C02 - Ding Wang y Li

Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

Journal of Building Engineering 30 (2020) 101287

Contents lists available at ScienceDirect

Improving project communication in the architecture, engineering and


construction industry: Coupling virtual reality and laser scanning
Fábio Matoseiro Dinis a, *,
a CONSTRUCT-GEQUALTEC, Faculty of Engineering, University of Porto, Portugal
b CONSTRUCT-LFC, Faculty of Engineering, University of Porto, Portugal

ARTICLE INFO ABSTRACT

Keywords: In recent years, the demand for accurate, clear and easy-understandable information has been steadily rising
Virtual reality within the Architecture, Engineering and Construction (AEC) industry's stakeholders. Despite this, a sizeable
Laser scanning surveying
portion of this industry still considers the traditional approach to Construction Engineering, disregarding ma-
Building information modelling
jor innovations and technologic advances. This not only poses a great obstacle to proper communication be-
Building retrofitting
Project communication
tween project-related entities but also presents a significant challenge for retrofitting projects.
This article proposes a workflow for the improvement of communication in construction projects, in partic-
ular between professionals who lack specific BIM skills. Thus, by coupling laser scanning and Virtual Reality
(VR), within a Building Information Modelling (BIM) work environment, this workflow comprises the entire
process from on-site geometric data acquisition, through data treatment and analysis, culminating with the
point-cloud importation into a game engine and the development of navigation and interaction tools within
the VR environment. The framework is validated through its application to a proof of concept, from which
conclusions regarding the workflow success, limitations, optimization, among other topics are discussed.

1. Introduction tion of Virtual Reality (VR) and laser scanning within a BIM environ-
ment. It is expected that this provision of information may contribute
In recent years, laser scanning has been increasing its importance to the development of the Asset Information Model (AIM) [21] and
in the Architectural, Engineering and Construction (AEC) industry, be- further project decisions during the building life cycle. The proposed
ing applied in several areas such as: project monitoring [1–3], auto- tool can also be used during the operational stage, creating a suitable
mated Building Information Modelling (BIM) [4–7], life-cycle assess- interface for non-BIM users to introduce information in the BIM
ment [8–10], and retrofitting [11–13]. The latter, in particular, has model. The use of a point cloud as the basis for this interface also sub-
received the support of several international strategies [14,15] aimed stantially eliminates the need for a high Level of Development (LOD)
at accomplishing existing industry goals, particularly related to energy BIM model during the operational stage, greatly reducing its associ-
efficiency and CO2 emissions [16–18]. However, despite the urgency ated costs. This objective is accomplished through the proposition of a
related to these goals, multiple challenges persist within the retro- workflow, followed by its validation in a proof of concept of a possi-
fitting process. These challenges are often associated with a complex ble retrofitting project.
and inefficient communication between all the acting parties, caused The document is organized as follows: Section 2 presents a review
by the absence of knowledge regarding the building as-is conditions of the available literature on the topics of laser scanning and VR. Sec-
and the unavoidable involvement of multiple AEC professionals and tion 3 describes the research methodology. Section 4 identifies the
stakeholders during the retrofitting process and the building's life- adopted Information Technology resources. Section 5 describes the
cycle [18–20]. proposed workflow. Section 6 focuses on the proof of concept while
To this end, this paper demonstrates an auxiliary tool for design also indicating the prepared future works. Section 7 presents the con-
brief, which helps convey expected requirements and specifications clusions.
valued by stakeholders (e.g. project owners), using the joint applica-

Available online 22 February 2020


2352-7102/© 2020 Elsevier Ltd. All rights reserved.
Journal of Building Engineering 30 (2020) 101287

2. Literature review However, developing suitable VR environments represents a chal-


lenging task. In the AEC sector, this limitation is strained by the need
2.1. Laser scanning for building geometry acquisition to portray a continuously changing real built environment, whose in-
formation is often difficult to acquire. Thus, deriving 3D virtual sce-
In recent years, major technological advances allowed for the de- narios from point cloud data can significantly reduce the modelling
velopment of detailed three-dimensional (3D) representations of the effort necessary to achieve a visually accurate VR environment. As
as-is building [13,20,22]. These representations are acquired through such, VR and laser scanning have previously been applied with a wide
the application of survey technologies such as laser scanning, pho- range of uses: Brenner and Haala [59] present a method for fast pro-
togrammetry, videogrammetry, time-of-flight, optical triangulation, ducing VR models of cities using airborne laser scanning; Fernández-
among others [19,23–28]. Most of these methods tend to require Palacios et al. [60] demonstrate the benefits of using laser scanning
trained operators and expensive equipment [19,23–27]. However, as and VR in relation with cultural heritage, by producing detailed and
seen in Refs. [29–31], since 2017, laser scanning has emerged as one photo-realistic virtual environments useful for visualization, documen-
of the most relevant topics in the field of BIM, appearing in the top- tation, divulgation, museum exhibitions, virtual tourism, among oth-
ranked clusters of knowledge and keywords, in addition to relating to ers; Bruno et al. [61] suggest some guidelines for the creation of a vir-
one of the most cited articles [32]. In fact, laser scanning's capacity to tual exhibition system for realistic high-quality archaeological find-
perform automatic and quick measurements of distances and angles, ings; Balsa-Barreiro and Fritsch [62] detail the challenges of surveying
combined with its highly accurate capture of complex geometries and historical cities, presenting a methodology based on laser scanning
minute details, distinguishes this technology in comparison with the and photogrammetric techniques for the generation of visually aes-
remaining alternatives [22,27,33]. Furthermore, the development of thetic and detailed 3D virtual environments; Kersten [63] also dis-
expert point cloud software such as Leica cloud CYCLONE, Meshlab plays the potential of coupling these technologies through the cre-
and Autodesk ReCap, allows for the quick processing of the acquired ation of a VR model for a portion of an ancient dam. Vincke et al.
data, permitting the elimination of unwanted noise, the alignment and [52] developed an approach which enables different types of input
unification of the point clouds, and the conversion of this data into data formats to be introduced into a game engine (i.e., BIM models,
geometric forms [19]. In fact, through this conversion, laser scanning meshes, and point clouds) to provide an inexpensive solution for the
may provide the means for the automatic generation of BIM models, visualization of construction works. The authors used the Oculus Rift
easing the effort to create as-is models while also improving their ac- API to create a virtual scenario where users may travel through the
curacy and detail. Several research initiatives have focused on the so- virtual scene, visualize and compare different data types as well as es-
called “scan-to-BIM” process with positive results [32,34–38], divid- tablish a raw deviation analysis between the as-is BIM and the as-
ing the process into three main steps: data collection; data processing; designed BIM models. Thiel et al. [64] describe a rendering system to
and BIM modelling. support the visualization of vast point-clouds within an immersive VR
Despite this, laser scanning still displays some limitations, requir- scenario. Furthermore, the authors conduct user evaluations regarding
ing expensive equipment and knowledgeable operators. Additionally, a set of interactions to inspect the 3D content (e.g., locomotion tech-
its field of view may also be a problem, forcing the user to acquire niques, measurement tools, gesture-based interactions). A thorough
multiple point clouds from different positions to eliminate occlusions, study on the comparison between different algorithms and techniques
as seen in Refs. [16,22,34,39]. to improve the rendering performance of point-clouds into VR scenes
is presented by Discher et al. [65].
2.2. Virtual Reality
3. Methodology
VR applications have shown encouraging results in a variety of
fields related to the AEC industry, presenting solutions for communi- The present article is based on the development and subsequent
cation and collaboration issues, especially between partners or stake- validation of a system which was created to solve a practical empiri-
holders with distinct profiles. In fact, construction projects and related cal problem, based on the methodology of previous works [66]. In-
operations are known as eventful and complex scenarios [40], com- deed, the proposed solution is grounded in the principles of Design
prising multiple interactions and communication lines between pro- Science Research, as it intends to provide an innovative artefact and
ject teams. Methodologies such as BIM addressed at project informa- its further validation to address a real environment application [67].
tion and design management [41] enable teams to work on a collabo- The methodology present in Fig. 1 was used to structure the research.
rative platform, creating and sharing project data [42]. Despite this, It can be divided into four steps:
as an extensively collaborative technology, BIM still faces limitations
in what regards the adaptation to the knowledge levels, skills and 1. Literature review;
tasks of different project teams [43,44]. 2. Definition of technological requirements;
Developing VR scenarios can provide more intuitive, interactive 3. Framework proposal;
and understandable environments to meet the profile of a broader 4. Framework validation.
range of users. Previous case studies describe primary favourable re-
sults on how BIM-based VR interfaces may offer beneficial platforms Initially, a literature review is presented on the main technologies
for further intuitive interactions with the built environment [45]. In a combined in this study, laser scanner and VR, to map the main devel-
systematic review about BIM-based VR applications, Sidani et al. [46] opments in the respective study areas. Then, the proposed approach is
describe the main uses of these types of interfaces, the system archi- detailed by defining the technological requirements and developing a
tecture, and most used assessment methods. The authors also address framework based on the knowledge made explicit by peers and the
the integration of such systems in the building life cycle, the main tar- authors' own experience. Lastly, the system is validated through a
get groups, as well as the most commonly researched BIM uses. proof of concept that confirms its correspondence and adequacy with
Several examples of VR interfaces developed for AEC related appli- the objectives to be achieved.
cations depict favourable outcomes, e.g. in urban planning [47,48],
acoustic assessment [49], supporting design review and decision mak-
ing [50–53], construction safety [54], improving spatial perception
[55] and learning outcomes in Engineering related areas [56–58].

2
Journal of Building Engineering 30 (2020) 101287

nally, for the BIM modelling software, Autodesk Revit and Dynamo
(an open-source plugin in Revit for visual programming) were used
for their current dominance in the scientific and professional commu-
nity [20], intuitive and powerful modelling capabilities [77], as well
as the well-documented interoperability with Unity [78,79].

5. Workflow overview

In this section, a complete overview of the proposed workflow is


presented. In summary, the proposed workflow uses a laser-scanner
point cloud as the basis for setting a building's geometry inside a vir-
tual environment and enabling a user to relay information (spoken
strings of text) to the design team through user interaction with this
environment. This workflow has two foreseen uses:

1. provide an innovative and alternative approach in the definition


of expected requirements, qualities and specifications valued by
the owner in the design brief, for instance, that should be
contained in the AIM according to the specifications of ISO
Fig. 1. Research structure. 19650;
2. serve as an interface for non-BIM users during the operational
4. Information technology resources stage (e.g. facility management), when BIM models of the
appropriate Level of Development (LOD 500 [80–82]) are not
Five software tools were used. These tools can be roughly divided available to be imported into the VR setting.
into three groups:
The tasks comprising this workflow can be seen in Fig. 2 divided
• point cloud software; into three main stages: i) Point Cloud Acquisition and Treatment; ii)
• VR software; Virtual Reality Environment; and iii) BIM Project Update. Fig. 2 also
• and BIM modelling software. displays a division of these stages according to their degree of au-
tomation. The entire first stage, as well as a portion of the second
Factors that affected the selection of the software for each group (Fig. 2, left side), is a largely manual process, including the acquisi-
are multifaceted. However, a main factor in the final decision was the tion and treatment of the point cloud, and the outset setting of the VR
software's documented interoperability. environment. This portion of the workflow should occur once at pro-
In the first group, Leica Cyclone 9.1 and MeshLab were applied for ject delivery, and then be further updated only when significant
their well-documented interoperability with one another [68,69], as changes are made during the operational phase of the building. The
well as with the laser scanner that was used, Leica's ScanStation P20 remaining portion of the second stage, as well as the last stage (Fig. 2,
[70,71]. In the second group, Unity game engine, a cross-platform right side), is mainly automatic and consists on the registration of
game engine which has been applied in recent years for the develop- voice inputs as annotations in the VR scene, which are self-recorded
ment of immersive VR environments in AEC applications [72,73], was and automatically uploaded to the corresponding location into the
identified as the optimal platform for the creation of the virtual envi- BIM authoring tool. Therefore, no modelling skills are needed (e.g.,
ronment. This was mainly because of its already mentioned spread 3D modelling and BIM skills). The information input triggers an alert
use in developing AEC VR environments, as well as its interactivity in the project's BIM model to be addressed by the design team.
quality and dedicated assets library [74]. Furthermore, its recognized
interoperability with MeshLab was also a deciding factor [75,76]. Fi-

Fig. 2. Proposed workflow.

3
Journal of Building Engineering 30 (2020) 101287

In the following sections, this premise will be detailed, describing mance is highly dependent on the point cloud quality and on the
each required step. These sections follow the workflow illustrated in hardware. Afterwards, the “Screened Poisson Surface Reconstruction”
Fig. 2. algorithm [84] is applied to create a mesh from the remaining points
(Fig. 4). However, it should be stated that this algorithm may create
5.1. Point cloud acquisition and treatment surfaces that were inexistent in the initially scanned environment. As
such, these surfaces must be eliminated through a semi-manual
The first stage in the proposed workflow includes the laser scanner process using the proper selection algorithms available in MeshLab
surveying of a building, followed by the treatment of the acquired (Fig. 5). Finally, before exporting the resulting mesh as an Object
point cloud and its exportation to a mesh design software. The survey- (.obj) file, the point cloud colours must be transferred to their respec-
ing and treatment steps of this stage have been detailed in Ref. [13], tive positions in the mesh surface. This process is accomplished
where the authors propose and apply a framework for the acquisition through the use of the “Trivial Per-Triangle parametrization” and
and treatment of a building's as-is geometric data using laser scan- “Transfer: Vertex colour to texture” algorithms, mapping the entire
ning. This process includes the planning and setup of the required mesh (securing the proper size and spacing of its triangles) while also
scan locations, the exportation of the acquired point clouds from the creating a Portable Network Graphics (.png) file containing the points'
laser scanner to a point cloud software and, lastly, the registration colours as a texture file. Fig. 6 displays the acquired mesh before (left
(process of merging multiple scans into a single-shared coordinate sys- side) and after (right side) overlapping the created texture.
tem) and cleaning (process of deleting unwanted data from the scans)
of the point clouds. As previously stated in Section 4 and illustrated in 5.2. Virtual Reality Environment
Fig. 2, Leica's ScanStation P20 and Cyclone 9.1 were, respectively, the
laser scanner and point cloud software applied in this article to ac- The VR environment is developed in the Unity game engine. Users
complish these steps. To finalize the first stage, the treated point can navigate through the VR environment, point to target building el-
cloud is exported as a Plain Text Data Format (.ptx) file to the mesh ements and add tags containing voice input information. These tags’
design software – which in this article was MeshLab. The.ptx file con- position and speech information are simultaneously recorded in a Text
tains multiple sets of coordinate triplets (X, Y and Z) as well as colour (.txt) file.
(RGB) and intensity values. In MeshLab, the point cloud first suffers a To achieve this, the first step is to import the.obj and.png files as
reduction in its number of points through the use of “Poisson-disk assets into the project. The.obj file contains the acquired mesh with-
Sampling” [83] (Fig. 3). This reduction allows for the elimination of out colour information, while the.png file holds the created texture to
points in high point density areas (typically near the scan station posi- overlap the.obj using UVW coordinates. UVW coordinate system is
tion) while preserving points in low point density areas (typically far- used in computer graphics to distinguish modelling (XYZ coordinates)
ther from the scan station position). This first action allows the acqui- from mapping [85]. Generally, UVW coordinate systems relate to the
sition of a manageable point cloud (with lower file size and fewer process of mapping a texture or material on an object. The result con-
negative impacts on the computational power), without compromising sists of a set of coloured meshes with their respective textures dis-
relevant point cloud information. This is a manual process, as perfor-

Fig. 3. Original point cloud (left); point cloud after reduction (right).

Fig. 4. Mesh generation using the algorithm “Screened Poisson Surface Reconstruction”.

4
Journal of Building Engineering 30 (2020) 101287

Fig. 5. Removing mesh segments that do not correspond to real scanned surfaces.

Fig. 6. Transferring vertex’ colours to the mesh: original mesh (left); mesh after the process (right). (For interpretation of the references to colour in this figure
legend, the reader is referred to the Web version of this article.)

posed in their proper position, as initially attained with MeshLab (Fig. To perform this step, Dynamo starts by retrieving all the informa-
7). tion in the.txt file, creating an array. The cells containing the coordi-
Afterwards, VR techniques were developed to allow the user to nates and the spoken information are then selected and manipulated
navigate within the environment and communicate to the project to properly locate the spheres inside the Revit project and induce
team. The locomotion technique is mainly accomplished using tele- them with the proper information. Any set of coordinates that are not
portation mechanisms, that is, by pressing and releasing one of the related to any spoken information are eliminated, preventing the cre-
controllers’ touchpad to move to a target location. The voice inputs ation of empty spheres. The spheres can then be seen in the Revit pro-
are recorded using speech recognition functions and, subsequently, ject (Fig. 10).
pinned to the mesh using a laser pointer (Fig. 8). It should be stated It should be stated that a Dynamo interface was created to help the
that these interactions were developed to be handled using an HTC user easily perform this step. As can be seen in Fig. 11, the interface
VIVE head-mounted display (HMD) and controllers. requires two inputs from the user. The first is the location of the text
file. The second is the Revit family to be used. By default, the “infor-
5.3. BIM project update mation sphere” family is chosen. However, this was left as an input
option since users may choose to adopt other families for this pur-
In the third and last stage of the workflow, the.txt file containing pose.
the machine interpreted results of the speech notes inserted by users
and their respective 3D coordinates is exported from the interactive 6. Discussion and future works
VR application to a Dynamo script. This script instantiates an object
in the Revit project – an “information sphere” – which contains the Preliminary tests were conducted confirming that the interface
spoken notes as text strings. These spheres are placed at the exact co- corresponds to the original objectives (see Figs 3–11) of combining
ordinates that were retrieved from the.txt file. The 3D representation laser scanning and VR – within BIM environment – for the improve-
of these spheres can be seen in Fig. 9. ment of communication in construction projects. The proof of concept

Fig. 7. Importing the mesh as an asset into Unity: mesh without colour information (left); coloured meshes with their respective textures (right). (For interpreta-
tion of the references to colour in this figure legend, the reader is referred to the Web version of this article.)

5
Journal of Building Engineering 30 (2020) 101287

Fig. 8. Placing an annotation within the VR interface.

cept. In Ref. [86], the authors describe this method as an evaluation


based on criteria, defined according to the simulated cognitive
processes of the users, required to complete a set of tasks. The method
allows reaching a verdict regarding the user's ability to execute the
correct actions, which are related to the users' goals and capacity to
explore the interface, using the least possible support and instructions
[86]. With a special focus on the ease of learning, this method in-
volves recognizing problematic usability features while going through
specific tasks [88]. Preece, Rogers and Sharp [88] describe five main
aspects encompassing cognitive walkthroughs:

1. Identify the main characteristics of the users, as well as the design


aspects and corresponding series of actions to be evaluated;
Fig. 9. Information sphere 3D representation in Revit. 2. Conduct the analysis with the support of the designers and expert
evaluators;
was performed using a VR-ready laptop (CPU: INTEL I7 – 6700HQ, 3. As the tasks are being completed, the evaluators answer to
RAM: 16 GB DDR4, 256 GB SSD, GPU: NVIDIA GTX 1060 m) and the following questions:
HMD HTC VIVE. • “Will the correct action be sufficiently evident to the user?“;
It should be stated that these preliminary tests were merely fo- • “Will the user notice that the correct action is available?“;
cused on the system's runnability rather than on usability assessment. • “Will the user associate and interpret the response from the
To this end, the authors intend to conduct future works on usability action correctly?“.
assessment based on two well-established approaches from the litera- 4. Register possible causes for usability problems and subsequent
ture, that will elicit relevant inputs to future iterations of the inter- future design modifications;
face: cognitive walkthroughs [86] and guidelines on how to conduct 5. Reconsider the interface design.
usability evaluations [87].
The first approach, cognitive walkthroughs, was regarded as a fea- Regarding the proposed interface, the authors expect future cogni-
sible method attending to the challenges of finding representative tive walkthroughs to focus on the following user profile: project own-
users, as well as the current early stage of the interface – proof of con-

Fig. 10. Information sphere 3D representation.

6
Journal of Building Engineering 30 (2020) 101287

serving as an interface for non-BIM users to introduce information


into the BIM model.
As seen by the proof of concept of a retrofitting scenario, the pro-
posed system is divided into three main stages, two of which concern
the preparation of the hardware and software tools (i.e. the manual
process of acquisition and treatment of the point cloud, and setting
the VR scene). Users may then step into an immersive as-is virtual
building environment and automatically relay information (spoken
strings of text) to the design team through the update of the associ-
ated BIM model. The main contribution of this study to the field of
knowledge is to present an approach to overcome the obstacle to
proper communication between project-related entities using state of
the art technology, especially in cases of retrofitting processes.
Future works will target validation through cognitive walk-
throughs attending to the challenges of finding representative users,
and the current early stage of the interface – proof of concept. The au-
thors expect this method to shed light on the main adequacy issues of
the proposed solution. Subsequently, during future development
stages adjacent to the finished system, usability evaluations will be
used to validate the system.

Fig. 11. Interface created for the performance of the third step.
Declaration of competing interest

ers and on-site AEC professionals with supervising responsibilities There are no conflict of interest.
(e.g. facility managers or maintenance technicians).
The second approach to be considered during future iterations of Acknowledgements
the proposed interface will follow Nielsen's [87] guidelines on how to
conduct usability evaluations of near-finished systems. According to This article has been developed from the results obtained within
Nielsen [87], one can establish a range of levels for a specific usability the framework of the CONSTRUCT project, as well as the support of
attribute through a “usability goal line”, which is then used to assess Fundação para a Ciência e Tecnologia (FCT) through the research pro-
how well users interact with the system. Usability goal lines illustrate ject SFRH/BD/129652/2017.
the minimum acceptable, as well as aimed levels of usability to be Project POCI-01-0145-FEDER-007457 - CONSTRUCT - Institute of
achieved for a certain attribute (see also Rideout [89]). R&D In Structures and Construction is funded by ERDF funds through
Regarding the proposed interface, the authors suggest the selection COMPETE2020 - Programa Operacional Competitividade e Interna-
of the following four usability attributes: cionalização (POCI) – and by national funds through FCT - Fundação
para a Ciência e a Tecnologia.
1. Efficiency – resources consumed to attain a specific level of
performance [87]; References
2. Effectiveness – extent of task completion [90];
[1] Z. Pučko, N. Šuman, D. Rebolj, Automated continuous construction progress
3. Learnability – ease of learning the interface [88]; monitoring using multiple workplace real time 3D scans, Adv. Eng. Inf. 38 (2018)
4. Subjective satisfaction – appeal of using the system compared 27–40, https://doi.org/10.1016/j.aei.2018.06.001.
[2] C. Zhang, D. Arditi, Automated progress control using laser scanning technology,
with the users' initial needs and expectations [87].
Autom. ConStruct. 36 (2013) 108–116, https://doi.org/10.1016/j.autcon.2013.08.
012.
To be able to set the usability goal lines experts may be requested [3] S. El-Omari, O. Moselhi, Integrating 3D laser scanning and photogrammetry for
to test a range of pilot tasks and establish suitable attribute values. progress measurement of construction work, Autom. ConStruct. 18 (1) (2008) 1–9,
https://doi.org/10.1016/j.autcon.2008.05.006.
For instance, in the case of efficiency (1), the time required to achieve [4] H. Macher, T. Landes, P. Grussenmeyer, From point clouds to building
a skill level able to fulfil the proposed task needs to be established. information models: 3D semi-automatic reconstruction of indoors of existing
In the case of the assessment of subjective satisfaction (3) and buildings, Appl. Sci. 7 (10) (2017) 1030, https://doi.org/10.3390/app7101030.
[5] C. Thomson, J. Boehm, Automatic geometry generation from point clouds for
learnability (4), the authors will consider using the System Usability BIM, Rem. Sens. 7 (9) (2015) 11753–11775, https://doi.org/10.3390/
Scale [91,92], a well-established questionnaire providing insight on rs70911753.
the two aforementioned attributes. [6] J. Jung, C. Stachniss, S. Ju, J. Heo, Automated 3D volumetric reconstruction of
multiple-room building interiors for as-built BIM, Adv. Eng. Inf. 38 (2018) 811–
825, https://doi.org/10.1016/j.aei.2018.10.007.
7. Conclusions [7] M. Bassier, B. Van Genechten, M. Vergauwen, Classification of sensor
independent point cloud data of building objects using random forests, J. Build.
Eng. 21 (2019) 468–477, https://doi.org/10.1016/j.jobe.2018.04.027.
The present paper proposes a semi-automatic workflow and a set
[8] R.E. Edwards, E. Lou, A. Bataw, S.N. Kamaruzzaman, C. Johnson, Sustainability-
of software tools as an alternative approach for the design brief re- led design: feasibility of incorporating whole-life cycle energy assessment into BIM
garding the definition of expected requirements, qualities and specifi- for refurbishment projects, J. Build. Eng. 24 (2019) 100697, https://doi.org/10.
cations valued by stakeholders (i.e., project owners). The result is thus 1016/j.jobe.2019.01.027.
[9] C. Panteli, A. Kylili, L. Stasiuliene, L. Seduikyte, P.A. Fokaides, A framework for
an informal log of the owner's intentions and enumeration of changes. building overhang design using building information modeling and life cycle
Furthermore, the proposed solution intends to provide innovative assessment, J. Build. Eng. 20 (2018) 248–255, https://doi.org/10.1016/j.jobe.
means to support the flow of information between parties involved in 2018.07.022.
[10] T. Sartori, J.L. Calmon, Analysis of the impacts of retrofit actions on the life cycle
construction projects by coupling laser scanner, VR and BIM technol- energy consumption of typical neighbourhood dwellings, J. Build. Eng. 21 (2019)
ogy. The proposed workflow may also be applied in other stages of 158–172, https://doi.org/10.1016/j.jobe.2018.10.009.
the project and building's lifecycle, namely in the operational phase, [11] C. Rodríguez-Moreno, J. Reinoso-Gordo, E. Rivas-López, A. Gómez-Blanco, F.
Ariza-López, I. Ariza-López, From point cloud to BIM: an integrated workflow for
documentation, research and modelling of architectural heritage, Surv. Rev. 50

7
Journal of Building Engineering 30 (2020) 101287

(360) (2018) 212–231, https://doi.org/10.1080/00396265.2016.1259719. Computing in Civil Engineering (2011), 2011, pp. 665–672.
[12] L.M. Khodeir, D. Aly, S. Tarek, Integrating HBIM (Heritage Building Information [40] Q.T. Le, A. Pedro, C.R. Lim, H.T. Park, C.S. Park, H.K. Kim, A framework for
Modeling) tools in the application of sustainable retrofitting of heritage buildings using mobile based virtual reality and augmented reality for experiential
in Egypt, Procedia Environ. Sci. 34 (2016) 258–270, https://doi.org/10.1016/j. construction safety education, Int. J. Eng. Educ. 31 (3) (2015) 713–725.
proenv.2016.04.024. [41] H. Penttilä, Describing the Changes in Architectural Information Technology to
[13] L. Sanhudo, N.M. Ramos, J.P. Martins, R.M. Almeida, E. Barreira, M.L. Simões, V. Understand Design Complexity and Free-Form Architectural Expression, 11, ITcon,
Cardoso, A framework for in-situ geometric data acquisition using laser scanning 2006.
for BIM modelling, J. Build. Eng. 28 (2020) 101073, https://doi.org/10.1016/j. [42] A.Z. Sampaio, The introduction of the BIM concept in civil engineering
jobe.2019.101073. curriculum, Int. J. Eng. Educ. 31 (1) (2015) 302–315.
[14] E. Recast, Directive 2010/31/EU of the European Parliament and of the Council [43] Y. Liu, S. van Nederveen, M. Hertogh, Understanding effects of BIM on
of 19 May 2010 on the energy performance of buildings (recast), Off. J. Eur Union collaborative design and construction: an empirical study in China, Int. J. Proj.
18 (2010) 2010 06. Manag. 35 (4) (2017) 686–698, https://doi.org/10.1016/j.ijproman.2016.06.007.
[15] T.E. Commission, (2017). https://ec.europa.eu/clima/policies/strategies/2020_ [44] H. Kerosuo, R. Miettinen, S. Paavola, T. Mäki, J. Korpela, Challenges of the
en. expansive use of Building Information Modeling (BIM) in construction projects,
[16] S. Lagüela, L. Díaz-Vilariño, J. Martínez, J. Armesto, Automatic thermographic Production 25 (2) (2015) 289–297.
and RGB texture of as-built BIM for energy rehabilitation purposes, Autom. [45] F.M. Dinis, A.S. Guimarães, B.R. Carvalho, Martins JPP Development of virtual
ConStruct. 31 (2013) 230–240. reality game-based interfaces for civil engineering education, 2017 IEEE Global
[17] G. Verbeeck, H. Hens, Energy savings in retrofitted dwellings: economically Engineering Education Conference (EDUCON), 25-28 April 2017, 2017, pp. 1195–
viable?, Energy Build. 37 (7) (2005) 747–754, https://doi.org/10.1016/j.enbuild. 1202, https://doi.org/10.1109/EDUCON.2017.7943000.
2004.10.003. [46] A. Sidani, F.M. Dinis, L. Sanhudo, J. Duarte, J.S. Baptista, J.P. Martins, A. Soeiro,
[18] Z. Ma, P. Cooper, D. Daly, L. Ledo, Existing building retrofits: methodology and Recent tools and techniques of BIM-based virtual reality: a systematic review,
state-of-the-art, Energy Build. 55 (2012) 889–902, https://doi.org/10.1016/j. Arch. Comput. Methods Eng. (2019) 1–14, https://doi.org/10.1007/s11831-019-
enbuild.2012.08.018. 09386-0.
[19] Ö. Göçer, Y. Hua, K. Göçer, A BIM-GIS integrated pre-retrofit model for building [47] L. Jiang, L. Maffei, M. Masullo, Developing an Online Virtual Reality Application
data mapping, Build. Simulat. 9 (5) (2016) 513–527, https://doi.org/10.1007/ for E-Participation in Urban Sound Planning, EuroRegio, 2016.
s12273-016-0293-4. [48] L. Maffei, M. Masullo, A. Pascale, G. Ruggiero, V.P. Romero, Immersive virtual
[20] L. Sanhudo, N.M.M. Ramos, J. Poças Martins, R.M.S.F. Almeida, E. Barreira, M.L. reality in community planning: acoustic and visual congruence of simulated vs
Simões, V. Cardoso, Building information modeling for energy retrofitting – a real world, Sustain. Cities Soc. 27 (2016) 338–345, https://doi.org/10.1016/j.scs.
review, Renew. Sustain. Energy Rev. 89 (2018) 249–260, https://doi.org/10. 2016.06.022.
1016/j.rser.2018.03.064. [49] T. Iachini, L. Maffei, F. Ruotolo, V.P. Senese, G. Ruggiero, M. Masullo, N.
[21] ISO B, 19650-1: 2018. BSI Standards Publication Organization and Digitization of Alekseeva, Multisensory assessment of acoustic comfort aboard metros: a virtual
Information about Buildings and Civil Engineering Works, Including Building reality study, Appl. Cognit. Psychol. 26 (5) (2012) 757–767.
Information Modelling (BIM)-Information Management Using Building [50] P.S. Dunston, L.L. Arns, J.D. McGlothlin, G.C. Lasker, A.G. Kushner, An immersive
Information Modelling, 2018. virtual reality mock-up for design review of hospital patient rooms, Collaborative
[22] S. Lagüela, L. Díaz-Vilariño, J. Armesto, P. Arias, Non-destructive approach for Design in Virtual Environments, Springer, 2011, pp. 167–176.
the generation and thermal characterization of an as-built BIM, Construct. Build. [51] Y. Liu, J. Lather, J. Messner, Virtual reality to support the integrated design
Mater. 51 (2014) 55–61, https://doi.org/10.1016/j.conbuildmat.2013.11.021. process: a retrofit case study, Computing in Civil and Building Engineering (2014),
[23] A. Bhatla, S.Y. Choe, O. Fierro, F. Leite, Evaluation of accuracy of as-built 3D 2014, pp. 801–808.
modeling from photos taken by handheld digital cameras, Autom. ConStruct. 28 [52] S. Vincke, Immersive visualisation of construction site point cloud data, meshes
(2012) 116–127, https://doi.org/10.1016/j.autcon.2012.06.003. and BIM models in a VR environment using a gaming engine, Int. Archiv.
[24] Z. Zhu, I. Brilakis, Comparison of optical sensor-based spatial data collection Photogram., Rem. Sens. Spatial Inf. Sci.-ISPRS Archives 42 (2019) 77–83, https://
techniques for civil infrastructure modeling, J. Comput. Civ. Eng. 23 (3) (2009) doi.org/10.5194/isprs-archives-xlii-5-w2-77-2019.
170–177, https://doi.org/10.1061/(ASCE)0887-3801(2009)23:3(170). [53] Y.-C. Lin, Y.-P. Chen, H.-W. Yien, C.-Y. Huang, Y.-C. Su, Integrated BIM, game
[25] P. Arias, C. Ordóñez, H. Lorenzo, J. Herraez, Methods for documenting historical engine and VR technologies for healthcare design: a case study in cancer hospital,
agro-industrial buildings: a comparative study and a simple photogrammetric Adv. Eng. Inf. 36 (2018) 130–145, https://doi.org/10.1016/j.aei.2018.03.005.
method, J. Cult. Herit. 7 (4) (2006) 350–354, https://doi.org/10.1016/j.culher. [54] S. Azhar, Role of visualization technologies in safety planning and management at
2006.09.002. construction jobsites, Procedia Eng. 171 (2017) 215–226, https://doi.org/10.
[26] P. Henry, M. Krainin, E. Herbst, X. Ren, D. Fox, RGB-D mapping: using depth 1016/j.proeng.2017.01.329.
cameras for dense 3D modeling of indoor environments, The 12th International [55] M. Roupé, P. Bosch-Sijtsema, M. Johansson, Interactive navigation interface for
Symposium on Experimental Robotics (ISER), 2010, pp. 22–25 Citeseer. Virtual Reality using the human body, Comput. Environ. Urban Syst. 43 (2014)
[27] C. Fröhlich, M. Mettenleiter, Terrestrial laser scanning–new perspectives in 3D 42–50, https://doi.org/10.1016/j.compenvurbsys.2013.10.003.
surveying, Int. Arch. Photogram. Rem. Sens. Spatial Inf. Sci. 36 (8) (2004) W2. [56] A. Pedro, Q.T. Le, C.S. Park, Framework for integrating safety into construction
[28] L. Gimenez, J.-L. Hippolyte, S. Robert, F. Suard, K. Zreik, Review: reconstruction methods education through interactive virtual reality, J. Prof. Issues Eng. Educ.
of 3D building information models from 2D scanned plans, J. Build. Eng. 2 (2015) Pract. 142 (2) (2015) 04015011.
24–35, https://doi.org/10.1016/j.jobe.2015.04.002. [57] W.S. Alhalabi, Virtual reality systems enhance students’ achievements in
[29] X. Li, P. Wu, G.Q. Shen, X. Wang, Y. Teng, Mapping the knowledge domains of engineering education, Behav. Inf. Technol. 35 (11) (2016) 919–925, https://doi.
Building Information Modeling (BIM): a bibliometric approach, Autom. ConStruct. org/10.1080/0144929X.2016.1212931.
84 (2017) 195–206, https://doi.org/10.1016/j.autcon.2017.09.011. [58] C.G. Jensen, Collaboration and dialogue in Virtual reality, J. Probl. Based Learn.
[30] X. Zhao, A scientometric review of global BIM research: analysis and Higher Educ. 5 (1) (2017), https://doi.org/10.5278/ojs.jpblhe.v0i0.1542.
visualization, Autom. ConStruct. 80 (2017) 37–47, https://doi.org/10.1016/j. [59] C. Brenner, N. Haala, Fast production of virtual reality city models, Int. Arch.
autcon.2017.04.002. Photogramm. Rem. Sens. 32 (4) (1998) 77–84.
[31] R. Santos, A.A. Costa, A. Grilo, Bibliometric analysis and review of Building [60] B. Jiménez Fernández-Palacios, D. Morabito, F. Remondino, Access to complex
Information Modelling literature published between 2005 and 2015, Autom. reality-based 3D models using virtual reality solutions, J. Cult. Herit. 23
ConStruct. 80 (2017) 118–136, https://doi.org/10.1016/j.autcon.2017.03.005. (Supplement C) (2017) 40–48, https://doi.org/10.1016/j.culher.2016.09.003.
[32] P. Tang, D. Huber, B. Akinci, R. Lipman, A. Lytle, Automatic reconstruction of as- [61] F. Bruno, S. Bruno, G. De Sensi, M.-L. Luchi, S. Mancuso, M. Muzzupappa, From
built building information models from laser-scanned point clouds: a review of 3D reconstruction to virtual reality: a complete methodology for digital
related techniques, Autom. ConStruct. 19 (7) (2010) 829–843, https://doi.org/10. archaeological exhibition, J. Cult. Herit. 11 (1) (2010) 42–49, https://doi.org/10.
1016/j.autcon.2010.06.007. 1016/j.culher.2009.02.006.
[33] L. Geosystems, Leica ScanStation P30/P40 Product Specifications, Heerbrugg, [62] J. Balsa-Barreiro, D. Fritsch, Generation of visually aesthetic and detailed 3D
Switzerland, 2016. models of historical cities by using laser scanning and digital photogrammetry,
[34] T. Gao, B. Akinci, S. Ergan, J. Garrett, Constructing as-is BIMs from progressive Digit. Appl. Archaeol. Cult. Herit. (2018), https://doi.org/10.1016/j.daach.2017.
scan data, Gerontechnology 11 (2) (2012) 75. 12.001.
[35] C. Wang, Y.K. Cho, C. Kim, Automatic BIM component extraction from point [63] T.P. Kersten, Virtual reality model of the Northern Sluice of the ancient Dam in
clouds of existing buildings for sustainability applications, Autom. ConStruct. 56 Marib/Yemen by combination of digital photogrammetry and terrestrial laser
(2015) 1–13, https://doi.org/10.1016/j.autcon.2015.04.001. scanning for archaeological applications, Int. J. Architect. Comput. 5 (2) (2007)
[36] D. Huber, B. Akinci, A.A. Oliver, E. Anil, B.E. Okorn, X. Xiong, Methods for 339–354.
automatically modeling and representing as-built building information models, [64] F. Thiel, S. Discher, R. Richter, Döllner J Interaction and locomotion techniques
Proceedings of the NSF CMMI Research Innovation Conference, 2011. for the exploration of massive 3D point clouds in VR environments, International
[37] X. Xiong, A. Adan, B. Akinci, D. Huber, Automatic creation of semantically rich Archives of the Photogrammetry, Remote Sensing and Spatial Information
3D building models from laser scanner data, Autom. ConStruct. 31 (2013) 325– Sciences-ISPRS Archives, 4, Copernicus GmbH, 2018, pp. 697–701, https://doi.
337, https://doi.org/10.1016/j.autcon.2012.10.006. org/10.5194/isprs-archives-XLII-4-623-2018.
[38] M. Bassier, R. Klein, B. Van Genechten, Vergauwen M IFC Wall reconstruction [65] S. Discher, L. Masopust, S. Schulz, (n.d.). https://doi.org/10.24132/jwscg.2018.
from unstructured point clouds, Annals of the Photogrammetry Remote Sensing 26.2.2.
and Spatial Information Sciences, 2018, https://doi.org/10.5194/isprs-annals-IV- [66] M. Belsky, R. Sacks, I. Brilakis, Semantic enrichment for building information
2-33-2018. modeling, Comput. Aided Civ. Infrastruct. Eng. 31 (4) (2016) 261–274, https://
[39] B. Giel, R. Issa, Using laser scanning to access the accuracy of as-built BIM, doi.org/10.1111/mice.12128.

8
Journal of Building Engineering 30 (2020) 101287

[67] A. Hevner, S. Chatterjee, Design science research in information systems, in: A. [79] F.M. Dinis, A.S. Guimarães, B.R. Carvalho, Martins JPP an immersive Virtual
Hevner, S. Chatterjee (Eds.), Design Research in Information Systems: Theory and Reality interface for Civil Engineering dissemination amongst pre-university
Practice, Springer US, Boston, MA, 2010, pp. 9–22, https://doi.org/10.1007/978- students, Experiment@ International Conference (Exp. At’17), 2017 4th, IEEE,
1-4419-5653-8_2. 2017, pp. 157–158.
[68] A. Janowski, K. Nagrodzka-Godycka, J. Szulwic, P. Ziolkowski, Remote sensing [80] S. Bruno, M. De Fino, F. Fatiguso, Historic Building Information Modelling:
and photogrammetry techniques in diagnostics of concrete structures, Comput. performance assessment for diagnosis-aided information modelling and
Concr. 18 (3) (2016) 405–420. management, Autom. ConStruct. 86 (2018) 256–276, https://doi.org/10.1016/j.
[69] G. Tucci, A. Conti, L. Fiorini, The mock-up of the“ Ratto Delle Sabine” by autcon.2017.11.009.
Giambologna: making and utilization of a 3D model, Iconarp Int. J. Archit. Plan. 2 [81] P. Pishdad-Bozorgi, X. Gao, C. Eastman, A.P. Self, Planning and developing
(2) (2015) 73–83. facility management-enabled building information model (FM-enabled BIM),
[70] L. Geosystems, Leica Cyclone 9.1 and Leica CloudWorx - Technical Specifications, Autom. ConStruct. 87 (2018) 22–38, https://doi.org/10.1016/j.autcon.2017.12.
2014. 004.
[71] L. Geosystems, Leica Cyclone User Manual, 2017. [82] W. Chen, K. Chen, J.C.P. Cheng, Q. Wang, V.J.L. Gan, BIM-based framework for
[72] J.G. Cárcamo, H. Trefftz, D.A. Acosta, L.F. Botero, Collaborative design model automatic scheduling of facility maintenance work orders, Autom. ConStruct. 91
review in the AEC industry, Int. J. Interact. Des. Manuf. 11 (4) (2017) 931–947, (2018) 15–30, https://doi.org/10.1016/j.autcon.2018.03.007.
https://doi.org/10.1007/s12008-016-0301-z. [83] M. Corsini, P. Cignoni, R. Scopigno, Efficient and flexible sampling with blue
[73] F.M. Dinis, J.P. Martins, B.R. Carvalho, A.S. Guimarães, Disseminating civil noise properties of triangular meshes, IEEE Trans. Visual. Comput. Graph. 18 (6)
engineering through virtual reality: an immersive interface, Int. J. Online Eng. (2012) 914–924.
(iJOE) 14 (2018) 225–232 05. [84] M. Kazhdan, H. Hoppe, Screened Poisson surface reconstruction, ACM Trans.
[74] P. Pauwels, R. De Meyer, Van Campenhout J Visualisation of semantic Graph. 32 (3) (2013) 29.
architectural information within a game engine environment, 10th International [85] B.L. Smith, UVW Mapping. Foundation 3ds Max 8 Architectural Visualization,
Conference on Construction Applications of Virtual Reality, CONVR 2010, 2010, 2006, pp. 229–240.
pp. 219–228. [86] P.G. Polson, C. Lewis, J. Rieman, C. Wharton, Cognitive walkthroughs: a method
[75] F. Galeazzi, M. Callieri, M. Dellepiane, M. Charno, J. Richards, R. Scopigno, Web- for theory-based evaluation of user interfaces, Int. J. Man Mach. Stud. 36 (5)
based visualization for 3D data in archaeology: the ADS 3D viewer, J. Archaeol. (1992) 741–773, https://doi.org/10.1016/0020-7373(92)90039-N.
Sci.: Report 9 (2016) 1–11, https://doi.org/10.1016/j.jasrep.2016.06.045. [87] J. Nielsen, Usability Engineering, Elsevier, 1994.
[76] R. Miranda Á, J.M. Valle Melón, E. Calparsoro, J.G. Iñañez, Study, revalorization [88] S. Kurniawan, Interaction Design: beyond Human–Computer Interaction by
and virtual musealization of a ceramic kiln based on information gathered from Preece, Sharp and Rogers (2001), Springer, 2004, https://doi.org/10.1007/
old excavations, Digit. Appl. Archaeol. Cult. Herit. 7 (2017) 1–9, https://doi.org/ s10209-004-0102-1.
10.1016/j.daach.2017.08.003. [89] T. Rideout, Changing your methods from the inside, IEEE Software 8 (3) (1991)
[77] W. Wu, I. Kaushik, Design for sustainable aging: improving design 99–100, https://doi.org/10.1109/MS.1991.10030.
communication through building information modeling and game engine [90] A. NCITS, Common Industry Format for Usability Test Reports, American
integration, Procedia Eng. 118 (2015) 926–933, https://doi.org/10.1016/j.proeng. National Standards Institute. Inc, 2001, p. 354.
2015.08.532. [91] J.R. Lewis, Sauro J the factor structure of the system usability scale, International
[78] Y. Shi, J. Du, S. Lavy, D. Zhao, A multiuser shared virtual environment for facility Conference on Human Centered Design, Springer, 2009, pp. 94–103, https://doi.
management, Procedia Eng. 145 (2016) 120–127, https://doi.org/10.1016/j. org/10.1007/978-3-642-02806-9_12.
proeng.2016.04.029. [92] J. Sauro, Measuring Usability with the System Usability Scale (SUS), 2011.

You might also like