Advanced Engineering Informatics

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 32

Advanced Engineering Informatics

Integrated Environmental Design and Robotics workflow for Double skin Facade
Perforation onsite
--Manuscript Draft--

Manuscript Number: ADVEI-D-20-00667

Article Type: VSI: Robotics in Construction

Keywords: double facade skin perforation; environmental design; robotics in construction;


VR/AR for monitoring; Digital Twin in Construction

Corresponding Author: Ahmed Khairadeen Ali, PhD candidate


Chung-Ang University
Seoul, Seoul KOREA, REPUBLIC OF

First Author: Ahmed Khairadeen Ali, PhD candidate

Order of Authors: Ahmed Khairadeen Ali, PhD candidate

One Jae Lee, M.Sc in Architecture

Song Hayub, Professor

Abstract: For contemporary design practices, there is still a disconnection between the design
techniques used for early-stage design experimentation and performance analysis, and
those used for the manufacture and construction. This paper addresses the issues of
creating an integrated digital design workflow and provides a research framework for
integrating environmental performance into a robotic manufacturing process in the
construction site. The method allows the user to import a design surface, identify
design parameters, set several environmental performance goals, and then simulate
and select a robotic building strategy. Based on these inputs, design alternatives are
created and evaluated, taking into account their robotically simulated constructability,
in terms of their performance criteria. To validate the proposed structure, the design is
explored and tested in an experimental case study of double skin façade perforation
which is generated from the proposed methodology system. Initial results describe a
heuristic feature to improve simulated robotic constructability and demonstrate our
prototype functionality.

Powered by Editorial Manager® and ProduXion Manager® from Aries Systems Corporation
Cover Letter

Integrated Environmental Design and


Robotics workflow for Double skin
Facade Perforation onsite
Ahmed Khairadeen Ali1,2*, One Jae Lee3, Song Hayub1

1. School of Architecture and Building Science, Chung Ang University, Seoul, Republic of Korea
2. Architectural Department, College of Engineering, University of Duhok, Iraq
3. Haenglim Architecture and Engineering Company, Seoul, Republic of Korea

E-mail: ahmedshingaly@gmail.com, o.lee@haenglim.com, hysong@cau.ac.kr

The results prove that double skin façade design and perforation can be accomplished integrated
in the jobsite by utilizing robotics and visual programming in the main design and construction
stages.

Contributions:
a. This study depicts that iFobot system can produce façade perforation designs
compliance with LEED v4 and use them to guide robot arm in the milling stage.
b. iFOBOT proposed a simplified platform that can increase the collaboration between
different discipline including environment, design, construction, robot milling, and
perforation activity monitor. These different stages of façade designs can be integrated
together in a single iFOBOT system and exchange data, more efficiently without needing
a medium or different data formats.
c. iFOBOT targeted involving robot arm for perforation activity in the construction site to
increase automation in design and construction and to reduce manpower needed to
accomplish such activity.
Highlights

Integrated Environmental Design and


Robotics workflow for Double skin
Facade Perforation onsite
Ahmed Khairadeen Ali1,2*, One Jae Lee3, Song Hayub1

1. School of Architecture and Building Science, Chung Ang University, Seoul, Republic of Korea
2. Architectural Department, College of Engineering, University of Duhok, Iraq
3. Haenglim Architecture and Engineering Company, Seoul, Republic of Korea

E-mail: ahmedshingaly@gmail.com, o.lee@haenglim.com, hysong@cau.ac.kr

Highlights

● Robotics can perforate facade in construction site compliance with environmental


requirements

● Visual algorithms can integrate facade design and construction using robotics

● Facade perforation design comply with environment can be integrated with robot
milling on site

● Near real-time robot construction monitoring is possible using extended reality


Integrated Environmental Design and
Robotics workflow for Double skin
Facade Perforation onsite
Ahmed Khairadeen Ali1,2*, One Jae Lee3, Song Hayub1

1. School of Architecture and Building Science, Chung Ang University, Seoul, Republic of Korea
2. Architectural Department, College of Engineering, University of Duhok, Iraq
3. Haenglim Architecture and Engineering Company, Seoul, Republic of Korea

E-mail: ahmedshingaly@gmail.com, o.lee@haenglim.com, hysong@cau.ac.kr

Highlights

● Robotics can perforate facade in construction site compliance with environmental


requirements

● Visual algorithms can integrate facade design and construction using robotics

● Facade perforation design comply with environment can be integrated with robot
milling on site

● Near real-time robot construction monitoring is possible using extended reality


Abstract
For contemporary design practices, there is still a disconnection between the design techniques
used for early-stage design experimentation and performance analysis, and those used for the
manufacture and construction. This paper addresses the issues of creating an integrated digital
design workflow and provides a research framework for integrating environmental performance
into a robotic manufacturing process in the construction site. The method allows the user to
import a design surface, identify design parameters, set several environmental performance
goals, and then simulate and select a robotic building strategy. Based on these inputs, design
alternatives are created and evaluated, taking into account their robotically simulated
constructability, in terms of their performance criteria. To validate the proposed structure, the
design is explored and tested in an experimental case study of double skin façade perforation
which is generated from the proposed methodology system. Initial results describe a heuristic
feature to improve simulated robotic constructability and demonstrate our prototype
functionality.

Keywords: double facade skin perforation, environmental design, robotics in


construction, VR/AR for monitoring, Digital Twin in Construction.

Introduction
Automated systems and robotics are recognized as a technology that will revolutionize the
building industry by addressing the problems of profitability while increasing efficiency [1,2]. On-
site robotics has gained significant attention for building applications over the last few years.
They were utilized for various construction operations, including milling, perforation, inspection
and structural safety monitoring [3–5], manufacturing of building components [2], assembly of
building components [6–9], handling of materials [6,10,11], and surveying and control of
construction [12–14].

The rapid development of the Architecture Engineering Construction (AEC) industry's digital
design methodologies and innovations and their combination with modern manufacturing and
construction innovations have provided architects with opportunities to step away from Fordist
standardization to a post-Fordist domain of previously unattainable design possibilities [15,16].
Although symmetry and repetition have brought economic efficiencies to the AEC industry,
these principles are now being antiquated and overwhelmed by an age of digital design,
analysis, and manufacturing [6,6].

Modeling (BIM) has made the discovery of new forms and multiple design alternatives more
effective for a wide range of pre- and post-rationalization design approaches [6,15]. Besides,
simulations help designers make more informed design decisions by ensuring greater
consistency in the decision-making process for dynamic and synthetic design [17].
The current construction trend is to continue to demonstrate the value of non-standard
architecture and to use its intricacy and distinction as an opportunity not only to provide higher-
performing design solutions in a multi-objective, analytical way but also for qualitative esthetic
metrics and delightful communication [3,13,18,19]. In this regard, robotic manufacturing
provides a unique opportunity to rethink the architectural design process not only by designing
the building shape to be designed but also by designing the construction process itself, which
produces a manifestation of its own esthetic and intrinsic post-Fordist performance gains
[1,7,20]. The rapid convergence of advanced digital and robotic modeling has contributed to the
architect's reconsideration as both a contemporary master-builder and a digital toolmaker
[21,22]. This further illustrates the need for more comprehensive solutions for the AEC industry
and the importance of developing new design methodologies that endorse a more holistic
approach to robotics adoption, not just as well [23].

Despite the many advances, a gap in intuitive workflows remains for designers to integrate
simulation results as drivers and constraints early in the design process [21]. Furthermore,
diverse software is used for various design phases — concept generation, modeling,
manufacturing, and construction planning — and consequently, the continuous exchange of
knowledge between the AEC disciplines remains a challenge [24–26]. The current design
methods and analytical resources remain the most important restrictions in their foresight and
the registration of design parameters and constraints from the upstream to the downstream [16].
In general, these approaches do not take into account the limitations of assembly and
construction, nor do they take into account the sensitivity of projects to local conditions and the
nature of comparable real-world noise [24,25].

There are currently few robust and efficient programming strategies available to control multiple
robots under custom semi-automated construction conditions [2]. Challenges for these
programming tasks include a) a constantly evolving environment (building sites); b) much
smaller production volumes (buildings are one-off products); and c) a much wider variety of
tasks needed [27,28].

This research presents an integrated approach of double skin facade perforation in the
construction site using robot arm compliance with LEED v4 daylight index by utilized visual
programming and extended reality technologies. As a first step. We focus on coupling
environmental performance simulations and double-skin facade perforations to inform robotic
construction simulations for informing perforation design configurations based on our system. A
case study was built and tested to determine the benefits and disadvantages of the proposed
program. This research hypothesizes that the next steps in digital design and robotic
manufacturing are the development of methodologies that regard computers and our robotics as
collaborative partners in the design process, as having modules and the capacity to record
contextual factors, environmental analyzes and construction constraints to provide architects
with design alternatives that meet complex requirements. The authors developed a new
integrated robotic development workflow architecture to tackle this issue that bypasses several
proprietary software systems, each requiring specialized training and significant annual
investments. The core purpose of this analysis is to include a framework that covers large
facade perforation factors that combine architecture, compliance with the climate, construction,
and quality assurance in one simple to use and effective platform. The proposed system is built
to become a medium workflow between architects, engineers, specialist consultants, and
manufacturers in one platform and exchange of information more smoothly. This research is
limited to double facade perforations design and does not consider other facade design
iterations.

Literature Review

2.1 involvement of robotics in architecture design and


construction
The use of large-scale CAD / CAM manufacturing and robotics in the AEC industry fills the
difference between what is digitally designed and what could be achieved physically [20]. The
AEC industry, however, remains a trade-oriented and labor-intensive sector with limited job
automation. Research in construction-specific robots started in Japan in the 1980s and centered
on various activities related to the building [29].
However, the unpredictability of the construction sites, equipment limitations, and high cost have
resulted in the practical use of a few construction robots in construction sites [12]. Related work
has either centered on applying various approaches to specific robotic construction subtasks,
such as assembly robots, or large-scale additive manufacturing techniques [7,30]. Such
approaches use machine control to take advantage of the superior surface shaping capabilities
of concrete and adobe frameworks or to incorporate autonomous distributed robotic systems to
execute basic tasks collectively [1,31].

More recently in architecture, multifunctional industrial robots have become popular in


automation precisely because they are not designed for a particular mission, like personal
computers, but are appropriate for a wide variety of applications [21].
Most notably, the widespread use of industrial robots has allowed the development of
specialized manufacturing of architectural objects and mass personalized manufacturing.
This was followed by an increasing interest in reviving traditional techniques, or "digital
craftsmanship," which contributes to creative and regionally appropriate architectural design that
is formally complicated and complex, and often higher output [16,32].
To date, automated modeling methods have been mainly used as a more effective drafting table
edition and not as involved assistance in the design process [18,20]. Current methods
integrated performance input but only recently worked into the use of results-based simulation
to inform generic design has been more technology available [17,33]. Although the tools for
designing complex geometric shapes are now commonplace, there is a lack of intuitive tools to
enable designers to design and explore construction processes and particularly robotic solutions
and their highly complex and precise geometric configurations [2].
This is due firstly to the open-mindedness of design problems and their unknown complexity,
which often involve hard-to-calculate synthesis [15]; secondly, to the general lack of a deep
understanding and abstraction of fundamental shaping processes in complexity that allow us to
build tools to help our design intuition [20]; and thirdly, on the one side, the exponential
advancement of automated production and robotic technology and, on the other, the growing
need for architects to build programming skills [16,34]. Combined, these factors have inhibited
the development and role of computing and robotics in shaping and further contextualizing
design awareness and integrating it into architectural design decisions [35].

2.2 Automated continuous construction progress monitoring using


3D scans and extended reality
Building industry concerns are growing for timely and reliable knowledge about the status of the
building project [4]. Unfortunately, one of the most challenging challenges of building project
management is constant supervision and regulation at all stages of the project [36,37]. That
involves assessing progress by site assessments and alignment with the project schedule, while
the accuracy of progress results relies strongly on the expertise of the surveyor and the
consistency of the measurements [38,39].
Manual visual assessments and standard reporting of development help to provide reviews on
progress measurements [39], hardware and inventory tracking [40], health planning [6], and
production tracking [41]. However, such a cycle requires time, is vulnerable to error and is
uncommon [16]. Kinect is one of the most common 3D scanning tools used for building textual
analysis [40], spatial accuracy of depth data, and indoor mapping applications [42,43], near
range 3D measurements and modeling [42,44], and automatic identification of worker actions.

According to Kwon et al. ( 2014), an introduction of BIM models and AR offers an opportunity to
improve current inspection models concerning defect and quality control [45]. It means that
specific details such as sketches, schedules, and materials from a BIM model will be translated
to a marker that results as actual components on-site, in addition to details. Park et al. ( 2013)
describe the AR implementation process in three stages: defining objects and estimating
information, predicting positions of virtual objects through estimated information, and finally
integrating the real world with the virtual elements [46,47].

Several studies that evaluate the usage of BIM and AR / VR have been carried out. Bae y al.
(2013) analyzed a mobile AR app where the user takes photographs of the web and
photographs taken from the pre-collected web, the program decides the positioning of the user
relative to the site and delivers cyber knowledge accurately depending on the user's role [48].
The frameworks display a model that can be displayed at various times, and users can use a
mobile device to monitor the progress from different areas. The integration of the two
techniques was also used in the scanning of a historic house to provide a virtual reality interface
with enhanced degrees of information [49].

The authors hypothesize that the next moves in digital architecture and robotic manufacturing
are the creation of methodologies that regard computers and our robots as collaborating
collaborators in the design phase, as providing an agency and the capacity to document
contextual factors, environmental analyzes and construction restrictions to provide architects
with design alternatives that meet specific requirements
Despite the various advancements, a practical workflow void exists for designers to integrate
simulation outcomes as drivers and constraints in the design phase early on (Kilian 2006).
Additionally, diverse software is used for different design processes — concept development,
modeling, production, and construction planning — and therefore continuous sharing of
knowledge between the AEC. Besides, based on the results of the works examined, none of the
approaches found are yet capable of offering extensive and accurate construction surveillance
that will protect the whole building (outdoor and indoor) during the construction phase.

Methodology
The objective of this research is to develop a facade construction methodology in the
construction site using the robot arm as an extension of our Robot-based facade assembly
optimization [18]. The extension of the iFOBOT system proposes double-skin facade perforation
in the construction site and adds environment analysis to the design stage and monitoring tool
to the construction stage of the previous iFOBOT system.

These research questions were asked as follows. How to build a simple facade design tool
compliance with environment LEED analysis rules? How can robotics constraints and
environmental analysis validate simulations as performance drivers to improve a designer's
decision-making process? Can design and construction of a double skin facade be integrated
using iFOBOT system with limiting human input into selecting design parameters and monitor
the perforation process using a robot arm.
The bottom-up approach for facade design where environmental analysis and robot-operating
strategies, drive perforation size, and location right from the beginning of double facade design
can provide designers with optimized solutions for constructability issues and feedback on the
perforation process activity in the construction job site [8,50]. This research is, therefore,
exploring the hypothesis that robotic simulations can become an integral part of the digital
design workflow, combining robotic agency with other objectives, including environmental
analysis and heuristic construction efficiency calculation.

This research proposes double-skin facade perforation in the construction job site using a robot
arm and environmental analysis. This system is called iFOBOT which represents: "i" letter which
stands for minimum human input, "letter "F" stands for facade design and fabrication, and finally
phrase "OBOT" stands for the involvement of robot arm in the system.

Proposed system architecture comprises of four modules: (1) iFOBOT-environment module that
provide essential environmental analysis to guide main double-skin facade outlines, (2)
iFOBOT-design that generates a perforated exterior layer of double skin facade according to
previous module outlines, (3) iFOBOT-construct module controls robot arm movement in the
construction site for facade perforation, and (4) iFOBOT-monitor module observes perforation
activity in the construction site and exchange data between construction office and job site in
near real-time. iFOBOT system utilizes graphical algorithm editor grasshopper [51]inside
commercially developed 3d modeling Rhinoceros 3D [52] software.

Proposed system inputs are targeted at building envelopes that include interior space, interior
facade, and exterior facade layer geometry. Besides, it includes LEED regulations that include
environmental information, rule constraints, and geospatial location that will be used for
environmental evaluation. System input is imported into iFOBOT-environment that will produce
LEED compliance heatmap exterior layer facade and pass it to other system modules. Then
iFOBOT-design uses a heatmap from iFOBOT-environment as a guide to generate a perforated
facade layer and pass it to the next module. After That, iFOBOT-construct uses a perforation
toolpath to guide the robot arm to mill the exterior facade wall onsite. iFOBOT-construct trade of
metrics are daylight factor analysis, collision checking with robot arm body and surrounding
objects, avoiding singularity, reachability check, and reducing movement travel time. Finally,
iFOBOT-monitor registers robot movement and perforation details and passes them to the
construction office for quality checking then sends back review comments to the construction
job site by utilizing 3d scanner and extended reality technologies. System outputs are double-
skin facade perforated facade geometry, perforation drawings, environmental analysis, and
KUKA krl file to run the robot arm and construction progress steps as depicted in Figure 1.

Figure 1. iFOBOT system architecture and data workflow between its modules
3.1 iFOBOT-environment
In general, the iFOBOT-environment module uses predefined input data including weather data
analysis to develop an environment rule compliance checking tool that generates environment-
friendly facade perforation heatmap and passes it to iFOBOT-design.

iFOBOT-environment's input is weather data, building location, orientation. These inputs are
used to build sun path control and climate-based performance simulation using environmental
ladybug and honeybee plugins [53] inside grasshopper canvas.

This method is developed to measure the Annual Sun Exposure (ASE) which is a metric that
identifies the potential for visual discomfort in space interior. iFOBOT-environment calculates
ASE to check Leadership in Energy and Environmental Design (LEED) v4 daylight credit [18].
Less than 10 percent of interior space should receive direct sunlight more than 1000 lux for a
maximum period of 250 hours per year ( 𝐴𝑆𝐸1000,250). The ASE calculation grid size should be
less than 600 millimeters square at a work plane height of 750 millimeters above the finished
floor. For the nearest accessible weather station, using an hourly time-step analysis based on
standard meteorological year results, or an equivalent. Include any existing obstructions of the
interior and exclude movable furniture and partitions.

iFOBOT-environment breaks down the calculation methods of ASE into discrete components to
calculate it for double skin facade exterior layer building geometry. In particular, ASE tests the
percentage of floor area receiving more than 1000 lux of the direct sun only for more than 250
hours occupied per year. The visual program script is built upon the work of ASE in the LEED
credit index [54]

As such, the iFOBOT-environment starts by measuring only the direct sun that falls on the
horizontal surface outside. This is then compounded by a window transmission and used to pick
sun positions with values that are just above the 100 lux threshold. The resulting sun vectors
are then used in the measurement of "sunlight hours" to calculate the area of the floor that
reaches 250 hours per year. The resulting values will tell if facade design perforations are
meeting or failing these criteria as depicted in Figure 2.
Figure 2. iFOBOT-environment module workflow and explain input/output data that pass to
other modules

3.2 iFOBOT-construct
iFOBOT-construct is responsible for robot movement during the perforation activity in the
construction site. It receives facade perforation design from iFOBOT-design associated with its
design parameters and generates robot language control data.

iFOBOT-design provides perforation outlines and depth needed for the robot arm to mill the
facade exterior layer. Designers need to input predefined parameters to the iFOBOT-construct
for the digital robot movement simulation to work. Input data include robot arm type selection,
location, orientation, milling spindle tool, robot brand, construction environment 3d model,
perforation activity surrounding collision objects in the construction site. This module passes
input data to calculate robot arm safe working space as depicted in Figure 3 (component e, d).
Then this module divide building envelope into WorkStations (WS) according to robot arm safe
working space parameters as depicted in Figure 3 (component c). The workstation script was
built upon the iFOBOT-B dataset [17].
iFOBOT-construct uses KUKA|prc plugin [55] inside the grasshopper environment. It builds a
parametric model of robot arm movement and kinematic simulation of the robot. The outputs of
this module are workstation 3D models of perforation activity that include robot arm, facade exterior
wall, construction constraints and perforation toolpath. The production also includes KUKA krl file
that can be executed on the KUKA robot without any additional software being needed as depicted
in Figure 3.
Figure 3. iFOBOT-construct module explaining workflow of robot simulation and perforation
activity.

3.3 iFOBOR-monitor
The iFOBOT-monitor module covers robot arm monitoring in the construction site by utilizing 3d
camera scanning and extended reality. The reason iFOBOT-monitor was added as an essential
subsystem of iFOBOT is that robot arm run inside a dynamic environment that continuously
changes comes at high risk and having a good monitoring tool that captures the robot arm
movement and exchange data with construction office is essential in onsite perforation activity.

iFOBOT-monitor starts with predefined settings including scanner coordinates in the job site,
field of view, resolution quality, color coding robot arm, robot workstation model location, and
BIM level of details. Based on the results in the works examined, none of the observed
approaches are yet capable of providing extensive and accurate construction monitoring, which
would cover the entire building (outdoor and indoor) during the construction process. Thus, our
work focuses on methods, focused on 4D as-built vs. 4D as-designed comparison, while our
goal was to create a system that would address the need for additional point cloud acquisition
efforts, and the issue of incomplete point clouds due to obscure building elements or missing
building parts (e.g. indoor or outdoor scanning online). The basic idea was to create 4D point
clouds on the go continuously by collecting 3D scans from the site, equipped with suitable low-
cost 3D scanning tools, such as Kinect[43,47]. This research utilised Kinect v2 sensor in case
study experiment but many others would like to match this category (e.g. Xtion ProLive 3D
sensor or Structure sensor). The approach is explained further on in the article.
The 3D laser scanner registers point cloud data of the workstation including robot movements,
facade perforations, construction constraints, and surroundings and register it inside a portable
computer in the job site using Quokka Kinect data processing [56] and Firefly: real-time data
processing [57] plugins inside grasshopper environment. Then, it processes target objects (in
our case facade perforations) point cloud data into 3d mesh geometry with reduced vertices to
make the file light and transformable. Next, iFOBOT-monitor utilizes speckle works [58] to send
3d mesh geometry to the construction office and sync/update digital models in the BIM
environment. After that the inspector utilized virtual reality to check progress rate, quality check,
add notes, view schedule, compare BIM mode with coming data from the job site using the
Mindesk plugin [59] inside a grasshopper environment. Finally, the inspector sends the
inspection digital report feedback to the job site for the worker to view in Augmented reality
mode using Fologram plugin [60] inside the grasshopper environment as depicted in Figure 4.

Figure 4. iFOBOT-monitor module illustrating progress checking by utilizing 3d laser scanning


and extended reality technology

CASE STUDY
The core thermal model variables are illustrated in Figure 5. This is a single zone layout, with all
surfaces modified to adiabatic except the glass façade, reflecting a single space in a larger
building. For the thermal, daylight, and glare simulations, each shading variant has been applied
to this model 's exterior. The case study is located in 471-080 Galmae-dong Guri-si, Gyeonggi-
do Seoul South Korea.
Four design choices for the facade were evaluated to evaluate the efficiency of traditional
shading approaches with a more rigorous solution focused on constraints. The iFOBOT process
starts with running four different exterior facade designs into the iFOBOT-environment and
checking the portion of space that receives more than 250 hours of daylight and LEED states of
the geometry. The first facade design A Partition of space with > 250 hours = 54% and failed in
LEED index. The second facade design Partition of space with > 250 hours = 0% and passed
LEED index. The third facade design Partition of space with > 250 hours = 20% and LEED
status of your Geometry = B. The fourth and final facade design Partition of space with > 250
hours = 1% and passed LEED V4 index as depicted in Figure 5 (component 1, 2, 3, and 4).

The heatmaps of the four facade alternatives demonstrate the traditional Sun-Path solution
model. Determined by an hourly thermal simulation, solar benefit desirability is mapped to the
region of the sky where the sun is at the respective hour. The chart further highlights the tension
implicit in the reality that weather conditions during the summer solstice are not symmetrical;
solar gain can be beneficial for the same sun location in the spring, and unwanted in the fall.

The 3D design model at this stage includes all the features required to incorporate detailing and
structural constraints. As the model is developed inside the software framework, which also
supports environmental optimization, it is simple to use the final design to re-run the Ladybug for
the Rhino simulation environment. The simulation helps to understand how shading system
performance has been impacted by incorporating construction constraints. The iterative
feedback process for simulation and concept detailing is enabled because the detailing requires
integration. If a satisfactory solution has been obtained, the final geometry is frozen for use in
the robotic code generation needed for the production process.

4.1 case study validating iFOBOT-environment and iFOBOT-


design
The iFOBOT-design module then used best facade design (in our case facade design 4)
heatmap to generate perforation corresponding to the initial facade openings using an image
sampler component inside a grasshopper environment. After That iFOBOT-design and iFOBOT-
environment pass perforated facade openings to the iFOBOT-construct for onsite construction
using robot arms as depicted in Figure 5.
The geometry of the perforation is further designed to meet different building requirements and
material constraints. The environmental design phase leaves with a continuous surface model
of the perforation, which is subdivided in a next stage to fit the supporting framework and the
corresponding environmental heatmap. A Grasshopper concept designed to balance the user-
defined supporting structure and accept the desired relation information is used to automate the
surface subdivision. The resulting individual perforation surface geometry is used to identify the
component's central axis and to locate mounting mechanisms on the support.
Figure 5. One room in a building case study illustrating iFOBOT-environment and iFOBOT-
design tool

4.2 case study validating iFOBOT-construct


The latter stage will be carried out by the designer in operation, while the earlier phases will
remain within the construction team comprising the contractor, façade specialist, and
environmental engineer.

A 6-axis commercial robotic arm KUKA HA 30/60. A series of custom Grasshopper definitions
were created, deriving from the optimized geometry positioning commands developed in the
previous stages of the workflow. At this point, every digital part is perforated using a flat end mill
bit of the central digital construction model, rotated and placed relative to the manufacturing
process requirements. Additional content restrictions can be considered within the scripts and
included as components. Additional constraints of high-quality industrial facade perforation
products are also dimensionally rectified at the end of production to achieve specific tolerances
in the perforation process. For the same purpose, the manufacturing process developed uses
robotic post-processing at the green state which provides consistent thickness and exact shape
of the component as illustrated in Figure 6.
Figure 6. Case study perforation activity in the construction job site using a robot arm

In the case being tested, robotic milling is used to achieve dimensional rectification. Using the
automation tools developed. Using KUKA plug-in within Grasshopper, fast code is created for
surface milling tool paths and contour cuts [55]. Without manual tooling, all necessary code is
generated directly from the central Rhinoceros model for the 6-axis robotic work cell.

The integration process was checked with the full-scale prototypical facade segment being
created. Perforations on the facade have been notified from the iFOOBT-env heatmap. Frame,
showing that the design properly addresses the material and the construction parameters as
depicted in Figure 7.

4.3 case study validating iFOBOT-monitor

The iFOBOT-monitor covers exchanging perforation activity progress monitoring between


jobsite and construction office. It starts with scanning the scene using a 3D camera and
capturing 3D point cloud data. Next, it converts point cloud modeling data to 3d mesh. Then, it
sends site data to the construction office, the inspector checks it using virtual reality and
prepares an inspection report. Finally the worker in the jobsite receives inspection feedback and
visualizes it using augmented reality mood.

In the case study, monitoring started with placing Kinect V2 camera outside robot arm safe
movement and manually adjusted to capture robot arm body, facade wall and surrounding.
Then the exterior wall of the double skin facade was cropped from the point cloud to reduce
computational time and data. It was converted to 3d mesh in the iFOBOT-monitor preparation
stage. Then the researchers used a speckle plugin to send the 3d mesh to the construction
office. The inspector syncs the BIM model in the office and receives the scanned 3d mesh from
the jobsite. After that, the inspector uses Oculus Rift S headset as a virtual reality host to
inspect the as-built model and compare it to the designed facade wall. The Mindesk plugin
inside the grasshopper environment allows the inspector to zoom in, write notes, take
measurements, draw shapes, check quality, monitor coordinates and check work progress while
in the VR mode. Inspector then prepare feedback report and connect it to Follogram plugin. The
worker in the jobsite received an update report in the jobsite using a smartphone presynked with
the construction office and connected to the same wifi. Finally the worker was able to visualize
the inspector report and overlap it with the as built model using QR code. The iFOBOT-monitor
is an essential part of iFOBOT because it allows data exchange of robot preroferation between
jobsite and construction office fast and easy to use because all of the different technologies are
connected to the same platform. Each loop of iFOBOT-monitor takes an average of 17 minutes
to accomplish. The digital model overlap with a built point cloud was possible in the construction
site to check if the robot is running correct joint movements and milling the perforation activity
correctly as depicted in Figure 7.

Figure 7. Case study testing point cloud generation and extended reality technologies of
iFOBOT-monitor module

Discussion
We have successfully built and evaluated an optimized iFOBOT for on-site double-skin surface
perforation at this point in the study. The case study so far provided us with an understanding of
how one design parameter, the angulation of the perforation, would influence the system's
design generation and behavior. Double skin facade perforation scale and shape are controlled
by the iFOBOT-design and guided by iFOOBT-environment whether it passes LEED v4 or not.
In the case study, the fourth iteration passed successfully the environment rule compliance
check. Then the iFOBOT-construct developed a milling toolpath for the robot arm to start
perforating in the Jobsite. The lab test was successful and took nearly one and a half hours to
finish half of the wall milling task. The iFOBOT-monitor successfully registered a 3D point cloud
of the perforation activity and converted the target wall to 3d mesh. After that, it sent 3d mesh
geometry to the construction office for inspection and quality checking. The inspector in the
construction office used virtual reality connected to the iFOBOT system to review, comment,
take measures, and check progress. Finally, the worker receives an inspection report in the
construction site and visualizes it using augmented reality technologies. One loop of iFOBOT-
monitor data passing takes 17 minutes to finish. The Kinect camera works best in the indoor
environment with controlled lumination and does not perform well in the outdoor environment.
Therefore, the authors consider using outdoor depth capturing sensors like ZED 2 [61] in the
future and integrate it with iFOBOT. We will continue to collect and evaluate the data to further
improve the heuristic method, as well as to identify the modules and negotiate them. The full
integration of the geometry and the data flowing through the modules of iFOBOT would be
further introduced as a next step. The more improvement of the dataflow to always provide
essential feedback provided the input from the building and simulation is another important
enhancement to the iFOBOT framework.

Conclusion
This paper introduces an integrated concept for the robotic manufacturing workflow, which is
easily transferable to other material systems while being designed for a high-performance
ceramic facade system. Custom Grasshopper components and custom scripts are merged into
a single software framework for all required actions.This approach to design automation
promotes the cooperation of all team leaders engaged in tailor-made, high-performance façade
concept, concept creation, and construction planning. The automated workflow serves the
manufacturing environments directly, too. It allows the iterative simulation of the geometry of the
building, checking that the geometry generated in the early stages of design is performing as
foretold.
Environmental optimization has created highly customized perforation forms on the facade,
potentially resulting in a visually rich and complex façade pattern. In practice, there is
widespread interest in parametric design's strictly formal expressions, an interest that
sometimes leads to the same need for optimization even without performance optimization.
Only by rethinking existing manufacturing processes and developing related digital design into
production environments can geometries of highly variable design be more readily implemented
in physical form.
Architectural architecture has long dealt with the design problem in the form of expensive one-
off designed architecture versus inexpensive yet relentless high-volume industrial
manufacturing. Robotic technologies and related approaches to design automation can
potentially provide a third route to avoid the age-old dilemma. Focusing on environmental
efficiency as a configuration engine does not mean sacrificing architecture – both of these are
no longer mutually exclusive.

Acknowledgment
This research was supported by Haenglim Architecture and Engineering Company and Chung
Ang University in Seoul South Korea. The First Author would like to thank the Korean
Government Program (GSP) for their continuous support.

References
[1] P. Kurtser, O. Ringdahl, N. Rotstein, R. Berenstein, Y. Edan, In-field grape cluster size
assessment for vine yield estimation using a mobile robot and a consumer level RGB-D
camera, IEEE Robot. Autom. Lett. 5 (2020) 2031–2038.
[2] O. Kontovourkis, G. Tryfonos, C. Georgiou, Robotic additive manufacturing (RAM) with
clay using topology optimization principles for toolpath planning: the example of a building
element, Archit. Sci. Rev. 63 (2020) 105–118.
[3] S. Choi, E. Kim, Design and implementation of vision-based structural safety inspection
system using small unmanned aircraft, in: 2015 17th Int. Conf. Adv. Commun. Technol.
ICACT, IEEE, 2015: pp. 562–567.
[4] M. Soliman, D.M. Frangopol, A. Mondoro, A probabilistic approach for optimizing
inspection, monitoring, and maintenance actions against fatigue of critical ship details,
Struct. Saf. 60 (2016) 91–101.
[5] L. Caldas, Generation of energy-efficient architecture solutions applying GENE_ARCH: An
evolution-based generative design system, Adv. Eng. Inform. 22 (2008) 59–70.
https://doi.org/10.1016/j.aei.2007.08.012.
[6] L. Ding, W. Jiang, Y. Zhou, C. Zhou, S. Liu, BIM-based task-level planning for robotic brick
assembly through image-based 3D modeling, Adv. Eng. Inform. 43 (2020) 100993.
[7] C.P. Chea, Y. Bai, X. Pan, M. Arashpour, Y. Xie, An integrated review of automation and
robotic technologies for structural prefabrication and construction, Transp. Saf. Environ.
(2020).
[8] M. Yablonina, M. Prado, E. Baharlou, T. Schwinn, A. Menges, Mobile robotic fabrication
system for filament structures, Fabr. Rethink. Des. Constr. 3 (2017).
[9] S. Keating, N. Oxman, Compound fabrication: A multi-functional robotic platform for digital
design and fabrication, Robot. Comput.-Integr. Manuf. 29 (2013) 439–448.
https://doi.org/10.1016/j.rcim.2013.05.001.
[10] S. Raman, E. Cherbaka, R.J. Giovacchini, B. Jennings, T.C. Slater, Construction beam
robot, 2020.
[11] M. Pan, T. Linner, W. Pan, H. Cheng, T. Bock, Structuring the context for construction
robot development through integrated scenario approach, Autom. Constr. 114 (2020)
103174.
[12] S. Cai, Z. Ma, M.J. Skibniewski, S. Bao, H. Wang, Construction Automation and Robotics
for High-Rise Buildings: Development Priorities and Key Challenges, J. Constr. Eng.
Manag. 146 (2020) 04020096.
[13] T. Kim, D. Lee, H. Lim, U. Lee, H. Cho, K. Cho, Exploring research trends and network
characteristics in construction automation and robotics based on keyword network
analysis, J. Asian Archit. Build. Eng. (2020).
[14] P. Rea, E. Ottaviano, Design and development of an Inspection Robotic System for indoor
applications, Robot. Comput.-Integr. Manuf. 49 (2018) 143–151.
https://doi.org/10.1016/j.rcim.2017.06.005.
[15] B. Abbasnejad, M.P. Nepal, A. Ahankoob, A. Nasirian, R. Drogemuller, Building
Information Modelling (BIM) adoption and implementation enablers in AEC firms: a
systematic literature review, Archit. Eng. Des. Manag. (2020) 1–23.
[16] S.O. Babatunde, D. Ekundayo, A.O. Adekunle, W. Bello, Comparative analysis of drivers
to BIM adoption among AEC firms in developing countries, J. Eng. Des. Technol. (2020).
[17] A.K. Ali, O.J. Lee, H. Song, Generic design aided robotically facade pick and place in
construction site dataset, Data Brief. 31 (2020) 105933.
https://doi.org/10.1016/j.dib.2020.105933.
[18] A.K. Ali, O.J. Lee, H. Song, Robot-based facade spatial assembly optimization, J. Build.
Eng. (2020) 101556. https://doi.org/10.1016/j.jobe.2020.101556.
[19] M. Kim, I. Konstantzos, A. Tzempelikos, Real-time daylight glare control using a low-cost,
window-mounted HDRI sensor, Build. Environ. 177 (2020) 106912.
https://doi.org/10.1016/j.buildenv.2020.106912.
[20] J. Hook, Automated digital fabrication concept for composite facades, (2016).
[21] B. Bogosian, L. Bobadilla, M. Alonso, A. Elias, G. Perez, H. Alhaffar, S. Vassigh, Work in
Progress: Towards an Immersive Robotics Training for the Future of Architecture,
Engineering, and Construction Workforce, in: 2020 IEEE World Conf. Eng. Educ.
EDUNINE, IEEE, 2020: pp. 1–4.
[22] A. Austin, B. Sherwood, J. Elliott, A. Colaprete, K. Zacny, P. Metzger, M. Sims, H. Schmitt,
S. Magnus, T. Fong, Robotic Lunar Surface Operations 2, Acta Astronaut. (2020).
[23] N. Hack, K. Dörfler, A.N. Walzer, T. Wangler, J. Mata-Falcón, N. Kumar, J. Buchli, W.
Kaufmann, R.J. Flatt, F. Gramazio, Structural stay-in-place formwork for robotic in situ
fabrication of non-standard concrete structures: A real scale architectural demonstrator,
Autom. Constr. 115 (2020) 103197.
[24] S. Alizadehsalehi, A. Hadavi, J.C. Huang, From BIM to extended reality in AEC industry,
Autom. Constr. 116 (2020) 103254.
[25] D.F. Bucher, D.M. Hall, Common Data Environment within the AEC Ecosystem: moving
collaborative platforms beyond the open versus closed dichotomy, (n.d.).
[26] L. Santos, S. Schleicher, L. Caldas, Automation of CAD models to BEM models for
performance based goal-oriented design methods, Build. Environ. 112 (2017) 144–158.
https://doi.org/10.1016/j.buildenv.2016.10.015.
[27] A. Figliola, A. Battisti, Exploring Informed Architectures, in: Post-Ind. Robot., Springer, n.d.:
pp. 1–45.
[28] D. Tish, N. King, N. Cote, Highly accessible platform technologies for vision-guided,
closed-loop robotic assembly of unitized enclosure systems, Constr. Robot. (n.d.) 1–11.
[29] D. ADACHI, D. KAWAGUCHI, Y.U. SAITO, Robots and Employment: Evidence from
Japan, 1978-2017, (2020).
[30] P.N. Koustoumpardis, N.A. Aspragathos, Intelligent hierarchical robot control for sewing
fabrics, Robot. Comput.-Integr. Manuf. 30 (2014) 34–46.
https://doi.org/10.1016/j.rcim.2013.08.001.
[31] T. Sentenac, F. Bugarin, B. Ducarouge, M. Devy, Automated thermal 3D reconstruction
based on a robot equipped with uncalibrated infrared stereovision cameras, Adv. Eng.
Inform. 38 (2018) 203–215. https://doi.org/10.1016/j.aei.2018.06.008.
[32] M. Amoretti, M. Reggiani, Architectural paradigms for robotics applications, Adv. Eng.
Inform. 24 (2010) 4–13. https://doi.org/10.1016/j.aei.2009.08.004.
[33] A. Luna-Navarro, R. Loonen, M. Juaristi, A. Monge-Barrio, S. Attia, M. Overend, Occupant-
Facade interaction: a review and classification scheme, Build. Environ. 177 (2020) 106880.
https://doi.org/10.1016/j.buildenv.2020.106880.
[34] D.R. Lumpkin, W.T. Horton, J.V. Sinfield, Holistic synergy analysis for building subsystem
performance and innovation opportunities, Build. Environ. 178 (2020) 106908.
https://doi.org/10.1016/j.buildenv.2020.106908.
[35] J. Ota, Multi-agent robot systems as distributed autonomous systems, Adv. Eng. Inform.
20 (2006) 59–70. https://doi.org/10.1016/j.aei.2005.06.002.
[36] A. Braun, S. Tuttas, A. Borrmann, U. Stilla, Improving progress monitoring by fusing point
clouds, semantic data and computer vision, Autom. Constr. 116 (2020) 103210.
https://doi.org/10.1016/j.autcon.2020.103210.
[37] Sacks R., Navon R., Brodetskaia I., Shapira A., Feasibility of Automated Monitoring of
Lifting Equipment in Support of Project Control, J. Constr. Eng. Manag. 131 (2005) 604–
614. https://doi.org/10.1061/(ASCE)0733-9364(2005)131:5(604).
[38] F.P. Rahimian, S. Seyedzadeh, S. Oliver, S. Rodriguez, N. Dawood, On-demand
monitoring of construction projects through a game-like hybrid application of BIM and
machine learning, Autom. Constr. 110 (2020) 103012.
[39] D. Rebolj, Z. Pučko, N.Č. Babič, M. Bizjak, D. Mongus, Point cloud quality requirements for
Scan-vs-BIM based automated construction progress monitoring, Autom. Constr. 84
(2017) 323–334. https://doi.org/10.1016/j.autcon.2017.09.021.
[40] I.T. Weerasinghe, J.Y. Ruwanpura, J.E. Boyd, A.F. Habib, Application of Microsoft Kinect
sensor for tracking construction workers, in: Constr. Res. Congr. 2012 Constr. Chall. Flat
World, 2012: pp. 858–867.
[41] F. Sun, M. Liu, Y. Wang, H. Wang, Y. Che, The effects of 3D architectural patterns on the
urban surface temperature at a neighborhood scale: Relative contributions and marginal
effects, J. Clean. Prod. 258 (2020) 120706.
[42] P. Henry, M. Krainin, E. Herbst, X. Ren, D. Fox, RGB-D mapping: Using Kinect-style depth
cameras for dense 3D modeling of indoor environments, Int. J. Robot. Res. 31 (2012) 647–
663. https://doi.org/10.1177/0278364911434148.
[43] L. Caruso, R. Russo, S. Savino, Microsoft Kinect V2 vision system in a manufacturing
application, Robot. Comput.-Integr. Manuf. 48 (2017) 174–181.
https://doi.org/10.1016/j.rcim.2017.04.001.
[44] A. Bhatla, S.Y. Choe, O. Fierro, F. Leite, Evaluation of accuracy of as-built 3D modeling
from photos taken by handheld digital cameras, Autom. Constr. 28 (2012) 116–127.
https://doi.org/10.1016/j.autcon.2012.06.003.
[45] O.-S. Kwon, C.-S. Park, C.-R. Lim, A defect management system for reinforced concrete
work utilizing BIM, image-matching and augmented reality, Autom. Constr. 46 (2014) 74–
81. https://doi.org/10.1016/j.autcon.2014.05.005.
[46] C.-S. Park, D.-Y. Lee, O.-S. Kwon, X. Wang, A framework for proactive construction defect
management using BIM, augmented reality and ontology-based data collection template,
Autom. Constr. 33 (2013) 61–71. https://doi.org/10.1016/j.autcon.2012.09.010.
[47] J.M. Davila Delgado, L. Oyedele, P. Demian, T. Beach, A research agenda for augmented
and virtual reality in architecture, engineering and construction, Adv. Eng. Inform. 45
(2020) 101122. https://doi.org/10.1016/j.aei.2020.101122.
[48] Y. Bae, T.-K. Vu, R.-Y. Kim, Implemental Control Strategy for Grid Stabilization of Grid-
Connected PV System Based on German Grid Code in Symmetrical Low-to-Medium
Voltage Network, IEEE Trans. Energy Convers. 28 (2013) 619–631.
https://doi.org/10.1109/TEC.2013.2263885.
[49] X. Sun, J. Bao, J. Li, Y. Zhang, S. Liu, B. Zhou, A digital twin-driven approach for the
assembly-commissioning of high precision products, Robot. Comput.-Integr. Manuf. 61
(2020) 101839. https://doi.org/10.1016/j.rcim.2019.101839.
[50] T. Schwinn, A. Menges, Fabrication agency: Landesgartenschau exhibition hall, Archit.
Des. 85 (2015) 92–99.
[51] Grasshopper - algorithmic modeling for Rhino, (n.d.). https://www.grasshopper3d.com/
(accessed May 22, 2020).
[52] Rhino 6 for Windows and Mac, (n.d.). https://www.rhino3d.com/ (accessed May 22, 2020).
[53] M.S. Roudsari, M. Pak, A. Smith, Ladybug: a parametric environmental plugin for
grasshopper to help designers create an environmentally-conscious design, in: Proc. 13th
Int. IBPSA Conf. Held Lyon Fr. Aug, 2013: pp. 3128–3135.
[54] E. Naboni, M. Meloni, C. Mackey, J. Kaempf, The Simulation of Mean Radiant
Temperature in Outdoor Conditions: A review of Software Tools Capabilities, in:
IInternational BPSA Conf., 2019.
[55] Association for Robots in Architecture | KUKA|prc, (n.d.).
https://www.robotsinarchitecture.org/kuka-prc (accessed May 28, 2020).
[56] alhadidi, QUOKKA PLUGIN (REAL-TIME 3D SCANNING), Suleiman Alhadidi. (2017).
http://alhadidi.com/quokka-plugin-real-time-3d-scanning/ (accessed May 22, 2020).
[57] Firefly Experiments, Firefly Exp. (n.d.). http://www.fireflyexperiments.com (accessed May
22, 2020).
[58] Speckle Docs / Grasshopper Basics, (n.d.).
https://speckle.systems/docs/clients/grasshopper/basics/ (accessed May 22, 2020).
[59] Rhino – Mindesk, (n.d.). https://mindeskvr.com/rhino/ (accessed August 10, 2020).
[60] Fologram | Food4Rhino, (n.d.). https://www.food4rhino.com/app/fologram (accessed June
11, 2020).
[61] ZED Stereo Camera | Stereolabs, (n.d.). https://www.stereolabs.com/zed/ (accessed
August 11, 2020).
Figure
Feedback/update

INPUT
iFOBOT-
iFOBOT-construct Trade of Matrices
environment
-Daylight factor
-Building Analysis
geometry -Collision check
-Interior skin -Singularity
facade -Reachability
-Geospatial check
a) b) c) d)
location -movement travel
ENVIRONMENTAL SIMULATION time

-LEED
Regulations Pass Pass
-Environment
environmental perforation
information
-openings
analysis toolpaths
-Constraints

LEGEND

Arrow indicate data passing


Arrow indicate geometric feedback iFOBOT-design iFOBOT-monitor
Arrow indicate analysis feedback
a) ladybug (Environment analysis plugin for Information
Grasshopper) 3D PARAMETRIC ENVIRONMENT OUTPUT
b) honey bee (Environment Analysis plugin
for Grasshopper)
c) python open source script -Geometry (.Obj)
d) KUKA robot control -perforation
e) Rhinoceros 3D nurb modeling software drawings(.Dwg)
f) Grasshopper visual programming e) -Environment
g) Fologram: AR plug-in for Grasshopper
f) g) h) i)
analysis(.txt)
(GH)
h) Firefly: real-time data processing (GH)
-KUKA krl file (.src)
i) BIM model in construction office -Construct progress
steps (.Dat)
Input data iFOBOT-env

Weather data analyse

Weather Data Interior space


Location receiving less than
Sun path control 100 hours of sunlight
Climate Based performance simulations per year

Threshold Selector
Target building envelop
Select part of
Double skin facade heatmap that exceed
Interior layer 250 hours per year
Exterior layer Sunlight hours analysis for shading
design, 3D sun-path visualization Exterior Panel

Annual Sun Exposure (ASE) LEED v4 daylight credit


iFOBOT-design
rule compliance check
Pass best
Parametric model ASE measures the percentage of floor area A facade
that receives more than 1000 lux of DIRECT
Design Parametric controller SUN ONLY.
Facade design iFOBOT-
Perforation configuration
spaces with more than 10% ASE will likely B construct
result in visual discomfort

calculation to determine the floor area that


exceeds 250 hours per year C
iFOBOT-construct
Input
iFOBOT- Successful
design double skin
facade
perforation

Building envelope
-Dimensions a) facade b) Robot safe c) peroration activity
-Rotation a Panelization working window workstations
Robot arm select
-Robot initial
location
-milling spindle
model
-Robot brand
Output
Construction
environment 3d
model Workstation (.gh)
-constraints
-policies Robot safe Quantity take-off
-specifications working (.xls)
Robot safe
envelope BIM environment
working vertical-
kuka|Prc core window (.obj)
Minimum
setting (circular) distance
-command Kuka krl file (.src)
-analysis output Rc: Robot safe Ideal Distance
-collision working vertical-
-reachability window Maximum
-time value (rectilinear) Distance
d) robot-arm e) robot-arm
rectilinear vertical safe working
window envelope
iFOBOT-monitor
Predefined setting Construction jobsite feedback Construction office

-Scanner location xyz in AR VISUALIZATION


reality/virtual
-Field of View
Layer 1: BIM model
-Cropped view/ target
-Resolution quality Layer 2: point cloud + BIM overlay model
WS a)
-Color coding objects 2 WS Layer 3: notes + drawings
-robot workstation model 1 l)
location
-BIM LOD b)
-Overlay scale
iFOBOT-construct

LEGEND
3D LASER SCAN POINT CLOUD MODELING VIRTUAL REALITY
Arrow indicate data passing
Arrow indicate geometric feedback
Arrow indicate analysis feedback
a) Fologram: AR plug-in for Grasshopper
e) m)
(GH) i) k)
b) worker smartphone in jobsite c)
c) Xbox Kinect V2 laser scanning camera
d) computer in jobsite Progress check
f) Quality check
e) Quokka Kinect data processing (GH)
Add notes
f) Firefly: real-time data processing (GH) j) View schedule
g) Tarsier: Point cloud manage (GH) d) Compar with BIM 3d
3D PARAMETRIC ENVIRONMENT model
h) Speckle: real-time broadcasting (GH) g)
i) Grasshopper (GH): Graphical algorithm
editor
scan prepare inspect
j) Rhinoceros: 3D Nurbs modeling software
k) BIM model in construction office
l) Mindesk: real-time VR platform inside
(GH)
m) Oculus Rift S VR headset h)
iFOBOT-env

Environment LEED rule compliance check

Case Study - one room model A


(Our target)
B
Interior Interior Exterior
space wall wall
C
exterior

iFOBOT-design

1 2

iFOBOT-
construct
Double skin facade Partition of space with > 250 hours = Partition of space with > 250 hours = 0%
54% LEED status of your Geometry = A Exterior layer openings
LEED status of your Geometry = C
c 4
Location
471-080 Galmae-dong Guri-
si, Gyeonggi-do Seoul South
Korea

Partition of space with > 250 hours = 20% Partition of space with > 250 hours = 1%
LEED status of your Geometry = B LEED status of your Geometry = A Perforated
facade
Exterior skin facade
Robot reachability
Perforations
FIrst Perforation
Robot toolpath cluster

Second perforation
Robot travel distance cluster

Robot location One robot position for


both perforations

a) exterior skin perforation process b) robot-arm reachability and location optimization

FIrst Perforation
cluster

Lab room borders

Kinect Camera
location

c) lab test of perforation activity in the d) robot arm joint simulation positions showing no
construction site collision reachability issues
Construction jobsite feedback Construction office

AR | Receiving comments in jobsite

Visualize BIM User


column interface

lab test of perforation activity


in the construction site Progress Digital Model As Built model data
QR code monitoring overlap

3D LASER SCAN POINT CLOUD MODELING VIRTUAL REALITY

➢ Add notes
➢ View
schedule
➢ Compar with
Monitor | Review as built BIM 3d
Crop target facade model model
➢ Progress
for 3D modeling check
➢ Defect check
➢ Quality
check
point cloud
capturing model Zoomed 3d mesh Zoomed into detailed
geometry perforation

scan prepare inspect


Author Agreement

CRediT author statement


Ahmed Khairadeen Ali: Visualization, Software, Writing- Reviewing
and Editing, Writing- Original draft Preparation.: One Jae Lee:
Methodology, Conceptualization Validation, Investigation.: Hayub
Song: Supervision, Data curation.
Declaration of Interest Statement

Declaration of interests

☒ The authors declare that they have no known competing financial interests or personal relationships
that could have appeared to influence the work reported in this paper.

☐The authors declare the following financial interests/personal relationships which may be considered
as potential competing interests:

You might also like