Professional Documents
Culture Documents
J Forsciint 2019 110006
J Forsciint 2019 110006
PII: S0379-0738(19)30418-9
DOI: https://doi.org/10.1016/j.forsciint.2019.110006
Reference: FSI 110006
Please cite this article as: Sieberth T, Dobay A, Affolter R, Ebert L, A toolbox for the rapid
prototyping of crime scene reconstructions in virtual reality, Forensic Science International
(2019), doi: https://doi.org/10.1016/j.forsciint.2019.110006
This is a PDF file of an article that has undergone enhancements after acceptance, such as
the addition of a cover page and metadata, and formatting for readability, but it is not yet the
definitive version of record. This version will undergo additional copyediting, typesetting and
review before it is published in its final form, but we are providing this version to give early
visibility of the article. Please note that, during the production process, errors may be
discovered which could affect the content, and all legal disclaimers that apply to the journal
pertain.
1
Institute of Forensic Medicine, University of Zurich, Winterthurerstrasse 190/52, CH-8057
Zurich, Switzerland
2
3D Zentrum Zurich, University of Zurich, Winterthurerstrasse 190/52, CH-8057 Zurich,
Switzerland
Highlights
of
For the use of virtual reality in forensic virtual scenes require a set of tools.
Interaction and Modification in Virtual Crime Scenes allow for discussion of
ro
the scenes.
Already a small number of tools in virtual scenes support a variety of forensic
applications. -p
Abstract
re
Virtual reality is recently finding its way in forensic work. The required 3D data is nowadays a
standard dataset available in many cases, from homicide to traffic collisions, including not
lP
only data from the scene but also of weaponry and involved persons. Current investigations
use these 3D data to replicated the incident and as discussion base for forensic personal.
However, modifying the scene on a 2D viewport is often cumbersome due to the loss of the
third dimension. Also to perform the modifications on the scene a 3D operator is often
na
required. Virtual reality might improve this step by its easy use and by visualising the third
dimension. This publication presents a variety of tools which can be used in forensic
investigations. Additionally to the tools, examples of forensic use of these tools will be
presented, showing that already a small number of tools support a variety of forensic
ur
applications.
Keywords:
Jo
1
questions regarding the sequence of events, gunshot trajectories and other forensic
evidence can be answered [16–19].
The general workflow for conducting a 3D reconstruction consists of four main steps. First,
the 3D data must be generated, which means employing the correct modality to document a
certain object, location or medical finding. Second, all data must be prepared for the
reconstruction. In this step, polygon models of relevant anatomical structures are extracted
from the volume datasets of medical scans, photogrammetric models and their textures, are
calculated from two-dimensional digital photos, and characters are prepared for animation by
rigging them to a kinematic bone system. If required, the polygon count of the resulting
models is reduced or parts of a scene are rebuilt with existing 3D data based on low-polygon
primitives. The purpose of this step is to convert all the data into a polygonal mesh and
import them into the 3D animation software. In the third step, the person performing the
reconstruction confers with field experts to decipher the data in the context of the forensic
question that must be answered. Because of the complexity and the amount of data that can
of
be available, this step can take a considerable amount of time. Subsequently, when the
traces fit together, the reconstruction is performed, and the visualizations are created - either
as 2D renderings or as virtual reality (VR) scenes [20, 21]. Therefore, the final reconstruction
ro
can consist of multiple plausible scenarios that are dependent on the forensic question and
the available traces, evidence and materials.
What makes these reconstructions time consuming and therefore expensive is the fact that
-p
while the data are in 3D, the display used for the reconstructions as well as the input devices
(the computer mouse and keyboard) work in 2D. Due to a lack of information and input
capabilities, converting several polygon models into the desired position can be tedious and
re
can require constant adjustment of the viewport perspective and the 3D object.
2014, we proposed a system to display crime scene reconstructions to state attorneys [20].
Since then, VR has been used routinely for incident reconstructions, virtual crime scene
visits or even forensic medical examinations [22, 23]. Current-generation VR systems
consist of 2 to 3 components. First, a VR headset displays a perspectively correct image to
na
the user. Second, controllers, whose positions are also tracked by the system, allow the user
to interact with the virtual environment. Finally, some systems employ external trackers or
lighthouses as external points of reference [24, 25]. The combination of 3D virtual reality
headsets in conjunction with tracked controllers offers more than just visualizing static crime
ur
scene reconstructions. In this article, we present a new application for using VR techniques
to accelerate 3D incident reconstructions.
Jo
2. Methods
To perform a 3D reconstruction in VR, several steps are required. First, the recorded data
are prepared. In this step, all the data are converted to polygon meshes and cropped to the
required area of interest to avoid data clutter and subsequent lagging in the VR visualization.
For large datasets, this process can also include reducing the polygon count and finding a
balance between the model detail and the polygon load. This step is performed in dedicated
software, depending on the modality. After the data are prepared as needed, scene
integration in Unity (Version 2018.1.8f1 Personal, Unity Technologies, San Francisco, United
States) is performed, and the required reconstructions are implemented. A preliminary
reconstruction is performed in VR. Subsequently, the reconstructed scene is exported and
2
transferred to other 3D software, such as CAD software, to perform fine adjustments, render
more realistic images or fit the scene in a larger context.
of
These tools allow moving around in the scene, manipulating elements within the scene
(moving, rotating), measuring the placement of indicators and trajectories, taking
screenshots, replacing the controller with objects and scaling the user for a better overview
ro
or for increased detail. It is important to note that modifications of the polygon meshes or the
texture content within the virtual environment are deliberately not implemented to maintain
the integrity of the data. Different tools can be selected and used with the controller. The
-p
control scheme varies from tool to tool, and the button functionalities are displayed in the
virtual environment. All the scripts developed for this project are available upon request.
re
2.2.1 Teleportation
One of the most important features is the teleporting function provided with the SteamVR
asset. This feature allows the user to teleport instantly from one position to another,
lP
effectively allowing the user to cover large distances within a fraction of a second.
Furthermore, it is possible to limit the area that a user can reach by limiting the teleportation
area, and it is possible to define points of interest (POI) by adding a teleportation marker that
can be reached with the teleportation feature.
na
[29].
The first tool is a screenshot tool, which allows the user to take virtual photos of the current
view in the VR environment. The screenshots with dates and timestamps are saved as
digital files in the case folder. The native screenshot resolution is the resolution displayed by
the VR HMD and can be adjusted to render higher quality images with larger pixel counts.
However, a larger pixel count does not necessarily result in a better image because the
texture on the 3D model and polygon count are limiting parameters for the screenshot quality
[22].
3
The measure tool allows the use of VR controllers to measure distances. A scalebar can be
placed into the scene via dragging. Upon the release of the button, the scale bar is
visualized, and the total length of the ruler is displayed. By placing multiple markers, it is
possible to measure around curves as well (Fig 1).
of
Figure 1. Measurement tool measuring the diameter of an injury with intermediate
points along the curvature of the round object surface and the straight line distance
ro
between two tables. Both measurements are performed in meters.
4
of
ro
-p
Figure 2. Trajectory tool. The red line at the bottom is the scanned probe that marks a
re
bullet in the wall and the presumed bullet trajectory. Encircled in green cylinder
showing the manually extended trajectory which does not consider the possible
deviation in the firing trajectory.
lP
user might be necessary whenever there are traces in inaccessible areas such as the floor
or ceiling. Downscaling also helps with visualizing small objects or fine textures, which can
usually be challenging in VR due to the relatively low resolution of the display. Furthermore,
downscaling helps with fine controller-based movements as it suppresses hand-jitter effects.
ur
Upscaling of the user might be required to obtain an overview of a larger scene or to move
faster over longer distances.
In this mode, two types of movement are possible: a combination of walking and teleporting
Jo
5
Figure 3. Scaling and movement. The image on the left shows the scaled user, and the
red figure appears proportionally large. The image on the right image shows the user
hovering over the scene.
of
One of the key functionalities for reconstruction is interacting with objects in the virtual
environment. After all objects have been loaded into the scene using the Unity3D user
interface, they can be given the “interactive object” property. This property ensures that
ro
objects such as the floor or the walls cannot be altered. Objects that are interactive can be
grabbed using the controller. Grabbed objects can be freely moved and rotated. To avoid
accidental grabs of objects, it is also possible to freeze objects in their position, which
-p
requires the user to unfreeze the object first before it can be moved or adjusted. Interactive,
unfrozen objects are highlighted when the controller touches them. To allow the user to
perform matching operations, the transparency of objects with single-texture maps can also
be adjusted (Fig 4).
re
lP
na
ur
6
of
ro
Figure 5. Controller attachment made visible in reality.
tracked relative to the scene, allowing subsequent revisualizations of the performed actions
and movements, which can be used in analyses or in witness statements [23] (Fig 6). In
cases of 3D VR reconstruction, all alterations to the scene are documented.
na
ur
Jo
7
Figure 6. The recorded motion path of a user. The headset and controller are visible
and can move along the original motion path, which is visualized by the red (HDM),
green (left controller) and blue (right controller) paths.
2.2.11 Exporting
An important step is to save the adjustments and reconstructions of the scene and to export
the scene to the actual reconstruction software. For this purpose, relevant single objects can
be exported from the Unity3D hierarchy in a .obj format, including their orientation and
rotation attributes.
of
Motion tracker. This tool allows fingers and hands to perform actions within the field of view
(Fig 7).
ro
-p
re
lP
na
Figure 7. Finger tracking using the Leap controller on the HMD. Open hands and
pointing index fingers can easily be differentiated and increase the usability.
ur
Grasping is a feature that allows the user to grasp objects and move them around before
opening the hand and releasing the object, which is similar to the interaction performed with
Jo
Pointing at objects in VR is difficult for external viewers to see. For this purpose, the LEAP
Motion tracker was enabled to recognize a gesture in which the index finger is pointing while
the other fingers are retracted. When this gesture is recognized, the direction in which the
index finger points is represented by a “laser” emerging from the tip of the finger in the
pointing direction of the finger.
3. Discussion
8
In this article, we present a system that allows rapid prototyping of incident scene
reconstructions in virtual reality.
The proposed system can incorporate a wide variety of scanning modalities, including laser
scans of crime scenes, surface scans of objects and medical scans such as CT and MRI.
The functionality the system enables the user to quickly understand the data and explore the
scene in the context of an intuitively formulated forensic question. Because of the natural
interaction with the data compared to working with animation software such as 3ds Max, less
time is lost during this phase of the reconstruction, and thus, potentially reduces the costs.
The VR system can be easily integrated into the reconstruction workflow, provided there is
sufficient space available. Because the system is based on off-the-shelf gaming hardware
and software, the costs and entry threshold are relatively low. To date, the system was used
in four real cases that involved matching shoeprints and other injury inflicting tools.
Thus far, two applications for VR have been developed within the forensic holodeck project.
of
State attorneys can visualize crime scene reconstructions and visit virtual crime scenes in
which eye witness accounts are provided in VR. During the virtual crime scene visit, data
such as the witnesses position and rotation of the head and hands, audio and gunshot
ro
trajectories can be recorded. These data can be incorporated into the VR reconstruction
environment as well, allowing the presented system to bridge the pure VR visualization and
the crime scene visit also allowing to show discrepancies between the witnesses statement
and the forensic reconstruction.
-p
However, there are some limitations of the presented system. It is important to carefully
re
evaluate for each case if the use of VR is indicated or traditional methods are more suitable.
The use of VR is currently limited to evaluation and discussion purposes and can aid the
process of giving answers on forensic questions. The VR reconstruction is limited to 3D data
lP
only, and routine reconstruction tasks such as height estimations or reconstructions based
on camera footage cannot be performed. Using finger trackers to reposition objects in the
scene is possible but much less accurate than using the provided controllers. Currently, the
data must be manually imported and exported between the VR environment and the
na
point clouds over polygon meshes from surface scanners to volumetric data from medical
scans, are used for reconstructions. Unity as a gaming software, however, is limited to
visualizing relatively low-resolution polygon meshes. This visualization requires an additional
Jo
processing step in which all the data are converted to polygon meshes with a reduced
polygon count. Switching to a different platform other than Unity might enable us to visualize
the scanned data directly, further increasing the reconstruction speed. Future studies should
investigate the amount of time that could be saved by using VR for reconstructions with
respect to the type of forensic question.
Conflicts of Interest
There are no conflicts of interest.
Funding
9
No funding was received for this research.
Ethical Approval
No ethical approval was required for this article.
of
existing code components
Verification, whether as a part of the Lars Ebert, Till Sieberth,
activity or separate, of the overall Raffael Affolter
ro
Validation replication/ reproducibility of
results/experiments and other research
outputs
Investigation
Conducting a research and -p
investigation process, specifically
performing the experiments, or
Till Sieberth
re
data/evidence collection
Management activities to annotate Akos Dobay, Raffael Affolter
(produce metadata), scrub data and
lP
10
execution, including mentorship
external to the core team
Acknowledgements
The authors express their gratitude to Emma Louise Kessler, MD for her generous donation
to the Zurich Institute of Forensic Medicine, University of Zurich, Switzerland. We also thank
Nicolas Krismer for his help with the 3D printed controller attachment.
of
ro
-p
re
lP
na
ur
Jo
11
References
1. Franckenberg S, Flach PM, Gascho D, Thali MJ, Ross SG Postmortem computed
tomography-angiography (PMCTA) in decomposed bodies - A feasibility study. J
Forensic Radiol Imaging 2015; 3:226–234.
2. Filograna L, Thali M Post-mortem computed tomography (PMCT) imaging of the
lungs: pitfalls and potential misdiagnosis. 2013; 1–13.
3. Laberke PJ, Ampanozi G, Ruder TD, Gascho D, Thali MJ, Fornaro J Fast three-
dimensional whole-body post-mortem magnetic resonance angiography. J Forensic
Radiol Imaging 2017; 10:41–46.
4. Ampanozi G, Schwendener N, Krauskopf A, Thali MJ, Bartsch C Incidental occult
gunshot wound detected by postmortem computed tomography. Forensic Sci Med
Pathol 2013; 9:68–72.
5. Thali MJ, Ross S, Oesterhelweg L, Grabherr S, Buck U, Naether S, et al Virtopsy.
Working on the future of forensic medicine. Rechtsmedizin 2007; 17:7–12.
6. Schweitzer W, Röhrich E, Schaepman M, Thali MJ, Ebert L Aspects of 3D surface
scanner performance for post-mortem skin documentation in forensic medicine using
rigid benchmark objects. J Forensic Radiol Imaging 2013; 1:167–175.
of
7. Ruder TD, Thali MJ, Hatch GM Essentials of forensic post-mortem MR imaging in
adults. Br J Radiol 2014; 87:.
8. Michienzi R, Meier S, Ebert LC, Martinez RM, Sieberth T Comparison of forensic
ro
photo-documentation to a photogrammetric solution using the multi-camera system
“Botscan.” Forensic Sci Int 2018; 288:46–52.
9. Breitbeck R, Ptacek W, Ebert L, Furst M, Kronreif G Virtobot - A Robot System for
Optical 3D Scanning in Forensic Medicine. 2014; 84–91.
-p
10. Kottner S, Ebert LC, Ampanozi G, Braun M, Thali MJ, Gascho D VirtoScan - a mobile,
low-cost photogrammetry setup for fast post-mortem 3D full-body documentations in
x-ray computed tomography and autopsy suites. Forensic Sci Med Pathol 2017;
re
13:34–43.
11. Leipner A, Baumeister R, Thali MJ, Braun M, Dobler E, Ebert LC Multi-camera system
for 3D forensic documentation. Forensic Sci Int 2016; 261:123–128.
12. Buck U, Albertini N, Naether S, Thali MJ 3D documentation of footwear impressions
lP
and tyre tracks in snow with high resolution optical surface scanning. Forensic Sci Int
2007; 171:157–164.
13. Franckenberg S, Binder T, Bolliger S, Thali MJ, Ross SG Just Scan It! - Weapon
Reconstruction in Computed Tomography on Historical and Current Swiss Military
na
12
20. Ebert LC, Nguyen TT, Breitbeck R, Braun M, Thali MJ, Ross S The forensic holodeck:
an immersive display for forensic crime scene reconstructions. Forensic Sci Med
Pathol 2014; 10:623–626.
21. Thali MJ, Braun M, Buck U, Aghayev E, Jackowski C, Vock P, et al VIRTOPSY—
Scientific Documentation, Reconstruction and Animation in Forensic: Individual and
Real 3D Data Based Geo-Metric Approach Including Optical Body/Object Surface and
Radiological CT/MRI Scanning. J Forensic Sci 2005; 50:1–15.
22. Koller S, Ebert LC, Martinez RM, Sieberth T Using virtual reality for forensic
examinations of injuries. Forensic Sci Int 2019; 295:30–35.
23. Sieberth T, Dobay A, Affolter R, Ebert LC Applying virtual reality in forensics – a
virtual scene walkthrough. Forensic Sci Med Pathol 2019; 15:41–47.
24. HTC Corporation Vive | Discover Virtual Reality. 2017
25. Oculus VR LLC Oculus. 2017
26. Valve Corporation SteamVR Plugin. 2019.
https://assetstore.unity.com/packages/tools/integration/steamvr-plugin-32647.
Accessed 11 Apr 2019.
27. Unity Technologies Unity. 2019. https://unity.com/. Accessed 11 Apr 2019.
of
28. HTC Corporation VIVETM | Discover Virtual Reality Beyond Imagination. 2019.
https://www.vive.com/eu/. Accessed 11 Apr 2019.
29. Kersten TP, Büyüksalih G, Tschirschwitz F, Kan T, Deggim S, Kaya Y, et al The
ro
selimiye mosque of edirne, Turkey -An immersive and interactive virtual reality
experience using htc vive. Int Arch Photogramm Remote Sens Spat Inf Sci - ISPRS
Arch 2017; 42:403–409.
-p
re
lP
na
ur
Jo
13