Professional Documents
Culture Documents
2005 08 15 Grimm Blender For MR
2005 08 15 Grimm Blender For MR
2005 08 15 Grimm Blender For MR
Abstract Concept
This paper presents an approach how an extended Blender’s unique combination of a modeling tool with
version of Blender can support the development of an integrated game engine as runtime environment allows
content-rich Mixed Reality (MR) applications. its usage as authoring tool as well as its usage as runtime
Combining virtual and real worlds is complex and framework. To support the described authoring process of
time consuming, but many problems in building 3D MR applications on a technical level, two Blender
geometries and building MR applications are similar. extensions for tracking and for merging are necessary. To
Therefore, Blender is a natural candidate for being a fulfill the requirements of the first task (tracking), a
part of an MR authoring environment. Based on the marker-based tracking library was added to Blender. To
requirements of MR applications, it is shown how allow a merging of real video streams with the virtual
Blender can be extended to fulfill these requirements world a video texture was added. The second (adminis-
(e.g. integration of a tracking system and a video tration of relationships) and the fourth task (achieve visual
texture). A proof-of-concept implementation is correctness) are accomplished using visual editor for
presented as well as an outlook how MR technologies Blender building block as well as the Python integration.
could be used to ease geometric modeling in Blender. Both extensions (tracking and merging) are provided as
Python modules written in C++. Compared to a pure C++
Introduction extension of Blender this approach allows later an easy
Mixed Reality was defined by Milgram and Kishino as access either in the Python scripting interface or even as a
the merging of real and virtual worlds somewhere along logic brick in the visual editor. For the development of the
the 'virtuality continuum' which connects completely real Python modules CXX [10]was used.
environments to completely virtual ones [1]. In addition to
the complex authoring process to build an interactive 3D
application, several additional tasks are required to inte-
grate MR features to an application. Mainly four additional
tasks are necessary. The first task is to integrate tracking
devices (e.g. mechanical or vision based) in the runtime
framework in order to use the real world as an input
device. This integration includes also the task to provide
techniques to allow a stable calibration in order to adapt to
changing hardware as well as changing conditions, e.g.
different light conditions. Second, relationships between
real and virtual objects have to be administrated. This
includes, that geometric as well as logical dependencies
between real and virtual objects are described. Third, Fig 1: live video texture on a cube
merging of the real and the virtual world has to be enabled. The ARToolkit [11] is used for tracking and the frame
Fourth, visual correctness should be reached. This grabber library [12] for video capturing. With this library
includes, that the occlusion of real and virtual objects is live video as well as video playback is possible to use.
correct as well as that virtual object have shadows in the Both plug-ins are accessible using special Blender nodes.
real world and vice versa. The position and orientation of a tracking node (called
Existing technologies for MR application creation MRBTrackingNode) is controlled by the tracking
facilitate game development on different levels of extension. Each MRBTrackingNode can be used as parent
abstraction. These range from purely library-type node for one or more Blender nodes. Thus, it is possible to
technologies, which require programming skills for game assign a geometry node to a marker. A property with the
development, over script-based technologies to out-of-the- filename of marker specifies which marker is assigned to
box technologies, which allow a development with which MRBTrackingNode. The video is shown via an
dedicated visual editors. Examples are AMIRE [2][3], extended plane node (called MRBVideoPlane). It is
Arvika [4], DART [5], DWARF [6], Tinmith [7], displayed as video texture on the plane and can be used
Studierstube with APRIL [8] and MARS [9]. But there with arbitrary objects (see Fig 1). A property controls
does not exist one tool which can be used while content whether a life video stream or a recorded one is shown.
authoring as well as while running the application.
Demonstration Conclusion
In this example it is shown how an MR application can Based on the integration of a tracking system and on the
be built using the presented extended Blender. Step by step integration of video textures it was shown how Blender
a model of the city Frankfurt is created. It is based on a can be used to build content rich MR applications.
satellite image and will be augmented using a tangible As soon as hardware shaders become available within
interface. Blender, the video texture will be extended by a color-
First, you have to model or import the geometries of the keying shader.
buildings and you have to insert MRBTrackingNodes into Currently, ARToolkit is used as tracking technology. To
the scene hierarchy. Second, you have to use the tracking allow the development of a broader range of applications,
nodes as parent nodes for the geometry nodes (see Fig 2). one of the next steps is integrate alternative tracking
Third, you have to assign each MRBTrackingNode with technologies (e.g. OpenTracker [13]).
one marker specifying the filenames of the marker files. The vision is to use the tracking data also for the
modeling itself (compare with [7]). In combination with
new input devices a very powerful and intuitive user
interface could be built.