Professional Documents
Culture Documents
Design and Implementation of Virtual-Real Interactive System For Mixed Reality
Design and Implementation of Virtual-Real Interactive System For Mixed Reality
Design and Implementation of Virtual-Real Interactive System For Mixed Reality
Optitrack can capture the movement of objects, and In this experiment, the real person is photographed in
reflective markers are attached to the moving objects [8]. front of the green screen, and the foreground image is
The system draws the environment coordinates, traces the transmitted to the graphic workstation, and the image is
marked objects and calculates the relative position [9]. separated from the green screen and superimposed with
The hand tracking markers for this study are shown in the background image to finally obtain the superimposed
Figure 3. image. Figure 5 is a screen shot by the camera. Figure 6 is
a picture of camera combined with the virtual scene.
476
Optitrack plug-in, the virtual object is bound to the hand
markers in UE4, so that the two can share the position
information and synchronize the displacement. Figure 7 is
a picture of a virtual object bound to a hand.
477
Figure 9. An action after the avatar listens to the
collision
10. References
[1] J. Arroyo-Palacios and R. Marks, "[POSTER]
Figure 11. Virtual character action Believable Virtual Characters for Mixed Reality," 2017
IEEE International Symposium on Mixed and Augmented
2ȻThe virtual character passes the object to the real Reality (ISMAR-Adjunct), Nantes, 2017, pp. 121-123.
character
The real character's hand moves continuously, and the [2] H. Regenbrecht, K. Meng, A. Reepen, S. Beck and T.
UE4 requests the Optitrack for the location information of Langlotz, "Mixed Voxel Reality: Presence and
the hand markers. Then the virtual character passes the Embodiment in Low Fidelity, Visually Coherent, Mixed
object to the position of the markers. When the object is Reality Environments," 2017 IEEE International
transferred to that location, the real person grabs the Symposium on Mixed and Augmented Reality (ISMAR),
object. The delivery process is shown in Figure 12. Nantes, 2017, pp. 90-99.
478
[3] Z. Zhang, B. Cao, D. Weng, Y. Liu, Y. Wang and H.
Huang, "Evaluation of Hand-Based Interaction for Near-
Field Mixed Reality with Optical See-Through Head-
Mounted Displays," 2018 IEEE Conference on Virtual
Reality and 3D User Interfaces (VR), Reutlingen, 2018,
pp. 739-740.
479