Professional Documents
Culture Documents
AR and VR in Teaching
AR and VR in Teaching
AR and VR in Teaching
Authorized licensed use limited to: Auckland University of Technology. Downloaded on November 03,2020 at 05:33:44 UTC from IEEE Xplore. Restrictions apply.
ThC2.1
virtual teaching system. computers, cameras and optical equipment that can create
virtual teaching environment and respond to various
operations of teachers and students in real time. The
virtual virtual virtual
classroom Laboratory practice software are virtual teaching development platform which
training field is used to realize the required functions of teaching and a
virtual teaching resource editor.
The broadcast control platform manages and controls
LAN the generated content. Authentication management
verifies the access rights of users. Teaching management
manages and controls the teaching process, teaching steps,
teaching methods and teaching tools. User management
realizes the management and control to system end user
teacher teacher teacher student student student and system maintenance personnel.
1 2 3 1 2 3
terminal entrance
Figure 1 distributed virtual teaching system PC terminal mobile integrated
(VR helmet) terminal machine
(VR glasses)
The teaching system based on VR and AR is called computer panoramic optical virtual teaching virtual
virtual teaching system here, it integrates camera equipment development teaching
platform resource
high-performance computer software and hardware and editor
various advanced sensors to create a comprehensive
information environment with immersion and interaction broadcast control platform
capabilities. VR teaching system is shown in Figure 2. It content authentication teaching user
shows that there are some overlapping areas between the management management management management
real world and the virtual teaching environment, this
overlapping area forms the AR teaching environment.
That is to add virtual information to the real teaching Figure 3 composition of virtual teaching system
environment to enhance teachers and students' feelings of B. The Construction Of Human-Computer Interaction
the real teaching environment.
System In Virtual Teaching System
In an ideal situation, the virtual teaching system is
real environment
composed of multi-sensor, which is convenient for
AR teaching
human-computer interaction. The system can generate
environment high-quality images, good stereo effect, and fine natural
sense of force. It has good real-time performance, short
VR teaching delay, and users are not easy to feel tired. The
environment human-computer interaction of virtual teaching system is
shown in Figure 4.
sensing devices According to the changes of internal model and
display feedback external environment, the virtual environment generator
teachers generates a realistic virtual teaching environment through
and calculation. Stereovision generation produces a kind of
vision based on user's own viewpoint. Stereo sound
synthesis produces stereo effect based on users
themselves, which takes the user as the original point.
Figure 2 virtual teaching system Language recognition is used to identify voice commands
and conversation contents of users. Speech synthesis
Virtual teaching system consists of terminal entrance, produces the speech of human natural language. Position
content generation system and broadcast control platform, tracking and motion capture determines the position and
as shown in Figure 3. VR terminal is a terminal product direction of the user's head, eyes, hands and body.
integrating hardware, platform, computing center, flow Somatosensory system provides feedback on gravity,
entrance, industry application and other functions, which pressure, friction, temperature, etc. Kinematic system
meets the requirements of rigid demand, pain point, high provides rules and laws of motion for people and objects.
frequency, scene and connection.
The content generation system of virtual teaching
system consists of hardware and software. Hardwires are
544
Authorized licensed use limited to: Auckland University of Technology. Downloaded on November 03,2020 at 05:33:44 UTC from IEEE Xplore. Restrictions apply.
ThC2.1
virtual environment generator to generate a sense of position, so that people can feel the
sound coming from all directions. [2]
C. Tracking And Motion Capture Technology
stereovision generation The tracking and motion capture technology in VR
stereo sound synthesis teaching system needs to acquire accurately the position,
language recognition
speech synthesis direction and posture of the target object in
position tracking three-dimensional space in real time, as well as capture
motion capture the whole body movement of the human, and then
somatosensory system
kinematic system feedback the acquired information to VR system to
realize the interaction between human and the system. For
example, users can use eye movement or gesture changes
to control the device.
helmet display, VR glasses, holographic D. Sensing Technology
equipment, and other visual equipment
integrated machine The sensor can sense the measured information,
stereo equipment
multisensor groups transform the information into electrical signal or other
somatosensory equipment forms of information according to certain rules and output
force sensing device information, it can meet the requirements of information
transmission, processing, storage, display, recording and
Figure 4 human-computer interaction of virtual teaching
control. The sensing devices involved in VR include
wearable devices, such as helmet display, data gloves,
IV.KEY TECHNOLOGIES OF TEACHING SYSTEM BASED ON data clothes, etc., which are used to identify and track the
actions and sounds of the operator's head, hands and body.
VR AND AR There are also various kinds of sensing devices in the
environment, such as vision, hearing, touch and force
A. Vision Technology
sense devices.
When a person is observing an object, two eyes look at
the object from different positions, the left and right eyes E. Somatosensory Technology
see images of different sides of the object due to different The somatosensory technology in VR teaching system
angles, these two images are simultaneously imaged in is to bring the user's body feeling from the real world to
the retina, and the brain recognizes the distance of the the digital world, body sensation includes vision, hearing,
object according to these two different images, so as to kinesthetic, tactile, even taste and smell. Kinesthetic
produce stereoscopic sense.[2] Naked eye 3D vision uses sense is the perception of force and moment, such as the
optical technology to separate the visual images of the left shape, weight and hardness of objects. Tactile sensation is
and right eyes, resulting in the parallax of the two eyes, so the feeling of vibration, temperature and tangential force,
as to produce the visual illusion effect of the stereoscopic such as the perception of material, cold and hot.[3] In
image, it can obtain realistic stereo images with space and practical training, simulation experiment and other
depth without any auxiliary equipment (head display, teaching activities, somatosensory technology enables
glasses).Holographic projection technology uses the teachers and students to really feel the physical
interference and diffraction principle of light to record the characteristics of the experimental object and improve the
intensity and depth information of light, and reproduce sense of reality and immersion.
the real space information of the object, it enables the
observer to see the stereoscopic image of the object from V. THE CONTENT GENERATION OF VIRTUAL TEACHING
any visual angle. In the virtual teaching system, teachers SYSTEM
and students can get better learning experience and effect
if they don't need to wear any visual equipment and watch A. 3D Content Generated By Computer
the images and objects in a natural way.
In VR system, the geometric attributes of objects are
B. Sound Technology acquired mainly by optical method and stereo parallax
Virtual sound technology in VR teaching system is method. The optical method is to project light onto the
used to simulate sound field in real space, it uses virtual object and measure the depth information of the object
sound recording and synthesis technology (artificial head according to the order of the returned light. According to
recording, artificial synthetic virtual sound) and virtual the principle of triangulation, the stereo parallax method
sound playback technology to simulate the transmission uses two cameras located at different positions to take
of sound in three-dimensional space. For example, a pictures of the object, and uses the parallax of the
micro microphone is placed in the ear canal of the corresponding points to calculate the stereo information
artificial head, and the pulse response of the head to of the object, so as to obtain the spatial coordinates of the
multiple sound sources at different positions is recorded, sampling points. With professional 3D modeling software
and then real-time convolution calculation is carried out (such as 3D Max), it can make exquisite 3D models and
545
Authorized licensed use limited to: Auckland University of Technology. Downloaded on November 03,2020 at 05:33:44 UTC from IEEE Xplore. Restrictions apply.
ThC2.1
animations, 3D scanners can also be used to quickly system, if the above problems are solved and the teaching
create high-precision 3D models, it can also use 3D system is examined and designed from the perspective of
reconstruction software (such as Agisoft Photoscan educational neuroscience, the system will become an
Professional) to generate 3D models of real coordinates assistant to help people learn, think and create in the
by using multiple photos of objects. [3] multi-dimensional information space. In addition, VR and
In terms of motion capture, there are technologies AR can play a better role and value when they are used in
based on Electromechanical, electromagnetic and special teaching, only when the human factors are fully
optical signs. The camera and motion capture device can considered and the human-friendly design is carried out.
be used to obtain motion data and generate motion model.
REFERENCES
B. 3D Content Generated by Panoramic Video Shooting
[1] Tan Jiefu; Zhong Zheng;Yao Yongfang. Virtual Reality Foundation
VR panoramic video uses panoramic shooting and Actual Combat [M]. Beijing: Chemical Industry Press, 2018.
(horizontal 360o and vertical 360o) and image splicing [2] Wang Chengwei;Gao Wen;Wang Xingren. Theory, Implementation
and Application of Virtual Reality Technology [M]. Beijing:
technology to splice the video into spherical image to Tsinghua University Press, 1996.
build virtual space. VR video generation includes video [3] Lu Yun;Wang Haiquan;Sun Wei. Theory, technology, development
acquisition, stitching, coding and other processes, and it and application of virtual reality [M]. Beijing: Tsinghua University
also goes through the later synchronization, multi video Press, 2015
[4] Lou Yan. Overview of Virtual Reality and Augmented Reality
splicing and other work. Post production software, such Technology [M]. Beijing: Tsinghua University Press, 2016.
as PluralEyes, can be used for simultaneous processing of [5] Guido Bozzelli; Antonio Raia; Stefano Ricciardi. An integrated
sound and picture of multiple machines. Premiere is video VR/AR framework for user-centric interactive experience of
editing software based on non-linear editing equipment, cultural heritage: The ArkaeVision project[J]. Digital Applications
in Archaeology and Cultural Heritage,2019.
which can clip, splice and synthesize video materials. [6] Kyle Hooks; Wesley Ferguson; Pedro Morillo; Carolina Cruz-Neira;
After VR video is generated, teachers and students can Evaluating the user experience of omnidirectional VR walking
use VR devices (such as VR helmets, glasses, etc.) to get simulators[J]. Entertainment Computing,2020.
virtual space experience. [7] Steven Lo; Abdul Shakor Sandel Abaker; Fabio Quondamatteo;
Use of a virtual 3D anterolateral thigh model in medical education:
Augmentation and not replacement of traditional teaching?[J].
VI.CONCLUSION Journal of Plastic, Reconstructive & Aesthetic Surgery,2020.
[8] Liu Xiangqun;Guo Xuefeng;Zhong Wei. VR / AR / MR
Virtual reality and augmented reality bring great development practice: Based on unity and U4E engine [M]. Beijing:
benefits to teaching. But at present, most of the researches Mechanical Industry Press, 2017.
in China are still on the interaction between human [9] Zhang Shanli;Shi Fen. Introduction to virtual reality [M]. Beijing:
perception system, muscle system and computer, there is Tsinghua University Press, 2016.
[10] He Wei. United virtual reality development Canon [M]. Beijing:
little research on "how the sensory information obtained China Railway Press, 2016.
by human in practice is stored and processed in human [11] Zhang Kefa. AR and VR development practice [M]. Beijing:
brain to understand the objective world?" For the teaching Mechanical Industry Press, 2016
.
546
Authorized licensed use limited to: Auckland University of Technology. Downloaded on November 03,2020 at 05:33:44 UTC from IEEE Xplore. Restrictions apply.