Download as pdf or txt
Download as pdf or txt
You are on page 1of 54

Virtual Augmented and Mixed Reality

Applications and Case Studies 11th


International Conference VAMR 2019
Held as Part of the 21st HCI
International Conference HCII 2019
Orlando FL USA July 26 31 2019
Proceedings Part II Jessie Y.C. Chen
Visit to download the full and correct content document:
https://textbookfull.com/product/virtual-augmented-and-mixed-reality-applications-and
-case-studies-11th-international-conference-vamr-2019-held-as-part-of-the-21st-hci-in
ternational-conference-hcii-2019-orlando-fl-usa-july-26-31-2019/
More products digital (pdf, epub, mobi) instant
download maybe you interests ...

HCI International 2019 Posters 21st International


Conference HCII 2019 Orlando FL USA July 26 31 2019
Proceedings Part II Constantine Stephanidis

https://textbookfull.com/product/hci-
international-2019-posters-21st-international-conference-
hcii-2019-orlando-fl-usa-july-26-31-2019-proceedings-part-ii-
constantine-stephanidis/

Augmented Cognition 13th International Conference AC


2019 Held as Part of the 21st HCI International
Conference HCII 2019 Orlando FL USA July 26 31 2019
Proceedings Dylan D. Schmorrow
https://textbookfull.com/product/augmented-cognition-13th-
international-conference-ac-2019-held-as-part-of-the-21st-hci-
international-conference-hcii-2019-orlando-fl-usa-
july-26-31-2019-proceedings-dylan-d-schmorrow/

HCI International 2019 Posters 21st International


Conference HCII 2019 Orlando FL USA July 26 31 2019
Proceedings Part I Constantine Stephanidis

https://textbookfull.com/product/hci-
international-2019-posters-21st-international-conference-
hcii-2019-orlando-fl-usa-july-26-31-2019-proceedings-part-i-
constantine-stephanidis/

HCI International 2019 Posters 21st International


Conference HCII 2019 Orlando FL USA July 26 31 2019
Proceedings Part III Constantine Stephanidis

https://textbookfull.com/product/hci-
international-2019-posters-21st-international-conference-
hcii-2019-orlando-fl-usa-july-26-31-2019-proceedings-part-iii-
HCI in Games First International Conference HCI Games
2019 Held as Part of the 21st HCI International
Conference HCII 2019 Orlando FL USA July 26 31 2019
Proceedings Xiaowen Fang
https://textbookfull.com/product/hci-in-games-first-
international-conference-hci-games-2019-held-as-part-of-the-21st-
hci-international-conference-hcii-2019-orlando-fl-usa-
july-26-31-2019-proceedings-xiaowen-fang/

Cross Cultural Design Culture and Society 11th


International Conference CCD 2019 Held as Part of the
21st HCI International Conference HCII 2019 Orlando FL
USA July 26 31 2019 Proceedings Part II Pei-Luen
Patrick Rau
https://textbookfull.com/product/cross-cultural-design-culture-
and-society-11th-international-conference-ccd-2019-held-as-part-
of-the-21st-hci-international-conference-hcii-2019-orlando-fl-
usa-july-26-31-2019-proceedings-part-ii-pei/

Social Computing and Social Media Communication and


Social Communities 11th International Conference SCSM
2019 Held as Part of the 21st HCI International
Conference HCII 2019 Orlando FL USA July 26 31 2019
Proceedings Part II Gabriele Meiselwitz
https://textbookfull.com/product/social-computing-and-social-
media-communication-and-social-communities-11th-international-
conference-scsm-2019-held-as-part-of-the-21st-hci-international-
conference-hcii-2019-orlando-fl-usa-july-26-31/

Design User Experience and Usability Practice and Case


Studies 8th International Conference DUXU 2019 Held as
Part of the 21st HCI International Conference HCII 2019
Orlando FL USA July 26 31 2019 Proceedings Part IV
Aaron Marcus
https://textbookfull.com/product/design-user-experience-and-
usability-practice-and-case-studies-8th-international-conference-
duxu-2019-held-as-part-of-the-21st-hci-international-conference-
hcii-2019-orlando-fl-usa-july-26-31-2019-pro/

HCI for Cybersecurity Privacy and Trust First


International Conference HCI CPT 2019 Held as Part of
the 21st HCI International Conference HCII 2019 Orlando
FL USA July 26 31 2019 Proceedings Abbas Moallem
https://textbookfull.com/product/hci-for-cybersecurity-privacy-
and-trust-first-international-conference-hci-cpt-2019-held-as-
part-of-the-21st-hci-international-conference-hcii-2019-orlando-
Jessie Y. C. Chen
Gino Fragomeni (Eds.)

Virtual, Augmented
LNCS 11575

and Mixed Reality


Applications and Case Studies
11th International Conference, VAMR 2019
Held as Part of the 21st HCI International Conference, HCII 2019
Orlando, FL, USA, July 26–31, 2019
Proceedings, Part II
Lecture Notes in Computer Science 11575
Commenced Publication in 1973
Founding and Former Series Editors:
Gerhard Goos, Juris Hartmanis, and Jan van Leeuwen

Editorial Board Members


David Hutchison
Lancaster University, Lancaster, UK
Takeo Kanade
Carnegie Mellon University, Pittsburgh, PA, USA
Josef Kittler
University of Surrey, Guildford, UK
Jon M. Kleinberg
Cornell University, Ithaca, NY, USA
Friedemann Mattern
ETH Zurich, Zurich, Switzerland
John C. Mitchell
Stanford University, Stanford, CA, USA
Moni Naor
Weizmann Institute of Science, Rehovot, Israel
C. Pandu Rangan
Indian Institute of Technology Madras, Chennai, India
Bernhard Steffen
TU Dortmund University, Dortmund, Germany
Demetri Terzopoulos
University of California, Los Angeles, CA, USA
Doug Tygar
University of California, Berkeley, CA, USA
More information about this series at http://www.springer.com/series/7409
Jessie Y. C. Chen Gino Fragomeni (Eds.)

Virtual, Augmented
and Mixed Reality
Applications and Case Studies
11th International Conference, VAMR 2019
Held as Part of the 21st HCI International Conference, HCII 2019
Orlando, FL, USA, July 26–31, 2019
Proceedings, Part II

123
Editors
Jessie Y. C. Chen Gino Fragomeni
US Army Research Laboratory US Army Research Laboratory
Aberdeen Proving Ground, MD, USA Orlando, FL, USA

ISSN 0302-9743 ISSN 1611-3349 (electronic)


Lecture Notes in Computer Science
ISBN 978-3-030-21564-4 ISBN 978-3-030-21565-1 (eBook)
https://doi.org/10.1007/978-3-030-21565-1
LNCS Sublibrary: SL3 – Information Systems and Applications, incl. Internet/Web, and HCI

© Springer Nature Switzerland AG 2019


This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of the
material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation,
broadcasting, reproduction on microfilms or in any other physical way, and transmission or information
storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now
known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication
does not imply, even in the absence of a specific statement, that such names are exempt from the relevant
protective laws and regulations and therefore free for general use.
The publisher, the authors and the editors are safe to assume that the advice and information in this book are
believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors
give a warranty, expressed or implied, with respect to the material contained herein or for any errors or
omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in
published maps and institutional affiliations.

This Springer imprint is published by the registered company Springer Nature Switzerland AG
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
Foreword

The 21st International Conference on Human-Computer Interaction, HCI International


2019, was held in Orlando, FL, USA, during July 26–31, 2019. The event incorporated
the 18 thematic areas and affiliated conferences listed on the following page.
A total of 5,029 individuals from academia, research institutes, industry, and
governmental agencies from 73 countries submitted contributions, and 1,274 papers
and 209 posters were included in the pre-conference proceedings. These contributions
address the latest research and development efforts and highlight the human aspects of
design and use of computing systems. The contributions thoroughly cover the entire
field of human-computer interaction, addressing major advances in knowledge and
effective use of computers in a variety of application areas. The volumes constituting
the full set of the pre-conference proceedings are listed in the following pages.
This year the HCI International (HCII) conference introduced the new option of
“late-breaking work.” This applies both for papers and posters and the corresponding
volume(s) of the proceedings will be published just after the conference. Full papers
will be included in the HCII 2019 Late-Breaking Work Papers Proceedings volume
of the proceedings to be published in the Springer LNCS series, while poster extended
abstracts will be included as short papers in the HCII 2019 Late-Breaking Work Poster
Extended Abstracts volume to be published in the Springer CCIS series.
I would like to thank the program board chairs and the members of the program
boards of all thematic areas and affiliated conferences for their contribution to the
highest scientific quality and the overall success of the HCI International 2019
conference.
This conference would not have been possible without the continuous and unwa-
vering support and advice of the founder, Conference General Chair Emeritus and
Conference Scientific Advisor Prof. Gavriel Salvendy. For his outstanding efforts,
I would like to express my appreciation to the communications chair and editor of
HCI International News, Dr. Abbas Moallem.

July 2019 Constantine Stephanidis


HCI International 2019 Thematic Areas
and Affiliated Conferences

Thematic areas:
• HCI 2019: Human-Computer Interaction
• HIMI 2019: Human Interface and the Management of Information
Affiliated conferences:
• EPCE 2019: 16th International Conference on Engineering Psychology and
Cognitive Ergonomics
• UAHCI 2019: 13th International Conference on Universal Access in
Human-Computer Interaction
• VAMR 2019: 11th International Conference on Virtual, Augmented and Mixed
Reality
• CCD 2019: 11th International Conference on Cross-Cultural Design
• SCSM 2019: 11th International Conference on Social Computing and Social Media
• AC 2019: 13th International Conference on Augmented Cognition
• DHM 2019: 10th International Conference on Digital Human Modeling and
Applications in Health, Safety, Ergonomics and Risk Management
• DUXU 2019: 8th International Conference on Design, User Experience, and
Usability
• DAPI 2019: 7th International Conference on Distributed, Ambient and Pervasive
Interactions
• HCIBGO 2019: 6th International Conference on HCI in Business, Government and
Organizations
• LCT 2019: 6th International Conference on Learning and Collaboration
Technologies
• ITAP 2019: 5th International Conference on Human Aspects of IT for the Aged
Population
• HCI-CPT 2019: First International Conference on HCI for Cybersecurity, Privacy
and Trust
• HCI-Games 2019: First International Conference on HCI in Games
• MobiTAS 2019: First International Conference on HCI in Mobility, Transport, and
Automotive Systems
• AIS 2019: First International Conference on Adaptive Instructional Systems
Pre-conference Proceedings Volumes Full List

1. LNCS 11566, Human-Computer Interaction: Perspectives on Design (Part I),


edited by Masaaki Kurosu
2. LNCS 11567, Human-Computer Interaction: Recognition and Interaction
Technologies (Part II), edited by Masaaki Kurosu
3. LNCS 11568, Human-Computer Interaction: Design Practice in Contemporary
Societies (Part III), edited by Masaaki Kurosu
4. LNCS 11569, Human Interface and the Management of Information: Visual
Information and Knowledge Management (Part I), edited by Sakae Yamamoto and
Hirohiko Mori
5. LNCS 11570, Human Interface and the Management of Information: Information
in Intelligent Systems (Part II), edited by Sakae Yamamoto and Hirohiko Mori
6. LNAI 11571, Engineering Psychology and Cognitive Ergonomics, edited by Don
Harris
7. LNCS 11572, Universal Access in Human-Computer Interaction: Theory, Methods
and Tools (Part I), edited by Margherita Antona and Constantine Stephanidis
8. LNCS 11573, Universal Access in Human-Computer Interaction: Multimodality
and Assistive Environments (Part II), edited by Margherita Antona and Constantine
Stephanidis
9. LNCS 11574, Virtual, Augmented and Mixed Reality: Multimodal Interaction
(Part I), edited by Jessie Y. C. Chen and Gino Fragomeni
10. LNCS 11575, Virtual, Augmented and Mixed Reality: Applications and Case
Studies (Part II), edited by Jessie Y. C. Chen and Gino Fragomeni
11. LNCS 11576, Cross-Cultural Design: Methods, Tools and User Experience
(Part I), edited by P. L. Patrick Rau
12. LNCS 11577, Cross-Cultural Design: Culture and Society (Part II), edited by
P. L. Patrick Rau
13. LNCS 11578, Social Computing and Social Media: Design, Human Behavior and
Analytics (Part I), edited by Gabriele Meiselwitz
14. LNCS 11579, Social Computing and Social Media: Communication and Social
Communities (Part II), edited by Gabriele Meiselwitz
15. LNAI 11580, Augmented Cognition, edited by Dylan D. Schmorrow and Cali M.
Fidopiastis
16. LNCS 11581, Digital Human Modeling and Applications in Health, Safety,
Ergonomics and Risk Management: Human Body and Motion (Part I), edited by
Vincent G. Duffy
x Pre-conference Proceedings Volumes Full List

17. LNCS 11582, Digital Human Modeling and Applications in Health, Safety,
Ergonomics and Risk Management: Healthcare Applications (Part II), edited by
Vincent G. Duffy
18. LNCS 11583, Design, User Experience, and Usability: Design Philosophy and
Theory (Part I), edited by Aaron Marcus and Wentao Wang
19. LNCS 11584, Design, User Experience, and Usability: User Experience in
Advanced Technological Environments (Part II), edited by Aaron Marcus and
Wentao Wang
20. LNCS 11585, Design, User Experience, and Usability: Application Domains
(Part III), edited by Aaron Marcus and Wentao Wang
21. LNCS 11586, Design, User Experience, and Usability: Practice and Case Studies
(Part IV), edited by Aaron Marcus and Wentao Wang
22. LNCS 11587, Distributed, Ambient and Pervasive Interactions, edited by Norbert
Streitz and Shin’ichi Konomi
23. LNCS 11588, HCI in Business, Government and Organizations: eCommerce and
Consumer Behavior (Part I), edited by Fiona Fui-Hoon Nah and Keng Siau
24. LNCS 11589, HCI in Business, Government and Organizations: Information
Systems and Analytics (Part II), edited by Fiona Fui-Hoon Nah and Keng Siau
25. LNCS 11590, Learning and Collaboration Technologies: Designing Learning
Experiences (Part I), edited by Panayiotis Zaphiris and Andri Ioannou
26. LNCS 11591, Learning and Collaboration Technologies: Ubiquitous and Virtual
Environments for Learning and Collaboration (Part II), edited by Panayiotis
Zaphiris and Andri Ioannou
27. LNCS 11592, Human Aspects of IT for the Aged Population: Design for the
Elderly and Technology Acceptance (Part I), edited by Jia Zhou and Gavriel
Salvendy
28. LNCS 11593, Human Aspects of IT for the Aged Population: Social Media, Games
and Assistive Environments (Part II), edited by Jia Zhou and Gavriel Salvendy
29. LNCS 11594, HCI for Cybersecurity, Privacy and Trust, edited by Abbas Moallem
30. LNCS 11595, HCI in Games, edited by Xiaowen Fang
31. LNCS 11596, HCI in Mobility, Transport, and Automotive Systems, edited by
Heidi Krömker
32. LNCS 11597, Adaptive Instructional Systems, edited by Robert Sottilare and
Jessica Schwarz
33. CCIS 1032, HCI International 2019 - Posters (Part I), edited by Constantine
Stephanidis
Pre-conference Proceedings Volumes Full List xi

34. CCIS 1033, HCI International 2019 - Posters (Part II), edited by Constantine
Stephanidis
35. CCIS 1034, HCI International 2019 - Posters (Part III), edited by Constantine
Stephanidis

http://2019.hci.international/proceedings
11th International Conference on Virtual, Augmented
and Mixed Reality (VAMR 2019)

Program Board Chair(s): Jessie Y. C. Chen and Gino Fragomeni, USA

• Tamara Griffith, USA • Jose San Martin, Spain


• Fotis Liarokapis, Czech Republic • Andreas Schreiber, Germany
• Joseph B. Lyons, USA • Peter Smith, USA
• Phillip Mangos, USA • Simon Su, USA
• Amar R. Marathe, USA • Daniel Szafir, USA
• Rafael Radkowski, USA • Tom Williams, USA
• Maria Olinda Rodas, USA • Denny Yu, USA
• Michael S. Ryoo, USA

The full list with the Program Board Chairs and the members of the Program Boards of
all thematic areas and affiliated conferences is available online at:

http://www.hci.international/board-members-2019.php
HCI International 2020
The 22nd International Conference on Human-Computer Interaction, HCI International
2020, will be held jointly with the affiliated conferences in Copenhagen, Denmark, at
the Bella Center Copenhagen, July 19–24, 2020. It will cover a broad spectrum
of themes related to HCI, including theoretical issues, methods, tools, processes, and
case studies in HCI design, as well as novel interaction techniques, interfaces, and
applications. The proceedings will be published by Springer. More information will be
available on the conference website: http://2020.hci.international/.

General Chair
Prof. Constantine Stephanidis
University of Crete and ICS-FORTH
Heraklion, Crete, Greece
E-mail: general_chair@hcii2020.org

http://2020.hci.international/
Contents – Part II

VAMR and Robots

Design of Virtual Reality for Humanoid Robots with Inspiration


from Video Games . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
Jordan Allspaw, Lilia Heinold, and Holly A. Yanco

Visualizations for Communicating Intelligent Agent Generated


Courses of Action . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Jessica Bartik, Heath Ruff, Gloria Calhoun, Kyle Behymer,
Tyler Goodman, and Elizabeth Frost

Using HMD for Immersive Training of Voice-Based Operation


of Small Unmanned Ground Vehicles . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
Daniel W. Carruth, Christopher R. Hudson, Cindy L. Bethel,
Matus Pleva, Stanislav Ondas, and Jozef Juhar

Scalable Representation Learning for Long-Term Augmented


Reality-Based Information Delivery in Collaborative
Human-Robot Perception . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Fei Han, Sriram Siva, and Hao Zhang

Robot Authority in Human-Machine Teams: Effects of Human-Like


Appearance on Compliance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
Kerstin S. Haring, Ariana Mosley, Sarah Pruznick, Julie Fleming,
Kelly Satterfield, Ewart J. de Visser, Chad C. Tossell,
and Gregory Funke

Augmented Reality for Human-Robot Teaming in Field Environments. . . . . . 79


Christopher Reardon, Kevin Lee, John G. Rogers III, and Jonathan Fink

Augmented Reality Based Actuated Monitor Manipulation from


Dual Point of View . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
Ying Ren and Jiro Tanaka

Exploring Temporal Dependencies in Multimodal Referring Expressions


with Mixed Reality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
Elena Sibirtseva, Ali Ghadirzadeh, Iolanda Leite, Mårten Björkman,
and Danica Kragic

Mediating Human-Robot Interactions with Virtual, Augmented,


and Mixed Reality. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
Daniel Szafir
xviii Contents – Part II

Brain eRacing: An Exploratory Study on Virtual Brain-Controlled Drones . . . 150


Dante Tezza, Sarah Garcia, Tamjid Hossain, and Marvin Andujar

Human-Robot Interaction During Virtual Reality Mediated Teleoperation:


How Environment Information Affects Spatial Task Performance
and Operator Situation Awareness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
David B. Van de Merwe, Leendert Van Maanen, Frank B. Ter Haar,
Roelof J. E. Van Dijk, Nirul Hoeba, and Nanda Van der Stap

Investigating the Potential Effectiveness of Allocentric Mixed Reality


Deictic Gesture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178
Tom Williams, Matthew Bussing, Sebastian Cabrol, Ian Lau,
Elizabeth Boyle, and Nhan Tran

Autonomous Agent Teammate-Likeness: Scale Development


and Validation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199
Kevin T. Wynne and Joseph B. Lyons

VAMR in Learning, Training and Entertainment

Augmented Reality in Education: A Study on Preschool Children,


Parents, and Teachers in Bangladesh . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 217
Mohammad Fahim Abrar, Md. Rakibul Islam, Md. Sabir Hossain,
Mohammad Mainul Islam, and Muhammad Ashad Kabir

Physically Extended Virtual Reality (PEVR) as a New Concept in Railway


Driver Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 230
Małgorzata Ćwil and Witold Bartnik

Developing a VR Training Program for Geriatric Patients with Chronic


Back Pain: A Process Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243
Rebecca Dahms, Oskar Stamm, and Ursula Müller-Werdan

A Multi-procedural Virtual Reality Simulator for Orthopaedic Training . . . . . 256


Gino De Luca, Nusrat Choudhury, Catherine Pagiatakis,
and Denis Laroche

Exploring Extended Reality as a Simulation Training Tool Through


Naturalistic Interactions and Enhanced Immersion . . . . . . . . . . . . . . . . . . . . 272
Daniel Duggan, Caroline Kingsley, Mark Mazzeo, and Michael Jenkins

A Study on the Development of a Mixed Reality System Applied


to the Practice of Socially Interactive Behaviors of Children
with Autism Spectrum Disorder . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 283
Yu-Chen Huang and I-Jui Lee
Contents – Part II xix

Cicero VR - Public Speaking Training Tool and an Attempt


to Create Positive Social VR Experience . . . . . . . . . . . . . . . . . . . . . . . . . . 297
Michał Jakubowski, Marcin Wardaszko, Anna Winniczuk,
Błażej Podgórski, and Małgorzata Ćwil

Virtual Dome System Using HMDs: An Alternative to the Expensive


and Less Accessible Physical Domes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 312
Yun Liu, Zhejun Liu, and Yunshui Jin

GVRf and Blender: A Path for Android Apps and Games Development . . . . 329
Bruno Oliveira, Diego Azulay, and Paulo Carvalho

Designing Educational Virtual Environments for Construction Safety:


A Case Study in Contextualizing Incident Reports and Engaging Learners . . . 338
Alyssa M. Peña, Eric D. Ragan, and Julian Kang

Augmented Reality (AR) Assisted Laryngoscopy for Endotracheal


Intubation Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 355
Ming Qian, John Nicholson, David Tanaka, Patricia Dias, Erin Wang,
and Litao Qiu

TurtleGO: Application with Cubes for Children’s Spatial Ability Based


on AR Technology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 372
Yoonji Song, Jaedong Kim, and Hanhyuk Cho

LumaPath: An Immersive Virtual Reality Game for Encouraging


Physical Activity for Senior Arthritis Patients . . . . . . . . . . . . . . . . . . . . . . . 384
Xin Tong, Diane Gromala, and Federico Machuca

A New Practice Method Based on KNN Model to Improve User


Experience for an AR Piano Learning System . . . . . . . . . . . . . . . . . . . . . . 398
Hong Zeng, Xingxi He, and Honghu Pan

Enabling Immunology Learning in Virtual Reality Through Storytelling


and Interactivity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 410
Lei Zhang, Doug A. Bowman, and Caroline N. Jones

VAMR in Aviation, Industry and the Military

Augmented Reality for Product Validation: Supporting the Configuration


of AR-Based Validation Environments. . . . . . . . . . . . . . . . . . . . . . . . . . . . 429
Albert Albers, Jonas Reinemann, Joshua Fahl, and Tobias Hirschter

Assessing the Effect of Sensor Limitations in Enhanced Flight Vision


Systems on Pilot Performance. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 449
Ramanathan Annamalai, Michael C. Dorneich, and Güliz Tokadlı
xx Contents – Part II

Use of an Enhanced Flight Vision System (EFVS) for Taxiing


in Low-Visibility Environments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 466
Dennis B. Beringer, Andrea Sparko, and Joseph M. Jaworski

The Measurement of the Propensity to Trust Automation . . . . . . . . . . . . . . . 476


Sarah A. Jessup, Tamera R. Schneider, Gene M. Alarcon, Tyler J. Ryan,
and August Capiola

An Augmented Reality Shared Mission Planning Scenario: Observations


on Shared Experience . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 490
Sue Kase, Simon Su, Vincent Perry, Heather Roy, and Katherine Gamble

Human-Computer Interaction for Space Situational Awareness (SSA):


Towards the SSA Integrated Sensor Viewer (ISV). . . . . . . . . . . . . . . . . . . . 504
Mitchell Kirshner and David C. Gross

Image-Based Ground Visibility for Aviation: Is What You See


What You Get? (Pilot Study) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 516
Daniela Kratchounova, David C. Newton, and Robbie Hood

Examining Error Likelihood When Using Enhanced Vision Systems


for Approach and Landing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 529
Steven J. Landry, Denys Bulikhov, Zixu Zhang, and Carlos F. Miñana

Towards a Mixed Reality Assistance System for the Inspection


After Final Car Assembly . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 536
Marco Pattke, Manuel Martin, and Michael Voit

Interaction Paradigms for Air Traffic Control and Management


in Mixed Reality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 547
Syed Hammad Hussain Shah, Kyungjin Han, and Jong Weon Lee

Exploring Applications of Storm-Scale Probabilistic Warn-on-Forecast


Guidance in Weather Forecasting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 557
Katie A. Wilson, Jessica J. Choate, Adam J. Clark, Burkely T. Gallo,
Pamela L. Heinselman, Kent H. Knopfmeier, Brett Roberts,
Patrick S. Skinner, and Nusrat Yussouf

Author Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 573


Contents – Part I

Multimodal Interaction in VR

Presence, Immersion and Usability of Mobile Augmented Reality . . . . . . . . . 3


Hyoenah Choi, Youngwon Ryan Kim, and Gerard J. Kim

Explorations in AR: Finding Its Value . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16


Mauricio Gomes de Sá Ribeiro, Isabel Lafuente Mazuecos,
Fabiano Marinho, and Alice Neves Gomes dos Santos

Designing Inclusive Virtual Reality Experiences . . . . . . . . . . . . . . . . . . . . . 33


Matt Dombrowski, Peter A. Smith, Albert Manero, and John Sparkman

Spherical Layout with Proximity-Based Multimodal Feedback


for Eyes-Free Target Acquisition in Virtual Reality . . . . . . . . . . . . . . . . . . . 44
BoYu Gao, Yujun Lu, HyungSeok Kim, Byungmoon Kim, and Jinyi Long

A Multimodal Interface for Virtual Information Environments . . . . . . . . . . . 59


Jeffrey T. Hansberger, Chao Peng, Victoria Blakely, Sarah Meacham,
Lizhou Cao, and Nicholas Diliberti

AR Assistive System in Domestic Environment Using HMDs:


Comparing Visual and Aural Instructions . . . . . . . . . . . . . . . . . . . . . . . . . . 71
Shuang He, Yanhong Jia, Zhe Sun, Chenxin Yu, Xin Yi, Yuanchun Shi,
and Yingqing Xu

KnobCollector: Custom Device Controller for Dynamic Real-Time


Subjective Data Collection in Virtual Reality . . . . . . . . . . . . . . . . . . . . . . . 84
Rajiv Khadka and Amy Banic

CHARM: Cord-Based Haptic Augmented Reality Manipulation . . . . . . . . . . 96


Konstantin Klamka, Patrick Reipschläger, and Raimund Dachselt

To Speak or to Text: Effects of Display Type and I/O Style on Mobile


Virtual Humans Nurse Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
Justin Loyd, Toni Pence, and Amy Banic

Xavier Electromyographic Wheelchair Control and Virtual Training . . . . . . . 133


Albert Manero, Bjorn Oskarsson, John Sparkman, Peter A. Smith,
Matt Dombrowski, Mrudula Peddinti, Angel Rodriguez, Juan Vila,
and Brendan Jones
xxii Contents – Part I

The Effect of Onomatopoeia to Enhancing User Experience


in Virtual Reality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143
Jiwon Oh and Gerard J. Kim

Information Design for XR Immersive Environments:


Challenges and Opportunities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
Elaine M. Raybourn, William A. Stubblefield, Michael Trumbo,
Aaron Jones, Jon Whetzel, and Nathan Fabian

Multimodal Head-Mounted Virtual-Reality Brain-Computer Interface


for Stroke Rehabilitation: A Clinical Case Study with REINVENT . . . . . . . . 165
Athanasios Vourvopoulos, Octavio Marin-Pardo, Meghan Neureither,
David Saldana, Esther Jahng, and Sook-Lei Liew

Rendering, Layout, Visualization and Navigation

Physically-Based Bimanual Volumetric Selection


for Immersive Visualizations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183
Angela Benavides, Rajiv Khadka, and Amy Banic

Integrating Historical Content with Augmented Reality


in an Open Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196
Manuel Condado, Isabel Morais, Ryan Quinn, Sahil Patel,
Patricia Morreale, Ed Johnston, and Elizabeth Hyde

Surface Prediction for Spatial Augmented Reality Using Cubature


Kalman Filtering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206
Keegan Fernandes, Adam Gomes, Cong Yue, Yousef Sawires,
and David Wang

Marker Concealment Using Print Color Correction and Its Application . . . . . 221
Kanghoon Lee, Kyudong Sim, and Jong-II Park

Visual Effects of Turning Point and Travel Direction for Outdoor


Navigation Using Head-Mounted Display. . . . . . . . . . . . . . . . . . . . . . . . . . 235
Yuji Makimura, Aya Shiraiwa, Masashi Nishiyama, and Yoshio Iwai

Oculus Rift Versus HTC Vive: Usability Assessment


from a Teleportation Task . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 247
Crystal Maraj, Jonathan Hurter, Schuyler Ferrante, Lauren Horde,
Jasmine Carter, and Sean Murphy

Impact of Foveated Rendering on Procedural Task Training . . . . . . . . . . . . . 258


Rafael Radkowski and Supriya Raul

User Guidance for Interactive Camera Calibration . . . . . . . . . . . . . . . . . . . . 268


Pavel Rojtberg
Contents – Part I xxiii

Scaling Gain and Eyeheight While Locomoting in a Large VE . . . . . . . . . . . 277


Betsy Williams-Sanders, Tom Carr, Gayathri Narasimham,
Tim McNamara, John Rieser, and Bobby Bodenheimer

Emergency Response Using HoloLens for Building Evacuation . . . . . . . . . . 299


Sharad Sharma, Sri Teja Bodempudi, David Scribner, Jock Grynovicki,
and Peter Grazaitis

A New Traversal Method for Virtual Reality: Overcoming the Drawbacks


of Commonly Accepted Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 312
Karl Smink, J. Edward Swan II, Daniel W. Carruth, and Eli Davis

Comparative Study for Multiple Coordinated Views Across Immersive


and Non-immersive Visualization Systems . . . . . . . . . . . . . . . . . . . . . . . . . 321
Simon Su, Vincent Perry, and Venkateswara Dasari

Avatars, Embodiment and Empathy in VAMR

A Face Validation Study for the Investigation of Proteus Effects Targeting


Driving Behavior . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 335
Corinna A. Faust-Christmann, René Reinhard, Alexandra Hoffmann,
Thomas Lachmann, and Gabriele Bleser

Towards a Framework to Model Intelligent Avatars in Immersive


Virtual Environments for Studying Human Behavior in Building
Fire Emergencies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 349
Jing Lin and Nan Li

The Effects of Embodiment in Virtual Reality on Implicit Gender Bias . . . . . 361


Stephanie Schulze, Toni Pence, Ned Irvine, and Curry Guinn

Effects of Character Guide in Immersive Virtual Reality Stories . . . . . . . . . . 375


Qinghong Xu and Eric D. Ragan

Cognitive and Health Issues in VAMR

Spatial Perception of Size in a Virtual World . . . . . . . . . . . . . . . . . . . . . . . 395


Pritam Banik, Debarshi Das, and Si Jung Kim

Design Implications from Cybersickness and Technical Interactions


in Virtual Reality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 403
Patricia S. Bockelman, Sharlin Milliard, Matin Salemirad,
Jonathan Valderrama, and Eileen Smith
xxiv Contents – Part I

Characterizing the Cognitive Impact of Tangible Augmented Reality . . . . . . . 416


Michael W. Boyce, Aaron L. Gardony, Paul Shorter, Carlene Horner,
Cortnee R. Stainrod, Jeremy Flynn, Tad T. Brunyé,
and Charles R. Amburn

Evaluation of Immersive Interfaces for Tactical Decision Support . . . . . . . . . 428


Mark Dennison, Mark Mittrick, John Richardson, Theron Trout,
Adrienne Raglin, Eric Heilman, and Timothy Hanratty

Virtual Nature: A Psychologically Beneficial Experience . . . . . . . . . . . . . . . 441


Laura M. Herman and Jamie Sherman

Effects of Weight and Balance of Head Mounted Display


on Physical Load . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 450
Kodai Ito, Mitsunori Tada, Hiroyasu Ujike, and Keiichiro Hyodo

The Impact of Motion on Individual Simulator Sickness in a Moving Base


VR Simulator with Head-Mounted Display (HMD) . . . . . . . . . . . . . . . . . . . 461
Mara Kaufeld and Thomas Alexander

Communicating Information in Virtual Reality: Objectively Measuring


Team Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 473
Shannon Moore, Michael Geuss, and Joseph Campanelli

Quality of Experience Comparison Between Binocular and Monocular


Augmented Reality Display Under Various Occlusion Conditions
for Manipulation Tasks with Virtual Instructions . . . . . . . . . . . . . . . . . . . . . 490
Ming Qian, John Nicholson, and Erin Wang

Cybersickness and Postural Sway Using HMD Orientation. . . . . . . . . . . . . . 500


Lisa Rebenitsch and Breanna Quinby

Author Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 511


VAMR and Robots
Design of Virtual Reality for Humanoid
Robots with Inspiration
from Video Games

Jordan Allspaw, Lilia Heinold, and Holly A. Yanco(B)

Computer Science Department, University of Massachusetts Lowell,


Lowell, MA 01854, USA
Jordan Allspaw@uml.edu, Lilia Heinold@student.uml.edu, holly@cs.uml.edu

Abstract. Advances in robotics have led to breakthroughs in several


areas, including the development of humanoid robots. There are now
several different models of humanoid robots available, but operating
them remains a difficult challenge. Current operator control interfaces
for humanoid robots often require very experienced operators and sig-
nificant amounts of time for planning. A large amount of the planning
and cognitive load is attributable to the operator attempting to gain
adequate three-dimensional (3D) situation awareness and task aware-
ness while viewing an interface on a flat, two-dimensional (2D) screen.
Virtual reality (VR) has enormous potential to provide benefits to allow
the operator to quickly and accurately understand the state of a robot
in a scanned 3D environment and to issue accurate commands with less
cognitive load. In the gaming sphere, VR headsets remain a new and
promising interface for playing video games. In some cases, existing video
games are being ported over to VR and, in others, brand new games are
being designed with VR in mind. Control schemes and best practices for
VR are emerging within the video game industry. This paper aims to
leverage their lessons learned and to apply them to the teleoperation of
humanoid robots.

Keywords: Virtual reality · Video games · Robotics ·


Humanoid robotics · Human-robot interaction

1 Introduction
Over the past decade, there has been an increase in the development and use
of humanoid robots, just as virtual reality (VR) headsets have become more
commercially available with improved capabilities. However, VR has not been
used extensively with humanoid robots, although it seems that an immersive
view could help with the operation of such systems.
Our team performed an analysis of the human-robot interaction (HRI) tech-
niques used at the DARPA Robotics Challenge (DRC) Trials [19] and the DRC
Finals [9], both of which had a large collection of humanoid robots used by the
c Springer Nature Switzerland AG 2019
J. Y. C. Chen and G. Fragomeni (Eds.): HCII 2019, LNCS 11575, pp. 3–18, 2019.
https://doi.org/10.1007/978-3-030-21565-1_1
4 J. Allspaw et al.

participating teams. Both of these studies examined approaches taken by the


teams for interaction design, such as the number of operators, types of control,
how information was conveyed to the operator(s), and varying approaches to
semi-autonomous task execution. There were a number of strategies employed
by the various teams, but teams predominately relied on a combination of mouse,
keyboard, and gamepad interfaces.
The only known uses of VR in the DRC Finals were two teams – of the 23 in
the competition – utilizing the Oculus Rift Developer Kit (DK) [10] for viewing
point clouds. This technique was used as a situation awareness complement to
viewing point clouds through traditional means (i.e., on a computer monitor).
No teams used VR as their primary interface for either visualization or control. A
discussion with a member of one of the teams revealed that they had investigated
using the Oculus Rift DK for control but found it to be of limited use and did
not end up using it at the DRC Finals.
However, there is a requirement to visualize a great deal of sensor data when
teleoperating, or even supervising, a humanoid robot in complex environments.
For the visualization of sensor data in the DRC Finals [9], the teams in the
study averaged 5.1 active screens, 6.2 camera views, and 2.5 different point cloud
visualizations. There were an average of 1.6 active operators (using an input
device for robot control) and 2.8 passive operators (watching over the shoulders
of active operators, offering strategic advice), representing a large amount of
manpower towards the goal of allowing the operator to gain proper situation
and task awareness of the remote environment by interpreting sensor data from
a 2D interface (the screen) and building a 3D mental model.
Another interesting and common trait was that teams frequently had dif-
ferent visualization and control paradigms depending on the task they were
performing. The analysis concluded that this implied that each team spent a
significant amount of effort creating a specialized interface for each task, in order
to provide enough situation awareness to allow the operator(s) to complete the
task within the allotted time.
Much of the work on both the interface design and the operator task load was
centered around allowing the operator to gain a proper level of task and situation
awareness by interpreting data from the robot, which allowed the operator to
give the robot commands in an effective way. This need is why many interfaces
had multiple camera views. However, with modern sensors, there is potential
for a VR interface to greatly reduce the difficulty of this problem. For example,
depth cameras such as the Kinect or Xtion Pro generate 3D point clouds that
can be easily rendered inside a virtual world.
In previous work, we proposed an interface where the operator would use a
VR headset to teleoperate a humanoid robot [1]. However, in the time since we
initially designed our VR interface, a large number of video games have been
released for play with VR headsets. Now that VR games are more commonplace,
and there are many popular and well tested games available, we are interested in
analyzing various VR control techniques used in video games to discover what
lessons can be learned.
Design of VR for Humanoid Robots with Inspiration from Video Games 5

In this paper, we examine a sample of popular VR video games, analyzing


their approach to controls and data visualization. We then categorize common
control techniques of the VR games in our sample and discuss how those tech-
niques could be adapted for controlling a humanoid robot. Some of the techniques
used in video games will likely not be applicable to robot teleoperation due to the
different assumptions and requirements of the domains. However, there should
still be enough overlap for improvements to be made. We propose an altered
interface from our original design in [1], as well as a series of studies to compare
the two interfaces.

2 Virtual Reality in Video Game Design


We conducted a survey of current VR video games by reviewing online man-
uals and watching gameplay on YouTube in order to classify the methods for
controlling different aspects of the game. If an online manual was not available,
the game was omitted from our survey. Our survey originally included twenty-
one VR games1 , of which we found game manuals for fourteen2 . These fourteen
games are included in the analysis discussed in this section, with a summary
presented in Table 1.

2.1 Use of First vs Third Person

In all of the VR video games that we surveyed, first person was the primary
method of interaction between the user and the world. In one game, Job Sim-
ulator, the user has the option to move the camera from a first person view to
an over the shoulder third person view. However, even when in third person, the
user controls the character as if they were still in a first person view, i.e., looking
out from the character, not from its side. In all of the multiplayer games that
we surveyed, the user could see other players in third person but their character
stayed in a first person perspective throughout the game.
When in first person, the user could always see some indication of the location
of their hands, usually as gloves on the screen, but sometimes the controllers
themselves; however, they could not see the rest of their character.

1
Job Simulator, Star Trek Bridge Crew, Fantastic Contraption, The Lab, Skyrim VR,
Fallout 4 VR, Obduction, Subnautica, Rick and Morty: Virtual Rick-ality, The Talos
Principle VR, Anshar Wars 2, Settlers of Catan, L.A. Noire: The VR Case Files,
Budget Cuts, Arizona Sunshine, Onward, OrbusVR, Space Pirate Trainer, X-Plane
11, IL2 Battle of Stalingrad, and Eve Valkyrie.
2
Job Simulator, Star Trek Bridge Crew, Fantastic Contraption, Skyrim VR, Fallout 4
VR, Obduction, Subnautica, Rick and Morty: Virtual Rick-ality, The Talos Principle
VR, L.A. Noire: The VR Case Files, Arizona Sunshine, Onward, OrbusVR, and X-
Plane 11.
6 J. Allspaw et al.

Table 1. Characteristics of the VR games surveyed in this paper.

W pH H s
rm t B enu

ris U UD
pu ent on
Po an utt
ab

Pe t A M
Te tick ing

H D
d Gr

D
Se d i n e
m n

i n for
d on

ys l k
an ca

U
oo o

at g

Jo Wa

es t
R ers

H s to
3r ers

St -s

Pr or

Po n
R d

to

t
p
p

e
P

le
ea

ut
ol
t
1s

B
Job Simulator
Star Trek Bridge Crew
Fantastic Contraption
Skyrim VR
Fallout 4 VR
Obudction
Subnautica
Rick and Morty: Virtual Rickality
The Talos Principle VR
L.A. Noire: The VR Case Files
Arizona Sunshine
Onward
OrbusVR
X-Plane 11

2.2 Movement
The second area we examined to categorize the games is movement: how the
game allows the player to navigate their character within the virtual world. This
includes movement of the player’s camera, as well as of the player’s character
if they are separate. On the gaming platform Steam, VR games are categorized
in three ways: Room-scale, Standing, and Seated [7]. In most of the games that
we surveyed (71%), the user could choose which method they used. This choice
allows the user base to be as wide as possible, allowing each user to choose the
method with which they are most comfortable [11]. The terms are defined as
follows:
– Room-scale: The user moves within a dedicated area in order to move their
character. A chaperone is used to let the user know that they have reached
the boundaries of the tracked area in the real world. To leave the dedicated
area in the game, the user must use an alternate method for movement, such
as joystick or teleportation [3].
– Standing: The user is standing but stationary. Standing allows for some
lateral movement, but the user must stay within a much smaller area than
room-scale [15].
– Sitting: As with standing, the user is stationary. Because the user is seated,
there is less room to move, but the motion controllers usually still track the
user’s hands [15].
Of the games we surveyed, 71% had the option of room-scale mode. In three
of these (21%), the game was limited to a small area, such as a room, and thus
did not have an open world that the user could roam freely. This limitation
allowed the user to not worry about moving outside the room-scale area and the
need for combining movement methods. The games with an open world (43%)
Design of VR for Humanoid Robots with Inspiration from Video Games 7

required that the user use a joystick or teleportation to get around outside of
the small square in which they were standing. This method allowed the user to
bend to pick up objects and to imitate using a weapon such as a gun or bow
and arrow.
Eight (57%) of the games used teleportation as one of their movement meth-
ods. The user would point one of their joysticks at a location and press a button,
then a line and circle would appear indicating the exact location to which they
would teleport. When the user would release the button on their joystick, the
character would teleport to the specified location. This teleportation allows the
user to move around quickly in the VR world without moving in real life. Two
games (Rick and Morty, Star Trek Bridge Crew) that did not technically include
point to teleport instead allowed the user to move instantly between designated
locations, by either using portals or by pressing a button on a menu.
Nine (64%) of the games used joystick control (also known as smooth loco-
motion) for movement, where the user is stationary and uses a button on the
joystick for walking, similar to how a user would use the arrow keys on a com-
puter. Most VR controllers have a trackpad for this purpose. Six of these nine
games also had the option to teleport. These games let the user choose their pre-
ferred method for movement, mostly because smooth locomotion causes nausea
for many users, while teleportation has been reported to reduce nausea compared
to smooth locomotion [3].
One of the games had no movement (Star Trek Bridge Crew). This game
also only allowed the user to be seated, and the character would be sitting at a
station. By moving their hands, the user could select various buttons for their
character and switch stations. Similarly, another game (X-Plane 11) based on
fighter planes allowed the user only to teleport outside of the plane, but, inside,
the user would use their hands to press buttons to control the plane and would
not move in real life.

2.3 Manipulation
All of the games used the joystick buttons to control manipulation. There were
two primary methods for allowing a user to pick up an object. The first, similar
to many PC video games, would pop up a menu on the screen when the user was
in proximity of an object, indicating which button the user should press to use
the object. The second was more VR specific, allowing the user to point their
joystick at an object, using a laser pointer to show what object the user wanted
to select. Upon release of a joystick button, the player would grab the object.
In 50% of the games, the user had to hold down a button to hold an object.
When the user released the button, the character released the object as well.
The Vive controller has a dedicated button for this, called the Grip button.

2.4 User Interface


Twelve (86%) of the games used a button to bring up a menu in the game. Seven
of these allowed the user to point and click to select a menu button, where the
8 J. Allspaw et al.

user points the controller at a button and a laser appears. The button would
also change appearance (e.g., light up, change size, etc.) similar to how buttons
change appearance on a computer when you hover over them with a mouse.
Just one game (Star Trek Bridge Crew) used button selection, both on menus
and in the game, as one would in real life. The user moved their hand to hover
over the correct button in VR, then used a button to select it. This game had the
buttons laid out in front of the user like a console, instead of the more traditional
menu hovering in front of the user, in order to simulate the look from Star Trek.
The games that did not include the point and click method had the user
click buttons on the controller to select options. This method is more similar
to a traditional console game where the interface would indicate which button
corresponded to which option, and allowed the user to traverse the list with a
joystick or arrow button.

2.5 Heads Up Display (HUD)


A Heads Up Display (HUD) is commonly used in video games to display stats
such as health, amount of ammunition, inventory, etc. [18]. This method seems to
be less common in VR, with only three games using a permanent HUD. In two,
the HUD was simply a method for displaying information about other players
and objects (e.g., a health bar above other players or information hovering above
objects). The last of these had optional hints that would hover over objects.
A pop up HUD was marginally more common, used by five of the games.
Status bars would pop up in certain instances and locations then disappear
again. Sometimes the pop up was not in the user’s control (e.g., a health bar
when attacking an enemy, or a -X indicating decreasing health), and sometimes
the user could choose to view their status whenever they wished.
A wrist accessory HUD (i.e., the user moves their wrist as if they were looking
at a watch, allowing them to view their stats, which is sometimes combined with
a menu) was as common as a permanent HUD, used by three games. This form
of HUD is entirely up to the user as to when they wish to view it.

3 Potential Application of VR Game Design to Robotics


While VR game design has the advantage of many more hours of testing with
many more users than the typical robot interface, the video game world is not
always applicable to robotics. In this section, we discuss the VR game control
paradigms from popular video games, described above, in the context of robotics,
to determine which ones will be most applicable when designing a VR interface
for a robot system.

3.1 How Video Games Differ from Robotics


The biggest difference that needs to be considered when comparing VR for
robotics and for video games is that in robotics, the world that the user is
Design of VR for Humanoid Robots with Inspiration from Video Games 9

interacting with is real and not designed. In video games, the designer can bend
the rules of physics and create game specific rules that help themselves or the
user when it comes to control. For example, in a video game, sometimes the
user is allowed to clip through objects that they accidentally run into, to fall
from great heights, and to select objects that are not within grasping distance.
Many of the surveyed games allowed the user to “summon” an object when they
selected it; that is, when the user pressed the correct button, the object would
fly into the player’s hand.
Real world consequences also have to be considered when designing a VR
control system for a robot. With direct control in video games (i.e., when the
user is holding the motion controllers in their hands), it does not matter if the
user accidentally moves their arm in the wrong direction or needs to make a
strange motion to bring up a menu. While controlling a robot, a designer must
account for the fact that if a certain motion will bring up a menu, for example,
the robot will also attempt to make that motion. This design can be problematic
when the motion is impossible for the robot or the robot must hold a certain
position.
It is also worth noting that, in a video game, any object or obstacle the
user comes across is there by design. The user will not come across anything
unexpected (that is, not expected by the designer), so in video games it is much
easier to account for all possibilities that could occur while the user is playing.
This is definitely not true in the real world where robots must operate.

3.2 Applying Common Video Game VR Techniques to Robotics


While the VR control used in video games is not directly applicable to robotics,
design guidance can still be drawn from it, especially when it comes to what
is most comfortable and intuitive for a user. For example, no game surveyed
was exclusively in third person and very few games had the option of third
person view. This indicates that in VR, the most intuitive method of control is
first person – that is, you are the character. It follows that when designing VR
control for a humanoid robot, the user should be able to “be” the robot and see
through its eyes. We do believe, however, that third person views will also have
relevance to human-robot interaction.
Methods for movement in the games were relatively evenly distributed. Most
games offered more than one method so that users could choose a method with
which they were comfortable. This provision of multiple movement methods also
accounts for players with different gaming setups (e.g., some users cannot afford
a full room-scale VR setup or do not have the room for it) and expands the
possible user base of the game. Games with room-scale also often combine one-
to-one motion, where the character moves exactly with the user, along with
locomotion with a joystick or teleportation. This design was usually to account
for the fact that even in room-scale, the player has a limited space in which to
move. A similar problem is encountered in robotics: even if the operator has a
room-scale setup, the robot may have more space to move than the operator.
One way to tackle this problem is to allow the user to directly control the robot
10 J. Allspaw et al.

for smaller motions (e.g., grabbing, reaching, and limited lateral motion) and
leave walking to a point and click method. Another way of tackling the limited
space in which the operator has to move is to use an omnidirectional treadmill
[3], but none of the video games surveyed used this method. This design choice
could be attributed to the fact such a treadmill is not accessible to most gamers
at home and thus is primarily used in arcade setups.
In the video games that we surveyed, it was extremely common that while the
user was holding motion controllers, the game showed a pair of hands instead.
When the user pressed a button to grab, the hand would perform a grabbing
motion. Since all of the games surveyed used motion controllers to track the
users’ movements, the users could not physically move, grab, and point with
their hands and had to correlate certain buttons to hand movements. A possible
application to robotics is that a user could control the robot’s hands by using
motion controllers, and the buttons could be mapped to preprogrammed hand
motions performed by the robot.
In terms of manipulating objects, every video game surveyed which allowed
objects to be grabbed used a button on the controller to select the object of
interest. It was more evenly distributed which games chose to have the user hold
down the “grip button” to keep holding an object or to release the object by
tapping a button. Similar to movement, many games also offered settings for
object manipulation. The user could choose which method was most intuitive to
them, allowing the game to access a larger user base.
Menu access seemed to be tackled in very similar ways across all of the
games. The user would press a designated button on the controller to see the
menu, then would either use a point and click method of selection or use more
buttons on their controller to select. This design can be applied directly to a VR
robotics setup, where an operator could use a designated button to access an
in-VR menu. This design would account for the fact that current 2D interfaces
have many settings and buttons around the view of the robot that would need
to be accessed in VR through a menu. A point and click method of selection
could also be used, since direct movement could be disabled during menu access.
Most of the games that we surveyed did not include a HUD, but this can
be attributed to the fact that many did not need one. Some of the methods of
HUD used (e.g., hovering hints and information above other players) would not
be applicable to robotics. Since HUDs in video games are usually used to display
stats (e.g., health, amount of ammunition, and inventory), it could be used for
something similar in robotics. Some applicable statistics to display for robots
could include battery power, joint states, and other state of health information.

4 Initial Virtual Reality Interface


Our initial interface [1] was designed before VR video games were the norm and
while VR best practices were still being developed. Therefore, when designing
our interface, we looked to existing 2D interfaces for inspiration. We particularly
looked at interfaces used during the DRC, due to the wealth of information
available and the fact that the majority of competing robots were humanoid.
Another random document with
no related content on Scribd:
corresponding to those of the maxilla can be clearly traced in the
labium.

Fig. 52.—Maxilla and lower lip of Coleoptera. A, Maxilla of Passalus: a,


cardo; b, stipes; c, palpiger; d, palpus; e, inner or inferior lobe or
lacinia; f, outer or superior lobe or galea: B, Labium of Harpalus
caliginosus: a, mentum; b, hypoglottis; c, palpiger (support of the
labial palp); d, palp; e, ligula; f, paraglossa.

The mentum is an undivided, frequently very hard, piece, continuous


with either the submentum or the gula, and anterior to this are placed
the other parts, viz. the labial palpi and their supports, the palpigers;
beyond and between these exists a central piece (Fig. 52, B, e),
about whose name some difference of opinion prevails, but which
may be called the ligula (languette of French authors), and on each
side of this is a paraglossa. In the Orthoptera the single median
piece—the ligula of Coleopterists—is represented by two divided
parts. In some Insects (many Coleoptera) there is interposed
between the mentum and the palpigers a piece called the hypoglottis
(Fig. 52, B, b). It is not so well ascertained as it should be, that the
pieces of the lower lip bearing the same names in different Orders
are in all cases really homologous, and comparison suggests that
the hypoglottis of Coleoptera may possibly represent the piece
corresponding to the mentum of Orthopterists, the so-called mentum
of beetles being in that case the submentum of Orthopterists.

There is another part of the mouth to which we may call special


attention, as it has recently attracted more attention than it formerly
did; it is a membranous lobe in the interior of the mouth, very
conspicuous in Orthoptera, and called the tongue, lingua, or
hypopharynx; it reposes, in the interior of the mouth (Fig. 51, o), on
the middle parts of the front of the labium; it is probably not entirely
lost in Coleoptera, but enters into the composition of the complex
middle part of the lip by amalgamation with the paraglossae. It has
recently been proposed to treat this lingua as the morphological
equivalent of the labium or of the maxillae, giving it the name of the
endolabium, but the propriety of this course remains to be proved;[20]
the view is apparently suggested chiefly by the structure of the
mouth of Hemimerus, a very rare and most peculiar Insect that has
not as yet been sufficiently studied.

As the maxillae and labium are largely used by taxonomists in the


systematic arrangement of the mandibulate Insects, we give a figure
of them as seen in Coleoptera, where the parts, though closely
amalgamated, can nevertheless be distinguished. This Fig. 52
should be compared with Fig. 51.

In speaking of the segments of the body we pointed out that they


were not separate parts but constituted an uninterrupted whole, and
it is well to remark here that this is also true of the gnathites.
Although the mouth parts are spoken of as separate pieces, they
really form only projections from the great body wall. Fig. 51, B,
shows the intimate connexion that exists between the maxillae and
labium; the continuity of the mandibles with the membrane of the
buccal cavity is capable of very easy demonstration.

The head bears, besides the pieces we have considered, a pair of


antennae. These organs, though varying excessively in form, are
always present in the adult Insect, and exist even in the majority of
young Insects. They are very mobile, highly sensitive organs, situate
on or near the front part of the head. The antennae arise in the
embryo from the procephalic lobes, the morphological import of
which parts is one of the most difficult points connected with Insect
embryology.
The eyes of Insects are of two sorts, simple and compound. The
simple eyes, or ocelli, vary in number from one to as many as
eighteen or twenty; when thus numerous they are situated in groups
on each side of the head. In their most perfect form, as found in adult
aculeate Hymenoptera, in Orthoptera and Diptera, ocelli are usually
two or three in number, and present the appearance of small,
perfectly transparent lenses inserted in the integument. In their
simplest form they are said to consist of some masses of pigment in
connexion with a nerve.

Fig. 53.—Two ommatidia from the eye of Colymbetes fuscus, × 160.


(After Exner.) a, Cornea; b, crystalline cone; c, rhabdom; d,
fenestrate membrane with nerve structures below it; e, iris-
pigment; f, retina-pigment.

The compound, or facetted, eyes are the most remarkable of all the
structures of the Insect, and in the higher and more active forms,
such as the Dragon-flies and hovering Diptera, attain a complexity
and delicacy of organisation that elicit the highest admiration from
every one who studies them. They are totally different in structure
and very distinct in function from the eyes of Vertebrata, and are
seated on very large special lobes of the brain (see Fig. 65), which
indeed are so large and so complex in structure that Insects may be
described as possessing special ocular brains brought into relation
with the lights, shades, and movements of the external world by a
remarkably complex optical apparatus. This instrumental part of the
eye is called the dioptric part in contradistinction from the percipient
portion, and consists of an outer corneal lens (a, Fig. 53), whose
exposed surface forms one of the facets of the eye; under the lens is
placed the crystalline cone (b), this latter being borne on a rod-like
object (c), called the rhabdom. There are two layers of pigment, the
outer (e), called the iris-pigment, the inner (f), the retinal-pigment;
underneath, or rather we should say more central than, the
rhabdoms is the fenestrate membrane (d), beyond which there is an
extremely complex mass of nerve-fibres; nerves also penetrate the
fenestrate membrane, and their distal extremities are connected with
the delicate sheaths by one of which each rhabdom is surrounded,
the combination of sheath and nerves forming a retinula. Each set of
the parts above the fenestrate membrane constitutes an
ommatidium, and there may be many of these ommatidia in an eye;
indeed, it is said that the eye of a small beetle, Mordella, contains as
many as 25,000 ommatidia. As a rule the larvae of Insects with a
complete metamorphosis bear only simple eyes. In the young of
Dragon-flies, as well as of some other Insects having a less perfect
metamorphosis, the compound eyes exist in the early stages, but
they have then an obscure appearance, and are probably
functionally imperfect.

In the interior of the head there exists a horny framework called the
tentorium, whose chief office apparently is to protect the brain. It is
different in kind according to the species. The head shows a
remarkable and unique relation to the following segments. It is the
rule in Insect structure that the back of a segment overlaps the front
part of the one following it; in other words, each segment receives
within it the front of the one behind it. Though this is one of the most
constant features of Insect anatomy, it is departed from in the case
of the head, which may be either received into, or overlapped by, the
segment following it, but never itself overlaps the latter. There is
perhaps but a single Insect (Hypocephalus, an anomalous beetle) in
which the relation between the head and thorax can be considered
to be at all similar to that which exists between each of the other
segments of the body and that following it; and even in
Hypocephalus it is only the posterior angles of the head that overlap
the thorax. Although the head usually appears to be very closely
connected with the thorax, and is very frequently in repose received
to a considerable extent within the latter, it nevertheless enjoys great
freedom of motion; this is obtained by means of a large membrane,
capable of much corrugation, and in which there are seated some
sclerites, so arranged as to fold together and occupy little space
when the head is retracted, but which help to prop and support it
when extended for feeding or other purposes. These pieces are
called the cervical sclerites or plates. They are very largely
developed in Hymenoptera, in many Coleoptera, and in Blattidæ,
and have not yet received from anatomists a sufficient amount of
attention. Huxley suggested that they may be portions of head
segments.

Fig. 54.—Extended head and front of thorax of a beetle, Euchroma: a,


back of head; b, front of pronotum; c, chitinous retractile band; d,
cervical sclerites.

Thorax.

The thorax, being composed of the three consecutive rings behind


the head, falls naturally into three divisions—pro-, meso-, and
metathorax. These three segments differ greatly in their relative
proportions in different Insects, and in different stages of the same
Insect's life. In their more highly developed conditions each of the
three divisions is of complex structure, and the sclerites of which it is
externally made up are sufficiently constant in their numbers and
relative positions to permit of their identification in a vast number of
cases; hence the sclerites have received names, and their
nomenclature is of practical importance, because some, if not all, of
these parts are made use of in the classification of Insects. Each
division of the thorax has an upper region, called synonymically
dorsum, notum, or tergum; an inferior or ventral region, called
sternum; and on each side a lateral region, the pleuron. These
regions of each of the three thoracic divisions are further
distinguished by joining to their name an indication of the segment
spoken of, in the form of the prefixes pro-, meso-, and meta-; thus
the pronotum, prosternum, and propleura make up the prothorax.
The thoracic regions are each made up of sclerites whose
nomenclature is due to Audouin.[21] He considered that every
thoracic ring is composed of the pieces shown in Fig. 55, viz. (1) the
sternum (B', a), an unpaired ventral piece; (2) the notum (A),
composed of four pieces placed in consecutive longitudinal order
(A'), and named praescutum (a), scutum (b), scutellum (c), and post-
scutellum (d); (3) lateral pieces, of which he distinguished on each
side an episternum (B', c), epimeron (e), and parapteron (d), these
together forming the pleuron. We give Audouin's Figure, but we
cannot enter on a full discussion of his views as to the thorax; they
have become widely known, though the constancy of the parts is not
so great as he supposed it would prove to be. Sometimes it is
impossible to find all the elements he thought should be present in a
thoracic ring, while in other cases too many sclerites exist. As a rule
the notum of the meso- and metathoraces is in greater part
composed of two pieces, the scutum and the scutellum; while in the
pronotum only one dorsal piece can be satisfactorily distinguished,
though a study of the development may show that really two are
frequently, if not usually, present. On the other hand, one, or more, of
the notal sclerites in some cases shows evidence of longitudinal
division along the middle. The sternum or ventral piece, though
varying greatly in form, is the most constant element of a thoracic
segment, but it has sometimes the appearance of consisting of two
parts, an anterior and a posterior. The pleuron nearly always
consists quite evidently of two parts, the episternum, the more
anterior and inferior, and the epimeron.[22] The relations between
these two parts vary much; in some cases the episternum is
conspicuously the more anterior, while in others the epimeron is
placed much above it, and may extend nearly as far forwards as it. It
may be said, as a rule, that when the sternum extends farther
backwards than the notum, the epimeron is above the episternum,
as in many Coleoptera; but if the sternum be anterior to the notum,
then the episternum is superior to the epimeron, as in dragon-flies.
We would here again reiterate the fact that these "pieces" are really
not separate parts, but are more or less indurated portions of a
continuous integument, which is frequently entirely occupied by
them; hence a portion of a sclerite that in one species is hard, may in
an allied form be wholly or partly membranous, and in such case its
delimitation may be very evident on some of its sides, and quite
obscure on another.

Fig. 55.—Mesothorax of Dytiscus, after Audouin. A, notum; A', pieces


of the notum separated: a, praescutum; b, scutum; c, scutellum; d,
post-scutellum: B, the sternum and pleura united; B', their parts
separated: a, sternum; c, episternum; d, parapteron; e, epimeron.

The parapteron of Audouin does not appear to be really a distinct


portion of the pleuron; in the case of Dytiscus it is apparently merely
a thickening of an edge. Audouin supposed this part to be specially
connected with the wing-articulation, and the term has been
subsequently used by other writers in connexion with several little
pieces that exist in the pleural region of winged Insects.

The prothorax is even more subject to variation in its development


than the other divisions of the thorax are. In the Hymenoptera the
prosternum is disconnected from the pronotum and is capable,
together with the first pair of legs, of movement independent of its
corresponding dorsal part, the pronotum, which in this Order is
always more or less completely united with the meso-thorax; in the
Diptera the rule is that the three thoracic segments are closely
consolidated into one mass. In the majority of Insects the prothorax
is comparatively free, that is to say, it is not so closely united with the
other two thoracic segments as they are with one another. The three
thoracic rings are seen in a comparatively uniform state of
development in a great number of larvae; also in the adult stages of
some Aptera, and among winged insects in some Neuroptera such
as the Embiidae, Termitidae, and Perlidae. In Lepidoptera the
pronotum bears a pair of erectile processes called patagia; though
frequently of moderately large size, they escape observation, being
covered with scales and usually closely adpressed to the sides of the
pronotum.

The two great divisions of the body—the mesothorax and the


metathorax—are usually very intimately combined in winged Insects,
and even when the prothorax is free, as in Coleoptera, these
posterior two thoracic rings are very greatly amalgamated. In the
higher forms of the Order just mentioned the mesosternum and
mesopleuron become changed in direction, and form as it were a
diaphragm closing the front of the metasternum. The meso- and
meta-thorax frequently each bear a pair of wings.

We have described briefly and figured (Fig. 55) the sclerites of the
mesothorax, and those of the metathorax correspond fairly well with
them. In addition to the sclerites usually described as constituting
these two thoracic divisions, there are some small pieces at the
bases of the wings. Jurine discriminated and named no less than
seven of these at the base of the anterior wing of a Hymenopteron.
One of them becomes of considerable size and importance in the
Order just mentioned, and seems to be articulated so as to exert
pressure on the base of the costa of the wing. This structure attains
its maximum of development in a genus (? nondescript) of Scoliidae,
as shown in Fig. 56. The best name for this sclerite seems to be that
proposed by Kirby and Spence, tegula. Some writers call it
paraptère, hypoptère, or squamule, and others have termed it
patagium; this latter name is, however, inadmissible, as it is applied
to a process of the prothorax we have already alluded to.
Fig. 56.—Head and thorax of wasp from Bogota: t, tegula; b, base of
wing.

To complete our account of the structure of the thorax it is necessary


to mention certain hard parts projecting into its interior, but of which
there is usually little or no trace externally. A large process in many
Insects projects upwards from the sternum in a forked manner. It
was called by Audouin the entothorax; some modern authors prefer
the term apophysis. Longitudinal partitions of very large size,
descending from the dorsum into the interior, also exist; these are
called phragmas, and are of great importance in some Insects with
perfect flight, such as Hymenoptera, Lepidoptera, and Diptera. There
is no phragma in connection with the pronotum, but behind this part
there may be three. A phragma has the appearance of being a fold
of the dorsum; it serves as an attachment for muscles, and may
probably be of service in other ways. More insignificant projections
into the interior are the little pieces called apodemes (Fig. 57, e);
these are placed at the sides of the thorax near the wings. The
apophyses are no doubt useful in preserving the delicate vital organs
from shocks, or from derangement by the muscular movements and
the changes of position of the body.

Fig. 57.—Transverse section of skeleton of metathorax of Goliathus


druryi, seen from behind: a, metanotum; b, metasternum; c,
phragma; d, entothorax (apophysis or furca); e, apodeme; f,
tendon of articulation. (After Kolbe.)

The appendages of the thorax are (a) inferior, the legs; (b) superior,
the wings. The legs are always six in number, and are usually
present even in larvae, though there exist many apodal larvae,
especially in Diptera. The three pairs of legs form one of the most
constant of the characters of Insects. They are jointed appendages
and consist of foot, otherwise tarsus; tibia, femur, trochanter, and
coxa; another piece, called trochantin more or less distinctly
separated from the coxa, exists in many Insects. The legs are
prolongations of the body sac, and are in closer relation with the
epimera and with the episterna than with other parts of the crust,
though they have a close relation with the sternum. If we look at the
body and leg of a neuropterous Insect (Fig. 58) we see that the basal
part of the leg—the coxa—is apparently a continuation of one of the
two pleural pieces or of both; in the latter case one of the prolonged
pieces forms the coxa proper, and the tip of the other forms a
supporting piece, which may possibly be the homologue of the
trochantin of some Insects. In some Orthoptera, especially in
Blattidae, and in Termitidae, there is a transverse chitinised fold
interposed between the sternum and the coxa, and this has the
appearance of being the same piece as the trochantin of the anterior
legs of Coleoptera.

Fig. 58.—Hind leg of Panorpa: a, episternum; a′, epimeron; b, coxa; b′,


coxal fold of epimeron; c, trochanter; d, femur; e, tibia; f, tarsus.

Beyond the coxa comes the trochanter; this in many Hymenoptera is


a double piece, though in other Insects it is single; usually it is the
most insignificant part of the leg. The femur is, on the whole, the
least variable part of the leg; the tibia, which follows it, being
frequently highly modified for industrial or other purposes. The joint
between the femur and the tibia is usually bent, and is therefore the
most conspicuous one in the leg; it is called the knee. The other
joints have not corresponding names, though that between the tibia
and the tarsus is of great importance. The spines at the tip of the
tibia, projecting beyond it, are called spurs, or calcares. The tarsus
or foot is extremely variable; it is very rarely absent, but may consist
of only one piece—joint, as it is frequently called[23]—or of any larger
number up to five, which may be considered the characteristic
number in the higher Insect forms. The terminal joint of the tarsus
bears normally a pair of claws; between the claws there is frequently
a lobe or process, according to circumstances very varied in different
Insects, called empodium, arolium, palmula, plantula,
pseudonychium, or pulvillus. This latter name should only be used in
those cases in which the sole of the foot is covered with a dense
pubescence. The form of the individual tarsal joints and the armature
or vestiture of the lower surface are highly variable. The most
remarkable tarsus is that found on the front foot of the male
Dytiscus.

It has been suggested that the claws and the terminal appendage of
the tarsus ought to be counted as forming a distinct joint; hence
some authors state that the higher Insects have six joints to the feet.
These parts, however, are never counted as separate joints by
systematic entomologists, and it has recently been stated that they
are not such originally.

The parts of the foot at the extremity of the last tarsal joint proper are
of great importance to the creature, and vary greatly in different
Insects. The most constant part of this apparatus is a pair of claws,
or a single claw. Between the two claws there may exist the
additional apparatus referred to above. This in some Insects—
notably in the Diptera—reaches a very complex development. We
figure these structures in Pelopaeus spinolae, a fossorial
Hymenopteron, remarking that our figures exhibit the apparatus in a
state of retraction (Fig. 59). According to the nomenclature of Dahl
and Ockler[24] the plate (b) on the dorsal aspect is the pressure plate
(Druck-Platte), and acts as an agent of pressure on the sole of the
pad (C, e); c and d on the underside are considered to be extension-
agents; c, extension-plate; d, extension-sole (Streck-Platte, Streck-
Sohle). These agents are assisted in acting on the pad by means of
an elastic bow placed in the interior of the latter. The pad (e) is a
very remarkable structure, capable of much extension and retraction;
when extended it is seen that the pressure plate is bent twice at a
right angle so as to form a step, the distal part of which runs along
the upper face of the basal part of the pad; the apical portion of this
latter consists of two large lobes, which in repose, as shown in our
Figure (f), fall back on the pad, something in the fashion of the
retracted claws of the cat, and conceal the pressure-plate.

The mode in which Insects are able to walk on smooth perpendicular


surfaces has been much discussed, and it appears highly probable
that the method by which this is accomplished is the exudation of
moisture from the foot; there is still, however, much to be ascertained
before the process can be satisfactorily comprehended. The theory
to the effect that the method is the pressure of the atmosphere
acting on the foot when the sole is in perfect apposition with the
object walked on, or when a slight vacuum is created between the
two, has apparently less to support it.

Fig. 59.—Foot of Pelopaeus, a fossorial wasp: A, tarsus entire; B,


terminal joint, upper side; C, under side. a, claw; b, base of
pressure-plate; c, extension-plate; d, extension-sole; e, pad; f,
lobe of pad retracted.

The legs of the young Insect are usually more simple than those of
the adult, and in caterpillars they are short appendages, and only
imperfectly jointed. If a young larva, with feet, of a beetle, such as
Crioceris asparagi be examined, it may be seen that the leg is
formed by protuberance of the integument, which becomes divided
into parts by simple creases; an observation suggesting that the
more highly developed jointed leg is formed in a similar manner. This
appears to be really the case, for the actual continuity of the limb at
the chief joint—the knee—can be demonstrated in many Insects by
splitting the outer integument longitudinally and then pulling the
pieces a little apart; while in other cases even this is not necessary,
the knee along its inner face being membranous to a considerable
extent, and the membrane continuous from femur to tibia.

Turning to the wings, we remark that there may be one or two pairs
of these appendages. When there is but one pair it is nearly always
mesothoracic, when there are two pairs one is invariably
mesothoracic, the other metathoracic. The situation of the wing is
always at the edge of the notum, but the attachment varies in other
respects. It may be limited to a small spot, and this is usually the
case with the anterior wing; or the attachment may extend for a
considerable distance along the edge of the notum, a condition
which frequently occurs, especially in the case of the posterior
wings. The actual connexion of the wings with the thorax takes place
by means of strong horny lines in them which come into very close
relation with the little pieces in the thorax which we have already
described, and which were styled by Audouin articulatory epidemes.
There is extreme variety in the size, form, texture, and clothing of the
wings, but there is so much resemblance in general characters
amongst the members of each one of the Orders, that it is usually
possible for an expert, seeing only a wing, to say with certainty what
Order of Insects its possessor belonged to. We shall allude to these
characters in treating of the Orders of Insects.

Each wing consists of two layers, an upper and a lower, and


between them there may be tracheae and other structures,
especially obvious when the wings are newly developed. It has been
shown by Hagen that the two layers can be separated when the
wings are recently formed, and it is then seen that each layer is
traversed by lines of harder matter, the nervures. These ribs are
frequently called wing-veins, or nerves, but as they have no relation
to the anatomical structures bearing those names, it is better to
make use of the term nervures. The strength, number, form and
inter-relations of these nervures vary exceedingly; they are thus
most important aids in the classification of Insects. Hence various
efforts have been made to establish a system of nomenclature that
shall be uniform throughout the different Orders, but at present
success has not attended these efforts, and it is probable that no
real homology exists between the nervures of the different Orders of
Insects. We shall not therefore discuss the question here. We may,
however, mention that German savants have recently distinguished
two forms of nervures which they consider essentially distinct, viz.
convex and concave. These, to some extent, alternate with one
another, but a fork given off by a convex one is not considered to be
a concave one. The terms convex and concave are not happily
chosen; they do not refer to the shape of the nervures, but appear to
have been suggested by the fact that the surface of the wing being
somewhat undulating the convex veins more usually run along the
ridges, the concave veins along the depressions. The convex are the
more important of the two, being the stronger, and more closely
connected with the articulation of the wing.

The wings, broadly speaking, may be said to be three-margined: the


margin that is anterior when the wings are extended is called the
costa, and the edge that is then most distant from the body is the
outer margin, while the limit that lies along the body when the wings
are closed is the inner margin.

The only great Order of Insects provided with a single pair of wings
is the Diptera, and in these the metathorax possesses, instead of
wings, a pair of little capitate bodies called halteres or poisers. In the
abnormal Strepsiptera, where a large pair of wings is placed on the
metathorax, there are on the mesothorax some small appendages
that are considered to represent the anterior wings. In the great
Order Coleoptera, or beetles, the anterior wings are replaced by a
pair of horny sheaths that close together over the back of the Insect,
concealing the hind-wings, so that the beetle looks like a wingless
Insect: in other four-winged Insects it is usually the front wings that
are most useful in flight, but the elytra, as these parts are called in
Coleoptera, take no active part in flight, and it has been recently
suggested by Hoffbauer[25] that they are not the homologues of the
front wings, but of the tegulae (see Fig. 56), of other Insects. In the
Orthoptera the front wings also differ in consistence from the other
pair over which they lie in repose, and are called tegmina. There are
many Insects in which the wings exist in a more or less rudimentary
or vestigial condition, though they are never used for purposes of
flight.

The abdomen, or hind body, is the least modified part of the body,
though some of the numerous rings of which it is composed may be
extremely altered from the usual simple form. Such change takes
place at its two extremities, but usually to a much greater extent at
the distal extremity than at the base. This latter part is attached to
the thorax, and it is a curious fact that in many Insects the base of
the abdomen is so closely connected with the thorax that it has all
the appearance of being a portion of this latter division of the body;
indeed it is sometimes difficult to trace the real division between the
two parts. In such cases a further differentiation may occur, and the
part of the abdomen that on its anterior aspect is intimately attached
to the thorax may on its posterior aspect be very slightly connected
with the rest of the abdomen. Under such circumstances it is difficult
at first sight to recognise the real state of the case. When a segment
is thus transferred from the abdomen to the metathorax, the part is
called a median segment. The most remarkable median segment
exists in those Hymenoptera which have a stalked abdomen, but a
similar though less perfect condition exists in many Insects. When
such a union occurs, it is usually most complete on the dorsal
surface, and the first ventral plate may almost totally disappear: such
an alteration may involve a certain amount of change in the sclerites
of the next segment, so that the morphological determination of the
parts at the back of the thorax and front of the abdomen is by no
means a simple matter. A highly modified hind-body exists in the
higher ants, Myrmicidae. In Fig. 60 we contrast the simple abdomen
of Japyx with the highly modified state of the same part in an ant.
Fig. 60.—Simple abdomen of Japyx (A) contrasted with the highly
modified one of an ant, Cryptocerus (B). The segments are
numbered from before backwards.

Unlike the head and thorax, the abdomen is so loosely knitted


together that it can undergo much expansion and contraction. This is
facilitated by an imbricated arrangement of the plates, and by their
being connected by means of membranes admitting of much
movement (Fig. 47, m, p. 88). In order to understand the structure of
the abdomen it should be studied in its most distended state; it is
then seen that there is a dorsal and a ventral hard plate to each ring,
and there is also usually a stigma; there may be foldings or plications
near the line of junction of the dorsal and ventral plates, but these
margins are not really distinct pieces. The pleura, in fact, remain
membranous in the abdominal region, contrasting strongly with the
condition of these parts in the thorax. The proportions of the plates
vary greatly; sometimes the ventral are very large in proportion to the
dorsal, as is usually the case in Coleoptera, while in the Orthoptera
the reverse condition prevails.

Cerci or other appendages frequently exist at the extremity of the


abdomen (Fig. 47, n, p. 88); the former are sometimes like antennae,
while in other cases they may be short compressed processes
consisting of very few joints. The females of many Insects possess
saws or piercing instruments concealed within the apical part of the
abdomen; in other cases an elongate exserted organ, called
ovipositor, used for placing the eggs in suitable positions, is present.
Such organs consist, it is thought, either of modified appendages,
called gonapophyses, or of dorsal, ventral, or pleural plates. The
males frequently bear within the extremity of the body a more or less
complicated apparatus called the genital armour. The term
gonapophysis is at present a vague one, including stings, some
ovipositors, portions of male copulatory apparatus, or other
structures, of which the origin is more or less obscure.

The caterpillar, or larva, of the Lepidoptera and some other Insects,


bears a greater number of legs than the three pairs we have
mentioned as being the normal number in Insects, but the posterior
feet are in this case very different from the anterior, and are called
false legs or prolegs. These prolegs, which are placed on the hind
body, bear a series of hooks in Lepidopterous larvae, but the
analogous structures of Sawfly larvae are destitute of such hooks.

Placed along the sides of the body, usually quite visible in the larva,
but more or less concealed in the perfect Insect, are little apertures
for the admittance of air to the respiratory system. They are called
spiracles or stigmata. There is extreme variety in their structure and
size; the largest and most remarkable are found on the prothorax of
Coleoptera, especially in the groups Copridae and Cerambycidae.

The exact position of the stigmata varies greatly, as does also their
number. In the Order Aptera there may be none, while the maximum
number of eleven pairs is said by Grassi[26] to be attained in Japyx
solifugus: in no other Insect have more than ten pairs been recorded,
and this number is comparatively rare. Both position and number
frequently differ in the early and later stages of the same Insect. The
structure of the stigmata is quite as inconstant as the other points we
have mentioned are.

Fig. 61.—Membranous space between pro- and meso-thoraces of a


beetle Euchroma, showing stigma (st); a, hind margin of
pronotum; b, front leg; c, front margin of mesonotum; d, base of
elytra; e, mesosternum.
The admission of air to the tracheal system and its confinement
there, as well as the exclusion of foreign bodies, have to be provided
for. The control of the air within the system is, according to
Landois[27] and Krancher,[28] usually accomplished by means of an
occluding apparatus placed on the tracheal trunk a little inside of the
stigma, and in such case this latter orifice serves chiefly as a means
for preventing the intrusion of foreign bodies. The occluding
apparatus consists of muscular and mechanical parts, which differ
much in their details in different Insects. Lowne supposes that the air
is maintained in the tracheal system in a compressed condition, and
if this be so, this apparatus must be of great importance in the Insect
economy. Miall and Denny[29] state that in the anterior stigmata of
the cockroach the valves act as the occluding agents, muscles being
attached directly to the inner face of the valves, and in some other
Insects the spiracular valves appear to act partially by muscular
agency, but there are many stigmata having valves destitute of
muscles. According to Lowne[30] there exist valves in the blowfly at
the entrance to the trachea proper, and he gives the following as the
arrangement of parts for the admission of air:—there is a spiracle
leading into a chamber, the atrium, which is limited inwardly by the
occluding apparatus; and beyond this there is a second chamber, the
vestibule, separated from the tracheae proper by a valvular
arrangement. He considers that the vestibule acts as a pump to
force the air into the tracheae.

Fig. 62.—Diagrammatic Insect to explain terms of position. A, apex; B,


base: 1, tibia; 2, last abdominal segment; 3, ideal centre.

Systematic Orientation.
Terms relating to position are unfortunately used by writers on
entomology in various, even in opposite senses. Great confusion
exists as to the application of such words as base, apex, transverse,
longitudinal. We can best explain the way in which the relative
positions and directions of parts should be described by reference to
Figure 62. The spot 3 represents an imaginary centre, situated
between the thorax and abdomen, to which all the parts of the body
are supposed to be related. The Insect should always be described
as if it were in the position shown in the Figure, and the terms used
should not vary as the position is changed. The creature is placed
with ventral surface beneath, and with the appendages extended,
like the Insect itself, in a horizontal plane. In the Figure the legs are,
for clearness, made to radiate, but in the proper position the anterior
pair should be approximate in front, and the middle and hind pairs
directed backwards under the body. The legs are not to be treated as
if they were hanging from the body, though that is the position they
frequently actually assume. The right and left sides, and the upper
and lower faces (these latter are frequently also spoken of as sides),
are still to retain the same nomenclature even when the position of
the specimen is reversed. The base of an organ is that margin that is
nearest to the ideal centre, the apex that which is most distant. Thus
in Fig. 62, where 1 indicates the front tibia, the apex (A) is broader
than the base (B); in the antennae the apex is the front part, while in
the cerci the apex is the posterior part; in the last abdominal
segment (2) the base (B) is in front of the apex (A). The terms
longitudinal and transverse should always be used with reference to
the two chief axes of the body-surface; longitudinal referring to the
axis extending from before backwards, and transverse to that going
across, i.e. from side to side.

CHAPTER IV

ARRANGEMENT OF INTERNAL ORGANS–MUSCLES–NERVOUS SYSTEM–


GANGLIONIC CHAIN–BRAIN–SENSE-ORGANS–ALIMENTARY CANAL–
MALPIGHIAN TUBES–RESPIRATION–TRACHEAL SYSTEM–FUNCTION OF
RESPIRATION–BLOOD OR BLOOD-CHYLE–DORSAL VESSEL OR HEART–
FAT-BODY–OVARIES–TESTES–PARTHENOGENESIS–GLANDS.

The internal anatomy of Insects may be conveniently dealt with


under the following heads:—(1) Muscular system; (2) nervous
system; (3) alimentary system (under which may be included
secretion and excretion, about which in Insects very little is known);
(4) respiratory organs; (5) circulatory system; (6) fat-body; (7)
reproductive system.

Fig. 63.—Diagram of arrangement of some of the internal organs of an


Insect: a, mouth; b, mandible; c, pharynx; d, oesophagus; e,
salivary glands (usually extending further backwards); f, eye; g,
supra-oesophageal ganglion; h, sub-oesophageal ganglion; i,
tentorium; j, aorta; k1, k2, k3, entothorax; l1-l8, ventral nervous
chain; m, crop; n, proventriculus; o, stomach; p, Malpighian tubes;
q, small intestine; r, large intestine; s, heart; t, pericardial septum;
u, ovary composed of four egg-tubes; v, oviduct; w, spermatheca
(or an accessory gland); x, retractile ovipositor; y, cercus; z,
labrum.

Many of the anatomical structures have positions in the body that are
fairly constant throughout the class. Parts of the respiratory and
muscular systems and the fat-body occur in most of the districts of
the body. The heart is placed just below the dorsal surface; the
alimentary canal extends along the middle from the head to the end
of the body. The chief parts of the nervous system are below the
alimentary canal, except that the brain is placed above the beginning
of the canal in the head. The reproductive system extends in the
abdomen obliquely from above downwards, commencing anteriorly

You might also like