Term Paper Big Data Analysis

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 3

Overview

Vignette: 12:30PM - 1:30PM: HUMAN INTERFACE

Use Case: The robot assistant has picked up the freshly squeezed juice from the other side of the kitchen
and has placed a glass softly down for Dr. Brown and a plastic cup for her son.

Description: This is a use case that describes an artificially intelligent robot that can perform tasks on
behalf of humans, including navigating the home, handling food, interacting with other smart devices
(juicer), and serving food in different containers to different people. Note that it is not specific to the
title of the Vignette from which it comes from, which is more about AR interfaces for humans.

Summary of Key Big Data Technology Applications


The robot juicing oranges and serving orange juice for Dr. Brown and her son utilizes several big data
technologies discussed in the Unit 1 lectures. Some of the key technologies discussed in subsequent
sections is summarized as follows:

 Data Curation and Modern Databases: There are at least two distinct sets of data needed to
bring the robot orange juice use case to life – data needed to build the robot routines and
algorithms that come out of the box with the robot (or through software updates), and data
about the robot’s environment that it needs to process in real-time or near-real-time. High
performance column-store-based data warehouses are relevant for the former, and large, high-
speed in memory processing is relevant for the latter.
 Cloud Computing and Distributed Computing Platforms: For the robots to function on a daily
basis, they need to have internal computing and data storage capabilities as well as cloud
computing capabilities. The latter is important for interacting with the robot company.
Additionally, the many robots that the company has sold may end up being distributed
computing platforms to help with awareness at a broader level (e.g., Where are the freshest
oranges sold nearby? For this season’s oranges, what is the best juicing setting?)
 Data Visualization for Humans: The data that the robot collects about orange juicing and other
tasks needs to be presented to at least two groups of humans: those at the robot company
monitoring, tracking, and improving robot performance, and consumers like Dr. Brown who use
the robots. Both need simple and effective visualizations of what the robots are doing.
 Streaming and Sampling: The robot orange juicer needs both streaming and sampling
algorithms to accomplish the fast processing it needs to get its tasks done. The robot will receive
a vast amount of information about oranges, but it only needs to sample a small subset of them
to know what oranges to choose and how to juice them. On the other hand, the robot will
receive a vast amount of spatial data about its environment, but only needs to stream a small
layer of data to know where in the room to put the orange juice.
Data as the Foundation for Building Robots
The data needed for the robot to perform a relatively simple task for humans such as using an orange
juicer to make oranges requires a significant amount of data to build the orange juicing and orange juice
delivery routines, to execute the routines in a household setting. Current research that would be utilized
to fulfill this use case in Dr. Brown’s futuristic world is focused around robotic OLP (off-line
programming), VVS (virtual visual servoing), and simulations – much of which is being done by Korean
researchers and focuses on optimization of specific routines in coordination with one another and with
environmental awareness.

Off-line programming is a robot programming method where the routine for the robot is written
externally from the robot, and then uploaded to the robot for execution. This concept of software and
hardware separation is already prevalent in consumer electronics, such as with Windows updates or
new iOS releases, but is less common in industrial robotics. OLP is highly relevant to the use case
because the robot assistants will be capable of many tasks which need to be programmed, delivered,
and updated.

Kee-Jin Park, Jin-Dae Kim, and others have led and collaborated on a number of research efforts that
involve using simulations to improve robot automation accuracy and precision for real working
environments.1,2,3,4 The technique used is virtual visual servoing, where information from sensors are
gathered to create a virtual image of the environment, from which calibrations are made to accurately
perform tasks. Park and Kim (and others) have used the data collected from the sensors in simulations
to improve functions such as grinding, diode laser heat treatment, coating, arc welding, hemming, and
more. These are industrial and manufacturing tasks but can be thought of as precursors to future
consumer robot tasks, and which require a combination of data analysis to build robot routines, and
data analysis to execute those routines in the real world.

The framework for their work comes from two sources: a framework for virtual visual servoing
developed by Marchand and Chaumette and published in 2002 5; and a number of articles on simulation
analysis in robotics from the 1990s6,7.

1
Kee-Jin Park, Jin-Dae Kim, Che-Seung Cho, Byeong-Soo Kim, & Sung-Ryul Song. (2012). A Study on Heat Treatment
Automation using Robot Simulation OLP methodology. 한국생산제조학회 학술발표대회 논문집, 194.
2
Seung-Chan, L., In-Ho, S., & Jin-Hwan, B. (2008). An Accurate and Efficient Method of the Spray Paint Simulation
for Robot OLP. Transactions of the Society of CAD/CAM Engineers, 13(4), 296-304.
3
Kim, C., Hong, K., & Han, Y. (2005). PC-based off-line programming in the shipbuilding industry: Open
architecture. Advanced Robotics, 19(4), 435-458.
4
Andreas Mark; Robert Bohlin; Daniel Segerdahl; Fredrik Edelvik; Johan S. Carlson. (2014). Optimisation of
robotised sealing stations in paint shops by process simulation and automatic path planning. Int. J. of
Manufacturing Research, 9(1), 4-26.
5
E. Marchand, François Chaumette. Virtual visual servoing: A framework for real-time augmented reality. Drettakis,
G. and Seidel, H.-P. EUROGRAPHICS 2002 Conference Proceeding, 2002, Saarebrün, Germany, Germany. 21(3),
pp.289-298, 2002
6
Tarnoff, N., Jacoff, A., & Lumia, R. (1992). Graphical simulation for sensor based robot programming. Journal of
Intelligent and Robotic Systems, 5(1), 49-62.
7
Roos, & Behrens. (1997). Off-line programming of industrial robots — Adaptation of simulated user programs to
the real environment. Computers in Industry, 33(1), 139-150.
Robots Talking to Robots
Dr. Brown’s robot assistant interacts with the juicing machine, which presumably could be a smart
device, basically a single-task robot. For every interaction that Dr. Brown and her family has with a
robot, there could be countless robot-to-robot communications behind the scenes, sharing information
and coordinating activities. For example, if all the robots need to go grocery shopping for dinner and the
only constraint is that food needs to be ready by dinner, the robots could plan to go to the store at
different times. The same could be true for leveling energy grid utilization by robots or even
coordinating bulk purchases to save money.

Current research on robot-to-robot communication is being done on multiple fronts, as there are
countless applications of and requirements for multi-robot communication, including inter-robot
performance, swarm robotics, and inter-robot negotiations.

Local Robot Networks: Computing time, network latency, and system availability can be improved by
avoiding sole reliance on cloud robotics, and instead using virtual local networks to communicate with
each other.8 Data that needs to be transmitted to or received from a central server can be preprocessed
and asynchronous, allowing for smaller data payloads and better performance.

Robot Swarms: Swarm robotics involves the coordination of large numbers of relatively simple robots to
accomplish tasks in collectives, such as gathering data from large locations where there is a high risk of
robot failure or damage. Current research considers specific aspects of swarm robotics, such as control
frameworks9 and task allocation10, but there is also significant discussion of potential ethical
concerns11,12.

Robot Negotiations: Robots that interact with each other need to be able to handle conflicts between
their respective routines. For example, if the robot assistant wants to use the orange juicer, but Dr.
Brown had already set the juicer to a 2-hour self-cleaning cycle in the morning, the robot needs to figure
out whether to interrupt or wait. Not every interaction between robots will be as simple as this, and
current research focuses on developing frameworks and mechanisms for handling more complex inter-
robot interactions and negotiations, both virtual and physical. 13,14

8
Osunmakinde, I., & Vikash, R. (2014). Development of a Survivable Cloud Multi-Robot Framework for
Heterogeneous Environments. International Journal of Advanced Robotic Systems, 11(10).
9
Pavlic, T., Wilson, S., Kumar, G., & Berman, S. (2015). Control of Stochastic Boundary Coverage by Multirobot
Systems. Journal Of Dynamic Systems Measurement And Control-Transactions Of The Asme, 137(3), 034505/1-
034505/9.
10
Pavlic, T., Wilson, S., Kumar, G., & Berman, S. (2015). Control of Stochastic Boundary Coverage by Multirobot
Systems. Journal Of Dynamic Systems Measurement And Control-Transactions Of The Asme, 137(3), 034505/1-
034505/9.
11
Coeckelbergh, M. (2011). From Killer Machines to Doctrines and Swarms, or Why Ethics of Military Robotics Is
not (Necessarily) About Robots. Philosophy & Technology, 24(3), 269-278.
12
Shaw, I. (2017). Robot Wars: US Empire and geopolitics in the robotic age. Security Dialogue, 48(5), 451-470.
13
Wang, Zhang, Liu, Li, & Tang. (2017). Cloud-assisted interaction and negotiation of industrial robots for the smart
factory. Computers and Electrical Engineering, 63, 66-78.
14
Li, X., Sun, D., Yang, J., & Liu, S. (2011). Connectivity constrained multirobot navigation with considering physical
size of robots. Automation and Logistics (ICAL), 2011 IEEE International Conference on, 24-29.

You might also like