Professional Documents
Culture Documents
Sobeih 2015
Sobeih 2015
Sobeih 2015
Fig. 1.
1280
Fig. 2.
A. Google Glass
Google Glass [9] is a wearable computer with a headsup display like a pair of glasses. It has built-in sensors
(e.g. gyroscope, accelerometer), Bluetooth, WiFi, and interacts
with users via voice controls, a touchpad, and Google Glass
applications ( also called Glassware), as shown in Fig.2.
Glass runs on the Android operating system. For Google
Glass Explorer Edition, it incorporates a dual - core processor
and 2GB of RAM and well as 16GB Flash Storage. As an
experimental wearable device, Google released a prototype of
Google Glass in 2013 and then made it available to the public
in 2014. Despite that Google stopped producing the prototype
( completion of experimental phase) but are still committed to
the development of enterprise version [10].
There are two ways to control Google Glass: 1) one way
is to control it by a microphone through spoken commands.
To invoke an action or a Google Glass application, a user
1281
"
"
"
"
!"
Fig. 3.
Fig. 4.
A. Sensor environment
To access and control environmental sensors using Glass,
it is important to create a sensor environment, which is built
upon our previous work [20]. This sensor environment is a
client-server model, consisting of a server with a remote client
connected to a large number and variety sensors that can
be controlled through a web interface, smartphone, or smart
wearables, as shown in the left side of the Fig.4.
1) Sensor device board: The sensor device board has been
created using the Phidgets starter kit ( building-block tools for
sensing and control from different computing devices) [21]
containing a RFID scanner, spatial sensor, Motor controller
and four motors and an interface kit through which eight
sensors as well as four LEDs to simulate digital outputs and
two switches to simulate digital inputs are connected. The
setup was completed by using a USB wireless adapter and
powered USB hub that enables all of the Phidgets to be
connected at the same time.
2) Raspberry Pi, web server and sensor device application:
The central component is the server, running on the Raspberry
Pi, which is responsible for controlling the broker, advertising
itself to the network and then managing all the update and
1282
Fig. 5.
Activity, allowing access and display of all the sensor information in the environment as necessary and accessing all the
input and navigation facilities built in to Glass to allow the
user to control said devices.
In order to connect to the server, the Glassware accesses
the Wi-Fi transceiver and searches for the service detailing the
MQTT Broker. On resolution of this service the IP address and
port of the broker are determined and the Glassware initiates
a MQTT Client and login to the Broker. The Glassware and
Glass components are shown in Fig.5. The application then
builds a data model from the received device list and updates
it as required when on receipt of update messages from the
server.
From this point the Glassware loads into the main navigation activity which represents the sensor data within the
bounds of the Glass card interface. The data model contains
three basic object types: Collections, Devices and Sensors;
the latter of which is further split into a Switch Sensor and a
Range Sensor. Each of these is represented by different card
layouts arranged in a horizontal row that can be traversed by
the user swiping forward and backward on the touchpad with
the general Glass convention of a top down swipe gesture
being used as a back action to go back to the previous
row. Collections and Devices are represented by a card that
displays the name of the object and when selected with a tap
opens up into another row of similar cards. A Device, when
selected, opens up into a row of sensor cards which display
the name of the sensor and the current value. Switch Sensors
are represented by coloured toggle switch indicating its current
position either on or off by both the colour and differing on/off
text. If a sensor is controllable, then tapping on the touchpad
causes a sensor update object to be generated for that sensor
which is dispatched to the server by posting it to the relevant
topic and the successful update is received and causes the card
to alter to indicate the new value. A Range sensor depicts the
name of the sensor along with the current numerical value
of the sensor. If it is controllable, it also depicts a slider
which can be focused by tapping the touchpad, after which
1283
start card with the phrase connect to open day server. Once
in the application there are a number of contextual menus, the
rst is available within a collection or device and allows the
user to traverse the group and select a particular card, go back
or restart the navigation activity. While a Sensor card is shown
an additional option enters the sensor menu which enables the
user to toggle a switch sensor or set a range sensor value.
This prototype has undergone a through software testing including black box and white box testing. The system functions
well and can automatically adjustment sensors such as the
lighting, temperature and give a full control of user experience.
IV. C ONCLUSION AND F UTURE W ORK
Fig. 6.
In this work, for the rst time, the main goal of our
exploratory work was to prototype a wearable system with
particular emphasis on interacting with sensor devices in the
environment. The testing and demonstration has shown the
successful delivery of this prototype. This work could be
potentially used in Smart Homes, smart spaces for full control
of the smart environment.
While the Glassware produced achieved the goals set there
are still many avenues of development that could be undertaken. One aspect is to utilise the Bluetooth and Wi-Fi
capabilities of Glass along with any other relevant sensors to
determine precise location data and feed it to an application
to create contextual data display automatically. Another aspect
is to utilise the camera and particularly the fact that it sees a
similar view to the user. Accessing the camera and using it to
identify devices in the environment that are in the users eld of
view through some form of photo recognition would be a great
leap forward in hands free ubiquitous computing. One of the
most important aspects is security design. Security is a very
important part of the Internet of Things (IoT). The amount
and type of data to be gathered will be extremely personal
and should be protected. The future work will be incorporate
security aspect in the system.
ACKNOWLEDGMENT
The work reported in this paper has formed part of the
DIVINE project, which is funded by Manchester Metropolitan
University in UK.
R EFERENCES
[1] Gartner. (2015, 07) The disruptive impact of iot on
business - gartner symposium/itxpo 2014. [Online]. Available:
http://www.gartner.com/newsroom/id/2905717
[2] H. Salah, E. MacIntosh, and N. Rajakulendran. (2014) Mars
market insights: Wearable tech: Leveraging canadian innovation
to improve health. [Online]. Available: www.marsdd.com/newsinsights/mars-reports/
[3] Nickfuelband. (2015, 07) Nike+fuelband. [Online]. Available:
http://www.nike.com/
[4] Jawbone. (2015, 07) Jawbone. [Online]. Available: https://jawbone.com/
[5] Fitbit. (2015, 07) Fitbit. [Online]. Available: http://www.tbit.com/uk
[6] AiroHealth. (2015, 07) Airo Health. [Online]. Available:
http://www.getairo.com/
[7] Preventice. Bodyguardian remote monitoring system. [Online].
Available: http://www.preventice.com/products/bodyguardian/
[8] IDC. (2015, 07) Worldwide quarterly wearable device tracker. [Online].
Available: http://www.idc.com/tracker/
1284
[9] Google.
(2015,
07)
Google
glass.
[Online].
Available:
https://developers.google.com/glass/
[10] BBC. (2015, 07) Google glass sales halted but rm says kit is not dead.
[Online]. Available: http://www.bbc.co.uk/news/technology-30831128
[11] L. Richardson, M. Amundsen, and S. Ruby, RESTful Web APIs, 1st ed.
OReilly Media, 2013.
[12] L. Kathy-Chin. (2013) A surgeons review of google
glass
in
the
operating
room.
[Online].
Available:
http://www.fastcompany.com/3022534/internet-of- things/a-surgeonsreview-of-google-glass-in-the-operating-room
[13] P. Marks, A healthy dose of google glass, New Scientist, vol. 219, no.
2936, pp. 2223, 2013.
[14] C. Wiltz, Epgl medical sciences among rst device makers to use
google glass, Medical Device and Diagnostic Industry, vol. 35, no. 4,
2013.
[15] W. Glauser, Doctors among early adopters of google glass, Canadian
Medical Association Journal, p. 109, 2013.
[16] STEMbite.
(2015,
07)
Stembite.
[Online].
Available:
www.youtube.com/STEMbite
[17] S. Vallurupalli, H. Paydak, S. Agarwal, M. Agrawal, and C. Assad Kottner, Wearable technology to improve education and patient outcomes
in a cardiology fellowship program: A feasibility study, Health and
Technology, vol. 3, no. 4, pp. 267270, 2013.
[18] M. Campbell, Hive-mind solves tasks using google glass ant game,
New Scientist, vol. 219, no. 2928, p. 20, 2013.
[19] H. Altwaijry, M. Moghimi, and S. Belongie, Recognizing locations with
google glass: A case study, in IEEE Winter Conference on Applications
of Computer Vision (WACV). IEEE, 24-26 March 2014 2014, pp. 167
174.
[20] T. Sobeith, Mimir: Development of a sotfware system to control remote
snesor devices using mobile and wearbale technolgies, Masters thesis,
Manchester Metropolitan University, 2014.
[21] Phidgets.
(2015,
07)
Phidgets.
[Online].
Available:
http://www.phidgets.com/
[22] MQTT. (2015, 07) Mqtt. [Online]. Available: http://mqtt.org/
1285