Professional Documents
Culture Documents
Indoor Navigation Using Augmented Reality
Indoor Navigation Using Augmented Reality
In
By
To
Candidate’s Declaration
This is to certify that the work which is considering presented in the report entitled
“Indoor Navigation using Augmented Reality” in partial fulfillment of the
requirements for the award of the degree of Bachelor of Technology in Computer
Science and Engineering/Information Technology submitted in the department of
Computer Science & Engineering and Information Technology, Jaypee University of
Information Technology Waknaghat is an accurate record of our own work carried out
around a period from August 2016 to December 2016 under the supervision of Ms. Ruhi
Mahajan (Assistant Professor, Computer Science & Engineering Department).
The matter embodied in the report has not been submitted for the award of any other
degree or diploma.
This is to certify that the above statement made by the candidates is true to the best of my
knowledge.
i
ACKNOWLEDGEMENT
We owe our profound gratitude to our project supervisor Ms. Ruhi Mahajan, who took
keen interest and guided us all along in my project work titled ―Street view using
Augmented Reality, till the completion of our project by providing all the necessary
information for developing the project. The project development helped us in research
and we got to know a lot of new things in our domain. We are really thankful to her.
ii
TABLE OF CONTENTS
CERTIFICATE ............................................................................................... i i
ACKNOWLEDGEMENT .................................................................................. ii ii
LIST OF FIGURES ........................................................................................... v v
LIST OF TABLES ......................................................................................... vi vi
ABSTRACT .................................................................................................. vii vii
1)INTRODUCTION…………………………………………………………………...1
1.2)OBJECTIVES……………………..……………………….…………...…………3
1.3)METHODOLGY……………..…………………………….……….….…………4
2)LITEATURE SURVEY
3.1)SOFTWARE REQUIREMENTS………………………………………………30
3.2)HARDWARE REQUIREMENTS……………………………………………...30
3.3)MAKING BEACONS…………………………………..……………………31-33
iii
3.4)SYSTEM DESIGN…………………………………..……………………..…34-36
3.5)API IMPLEMENTATION…………………………………..………………..37-38
5)CONCLUSION………………………………………………………..…………...44
5.1)FUTURE SCOPE……………………………………………………..……........45
6.) REFERENCES………………………………………………………….….....46-47
iv
LIST OF FIGURES
2. A simple marker 7
15 Class Diagram 35
18 Unit testing 40
v
LIST OF TABLES
3 Test case-1 41
4. Test case-2 41
5. Test case-3 41
6. Test case-4 41
7. Test case-5 42
8. Test case-6 42
9. Test case-7 42
vi
ABSTRACT
This project is indented to gave all one got an Implementation of indoor as well as
outdoor navigation by augmented reality and beacons (Bluetooth low energy device).
Augmented Reality or AR is an emerging technology around one’s conception of the real-
time environment is enhanced by superimposing computer-generated information one as
graphical, textual or audio causal, as lavishly as objects onto a disclose screen. The
proposed function is an android mobile based academic work which will be compatible
by all of the versions of android having detail API level above 16 (Jellybean).
In particular, this android application explicitly addresses issues caused all by relatively
confused GPS and attitude sensors around navigation by for beacons which interact by all
of the android application via an API.
vii
1. INTRODUCTION
This project is intended to give implementation of augmented reality to navigate the user
indoor as well as outdoor. The project highlights the use of beacons to retrieve the user’s
current location. The major advantage of using beacon is that the user will have a
seamless experience; the user is not required to scan any marker for augmented reality to
work. Hence the efficiency is higher and hence the time taken to respond to user query is
less than other mechanisms. The working of the application is also described in this
documentation. The documentation also lists out the requirements for the project
implementation.
1
1.1) PROBLEM STATEMENT
While today’s more unofficial GPS chips cancel sometimes gain a work the bugs out of
(receive signals from all one want satellites to confirm a location ) alimentary a
apartment , the resulting lot is necessarily not disputable enough anticipated useful .The
signal form the satellites are attenuated and around by roofs , walls and particular
objects. Besides, the error range of many GPS chips can be larger than the indoor space
itself.
In distinct, this android review explicitly addresses issues caused by the relatively
confused GPS and point of view sensors around navigation by via beacons which interact
mutually the android application via an API.
2
1.2) OBJECTIVE
Long-term – Long-term objective includes that students are exposed to the industrial
environment which should help us in future when we will work in real-time projects. One
should gain experience of working with a team and learn to cooperate with our team
members.
3
1.3) METHODOLOGY
Making Beacons: This includes creating the beacons which will emit the low
Bluetooth energy signals which will be detected by the API which is present in the
android application.
UI Design: We create an activity in the android application in which the user will
have to feed the destination(should match with the destination stored in the database)
and then the current location will be detected and the user will be guided to reach the
destination using augmented reality.
Indexing of Beacons in database: This module will insert the MAC of the beacon
which is at the destination, and the MAC of all the beacons which the user will
encounter during the navigation.
Testing: Various kinds of testing like black box testing, white box testing etc
4
2. LITERATURE SURVEY
User maintains a sense of presence in real Senses are under control of system
world
Needs a mechanism to combine virtual and real Need a mechanism to feed virtual world
worlds to user
Augmented presence and virtual world are antithesis reflections of a well- known in
another by all of what technology seeks to end and announce for the user. Virtual world
offers a digital relaxation of a genuine continuance setting, interruption augmented
survival delivers virtual fundamentals as an layer to the real world.
6
2.1.2) Different Types of Augmented Reality
There are two types of easily done augmented reality: marker-based which uses cameras
and audio visual , and marker less which act with regard to positional data one as a
mobile's GPS and compass.
Marker Based:
Different types of Augmented Reality (AR) markers are images that can be detected by a
camera and secondhand mutually software as the motion studio video for virtual capital
places in a scene .Most are detective not with standing colors can be used as conceive as
the scan between them cut back be smoothly recognized by a camera .Simple augmented
hand one is dealt makers can comprise one or greater basic shapes with black squares at
variance a white background. More elaborate markers can be created using simple images
that are still read properly by a camera, and these codes can even take the form of tattoos.
A camera is used mutually with AR software to notice augmented reality markers as the
location for virtual objects. Limitations on the types of augmented reality markers that cut
back be used are based on the software that recognizes them. The simplest types of
augmented reality markers are black and white images that consist of two-dimensional
(2D) barcodes.
7
Markerless:
In marker-less augmented reality the image is gathered through internet and displayed on
any specific location (can be gathered using GPS). The application doesn’t require a
marker to display the content. It is more interactive than marker based augmentation.
Since the advent of 2010 greater and preferably stress has been given on the fixed attitude
of Free and Open Source Software (FOSS). Android is leading the current O.S
convenience store as shown in figure 5, because it is open source and developed by a
consortium of preferably more than 86 leading M.N.C’s called Open Handset Allowance
(O.H.A). Android besides is coming as such the most growing technologies. More and
preferably applications have been extended and modified by third satisfaction user.
1. Camera: A real-world live video is feed as an input from the Android cell phone
camera to the Camera module. Displaying this live feed from the Android cell phone
camera is the reality in augmented reality. This live video stream is given as an input to
the Image Capturing Module.
10
2. Image Capturing Module: The input to Image Capturing Module is the reside video
receive from the camera of a mobile device. This module analyses the camera consume,
by analyzing each envision in the video. This module generates twin images i.e. a digital
approach that has unaccompanied two usable values all pixel. Typically one and the other
colors hand down for a dual images are black and white. These dual images are provided
as an input to Image Processing Module.
3. Image Processing Module: Inputs to Image Processing Module are the binary images
from Image Capturing Module. These binary images are processed using an image
processing technique to detect the AR Marker. Detection of AR Marker is essential to
determine the position, where to place the virtual object. Once the AR Marker is detected,
its location is provided as an input to the Tracking Module.
4. Marker Tracking Module: The tracking module is “the heart” of the augmented
reality system; it calculates the relative display of the camera in trustworthy time. The
decision pose means the six degrees of freedom (DOF) position, i.e. the 3D location and
3D attitude of an object. The expected pose is provided as an input to Rendering Module.
5. Rendering Module: There are 2 inputs to Rendering Module. First is the stake pose
from the Tracking Module and special is the Virtual Object subsequent augmented. The
Rendering Module combines the antithetical theory and the virtual components by the
agency of the calculated pose and renders the augmented image on the exclaim screen of
the express device.
11
2.1.5) Conclusion:
This paper proposes a capsize based augmented reality continuation by Android operating
system which will uphold to accompany virtual objects mutually the real environment
facilitating disparate applications as mentioned in this paper. The main advantage is
evaluation of low cost devices as compared to the valuable head mounted uncover
devices. Secondly with the help of this project we wish not buy product and then see how
it will befit your environment. In future images of objects from various views can be
fetched directly from vendor’s websites; same could be modeled into a 3D objects and
augmented. Also multiple objects will be augmented which is currently a major
challenge.
12
2.2) Title: “Marker less Augmented Reality Android App For Interior
Decoration”(2013)
This paper discusses mobile application of Augmented Reality (AR) on the Android
platform. The paper discusses existing SDKs for developing AR applications on
mobile platforms and elaborates their functionalities and limitations.
It is designed for personal computers and not for embedded devices. Hence
porting it directly to mobile platforms is difficult and impractical because it uses a
lot of FPU(floating point unit) calculations.
13
2. Vuforia:
Vuforia is an Augmented Reality SDK provided by Qualcomm, AG. It supports
a variety of 2D and 3D target types including Image Targets, 3D Multi-Target
configurations, and a form of addressable Fiduciary Marker known as a Frame
Marker. Additional features of the SDK include localized Occlusion Detection by
the agency of „Virtual Buttons‟, runtime conception direct levy, and the
exemption to construct and reconfigure focus sets programmatically at runtime.
The latest liberate of the SDK, Vuforia 2.0, besides supports cloud recognition,
where a user points his smartphone at a book, packaging or notice and
information about the item like the price, reviews, etc are shown.
14
2.2.2) Algorithm used in developing the application:
Initialization
1. Initialize the video capture and camera parameters and acquire device position
information from the magnetic compass and accelerometer.
Main Loop
2. Freeze video input frame.
3. Get device position and orientation from the magnetic compass.
4. Calculate the camera transformation relative to the device position.
5. Draw the virtual objects in the center of the screen.
Shutdown
6. Close the video capture down
2.2.2) Conclusion:
This paper provided a brief introduction to handheld Augmented Reality, discussed the
various free SDKs available for developing AR applications and outlined their merits and
limitations. The paper also discussed the hurdles in achieving fast and efficient tracking on
mobile devices. In the frame of reference of this handout an Android research for inner
decoration purposes is over developed by the Metaio SDK 4.1 described earlier.
15
2.3) Title: “Prototyping an Outdoor Mobile Augmented Reality Street View
Application”(2009)
The authors explain the easily development of an experimental outdoor mobile augmented
reality (AR) inquiry, ―AR street observation, running on a Google Android Dev Phone 1.
This assignment addresses the design of interaction and visualization techniques for AR
user interfaces on mobile phones. Their endure suggests that usability is compromised by a
departure from the norm of issues, including tracking and reporting errors caused
individually roughly misled GPS and point sensor, the need to assist the phone away from
the body in potentially anxious poses, and the relatively small field of view covered by the
display and camera. The project focuses on probe in interaction and visualization for
outdoor mobile phone AR user interfaces (UIs), as a substitute than on outdoor registration
and tracking for these devices.
2.3.1) Implementation:
The prototype program consists of four components: AR street view running on the Google
Android Dev Phone 1, the street database server, an exterior sensor, and a UMPC (or laptop
PC) equipped by all of Bluetooth adapter. AR street view is built on Google Android
platform 1.5, supporting a mobile device equipped with a camera, Bluetooth adapter, GPS
receiver, orientation sensor and cellular or Wi-Fi network adapter. 3D graphics are drawn
by OpenGL ES 1.0. Registration and tracking are implemented using either the built-in
location and orientation sensors, or external sensors. The street database server manages the
path coordinates (and attributes statement for the US, decoded from the US Census 2006
TIGER/Line Files. The path database is queried aside the MySQL 5.1 spatial field of
reference, and accessed by HTTP from application.
16
Figure 6: AR street view 3D components. (a) Street objects are registered with the real
world and annotated with labels. (b) Compass reticle shows heading. (c) Information
display shows the quantitative values of GPS, orientation, and other sensor
observations.
1. Position Inaccuracy:
The position tracking errors introduced by the relatively inexpensive location
sensors used in smartphones compromise the usability of applications that rely on
them. For example, the A-GPS in the Android Dev Phone 1 can sometimes report
more than a 40 m offset from its actual position.
2. Orientation Inaccuracy:
Similarly, the phone’s built-in orientation sensor can report significant static and
dynamic orientation errors, off by tens of degrees.
3. Small Field of View
The optics used in Smartphone cameras typically produce an image with a wider
field of view than would be seen looking directly through a rectangular frame the
size of the display. However, this is still far smaller than the field of view of even a
single human eye. Thus, only a relatively small portion of the user’s field of
view can be visibly augmented at any point in time.
17
2.4) Title: “Location-based Mobile Augmented Reality Applications: Challenges,
Examples, Lessons Learned” (2014)
The technical capabilities of modern factual mobile devices in a preferably way and more
enable us to tump desktop-like applications by the whole of demanding resource
requirements in mobile environments. Along this area, tremendous concepts, techniques,
and prototypes have been confirmed, focusing on basic implementation issues of mobile
applications. However, only little what one exists that deals mutually with the design and
implementation (i.e., the engineering) of futuristic smart mobile applications and reports on
the lessons learned in this context. In this handout, the authors seek profound insights
facing the design and implementation of a well known an advanced mobile application,
which enables location-based mobile augmented survival on two different mobile operating
systems (i.e., iOS and Android). In distinctive, this comparatively mobile application is
characterized by high resource demands as various sensors must be queried at run time and
myrid virtual objects may have to be drawn in real-time on the screen of the smart mobile
device (i.e., a high frame has a lot to do per second be caused).
2.4.1) Architecture:
The architecture has been designed with the goal to be able to easily exchange and extend
its components. The study comprises four dominating modules organized in a multi-tier
architecture. Lower tiers tackle their services and functions by interfaces to stimus tiers.
Based on this architectural design, modularity can be ensured; i.e., both data management
and distinct elements can be custom built and extended on demand.
18
Figure 7: Algorithm realizing the location View
19
2.5) Title: “Distance Estimation of Smart Device using Bluetooth”(2013)
Distance approximation identifies the distance between two machines in wireless network.
The Received Signal Strength Indication (RSSI) of Bluetooth can be used to estimate
distance between backward and earlier devices. The characteristic of Bluetooth RSSI
outlay is different as environments. So, we have tested the infrequent record between
distance and Bluetooth RSSI value in as a substitute environments, a well-known indoor
hall, meeting room, and Electromagnetic Compatibility (EMC) body environment. This
handout shows the distance characteristic of Bluetooth RSSI from these experiment results.
There are a lot of measurement errors at Bluetooth RSSI raw data. The minimum RSSI
value is -88 dBm and the maximum RSSI value is -66 dBm at 11m of the indoor hall
environment. The difference between maximum value and minimum value is 22 dBm. So,
it is hard to estimate the distance using Bluetooth
RSSI raw data.
2.5.1) Conclusion:
This paper addressed the distance estimation method and distance characteristic of
Bluetooth RSSI. The distance estimation is impossible with the RSSI raw and may be
possible coarsely with the RSSI average data in indoor hall environment. In meeting room
environment, it is hard to classify into inside and outside of the meeting room with the
RSSI raw data and may be possible to classify into inside and outside of the meeting room
with the RSSI average data.
20
2.6) Android Interface
Android's continuous user interface is based on approach of manipulation, via exist inputs
that loosely modify to real-world actions, like swiping, tapping, pinching, and reverse
pinching to has a part in on-screen objects, by the whole of a virtual keyboard. Game
controllers and full size terrestrial keyboards are suited by Bluetooth. The vital idea to
user input is designed impending causal and provides a fluid touch interface, often per the
vibration capabilities of the device to provide hap tic how do you do to the user. Internal
hardware, a abundantly known as accelerometers, gyroscopes and proximity sensors are
used by compact number of applications to execute to additional user actions, for
example adjusting the screen from portrait to landscape tentative on at which point the
device is oriented, or allowing the user to steer a vehicle in a racing game by rotating the
device, simulating control of a steering wheel.
Android devices boot to the home screen, the prime navigation and recommendation "hub"
on Android devices that is pertinent to the desktop found on personal computers. Android
home screens are originally made up of app icons and widgets; app icons initiate the
associated app, whereas widgets exhibit live, auto-updating easygoing, a well-known as
the weather envision, the user's email inbox, or a hearsay ticker directly on the home
screen. A home screen may be made up of several pages, meanwhilest which the user can
swipe back and forth, sooner or later Android's home screen interface is seriously
customizable, allowing users to intervene the regard and feel of the devices to their tastes.
Third-party apps available on Google Play and different app stores can extensively re-
theme the home screen, and someday mimic the manage of antithetical operating systems,
a well-known as Windows Phone. Most manufacturers, and some wireless carriers,
customize the recognize and proceed of their Android devices to anticipate themselves
from their competitors. Applications that consider interactions by the whole of with the
home screen are called "launchers" because they, among distinct purposes, take up the
applications connected on a device.
21
2.6.1) Applications:
1. Linux Kernel
The basic layer is the Linux Kernel. The perfect Android OS is built on top of the Linux
Kernel mutually with some furthermore architectural changes. The term Kernel means the
core of complete Operating System.
It is this Linux kernel that interacts by all of the whole of the hardware and it contains all
the determining hardware drivers. Drivers are programs that behave and communicate by
the whole of the hardware.
For example, consider the Bluetooth function. All devices has a Bluetooth hardware in it.
Therefore the kernel must boost a Bluetooth driver to communicate by all of the
Bluetooth hardware.
The Linux kernel furthermore acts as an abstraction layer between the hardware and
distinct software layers. As the Android is built on a roughly popular and proven
principle, the porting of Android to deviaton of hardware became a almost painless task.
2. Libraries
The after layer is the Android’s native libraries. It is this layer that enables the device to
handle diverse types of data. These libraries are written in c or c++ definition and are
tenacious for a different hardware.
Some of the important native libraries include the following:
3. Android Runtime
Android Runtime consists of Dalvik Virtual machine and Core Java libraries.
Dalvik Virtual Machine
It is a essence of JVM used in android devices to contest apps and is optimized for low
processing power and low memory environments. Unlike the JVM, the Dalvik Virtual
Machine doesn’t run .class files, instead it runs .dex files. The .dex files are built from
.class file at the dread of compilation and approach higher efficiency in low resource
environments. The Dalvik VM allows multiple instance of Virtual machine to be created
accordingly providing security, isolation, memory management and threading support.
4. Application Framework
These are the blocks that our applications directly interact with. These programs
conclude the integral functions of phone like resource management, voice call
management etc. As a developer, you just approach these are some integral tools by all
of which we are building our applications.
Important blocks of Application framework are:
Activity Manager: Manages the activity continuance cycle of applications
Content Providers: Manage the data sharing mid applications
Telephony Manager: Manages all voice calls. We clear telephony manager if we desire to
access voice calls in our application.
Location Manager: Location authority, by GPS or cell tower
Resource Manager: Manage the disparate types of resources we evaluate in our Application.
24
5. Applications
Applications are the top layer in the Android architecture and this is to what our
applications are gonna reside into. Several standard applications comes pre-installed by
all of every device, a well known as:
SMS customer app
Dialer
Web browser
Contact manager
The version yesterday of the Android mobile operating system began by the whole of the
retrieve of the Android alpha in November 2007. The willingly commercial explanation,
Android 1.0, was reported in September 2008. Android is consistently developed by
Google and the Open Handset Alliance (OHA), and has seen a abode of updates to its
base operating system since the initial release.
Android 2.0 (Éclair) - October 2009
Google Maps Navigation is made a member of, bringing off the top of head
turn-by-turn directions to the phone.
Support for continuous accounts is increased, by the whole of distinct Contact,
Email and Calendar sync settings for individual account.
The Action Bar work is approved, providing a uninterrupted space for an app's
virtually used functions in the upper-right corner of the ahead of its time app.
Widgets are roughly in a superior way interactive, and let users flip on stacked
content or scroll through content displayed in the widget.
.
Android 4.0 (Ice Cream Sandwich) - October 2011
New Data Saver mode, which can force apps to abbreviate bandwidth usage
Figure 10: The android version global usage
28
2.9) Bluetooth Low Energy(BLE) Beacons
Android Studio
JDK(Java Development Kit) 1.7
Arduino IDE
System Requirements:
CPU: 2.2 GHz Processor and above
RAM: 4 GB or above
OS: Windows 7 or above
30
3.3) MAKING BEACONS:
We need to program the Atmel Atmega 328P IC to make the beacon and the finally
connect it with the nrf module to obtain a beacon.
31
Figure 11: Using Arduino as an ISP
After uploading the code on the IC(Atmel Atmega 328P), we need to connect it with the
Bluetooth module(nrf24l01+) and a battery.
Figure 13: Connection of Bluetooth module with the IC and the battery
33
3.4) SYSTEM DESIGN:
A class diagram in the Unified Modeling Language (UML) is a core of static definite
schedule diagram that describes the process of a program by showing the
system's classes, their attributes, operations (or methods), and the relationships halfway
objects.
2 private static final long Time till which scanning will be done(in ms)
SCAN_PERIOD
37
3.5.2) Functions
OctiseBeacons(Context context)
Constructor to give the context to this class.
checkBLEcompatability()
Checks whether the android version is compatible with BLE or not.
Return type: Boolean
1. true : when compatible
2. false : not compatible
checkBluetoothCompatability()
Checks whether the android version is compatible with Bluetooth or not.
Return type: Boolean
1. true : when compatible
2. false : not compatible
BluetoothEnabled(Activity activity)
Checks whether the Bluetooth is enabled or not, prompts a message to the user
to enable Bluetooth if not enabled.
Return type: void
Parameters: activity
Intent is sent to the user to enable the Bluetooth from this activity.
38
4. PERFORMANCE ANALYSIS
System testing of software is most working conducted on a diligent, full system to use the
system's compliance by the whole of its specified requirements. System testing falls
within the scope of black box testing, and as a well-known, should oblige no
development of the inner design of the bias or logic.
It is literally much similar efficient test case writing. In test case exchange of letter you
should devise the test scenarios & act with regard to use cases.
Figure 17
39
4.1.2) UNIT TESTING:-
The choice of unit testing is to bring to a close each object of the system and disclose
that the individual parts are correct.
Figure 18
40
4.2) TEST CASES
43
5. CONCLUSION
Indoor navigation deals by all of navigation within buildings. Because GPS reception is
normally non-existent inner buildings, distinctive positioning technologies are hand me
down here when automatic positioning is desired. WiFi or beacons (Bluetooth Low
Energy, BLE) are constantly used in this case to create a so-called "indoor GPS".
BLE beacon is complacent in disparate aspects. The project highlights the mange of
beacons to extricate the user’s current location. The major advantage of using beacon is
that the user will have a seamless experience; the user is not prescribed to read whole
marker for augmented reality to work. Hence the efficiency is higher and hence the
anticipation taken to execute to user interrogation is less than distinct mechanisms.
We were successfully experienced to design the BLE beacon, which will split the
Bluetooth low energy signals which can be detected by any android device running on
Android version preferably than 4.0(Ice cream sandwich). We were by the same token
successfully able to design the API(Application program interface) which will collect all
the scanned beacon data, process the data and then revive the processed data to the
application.
44
5.1.1) Future scope
In this project we are using beacons for only detecting the user’s location, but beacon can
be used in other applications too.
[1] Mr. Raviraj S. Patkar, Mr. S. Pratap Singh, Ms. Swati V. Birje, “Marker Based
Augmented Reality Using Android OS “(2013) - International Journal of Advanced
Research in Computer Science and Software Engineering
http://www.ijarcsse.com/docs/papers/Volume_3/5_May2013/V3I4-0388.pdf
[2] Mr. Prasad Renukdas, Mr. Rohit Ghundiyal, Mr. Harshavardhan Gadgil, Mr. Vishal
Pathare, “Markerless Augmented Reality Android App For Interior Decoration”(2013)-
International Journal of Engineering Research & Technology
https://www.ijert.org/view-pdf/3110/markerless-augmented-reality-android-app-for-
interior-decoration-
[3] Mr. Yoshitaka Tokusho , Mr. Steven Feiner, “Prototyping an Outdoor Mobile
Augmented Reality Street View Application”(2009)
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.472.9624&rep=rep1&type=pdf
[4] Geiger, Philip and Schickler, Marc and Pryss, Rüdiger and Schobel,
Johannes and Reichert, “Location-based Mobile Augmented Reality Applications:
Challenges, Examples, Lessons Learned” (2014) - 10th Int'l Conference on Web
Information Systems and Technologies
http://dbis.eprints.uni-ulm.de/1028/1/Geig_2014.pdf
[5] Joonyoung Jung, Dongoh Kang, Changseok Bae, “Distance Estimation of Smart
Device using Bluetooth”(2013) - The Eighth International Conference on Systems and
Networks Communications
https://www.thinkmind.org/index.php?view=article&articleid=icsnc_2013_1_30_20039
[6] Rachel Metz, ‘Augmented Reality Is Finally Getting Real’, 2012. [Online]. Available:
https://www.technologyreview.com/s/428654/augmented-reality-is-finally-getting-real/
46
[7] Jimmy & Omar, ‘Augmented Reality’, 2012. [Online]. Available:
https://prezi.com/zzpvrwqk1bns/augmented-reality/
[8] Danielle Muoio, ‘The 10 coolest things happening with augmented reality right now’,
2015. [Online]. Available: http://www.businessinsider.in/The-10-coolest-things-
happening-with-augmented-reality-right-now/articleshow/49313058.cms#microsofts-
augmented-reality-headset-the-hololens-lets-a-roboarmy-invade-your-living-room-it-
debuts-next-year-1
[9] Lily Prasuethsut, ‘Everything you need to know about augmented reality: Then, now
& next’, 2016. [Online]. Available: http://www.wareable.com/ar/everything-you-need-to-
know-about-augmented-reality
[10] The Economist, ‘The difference between virtual and augmented reality’, 2016.
[Online]. Available: http://www.economist.com/blogs/economist-
explains/2016/04/economist-explains-8
47