Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 64

Indoor Navigation Using Augmented Reality

Project report submitted in fulfillment of the requirement for the degree


of Bachelor of Technology

In

Computer Science and Engineering

By

Rishav Gupta (131304)


Shubham Sharma(131339)

Under the supervision of

(Ms. Ruhi Mahajan)

To

Department of Computer Science & Engineering and Information Technology

Jaypee University of Information Technology Waknaghat, Solan-173234,


Himachal Pradesh
CERTIFICATE

Candidate’s Declaration

This is to certify that the work which is considering presented in the report entitled
“Indoor Navigation using Augmented Reality” in partial fulfillment of the
requirements for the award of the degree of Bachelor of Technology in Computer
Science and Engineering/Information Technology submitted in the department of
Computer Science & Engineering and Information Technology, Jaypee University of
Information Technology Waknaghat is an accurate record of our own work carried out
around a period from August 2016 to December 2016 under the supervision of Ms. Ruhi
Mahajan (Assistant Professor, Computer Science & Engineering Department).

The matter embodied in the report has not been submitted for the award of any other
degree or diploma.

Rishav Gupta, 131304 Shubham Sharma,131339

This is to certify that the above statement made by the candidates is true to the best of my
knowledge.

Ms. Ruhi Mahajan


Assistant Professor
Computer Science & Engineering Department
Dated:

i
ACKNOWLEDGEMENT

We owe our profound gratitude to our project supervisor Ms. Ruhi Mahajan, who took
keen interest and guided us all along in my project work titled ―Street view using
Augmented Reality, till the completion of our project by providing all the necessary
information for developing the project. The project development helped us in research
and we got to know a lot of new things in our domain. We are really thankful to her.

ii
TABLE OF CONTENTS

CERTIFICATE ............................................................................................... i i
ACKNOWLEDGEMENT .................................................................................. ii ii
LIST OF FIGURES ........................................................................................... v v
LIST OF TABLES ......................................................................................... vi vi
ABSTRACT .................................................................................................. vii vii

1)INTRODUCTION…………………………………………………………………...1

1.1)PROBLEM STATEMENT ……………..…………………..……………………2

1.2)OBJECTIVES……………………..……………………….…………...…………3

1.3)METHODOLGY……………..…………………………….……….….…………4

2)LITEATURE SURVEY

2.1) RESEARCH PAPER-1………………………………………..…………….5-12


2.2) RESEARCH PAPER-2……………………………………………………..13-15
2.3) RESEARCH PAPER-3……………………………………………………..16-17
2.4) RESEARCH PAPER-4……………………………………………………..18-19
2.5) RESEARCH PAPER-5………………………………………………………...20
2.6) ANDROID INTERFACE…………………………………………………..21-22
2.7) ANDROID ARCHITECTURE…………………………………………….23-25
2.8) ANDROID VERSIONS…………………..………………….…………… 26-28

2.9) BLE BEACONS……………………………………………………………….29

3.) SYSTEM DEVLOPMENT

3.1)SOFTWARE REQUIREMENTS………………………………………………30

3.2)HARDWARE REQUIREMENTS……………………………………………...30

3.3)MAKING BEACONS…………………………………..……………………31-33

iii
3.4)SYSTEM DESIGN…………………………………..……………………..…34-36

3.5)API IMPLEMENTATION…………………………………..………………..37-38

4.) PERFORMANCE ANALYSIS………………………………………………..39-43

5)CONCLUSION………………………………………………………..…………...44

5.1)FUTURE SCOPE……………………………………………………..……........45

6.) REFERENCES………………………………………………………….….....46-47

iv
LIST OF FIGURES

Title Page No.

1. Reality – Virtuality - Continuum 6

2. A simple marker 7

3 Markerless Augmented reality 8

4. Worldwide Smartphone OS market share 9

5. Proposed System Architecture 10

6. AR street view 3-D components 17

7. Algorithm realizing the location view 19

8. User interface for android 4.4 22

9. The android architecture 25

10. The android version global usage 28

11. Using Arduino as ISP 32

12. Pin out of Atmel Atmega 328P IC 32

13. Connection of Bluetooth module with IC 33

14. Use case Diagram 34

15 Class Diagram 35

16 Data Flow diagram 36

17 Black box testing 39

18 Unit testing 40

19 Received RSSI values at different distances 43

v
LIST OF TABLES

Title Page No.

1. Difference between AR and VR 6

2. Global variables used in API 37

3 Test case-1 41

4. Test case-2 41

5. Test case-3 41

6. Test case-4 41

7. Test case-5 42

8. Test case-6 42

9. Test case-7 42

10. Future Scope of beacons 45

vi
ABSTRACT

This project is indented to gave all one got an Implementation of indoor as well as
outdoor navigation by augmented reality and beacons (Bluetooth low energy device).
Augmented Reality or AR is an emerging technology around one’s conception of the real-
time environment is enhanced by superimposing computer-generated information one as
graphical, textual or audio causal, as lavishly as objects onto a disclose screen. The
proposed function is an android mobile based academic work which will be compatible
by all of the versions of android having detail API level above 16 (Jellybean).

In particular, this android application explicitly addresses issues caused all by relatively
confused GPS and attitude sensors around navigation by for beacons which interact by all
of the android application via an API.

vii
1. INTRODUCTION

Augmented reality (AR) is a live behave or indirect handle of a terrestrial, real-world


setting whose graphic illustration are increased by computer-generated sensory input a
wished as disclose, come from, person digital assistant or GPS information. Augmented
Reality turns the environment completely round you facing a digital interface by putting
virtual objects within the real reality, in time period.

The proposed project is an implementation of an android based application which is


compatible with all the android devices running on version greater than 4.0(Icecream
Sandwich) and relies on beacon to give the user location.

This project is intended to give implementation of augmented reality to navigate the user
indoor as well as outdoor. The project highlights the use of beacons to retrieve the user’s
current location. The major advantage of using beacon is that the user will have a
seamless experience; the user is not required to scan any marker for augmented reality to
work. Hence the efficiency is higher and hence the time taken to respond to user query is
less than other mechanisms. The working of the application is also described in this
documentation. The documentation also lists out the requirements for the project
implementation.

1
1.1) PROBLEM STATEMENT

While today’s more unofficial GPS chips cancel sometimes gain a work the bugs out of
(receive signals from all one want satellites to confirm a location ) alimentary a
apartment , the resulting lot is necessarily not disputable enough anticipated useful .The
signal form the satellites are attenuated and around by roofs , walls and particular
objects. Besides, the error range of many GPS chips can be larger than the indoor space
itself.
In distinct, this android review explicitly addresses issues caused by the relatively
confused GPS and point of view sensors around navigation by via beacons which interact
mutually the android application via an API.

2
1.2) OBJECTIVE

Short-term – The short-term objective of the project is complete understanding of the


project assigned. The concepts related to the project should be clear and the objective
behind the project should be known to the entire team. The focus should not be only on
theoretical concepts but also on practical implications of them. The technology used to
implement the projects should be familiar and also one should be able to apply them on
our respective project.

Long-term – Long-term objective includes that students are exposed to the industrial
environment which should help us in future when we will work in real-time projects. One
should gain experience of working with a team and learn to cooperate with our team
members.

3
1.3) METHODOLOGY

 Making Beacons: This includes creating the beacons which will emit the low
Bluetooth energy signals which will be detected by the API which is present in the
android application.

 UI Design: We create an activity in the android application in which the user will
have to feed the destination(should match with the destination stored in the database)
and then the current location will be detected and the user will be guided to reach the
destination using augmented reality.

 Indexing of Beacons in database: This module will insert the MAC of the beacon
which is at the destination, and the MAC of all the beacons which the user will
encounter during the navigation.

 Implementation of android application in Android Studio: This module will search


out theminimum path based on encountered beacons during navigation and in case
the user goes in the wrong direction, the user will be guided to the right direction by
finding the minimum path to the destination.

 Testing: Various kinds of testing like black box testing, white box testing etc

4
2. LITERATURE SURVEY

2.1) Title: “Marker Based Augmented Reality Using Android OS “(2013)

Augmented Reality or AR is an emerging technology anywhere one’s intuition of the


real-time environment is enhanced by superimposing computer-generated information a
well-known as graphical , textual or audio easy going , as readily as objects onto a
uncover screen. The proposed application is an android mobile based application which
will be compatible with all the existing and upcoming versions of the operating system.
The idea is to manage the user to regard the virtual object in the real world by a marker
based AR system. The user could extend images of the object which prospective the front,
back, top, bottom, left and right side pictures of the object. They will be placed onto a 3D
cube which will make up the complete virtual object. Thus an extended environment will
be created through the
amalgamation of real world and generated object and it will appear as though the real-
world object and virtual object coexist within the environment. The advantages of this
application as compared to the already existing 2D application are that it would display
object in 3D and enable the user to rotate it virtually. It is inexpensive as the user need
not actually purchase the object to see how it fits in the environment, instead he can try
before the purchase itself
5

2.1.1) Difference between Augmented reality and Virtual Reality

Augmented Reality Virtual Reality


An artificial, computer-generated
It takes our current reality and adds something simulation or recreation of a real life
to it environment or situation

User maintains a sense of presence in real Senses are under control of system
world

Needs a mechanism to combine virtual and real Need a mechanism to feed virtual world
worlds to user

Table 1: Difference between Augmented and Virtual Reality

Augmented presence and virtual world are antithesis reflections of a well- known in

Figure 1: Reality - Virtuality – Continuum

another by all of what technology seeks to end and announce for the user. Virtual world
offers a digital relaxation of a genuine continuance setting, interruption augmented
survival delivers virtual fundamentals as an layer to the real world.
6
2.1.2) Different Types of Augmented Reality

There are two types of easily done augmented reality: marker-based which uses cameras
and audio visual , and marker less which act with regard to positional data one as a
mobile's GPS and compass.

Marker Based:

Different types of Augmented Reality (AR) markers are images that can be detected by a
camera and secondhand mutually software as the motion studio video for virtual capital
places in a scene .Most are detective not with standing colors can be used as conceive as
the scan between them cut back be smoothly recognized by a camera .Simple augmented
hand one is dealt makers can comprise one or greater basic shapes with black squares at
variance a white background. More elaborate markers can be created using simple images
that are still read properly by a camera, and these codes can even take the form of tattoos.

Figure 2: A Simple Marker

A camera is used mutually with AR software to notice augmented reality markers as the
location for virtual objects. Limitations on the types of augmented reality markers that cut
back be used are based on the software that recognizes them. The simplest types of
augmented reality markers are black and white images that consist of two-dimensional
(2D) barcodes.
7
Markerless:
In marker-less augmented reality the image is gathered through internet and displayed on
any specific location (can be gathered using GPS). The application doesn’t require a
marker to display the content. It is more interactive than marker based augmentation.

Figure 3: Marker-less Augmented Reality


8
2.1.3) Why Android OS?

Since the advent of 2010 greater and preferably stress has been given on the fixed attitude
of Free and Open Source Software (FOSS). Android is leading the current O.S
convenience store as shown in figure 5, because it is open source and developed by a
consortium of preferably more than 86 leading M.N.C’s called Open Handset Allowance
(O.H.A). Android besides is coming as such the most growing technologies. More and
preferably applications have been extended and modified by third satisfaction user.

Figure 4: Worldwide Smartphone OS market share


9
2.1.4) Proposed System Architecture
The proposed system is a marker based system and its architecture as shown in figure 6
contains following modules:
1. Camera
2. Image Capturing Module
3. Image Processing Module
4. Rendering Module
5. Display Screen

Figure 5: Proposed system architecture

1. Camera: A real-world live video is feed as an input from the Android cell phone
camera to the Camera module. Displaying this live feed from the Android cell phone
camera is the reality in augmented reality. This live video stream is given as an input to
the Image Capturing Module.

10
2. Image Capturing Module: The input to Image Capturing Module is the reside video
receive from the camera of a mobile device. This module analyses the camera consume,
by analyzing each envision in the video. This module generates twin images i.e. a digital
approach that has unaccompanied two usable values all pixel. Typically one and the other
colors hand down for a dual images are black and white. These dual images are provided
as an input to Image Processing Module.

3. Image Processing Module: Inputs to Image Processing Module are the binary images
from Image Capturing Module. These binary images are processed using an image
processing technique to detect the AR Marker. Detection of AR Marker is essential to
determine the position, where to place the virtual object. Once the AR Marker is detected,
its location is provided as an input to the Tracking Module.

4. Marker Tracking Module: The tracking module is “the heart” of the augmented
reality system; it calculates the relative display of the camera in trustworthy time. The
decision pose means the six degrees of freedom (DOF) position, i.e. the 3D location and
3D attitude of an object. The expected pose is provided as an input to Rendering Module.

5. Rendering Module: There are 2 inputs to Rendering Module. First is the stake pose
from the Tracking Module and special is the Virtual Object subsequent augmented. The
Rendering Module combines the antithetical theory and the virtual components by the
agency of the calculated pose and renders the augmented image on the exclaim screen of
the express device.

11
2.1.5) Conclusion:

This paper proposes a capsize based augmented reality continuation by Android operating
system which will uphold to accompany virtual objects mutually the real environment
facilitating disparate applications as mentioned in this paper. The main advantage is
evaluation of low cost devices as compared to the valuable head mounted uncover
devices. Secondly with the help of this project we wish not buy product and then see how
it will befit your environment. In future images of objects from various views can be
fetched directly from vendor’s websites; same could be modeled into a 3D objects and
augmented. Also multiple objects will be augmented which is currently a major
challenge.
12

2.2) Title: “Marker less Augmented Reality Android App For Interior
Decoration”(2013)

This paper discusses mobile application of Augmented Reality (AR) on the Android
platform. The paper discusses existing SDKs for developing AR applications on
mobile platforms and elaborates their functionalities and limitations.

2.2.1) SDKs currently available for Android development:

1. ARToolkit: It is an open source marker based AR library developed at the


University of Washington. Some of the features of ARToolkit include:
 Single camera position/orientation tracking.
 Tracking code that uses simple black squares.
 The ability to use any square marker patterns.
 Easy camera calibration code.
 Fast enough for real time AR applications.
 Distributed with complete source code.

It is designed for personal computers and not for embedded devices. Hence
porting it directly to mobile platforms is difficult and impractical because it uses a
lot of FPU(floating point unit) calculations.

13
2. Vuforia:
Vuforia is an Augmented Reality SDK provided by Qualcomm, AG. It supports
a variety of 2D and 3D target types including Image Targets, 3D Multi-Target
configurations, and a form of addressable Fiduciary Marker known as a Frame
Marker. Additional features of the SDK include localized Occlusion Detection by
the agency of „Virtual Buttons‟, runtime conception direct levy, and the
exemption to construct and reconfigure focus sets programmatically at runtime.
The latest liberate of the SDK, Vuforia 2.0, besides supports cloud recognition,
where a user points his smartphone at a book, packaging or notice and
information about the item like the price, reviews, etc are shown.

3. Metaio Mobile SDK:


The Metaio SDK includes a powerful 3-D rendering engine in addition to plug-ins
for Unity. It has advanced tracking features, such as Markerless 2D and 3D
Tracking, client-based visual search. Some of its features are:
 Native App development for iOS, Android and PC (Windows)
 Includes powerful 3D Rendering engine.
 Highly advanced tracking and minimal artefacting.
 FREE commercial usage (with watermark)
 Huge developer support community

14
2.2.2) Algorithm used in developing the application:

Initialization
1. Initialize the video capture and camera parameters and acquire device position
information from the magnetic compass and accelerometer.
Main Loop
2. Freeze video input frame.
3. Get device position and orientation from the magnetic compass.
4. Calculate the camera transformation relative to the device position.
5. Draw the virtual objects in the center of the screen.
Shutdown
6. Close the video capture down

2.2.2) Conclusion:

This paper provided a brief introduction to handheld Augmented Reality, discussed the
various free SDKs available for developing AR applications and outlined their merits and
limitations. The paper also discussed the hurdles in achieving fast and efficient tracking on
mobile devices. In the frame of reference of this handout an Android research for inner
decoration purposes is over developed by the Metaio SDK 4.1 described earlier.

15
2.3) Title: “Prototyping an Outdoor Mobile Augmented Reality Street View
Application”(2009)

The authors explain the easily development of an experimental outdoor mobile augmented
reality (AR) inquiry, ―AR street observation, running on a Google Android Dev Phone 1.
This assignment addresses the design of interaction and visualization techniques for AR
user interfaces on mobile phones. Their endure suggests that usability is compromised by a
departure from the norm of issues, including tracking and reporting errors caused
individually roughly misled GPS and point sensor, the need to assist the phone away from
the body in potentially anxious poses, and the relatively small field of view covered by the
display and camera. The project focuses on probe in interaction and visualization for
outdoor mobile phone AR user interfaces (UIs), as a substitute than on outdoor registration
and tracking for these devices.

2.3.1) Implementation:

The prototype program consists of four components: AR street view running on the Google
Android Dev Phone 1, the street database server, an exterior sensor, and a UMPC (or laptop
PC) equipped by all of Bluetooth adapter. AR street view is built on Google Android
platform 1.5, supporting a mobile device equipped with a camera, Bluetooth adapter, GPS
receiver, orientation sensor and cellular or Wi-Fi network adapter. 3D graphics are drawn
by OpenGL ES 1.0. Registration and tracking are implemented using either the built-in
location and orientation sensors, or external sensors. The street database server manages the
path coordinates (and attributes statement for the US, decoded from the US Census 2006
TIGER/Line Files. The path database is queried aside the MySQL 5.1 spatial field of
reference, and accessed by HTTP from application.

16
Figure 6: AR street view 3D components. (a) Street objects are registered with the real
world and annotated with labels. (b) Compass reticle shows heading. (c) Information
display shows the quantitative values of GPS, orientation, and other sensor
observations.

2.3.2) Usability Issues:

1. Position Inaccuracy:
The position tracking errors introduced by the relatively inexpensive location
sensors used in smartphones compromise the usability of applications that rely on
them. For example, the A-GPS in the Android Dev Phone 1 can sometimes report
more than a 40 m offset from its actual position.
2. Orientation Inaccuracy:
Similarly, the phone’s built-in orientation sensor can report significant static and
dynamic orientation errors, off by tens of degrees.
3. Small Field of View
The optics used in Smartphone cameras typically produce an image with a wider
field of view than would be seen looking directly through a rectangular frame the
size of the display. However, this is still far smaller than the field of view of even a
single human eye. Thus, only a relatively small portion of the user’s field of
view can be visibly augmented at any point in time.

17
2.4) Title: “Location-based Mobile Augmented Reality Applications: Challenges,
Examples, Lessons Learned” (2014)

The technical capabilities of modern factual mobile devices in a preferably way and more
enable us to tump desktop-like applications by the whole of demanding resource
requirements in mobile environments. Along this area, tremendous concepts, techniques,
and prototypes have been confirmed, focusing on basic implementation issues of mobile
applications. However, only little what one exists that deals mutually with the design and
implementation (i.e., the engineering) of futuristic smart mobile applications and reports on
the lessons learned in this context. In this handout, the authors seek profound insights
facing the design and implementation of a well known an advanced mobile application,
which enables location-based mobile augmented survival on two different mobile operating
systems (i.e., iOS and Android). In distinctive, this comparatively mobile application is
characterized by high resource demands as various sensors must be queried at run time and
myrid virtual objects may have to be drawn in real-time on the screen of the smart mobile
device (i.e., a high frame has a lot to do per second be caused).

2.4.1) Architecture:

The architecture has been designed with the goal to be able to easily exchange and extend
its components. The study comprises four dominating modules organized in a multi-tier
architecture. Lower tiers tackle their services and functions by interfaces to stimus tiers.
Based on this architectural design, modularity can be ensured; i.e., both data management
and distinct elements can be custom built and extended on demand.

18
Figure 7: Algorithm realizing the location View

19
2.5) Title: “Distance Estimation of Smart Device using Bluetooth”(2013)
Distance approximation identifies the distance between two machines in wireless network.
The Received Signal Strength Indication (RSSI) of Bluetooth can be used to estimate
distance between backward and earlier devices. The characteristic of Bluetooth RSSI
outlay is different as environments. So, we have tested the infrequent record between
distance and Bluetooth RSSI value in as a substitute environments, a well-known indoor
hall, meeting room, and Electromagnetic Compatibility (EMC) body environment. This
handout shows the distance characteristic of Bluetooth RSSI from these experiment results.
There are a lot of measurement errors at Bluetooth RSSI raw data. The minimum RSSI
value is -88 dBm and the maximum RSSI value is -66 dBm at 11m of the indoor hall
environment. The difference between maximum value and minimum value is 22 dBm. So,
it is hard to estimate the distance using Bluetooth
RSSI raw data.

2.5.1) Conclusion:

This paper addressed the distance estimation method and distance characteristic of
Bluetooth RSSI. The distance estimation is impossible with the RSSI raw and may be
possible coarsely with the RSSI average data in indoor hall environment. In meeting room
environment, it is hard to classify into inside and outside of the meeting room with the
RSSI raw data and may be possible to classify into inside and outside of the meeting room
with the RSSI average data.

20
2.6) Android Interface

Android's continuous user interface is based on approach of manipulation, via exist inputs
that loosely modify to real-world actions, like swiping, tapping, pinching, and reverse
pinching to has a part in on-screen objects, by the whole of a virtual keyboard. Game
controllers and full size terrestrial keyboards are suited by Bluetooth. The vital idea to
user input is designed impending causal and provides a fluid touch interface, often per the
vibration capabilities of the device to provide hap tic how do you do to the user. Internal
hardware, a abundantly known as accelerometers, gyroscopes and proximity sensors are
used by compact number of applications to execute to additional user actions, for
example adjusting the screen from portrait to landscape tentative on at which point the
device is oriented, or allowing the user to steer a vehicle in a racing game by rotating the
device, simulating control of a steering wheel.
Android devices boot to the home screen, the prime navigation and recommendation "hub"
on Android devices that is pertinent to the desktop found on personal computers. Android
home screens are originally made up of app icons and widgets; app icons initiate the
associated app, whereas widgets exhibit live, auto-updating easygoing, a well-known as
the weather envision, the user's email inbox, or a hearsay ticker directly on the home
screen. A home screen may be made up of several pages, meanwhilest which the user can
swipe back and forth, sooner or later Android's home screen interface is seriously
customizable, allowing users to intervene the regard and feel of the devices to their tastes.
Third-party apps available on Google Play and different app stores can extensively re-
theme the home screen, and someday mimic the manage of antithetical operating systems,
a well-known as Windows Phone. Most manufacturers, and some wireless carriers,
customize the recognize and proceed of their Android devices to anticipate themselves
from their competitors. Applications that consider interactions by the whole of with the
home screen are called "launchers" because they, among distinct purposes, take up the
applications connected on a device.
21
2.6.1) Applications:

Android has a growing selection of third-party applications, which can be contracted by


users by downloading and installing the application's APK (Android application package)
file, or by downloading them using an research store program that allows users to
authorize, acknowledge, and annul applications from their devices. Google Play Store is the
primary application store installed on Android devices that execute mutually Google's
compatibility requirements and authorize the Google Mobile Services software.
Due to the disclose nature of Android, a lot of third-party application marketplaces also fit
for Android, as a choice to give a substitute for devices that are not allowed to ship by all of
Google Play Store, provide applications that cannot be offered on Google Play Store due to
policy violations, or for diversified reasons.

Figure 8: User Interface for Android 4.4(Kitkat)


22
2.7) Android Architecture:

Android OS is firm of different layers of software. Each layer provides different


services to the layer just above it. Together it will make the OS, middle ware and
applications.
Different layers in the android architecture are:

1. Linux Kernel

The basic layer is the Linux Kernel. The perfect Android OS is built on top of the Linux
Kernel mutually with some furthermore architectural changes. The term Kernel means the
core of complete Operating System.
It is this Linux kernel that interacts by all of the whole of the hardware and it contains all
the determining hardware drivers. Drivers are programs that behave and communicate by
the whole of the hardware.
For example, consider the Bluetooth function. All devices has a Bluetooth hardware in it.
Therefore the kernel must boost a Bluetooth driver to communicate by all of the
Bluetooth hardware.
The Linux kernel furthermore acts as an abstraction layer between the hardware and
distinct software layers. As the Android is built on a roughly popular and proven
principle, the porting of Android to deviaton of hardware became a almost painless task.

2. Libraries

The after layer is the Android’s native libraries. It is this layer that enables the device to
handle diverse types of data. These libraries are written in c or c++ definition and are
tenacious for a different hardware.
Some of the important native libraries include the following:

Surface Manager: It is used for compositing window manager mutually off-screen


buffering. Off-screen buffering means the apps can’t directly draw into the screen,
instead the drawings go to the off-screen buffer. There it is combined with other
drawings and constitute the final screen the user will see. This off screen buffer is the
reason incur the certainty of windows.
23
Media framework: Media frame of reference provides different media codec’s allowing
the enrollment and playback of disparate media formats
SQLite: SQLite is the database engine used in android for data storage purposes

WebKit: It is the user engine used to display HTML content


OpenGL: Used to do 2D or 3D graphics content to the screen

3. Android Runtime

Android Runtime consists of Dalvik Virtual machine and Core Java libraries.
Dalvik Virtual Machine

It is a essence of JVM used in android devices to contest apps and is optimized for low
processing power and low memory environments. Unlike the JVM, the Dalvik Virtual
Machine doesn’t run .class files, instead it runs .dex files. The .dex files are built from
.class file at the dread of compilation and approach higher efficiency in low resource
environments. The Dalvik VM allows multiple instance of Virtual machine to be created
accordingly providing security, isolation, memory management and threading support.

4. Application Framework

These are the blocks that our applications directly interact with. These programs
conclude the integral functions of phone like resource management, voice call
management etc. As a developer, you just approach these are some integral tools by all
of which we are building our applications.
Important blocks of Application framework are:
Activity Manager: Manages the activity continuance cycle of applications
Content Providers: Manage the data sharing mid applications

Telephony Manager: Manages all voice calls. We clear telephony manager if we desire to
access voice calls in our application.
Location Manager: Location authority, by GPS or cell tower

Resource Manager: Manage the disparate types of resources we evaluate in our Application.
24
5. Applications

Applications are the top layer in the Android architecture and this is to what our
applications are gonna reside into. Several standard applications comes pre-installed by
all of every device, a well known as:
 SMS customer app
 Dialer
 Web browser
 Contact manager

Figure 9: The Android Architecture


25
2.8) Android Versions

The version yesterday of the Android mobile operating system began by the whole of the
retrieve of the Android alpha in November 2007. The willingly commercial explanation,
Android 1.0, was reported in September 2008. Android is consistently developed by
Google and the Open Handset Alliance (OHA), and has seen a abode of updates to its
base operating system since the initial release.

Android 1.5 (Cupcake) - April 2009

 Auto-rotation allows for an easier landscape-to-portrait transition.


 Copy-and-paste extends from input fields to the browser.

Android 1.6 (Donut) - September 2009

 CDMA vows opens Android up to all told carriers.


 Multiple screen resolutions are quick to depleted for the willingly time.
 A battery-usage indicator is instructed to show the user what apps and services
are using up the necessarily battery.


Android 2.0 (Éclair) - October 2009

 Google Maps Navigation is made a member of, bringing off the top of head
turn-by-turn directions to the phone.
 Support for continuous accounts is increased, by the whole of distinct Contact,
Email and Calendar sync settings for individual account.

Android 2.2 (Froyo) - May 2010

 Dalvik VM: Just-in-time (JIT) Compiler brings tremendous speed


enhancements to Android.
 New API enables the right to push content forthwith from the Chrome user on the
desktop to the Android smartphone.
26
Android 2.3 (Gingerbread) - December 2010
 The on-screen keyboard is redesigned to enliven typing speed and accuracy, and
suggestions are urgently ready to be drawn as you type.
 Copy-and-paste is practically improved, by the arrows on in turn side of the
orientation selected initially that can be confused to grab the appropriate text.

Android 3.0 (Honeycomb) - February 2011

 The Action Bar work is approved, providing a uninterrupted space for an app's
virtually used functions in the upper-right corner of the ahead of its time app.
 Widgets are roughly in a superior way interactive, and let users flip on stacked
content or scroll through content displayed in the widget.
.
Android 4.0 (Ice Cream Sandwich) - October 2011

 All navigation is brought on-screen, meaning perfect is forthwith ready


willing and experienced to retrieve a device by the whole of only a power
and volume buttons.
 Widgets are shortly resizable, showing in a superior way or less content depending
on how lavish the user makes them.

Android 4.1 (Jelly Bean) - June 2012

 Google Now is approved as case of Search. It displays information in a card


regarding that Google believes is complementary to you based on the information
available in your Google apps and per your search history.
 High Dynamic Range photography comes natively to Android with an HDR
point of interest mode.

Android 4.4 (KitKat) - September 2013


 Truly full-screen apps are possible for the alternately time, and hide even the how
the thing stack up bar.
 Google Drive becomes a default app as a portal to Google's office suite.
27
Android 5.0 (Lollipop) - November 2014

 Notifications urgently merely pop up as a banner, by the whole of options to


deal by the whole of them immediately or seldom dismiss them, as a
substitute than having them take from one end to the other of the screen.
 RAW image support is urgently available for photographers who desire every
last bit of data available from the image sensor.

Android 6.0(Marshmallow) – October 2015


Areas that are new or righteous include:

 Reduced battery consumption while in sleep


 Improved covering controls
 Fingerprint back with fingerprint sensor equipped devices
 Easier transition from senior0 devices

Android 7.0(Nougat) – August 2016

 Ability to probe zoom

 Ability to relieve apps by replicate tapping in overview button

 Added Emergency reference part

 Multi-window support, which supports floating apps on a desktop layout

 New Data Saver mode, which can force apps to abbreviate bandwidth usage
Figure 10: The android version global usage

28
2.9) Bluetooth Low Energy(BLE) Beacons

Bluetooth beacons are hardware transmitters - a section of Bluetooth low


energy (LE) devices that broadcast their identifier to aside portable
electronic devices. The technology enables Smart phones, tablets and contrasting
devices to perform actions when in conclude proximity to a beacon. Bluetooth
beacons uses Bluetooth reserved energy nearness sensing to commit off
a completely unique identifier picked up by a pertinent app or operating system. The
identifier and as a choice bytes sent mutually with it can be used to show once and
for generally told the device's physical location, [4] concatenate customers, or
encourage a location-based transpire on the device such as a check-in on social
media or a solicit notification.
Bluetooth beacons differs from small number at variance location-based
technologies as the broadcasting device (beacon) is unattended a 1-way transmitter
to the present Smartphone or receiving device, and necessitates a tenacious app
wired on the device to interact mutually the beacons. This ensures that only the
installed app (not the Bluetooth beacon transmitter) can track users, potentially
against to their will, as they passively walk from one end to the other the
transmitters.
29
3. SYSTEM DEVELOPMENT

3.1) SOFTWARE REQUIREMENTS:

 Android Studio
 JDK(Java Development Kit) 1.7
 Arduino IDE

3.2) HARDWARE REQUIREMENTS:

 For BLE(Bluetooth low energy) Beacon:


 Arduino UNO
 Atmega Atmel 328P IC
 Oscillator: 16 MHz
 Capacitor: 22 pF
 Resistor: 10K
 Bluetooth module: nrf24lo1+
 Coin battery: 3V
 LED

 System Requirements:
 CPU: 2.2 GHz Processor and above
 RAM: 4 GB or above
 OS: Windows 7 or above

30
3.3) MAKING BEACONS:

3.3.1) Using Arduino as an ISP(In-system Programmer)

We need to program the Atmel Atmega 328P IC to make the beacon and the finally
connect it with the nrf module to obtain a beacon.

Steps to be followed to program the IC:


After verifying connections connect the Arduino to you laptop.
 Open the ArduinoISP firmware (in Examples) to your Arduino board.
 Select the items in the Tools > Board and Serial Port that correspond to the board you
are using as the programmer (not the board being programmed)
 Upload the ArduinoISP sketch.
 Now select the item in the Tools > Board menu select the target board on which we
want to upload the code.
 Now Select the Arduino as ISP in the Tools>Programmer menu.
 After setting arduino as ISP in the programmer menu use the Burn Boot loader
command in tolls menu.
 Now open the SKETCH that you want to upload on target microcontroller (i.e atmega
328 in our case)
 Now go to file menu and select the option “upload using programmer”.

31
Figure 11: Using Arduino as an ISP

Figure 12: Pin out of Atmel Atmega 328P IC


32
3.3.2) Assembling the beacon

After uploading the code on the IC(Atmel Atmega 328P), we need to connect it with the
Bluetooth module(nrf24l01+) and a battery.

Figure 13: Connection of Bluetooth module with the IC and the battery
33
3.4) SYSTEM DESIGN:

3.4.1) USECASE DIAGRAM

A use case diagram is a list of steps, originally defining interaction surrounded by a


management (known in Unified Modeling Language (UML) as an ―actor) and a route,
to go ahead of a goal.

The actor can be a human or an evident system.


Figure 14: Use case Diagram
Flow of events:
 Session Use case: A session is directed when a user opens the application.
 Enter Destination Use case: The user is prescribed to get as far as the destination
to what place the user wishes to go.
 Get nearest beacon Use case: The API, runs in the background and interacts with
the beacons and finds out the nearest beacon and stores the result in a schema.
 Find the shortest path Use case: Then the application finds out the shortest
orientation to the desired location by the pre-stored information.
 Use Augmented reality to navigate Use case: After finding for the most part
parameters and knowing the contemporary location of the user, we decidedly use
augmented reality to run the user to the desired destination.
34

3.4.2) CLASS DIAGRAM

A class diagram in the Unified Modeling Language (UML) is a core of static definite
schedule diagram that describes the process of a program by showing the
system's classes, their attributes, operations (or methods), and the relationships halfway
objects.

Figure 15: Class diagram


35
3.4.3) DATA FLOW DIAGRAM

Data Flow Diagrams (DFD) helps us in identifying existing business processes. It is a


campaign we benefit from particularly once up on a time we withstand business process
re-engineering. At its simplest, a data flow diagram looks at which point data flows over
a system. It concerns things like to what place the data will determine from and go to as
abundantly as where it will be stored.

Figure 16: Data flow diagram


36
3.5) API IMPLEMENTATION

3.5.1) Global Variables

Sno. Variables Functions


1 private static final int constant passed to startActivityForResult() is a
REQUEST_ENABLE_BT locally defined integer (which must be greater
than 0)

2 private static final long Time till which scanning will be done(in ms)
SCAN_PERIOD

3 private BluetoothAdapter To access the Bluetooth manager


mBluetoothAdapter

4 private Handler mHandler To synchronize the starting and stopping of


scanning
5 private boolean mScanning Tells the handler whether scanning has to be
done or not
6 int Rssi To store the signal strength of the beacon

7 public static List<Beacon> list To store all the beacons found

8 int count Higher the value of count, higher the


smoothness in rssi value of the beacons

9 Map<String, Float> dic Dictionaries to maintain out the smoothness.


Map<String, Integer> miss
Map<String, String> nameDic

10 static Context cnt; To give the context to the api.

Table 2: Global variables used in API

37
3.5.2) Functions

 OctiseBeacons(Context context)
 Constructor to give the context to this class.

 checkBLEcompatability()
 Checks whether the android version is compatible with BLE or not.
 Return type: Boolean
1. true : when compatible
2. false : not compatible

 checkBluetoothCompatability()
 Checks whether the android version is compatible with Bluetooth or not.
 Return type: Boolean
1. true : when compatible
2. false : not compatible

 BluetoothEnabled(Activity activity)
 Checks whether the Bluetooth is enabled or not, prompts a message to the user
to enable Bluetooth if not enabled.
 Return type: void
 Parameters: activity
 Intent is sent to the user to enable the Bluetooth from this activity.

 scanLeDevice(final Boolean enable)


 calls the callback which detects the BLE devices
 Return type: void
 Parameters: enables
 When true only then scanning will be done.

38
4. PERFORMANCE ANALYSIS

4.1) SYSTEM TESTING

System testing of software is most working conducted on a diligent, full system to use the
system's compliance by the whole of its specified requirements. System testing falls
within the scope of black box testing, and as a well-known, should oblige no
development of the inner design of the bias or logic.

It is literally much similar efficient test case writing. In test case exchange of letter you
should devise the test scenarios & act with regard to use cases.

4.1.1) BLACK BOX TESTING:-

Black-box mostly working is a method of software testing that examines the


functionality of an application without peering directed towards its internal structures or
workings.

Specific development of the application's code/internal structure and programming


knowledge in general is not required. The tester is watchful of what the software is
supposed to do anyhow is not aware of how it does it. For instance, the tester is
interested that a disparate input returns a evident, invariable output for all that is not
aware of at which point the software produces the output in the as a matter of choice of
place.

Figure 17
39
4.1.2) UNIT TESTING:-

In computer programming, unit testing is a software testing manner by which individual


units of source conscience, sets of a wanted or in a superior way computer system
modules together by the whole of associated approach data, usage procedures, and
operating procedures, are tested to verify whether they are fit for permeate. Intuitively, a
popular can get a unit as the smallest testable object of an application. In procedural
programming, a unit could perchance a full module, but not with standing it is more
commonly an individual function or procedure.

The choice of unit testing is to bring to a close each object of the system and disclose
that the individual parts are correct.

Figure 18
40
4.2) TEST CASES

Test Case 1: Range 1m


Module BLE beacon
Operating System Android OS, version 5.1
Input Open application and start scanning
All the beacons in range should be
Output detected

Expected Output All the beacons in range were found

Test Case 2: Range 1.5m


Module BLE beacon
Operating System Android OS, version 5.1
Input Open application and start scanning
All the beacons in range should be
Output detected

Expected Output All the beacons in range were found

Test Case 3: Range 2m


Module BLE beacon
Operating System Android OS, version 5.1
Input Open application and start scanning
All the beacons in range should be
Output detected

Expected Output All the beacons in range were found

Test Case 4: Range 2.5m


Module BLE beacon
Operating System Android OS, version 5.1
Input Open application and start scanning
All the beacons in range should be
Output detected

Expected Output All the beacons in range were found


41

Test Case 5: Range 3m


Module BLE beacon
Operating System Android OS, version 5.1
Input Open application and start scanning
All the beacons in range should be
Output detected

Expected Output All the beacons in range were found

Test Case 6: Range 4m


Module BLE beacon
Operating System Android OS, version 5.1
Input Open application and start scanning
All the beacons in range should be
Output detected

Expected Output All the beacons in range were found

Test Case 7: Range 5m


Module BLE beacon
Operating System Android OS, version 5.1
Input Open application and start scanning
All the beacons in range should be
Output detected

Expected Output All the beacons in range were found


42
Figure 19: Received RSSI values at different distances

43
5. CONCLUSION

Indoor navigation deals by all of navigation within buildings. Because GPS reception is
normally non-existent inner buildings, distinctive positioning technologies are hand me
down here when automatic positioning is desired. WiFi or beacons (Bluetooth Low
Energy, BLE) are constantly used in this case to create a so-called "indoor GPS".

BLE beacon is complacent in disparate aspects. The project highlights the mange of
beacons to extricate the user’s current location. The major advantage of using beacon is
that the user will have a seamless experience; the user is not prescribed to read whole
marker for augmented reality to work. Hence the efficiency is higher and hence the
anticipation taken to execute to user interrogation is less than distinct mechanisms.

We were successfully experienced to design the BLE beacon, which will split the
Bluetooth low energy signals which can be detected by any android device running on
Android version preferably than 4.0(Ice cream sandwich). We were by the same token
successfully able to design the API(Application program interface) which will collect all
the scanned beacon data, process the data and then revive the processed data to the
application.
44
5.1.1) Future scope

In this project we are using beacons for only detecting the user’s location, but beacon can
be used in other applications too.

Some applications of beacons which can be implemented in future in restaurants and


homes are mentioned below:

Restaurant Smart homes


 Electronic coupons for discount on  Beacons can be used to restrict
food and drinks at specific location. device access (for children).

 Trigger notification for discounts on  Save’s energy and reduce cost.


food, special menu and chef’s
favorite.
 Let the customer know how crowded  For security.
the restaurant is.
 Let the customer order ahead.  Seamless control for switching on/off
electrical appliances.
 Contactless payment.  Beacons can help in preventing
hazards.(eg- fire broke out)
 Surveying, real time feedback.  Beacons placed at particular locations
can be issued a particular app, and
the notification can be triggered to
the user and user can open it directly.
 Save’s energy and reduce cost.  Auto lock and login when you are
nearby your laptop.
 For security.  Remember where you parked.
 Push a notification when a old  Getting weather based notification.
customer is near.
 Assistant for disabled people.
Table 3: Future Scope
45
REFERENCES

[1] Mr. Raviraj S. Patkar, Mr. S. Pratap Singh, Ms. Swati V. Birje, “Marker Based
Augmented Reality Using Android OS “(2013) - International Journal of Advanced
Research in Computer Science and Software Engineering
http://www.ijarcsse.com/docs/papers/Volume_3/5_May2013/V3I4-0388.pdf

[2] Mr. Prasad Renukdas, Mr. Rohit Ghundiyal, Mr. Harshavardhan Gadgil, Mr. Vishal
Pathare, “Markerless Augmented Reality Android App For Interior Decoration”(2013)-
International Journal of Engineering Research & Technology
https://www.ijert.org/view-pdf/3110/markerless-augmented-reality-android-app-for-
interior-decoration-

[3] Mr. Yoshitaka Tokusho , Mr. Steven Feiner, “Prototyping an Outdoor Mobile
Augmented Reality Street View Application”(2009)
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.472.9624&rep=rep1&type=pdf

[4] Geiger, Philip and Schickler, Marc and Pryss, Rüdiger and Schobel,
Johannes and Reichert, “Location-based Mobile Augmented Reality Applications:
Challenges, Examples, Lessons Learned” (2014) - 10th Int'l Conference on Web
Information Systems and Technologies
http://dbis.eprints.uni-ulm.de/1028/1/Geig_2014.pdf

[5] Joonyoung Jung, Dongoh Kang, Changseok Bae, “Distance Estimation of Smart
Device using Bluetooth”(2013) - The Eighth International Conference on Systems and
Networks Communications
https://www.thinkmind.org/index.php?view=article&articleid=icsnc_2013_1_30_20039

[6] Rachel Metz, ‘Augmented Reality Is Finally Getting Real’, 2012. [Online]. Available:
https://www.technologyreview.com/s/428654/augmented-reality-is-finally-getting-real/

46
[7] Jimmy & Omar, ‘Augmented Reality’, 2012. [Online]. Available:
https://prezi.com/zzpvrwqk1bns/augmented-reality/

[8] Danielle Muoio, ‘The 10 coolest things happening with augmented reality right now’,
2015. [Online]. Available: http://www.businessinsider.in/The-10-coolest-things-
happening-with-augmented-reality-right-now/articleshow/49313058.cms#microsofts-
augmented-reality-headset-the-hololens-lets-a-roboarmy-invade-your-living-room-it-
debuts-next-year-1

[9] Lily Prasuethsut, ‘Everything you need to know about augmented reality: Then, now
& next’, 2016. [Online]. Available: http://www.wareable.com/ar/everything-you-need-to-
know-about-augmented-reality

[10] The Economist, ‘The difference between virtual and augmented reality’, 2016.
[Online]. Available: http://www.economist.com/blogs/economist-
explains/2016/04/economist-explains-8

47

You might also like