Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

G. B.

Pant Engineering College, New Delhi


DEPARTMENT OF COMPUTER SCIENCE & ENGINEERING

SYNOPSIS of Minor Project Aug-Dec 2021


On
Product Dimension Mapping with Augmented Reality
B.Tech (CSE - 7th Semester)

Submitted by: Submitted to:

Hunny Vedwal - 41020902718 Mrs. JYOTI TRIPATHI

Mohit Khulbe - 41420902718

Vivek Bharti - 04620902718


Problem Statement
Purchases of wrong product/items due to poor interpretation of size, misleading size
comparison, use of different standards, wrong depiction of items for marketing purposes
leading to series of returns and re-orders by customer.

Motivation
Augmented Reality is a relatively new technology and many companies are trying to
implement this in some way or another as it gives an immersive experience to the customer.
One such use case of this technology can be implemented in the online retail sector.

In a report compiled by Invespcro [1], at least 30% of all products ordered online are returned
as compared to 8.89% in brick-and-mortar stores in which 22% of people received a product
that does not match the description or looks as shown on the website and 23% people
received the wrong item altogether. Due to incidents like these people have started losing
trust in E-commerce and it impacts huge losses to the company as well as to the sellers.

Before buying any product, an average customer takes a lot of factors into consideration like
the size, color, customization of the product, and the product fitting well into the lifestyle of
the customer which the online retailers are not able to provide as compared to the offline
store.

Overview of Project
The product Dimension mapping system is an effective tool that helps in reducing the number
of product returns and re-orders. In this project, we aim to develop android software that
helps users in the visualization of the product in their own space. We aim to provide the users
a 3D model of the product in their own space in real-time with the same size and color as
seen in real life. Along with that, we also aim to provide the users with a real-time
measurement feature with which users can accurately place the product in the 3D space to get
an idea of what the product would actually look like in the space.

To implement the above said software, we will use Augmented Reality technology with
Google’s ARCore under the Android development environment [2]. We will also try to
integrate Deep learning and Machine learning models with the project.
Why use Augmented Reality?

● Augmented Reality is a relatively new technology and many companies are trying to
incorporate this in their product.
● It also provides an immersive experience to the customer and they can easily visualize
the product to get a better idea of it in the real-world.
● Augmented Reality is easily accessible as it only requires a smartphone which many
people already have.
● It doesn’t require additional accessories to access the features of AR unlike in VR in
which we need additional hardware (VR headset).
● It can be implemented in the vast online retail sector.

Objective and Scope of the project


In our android application, we propose solutions for two objectives that are-

1. To develop an android software that helps users in 3D life-sized visualization of the


product in their own space using Augmented Reality in which users would experience
a real-time visualization of the product.
2. To provide the users with a real-time virtual dimension measuring feature in the
software.

Literature Survey
In this literature survey, we have examined and read about existing softwares that uses
Augmented reality. We have gone through the technologies they used and also the limitations
and challenges that are present in the existing work. The existing application list is as
follows-

1. Harley Davidson developed an AR app [3] in collaboration with Theia Interactive


that shoppers can use to view motorcycles they might be interested in buying in the
showroom and customize it using the app to see which colors and features they might
like. The used Unreal Engine for the implementation of the AR app. To visualize a
detailed model and have a fast application at the same time, they used Unreal’s auto
LOD system which decreases the number of polygons used in the model without a
drop in quality, therefore achieving a high level of photo-realism. They faced
challenges in preparing the asset for import but overcame it with the help of
Datasmith.
2. The Gatwick airport passenger app [4] won a number of awards for its creative use
of AR technology. With the help of more than 2,000 beacons throughout its two
terminals, passengers can use the AR maps from their mobile phones to navigate
through the airport. As the app matures, it might eventually help improve traffic flow
in the airport. However, the implementation using physical beacons is very costly.

3. The Ikea Place app [5] will help users in their buying decision. The app was built
using Apple’s ARKit technology, and it allows users to scan their room and design the
space by placing Ikea products in the digital image of their room to create a new
environment with the new products. However, the application as of now only has a
few furnitures for 3D visualization.

4. Cosmetic company Sephora [6] uses AR technology to allow customers to try out
different looks and eye, lips, and cheek products as well as colors right on their own
digital face. This has increased their sales drastically. One limitation of the application
is that not all products are available for facial try-on.

5. Scapic has implemented Flipkart View in your room[7] which helps users in 3D
visualization of the products in our own space. It has been implemented using
WebXR. However only furniture and a few large appliances are only available for
visualization and surface recognition is also average.

After going through the existing work done in the field, we came to the following
conclusions-
1. For visualizing a model with a high number of polygons, the application will be
slower. To counter this problem, there must be a module implemented that decreases
the number of polygons which are not seen currently in the AR scene.
2. Some implementations used physical beacons which are costly when implemented on
a large scale. To conquer this problem, Image recognition models can be implemented
however the application will not be as fast as compared to physical beacons.
3. Very few products are available for visualization as their 3D models are not created.
To conquer this problem, third-party 3D models can also be added to the existing
dataset[8].

Methodology
“Google ARCore” along with Unity AR Foundation will be used as the core implementation
methods. It has a wide range of customizable features like motion tracking, environmental
understanding and Depth understanding, etc. As our application focuses on providing
accurate 3D visualization of the product, it requires tracking of the environment via the
camera feed, along with an understanding of different items in the vicinity as well. While
applying the 3D model, the amount of space and depth is crucial for the feasible
implementation of the product.

Our application will take advantage of features in the following ways:

Google’s ARCore
Fundamentally, ARCore is doing two things: tracking the position of the mobile device as it
moves and building its own understanding of the real world. ARCore provides SDKs for
many of the most popular development environments. These SDKs provide native APIs for
all of the essential AR features like motion tracking, environmental understanding, and light
estimation. With these capabilities, you can build entirely new AR experiences or enhance
existing apps with AR features[2]

● Motion Tracking: ARCore's motion tracking uses the phone's camera to identify
interesting points, called features, and tracks how those points move over time. With a
combination of the movement of these points and readings from the phone's sensors,
ARCore determines both the position and orientation of the phone as it moves through
space.

● Depth detection: The Depth API helps a device’s camera to understand the size and
shape of the real objects in a scene. It creates depth maps, thereby adding a layer of
realism into your apps [2]. It helps in positioning the 3D model with precision and in
conjunction with all the other real objects surrounding it.

● 3D model: We are using Amazon Berkeley Objects (ABO) Dataset for our 3D
models. The dataset contains high-quality 3D models with 4K texture maps for
physically based rendering for more than 7900 products. The models are provided in
the standard glTF 2.0 format [9].

● Scene Viewer: These 3D models will be rendered using the scene viewer that enables
3D and AR experiences from your website or Android app. It lets users easily
preview, place, view, and interact with web-hosted 3D models in their
environment[2].

For implementation, our proposed android software will adhere to the following workflow

● Camera feed Input to the application


● Surface recognition
● Track recognized surface
● User searches for product’s 3D model in the database [9]
● Google ARCore API’s working in backend
● Rendering 3D model
● Display Completed image
● Adjust size and ratio of the product
● Confirm buying decision

Fig-1: Application flowchart[10]

Hardware and Software Requirements


Hardware Requirements:
● 2 Gb RAM
● 500 Mb storage
● Smartphone (Android devices).
● Internet Connection
● 5-megapixel camera or above
● Any depth camera/ Time of Flight sensor (required for depth mapping)

Software Requirements:
● Android 7.0 Nougat and above.
● Google Play Services for AR

● Java / Unity AR Foundation

ARCore API:

○ Depth API

○ Recording and Playback API

○ Lighting Estimation API

○ Augmented Faces API

○ Augmented Images API

○ Cloud Anchors API

○ Android Camera2 API

○ Scene Viewer

References
[1] Khalid Saleh; “E-commerce Product Return Rate – Statistics and Trends [Infographic]”,
Invespcro website, 11 Apr 2021, Accessed on: 23 Sept, 2021, [Online], Available:
https://www.invespcro.com/blog/ecommerce-product-return-rate-statistics/
[2] Google; “Google ARCore documentation”; Google; Accessed on: 19 Sept, 2021;
[Online]; Available: https://developers.google.com/ar/develop/
[3] Ken Pimentel; “Theia Interactive’s Harley Davidson AR experience showcases the
potential of real-time”, Unreal Engine, May 4, 2018, Accessed on: 25 Sept, 2021, [Online],
Available: https://cutt.ly/9EGwWA6
[4] Kathleen Villaluz, “Gatwick Airport Uses Augmented Reality to Help Catching Flights”,
Interesting Engineering, May 28, 2017, Accessed on: 26 Sept, 2021, [Online], Available:
https://interestingengineering.com/gatwick-airport-uses-augmented-reality-help-catching-flig
hts
[5] Hao Jiang; “UX Case Study: IKEA Place”; Medium; 9 June, 2019, Accessed on: 27
Sept, 2021; [Online]; Available:
https://medium.com/@HausJiang/ux-case-study-ikea-place-a66319510023
[6]Alison DeNisco Rayome; “How Sephora is leveraging AR and AI to transform retail and
help customers buy cosmetics”; Tech Republic; 15 Feb, 2018; Accessed on: 26 Sept, 2021;
[Online]; Available:
https://www.techrepublic.com/article/how-sephora-is-leveraging-ar-and-ai-to-transform-retail
-and-help-customers-buy-cosmetics/
[7] Anuj Bhatia; “Flipkart’s new 3D camera kicks off a new vision for AR in retail”; The
Indian Express; 19 July, 2021; Accessed on: 1 Oct, 2021; [Online]; Available:
https://indianexpress.com/article/technology/tech-news-technology/flipkarts-new-3d-camera-
kicks-off-a-new-vision-for-ar-in-retail-7410293/
[8] Just Use App; “Ikea Place App Reviews”; Just Use Website; 30 Aug 2021; Accessed on:
1 Oct, 2021; [Online]; Available:
https://justuseapp.com/en/app/1279244498/ikea-place/reviews
[9] Matthieu Guillaumin; Thomas Dideriksen; Kenan Deng; Himanshu Arora; Arnab Dhua;
Jasmine Collins; Shubham Goel; Jitendra Malik, “Amazon Berkeley Objects (ABO)
Dataset”, Amazon.com, 2020, Accessed on: 22 Sept, 2021, Available:
https://amazon-berkeley-objects.s3.amazonaws.com/index.html
[10] James O Uhomoibhi; Clement E. Onime; Sandro Radicella; “MARE: Mobile
Augmented Reality Based Experiments in Science, Technology and Engineering”;
ResearchGate; 1 Dec, 2021; Accessed on: 1 Oct, 2021; [Online]; Available:
https://www.researchgate.net/figure/Technical-flowchart-for-video-see-through-augmented-re
ality-on-mobile-devices_fig2_296696644

You might also like