Download as pdf or txt
Download as pdf or txt
You are on page 1of 42

Argument Reality and Virtual Reality

A LAB REPORT

Submitted by
Anushree Bajaj[RA2011051010022]

Under the Guidance of


Dr. Premalatha.G
(Assistant Professor, Department of Data Science and Business Systems)
In partial fulfillment of the Requirements for the Degree
of

BACHELOR OF TECHNOLOGY
COMPUTER SCIENCE ENGINEERING
with specialization in Gaming Technology

DEPARTMENT OF DATA SCIENCE AND BUSINESS SYSTEMS


FACULTY OF ENGINEERING AND TECHNOLOGY
SRM INSTITUTE OF SCIENCE AND TECHNOLOGY
KATTANKULATHUR – 603203

SRM INSTITUTE OF SCIENCE AND TECHNOLOGY


KATTANKULATHUR-603203
BONAFIDE CERTIFICATE

Certified that this Lab report titled “AR and VR Lab Experiments” is the bonafide work of “Anushree
Bajaj[RA2011051010022] who carried out the experimental work under my supervision. Certified
further, that to the best of my knowledge the work reported herein does not form part of any other thesis or
dissertation on the basis of which a degree or award was conferred on an earlier occasion for this or any
other candidate.

Dr. Premalatha G Dr. M.Lakshmi


Faculty HEAD OF THE DEPARTMENT
Assistant Professor Dept. of DSBS
Dept. of DSBS
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Exp.
Date Title of the Experiment Page. No Sign
No
Generation of fractal curves and landscapes using
1 algorithms (Using Python ) 1

Illustrate the aliasing and anti-aliasing techniques


2 (Using Python ) 5

Generation of Mandelbrot and Julia set fractals.


3 (Using Python) 9

Construct the primitives with different color models


and simulate the conversion from one model to
4 14
another. (Using Python)

Develop a new texture and apply various mapping on


5 3D objects. (Using Unity) 19

Implementation of ray tracing concepts with the


6 collection of 3D models. (Using Unity) 21

Conversion of assemblies to VR models. (Using


7 Unity) 24

Creation of digital mockup addition of behavior.


8 (Using Unity) 26

Develop an animation sequence to illustrate the concepts of


9 kinematics and dynamics. (Using Unity) 28

Build a 3D scene using VRML and explore it using


10 various navigations. (Using Unity) 30

Tracking using AR. (Using Unity)


11 32
Haptic Sensing in AR. (Using Unity)
12 34
Ergonomic and aesthetic studies (Unreal & Unity)
13 36
Identification of a real life problem in thrust areas
14 (Example: Game Development) 38

Creation of a full-fledged immersive environment for


15 40
product / system evaluation (AR - Project)

1
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Exp.No 1 Generation of fractal curves and landscapes using algorithms.

Date:

Generation of fractal curves

Fractal curves are geometric objects that are self-similar at multiple scales. This means that if
you zoom in on a fractal curve, you will see the same basic pattern repeated at a smaller
scale. Some examples of fractal curves include the Koch snowflake, the Sierpinski triangle,
and the Mandelbrot set.

There are many different algorithms for generating fractal curves. One common approach is
to use a recursive algorithm. This type of algorithm works by repeatedly applying a set of
rules to a starting object. For example, the following recursive algorithm can be used to
generate the Koch snowflake:

Koch snowflake algorithm:

Input: A line segment


Output: A Koch snowflake curve

1. Draw the line segment.


2. Divide the line segment into three equal parts.
3. Replace the middle part with an equilateral triangle with the same base as the line segment.
4. Repeat steps 2 and 3 recursively for each side of the triangle.

The Koch snowflake curve is complete when the recursion stops.

Another common approach to generating fractal curves is to use a parametric equation. A


parametric equation is a type of equation that defines a curve using two or more independent
variables. For example, the following parametric equation can be used to generate the
Mandelbrot set:

Mandelbrot set equation:

z = z^2 + c

where:

* z is the complex number being generated


* c is a complex constant

The Mandelbrot set is complete when the absolute value of z exceeds 2.

2
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Generation of fractal landscapes

Fractal landscapes are geometric objects that are self-similar at multiple scales. This means
that if you zoom in on a fractal landscape, you will see the same basic pattern repeated at a
smaller scale. Some examples of fractal landscapes include mountains, clouds, and coastlines.

There are many different algorithms for generating fractal landscapes. One common approach
is to use the random midpoint displacement algorithm. This algorithm works by starting with
a square grid and then iteratively displacing the midpoint of each square by a random amount.
The amount of displacement is typically chosen from a normal distribution with a mean of
0.0 and a standard deviation that is gradually decreased as the algorithm progresses.

Another common approach to generating fractal landscapes is to use a diffusion algorithm.


Diffusion algorithms work by simulating the diffusion of a substance through a medium. The
substance can be anything, such as heat, smoke, or water. The diffusion process is typically
simulated using a partial differential equation.

Applications of fractal curves and landscapes

Fractal curves and landscapes have many different applications. They are used in computer
graphics to generate realistic images of natural objects, such as mountains, trees, and clouds.
They are also used in mathematics to study complex phenomena, such as turbulence and
chaos.

Here are some specific examples of applications of fractal curves and landscapes:

 Computer graphics: Fractal curves and landscapes are used to generate realistic
images of natural objects, such as mountains, trees, and clouds.
 Mathematics: Fractal curves and landscapes are used to study complex
phenomena, such as turbulence and chaos.
 Signal processing: Fractal curves and landscapes are used to analyze and compress
signals, such as images and audio recordings.
 Medical imaging: Fractal curves and landscapes are used to identify and diagnose
diseases, such as cancer and Alzheimer's disease.
 Finance: Fractal curves and landscapes are used to analyze and predict market trends.

Output:

3
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Result:

Fractal curves and landscapes are beautiful and complex mathematical objects. They have
many different applications in computer graphics, mathematics, signal processing, medical
imaging, and finance. Algorithms for generating fractal curves and landscapes are relatively
simple to implement, but they can produce stunning results.

4
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Exp.No 2 Illustrate the aliasing and anti-aliasing techniques

Date:

Aliasing:

Aliasing is a phenomenon that occurs when a signal is sampled at a lower frequency than its
highest component frequency. This can cause the signal to be reconstructed incorrectly,
resulting in jagged edges and other artifacts.

Box filter: The box filter is a simple aliasing technique that works by averaging the values of
adjacent pixels. This can be implemented in Python using the following code:

Python
import numpy as np

def box_filter(image):
"""Applies a box filter to the given image.

Args:
image: A numpy array representing the image.

Returns:
A numpy array representing the filtered image.
"""

kernel = np.ones((3, 3)) / 9


filtered_image = cv2.filter2D(image, -1, kernel)
return filtered_image

Gaussian filter:
The Gaussian filter is a more sophisticated aliasing technique that uses a Gaussian kernel
to smooth the image. This can be implemented in Python using the following code:

Python
import numpy as np
import cv2

def gaussian_filter(image):
"""Applies a Gaussian filter to the given image.

Args:
image: A numpy array representing the image.

Returns:
A numpy array representing the filtered image.
"""

kernel = cv2.getGaussianKernel(3, 1)
filtered_image = cv2.filter2D(image, -1, kernel)
return filtered_image

5
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Bilateral filter:
The bilateral filter is a more advanced aliasing technique that takes into account both the
spatial and intensity differences between pixels. This can be implemented in Python
using the following code:

Python
import numpy as np
import cv2

def bilateral_filter(image):
"""Applies a bilateral filter to the given image.

Args:
image: A numpy array representing the image.

Returns:
A numpy array representing the filtered image.
"""

filtered_image = cv2.bilateralFilter(image, d=9, sigmaColor=75,


sigmaSpace=75)
return filtered_image

Which aliasing technique to use depends on the specific application. The box filter is the
simplest and fastest technique, but it is not as effective at removing aliasing as the
Gaussian or bilateral filters. The Gaussian filter is more effective at removing aliasing, but
it is slower than the box filter. The bilateral filter is the most effective at removing
aliasing, but it is also the slowest.

Python
import numpy as np
import cv2

image = cv2.imread("image.jpg")

# Apply the Gaussian filter to the image.


filtered_image = gaussian_filter(image)

# Save the filtered image.


cv2.imwrite("filtered_image.jpg", filtered_image)

Anti-aliasing:

Anti-aliasing is a technique that is used to reduce aliasing. There are many different anti-
aliasing techniques, but they all work by smoothing out the edges of a signal before it is
sampled.

6
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Anti- aliasing

Anti-aliasing is a technique used in computer graphics to reduce the visual artifacts known as
aliasing, which occur when displaying images at lower resolutions or on devices that cannot
accurately represent fine details. Aliasing causes jagged or "staircase-like" edges on diagonal
lines or curves. Anti-aliasing plays a crucial role in improving the visual quality of computer
graphics, especially in applications like video games, computer-generated imagery (CGI),
and graphic design.

Anti-aliasing is a process used to smooth out these jagged edges and create a more visually
appealing image. It involves various algorithms designed to blend the colors of pixels along
the edges, thereby reducing the sharp contrast between the object and the background.Types
of Anti-aliasing:

a. Supersampling Anti-aliasing (SSAA): Also known as Full-Scene Anti-aliasing (FSAA),


this method renders the image at a higher resolution and then downscales it to the target
resolution. It can provide high-quality results but is computationally expensive.

b. Multisample Anti-aliasing (MSAA): MSAA only samples certain points on each pixel,
reducing the workload compared to SSAA. It helps to smooth edges but may not eliminate all
aliasing artifacts.

c. Fast Approximate Anti-aliasing (FXAA): FXAA is a post-processing technique that


analyzes the rendered image and applies smoothing algorithms to reduce aliasing. It is fast
and efficient but might blur some details.

d. Temporal Anti-aliasing (TAA): TAA is an advanced technique that involves blending


information from previous frames to reduce aliasing and flickering in motion sequences.

Output:

7
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Results:

Anti-aliasing is a valuable technique that can be used to improve the quality and realism of
images and videos. However, it is important to be aware of the computational cost and
performance implications before using it.

8
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Exp.No 3: Generation of Mandelbrot and Julia set fractals.

Date:

The Mandelbrot set and Julia sets are two of the most famous fractals. They are both
generated using complex numbers, and they both exhibit a remarkable degree of self-
similarity.

The Mandelbrot set is a set of complex numbers that do not diverge when iterated using the
following equation:

z = z^2 + c

where z is the complex number being iterated and c is a constant complex number.

The Julia sets are a family of fractals that are generated using the following equation:

z = z^2 + c

where z is the complex number being iterated and c is a constant complex number.

Procedure

Import the necessary libraries:

Python

import numpy as np

import matplotlib.pyplot as plt

Use code with caution. Learn more

Define the functions to generate the Mandelbrot set and Julia sets:

Python

def mandelbrot(c, max_iterations=100):

"""Generates the Mandelbrot set for the given complex number c.

Args:

c: A complex number.

9
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

max_iterations: The maximum number of iterations to perform.

Returns:

A NumPy array containing the Mandelbrot set.

"""

z=0

for i in range(max_iterations):

z = z**2 + c

if abs(z) > 2:

return i

return 0

def julia(c, max_iterations=100):

"""Generates the Julia set for the given complex number c.

Args:

c: A complex number.

max_iterations: The maximum number of iterations to perform.

Returns:

A NumPy array containing the Julia set.

"""

z=0

for i in range(max_iterations):

z = z**2 + c

if abs(z) > 2:

10
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

return i

return 0

Use code with caution. Learn more

Generate the Mandelbrot set and Julia sets:

Python

# Generate the Mandelbrot set.

mandelbrot_set = np.zeros((500, 500))

for i in range(500):

for j in range(500):

c = complex(i / 500 * 4 - 2, j / 500 * 4 - 2)

mandelbrot_set[i, j] = mandelbrot(c)

# Generate the Julia set.

julia_set_1 = np.zeros((500, 500))

for i in range(500):

for j in range(500):

c = complex(-0.8 + 0.156j)

julia_set_1[i, j] = julia(c)

# Generate the Julia set.

julia_set_2 = np.zeros((500, 500))

for i in range(500):

for j in range(500):

c = complex(0.35 + 0.35j)

11
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

julia_set_2[i, j] = julia(c)

Use code with caution. Learn more

Plot the Mandelbrot set and Julia sets:

Python

# Plot the Mandelbrot set.

plt.imshow(mandelbrot_set, cmap="hot")

plt.colorbar()

plt.title("Mandelbrot Set")

plt.show()

# Plot the Julia set.

plt.imshow(julia_set_1, cmap="hot")

plt.colorbar()

plt.title("Julia Set 1")

plt.show()

# Plot the Julia set.

plt.imshow(julia_set_2, cmap="hot")

plt.colorbar()

plt.title("Julia Set 2")

plt.show()

12
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Output:

Results:

We can explore the Mandelbrot set and Julia sets by changing the parameters of the
mandelbrot() and julia() functions. For example, you can change the max_iterations parameter.

13
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Exp: No 4: Construct the primitives with different color models and simulate the
conversion from one model to another.

Date:

There are three main color models used in computer graphics: RGB, HSV, and CMYK. Here
is a brief description of each model:

RGB: The RGB color model is based on the three primary colors of light: red, green, and
blue. Each pixel in an RGB image is represented by three values, one for each of the primary
colors. The values are typically in the range of 0 to 255, with 0 being the darkest and 255
being the brightest.

HSV: The HSV color model is based on the three perceptual attributes of color: hue,
saturation, and value. Hue is the color itself, such as red, green, or blue. Saturation is the
intensity of the color. Value is the brightness of the color.

CMYK: The CMYK color model is based on the four primary colors of ink: cyan, magenta,
yellow, and black. Each pixel in a CMYK image is represented by four values, one for each
of the primary colors. The values are typically in the range of 0 to 100, with 0 being the least
ink and 100 being the most ink.

Here is a table that shows the constructs of the different color models:

Color model Constructs

RGB Red, green, blue

HSV Hue, saturation, value

CMYK Cyan, magenta, yellow, black

Converting from one color model to another is a relatively straightforward process. The
following algorithm can be used to convert from RGB to HSV:

def rgb_to_hsv(rgb):

"""Converts an RGB color to an HSV color.

Args:

rgb: A tuple of three values, representing the red, green, and blue

14
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

components of the RGB color.

Returns:

A tuple of three values, representing the hue, saturation, and value

components of the HSV color.

"""

r, g, b = rgb

# Calculate the hue.

hue = np.arctan2(b - g, r - g) / (2 * np.pi)

# Calculate the saturation.

saturation = 1 - np.min([r, g, b]) / max([r, g, b])

# Calculate the value.

value = max([r, g, b])

return hue, saturation, value

The following algorithm can be used to convert from HSV to RGB:

def hsv_to_rgb(hsv):

"""Converts an HSV color to an RGB color.

Args:

hsv: A tuple of three values, representing the hue, saturation, and value

components of the HSV color.

Returns:

A tuple of three values, representing the red, green, and blue

15
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

components of the RGB color.

h, s, v = hsv

# Calculate the red, green, and blue components.

r = v * (1 - s)

g = v * (1 - s * np.cos(h))

b = v * (1 - s * np.sin(h))

return r, g, b

Simulation of the conversion from one color model to another

The Python code simulates the conversion from one color model to another:

Python

import numpy as np

import matplotlib.pyplot as plt

def rgb_to_hsv(rgb):

"""Converts an RGB color to an HSV color.

Args:

rgb: A tuple of three values, representing the red, green, and blue

components of the RGB color.

Returns:

A tuple of three values, representing the hue, saturation, and value

components of the HSV color.

"""

16
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

r, g, b = rgb

# Calculate the hue.

hue = np.arctan2(b - g, r - g) / (2 * np.pi)

# Calculate the saturation.

saturation = 1 - np.min([r, g, b]) / max([r, g, b])

# Calculate the value.

value = max([r, g, b])

return hue, saturation, value

def hsv_to_rgb(hsv):

"""Converts an HSV color to an RGB color.

Args:

hsv: A tuple of three values, representing the hue, saturation, and value

components of the HSV color.

Returns:

A tuple of three values, representing the red, green, and blue

Output:

17
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Results:

Here is a table that summarizes the constructs of the different color models:

| Color model | Constructs |

| RGB | Red, green, blue |

| HSV | Hue, saturation, value |

| CMYK | Cyan, magenta, yellow, black |

Exp.No 5: Develop a new texture and apply various mapping on 3D objects.

Date:

18
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

To develop a new texture and apply various mapping on a 3D object using Unity, you can
follow these steps:

1. Develop the new texture

You can use a variety of tools to develop a new texture, such as a photo editing software like
Adobe Photoshop or a dedicated 3D texture painting software like Substance Painter.

Once you have developed your new texture, you need to import it into Unity. To do this,
simply drag and drop the texture file into the Project window.

2. Apply the texture to the 3D object

To apply the texture to a 3D object, you need to create a material in Unity. To do this, go to
Assets > Create > Material.

Once you have created a material, you can assign the texture to it by dragging and dropping
the texture file onto the Albedo property in the Inspector window.

3. Apply texture mapping

Unity supports a variety of texture mapping techniques, including diffuse mapping, bump
mapping, specular mapping, and normal mapping.

To apply diffuse mapping, simply assign the texture to the Albedo property of the material.

To apply bump mapping, assign the texture to the Normal property of the material.

To apply specular mapping, assign the texture to the Metallic property of the material.

To apply normal mapping, assign the texture to the Normal property of the material, but also
set the Normal Map property to True.

4. Adjust the texture mapping settings

Once you have applied texture mapping, you may need to adjust the settings to get the
desired result.

For example, you may need to adjust the tiling, offset, and rotation of the texture. You may
also need to adjust the intensity of the texture.

5. Save and apply the material

Once you are satisfied with the texture mapping, you can save the material and apply it to the
3D object.

Here is an example of how to develop a new texture and apply diffuse mapping to a 3D
object in Unity:

19
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Create a new Unity project.


Import a new texture file into the project.
Create a new material and assign the texture to the Albedo property.
Create a new cube object in the scene.
Assign the material to the cube object.
Play the scene and view the cube object.
You should now see the texture applied to the cube object.

Output:

20
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Results:

We can use the Shader Graph in Unity to create more complex texture mapping effects. We
can use the Asset Postprocessor in Unity to automatically generate texture maps, such as
normal maps and roughness maps. We can use the Unity Standard Shader to apply multiple
textures to a single material. This can be useful for creating realistic materials, such as wood
and metal.

Exp.No 6: Implementation of ray tracing concepts with the collection of 3D models.

Date:

Ray tracing is a rendering technique that simulates the physical behavior of light to create
realistic images. It works by tracing the path of each ray of light from the camera to the light
source and back, calculating how the light interacts with the objects in the scene along the
way.

To implement ray tracing concepts with the collection of 3D models using Unity, you can
follow these steps:

Import the 3D models into Unity.

 Create a new material and assign the shader for ray tracing.
 Set the properties of the material, such as the albedo, roughness, and metallic.
 Place the 3D models in the scene.
 Add a camera to the scene.
 Set the rendering mode of the camera to ray tracing.
 Render the scene.

Here is an example of how to implement ray tracing concepts with the collection of 3D
models using Unity:

using UnityEngine;

21
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

public class RayTracing : MonoBehaviour

public Material rayTracingMaterial;

private void Start()

// Import the 3D models.

GameObject[] models = Resources.LoadAll<GameObject>("Models");

// Create a new material and assign the shader for ray tracing.

Material material = new Material(rayTracingMaterial);

// Set the properties of the material.

material.SetFloat("_Albedo", 0.5f);

material.SetFloat("_Roughness", 0.5f);

material.SetFloat("_Metallic", 0.0f);

// Place the 3D models in the scene.

foreach (GameObject model in models)

Instantiate(model);

// Add a camera to the scene.

Camera camera = GameObject.Find("Camera").GetComponent<Camera>();

// Set the rendering mode of the camera to ray tracing.

camera.renderingPath = RenderingPath.DeferredShading;

22
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

camera.allowHDR = true;

// Render the scene.

camera.Render();

Output:

23
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Results:

We can import 3D models from the Models folder, create a new material with the shader for
ray tracing, and set the properties of the material. Then, it will place the 3D models in the scene,
add a camera to the scene, set the rendering mode of the camera to ray tracing, and render the
scene. You can use the rayTracingMaterial property to assign a different material for ray
tracing. You can also use the _Albedo, _Roughness, and _Metallic properties to adjust the
appearance of the materials.

Exp.No: 7 Conversion of assemblies to VR models.

Date:

To convert assemblies to VR models using Unity, you can follow these steps:

 Import the assemblies into Unity. You can do this by dragging and dropping the
assembly files into the Project window.
 Create a new material for the VR models. To do this, go to Assets > Create >
Material.
 Assign a shader to the material. The shader should be compatible with VR.
 Set the properties of the material. This may include the albedo, roughness, metallic,
and normal map.
 Apply the material to the assemblies.
 Add a VR camera to the scene.
 Set the rendering mode of the camera to VR.
 Build and run the project.

Here is an example of how to convert assemblies to VR models using Unity:

using UnityEngine;

public class AssemblyToVRModel : MonoBehaviour


{
public Material vrMaterial;

private void Start()


{
// Import the assemblies.
Assembly[] assemblies = Resources.LoadAll<Assembly>("Assemblies");

// Create a new material for the VR models.

24
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Material material = new Material(vrMaterial);

// Set the properties of the material.


material.SetFloat("_Albedo", 0.5f);
material.SetFloat("_Roughness", 0.5f);
material.SetFloat("_Metallic", 0.0f);
material.SetTexture("_NormalMap", normalMapTexture);

// Apply the material to the assemblies.


foreach (Assembly assembly in assemblies)
{
assembly.GetComponent<Renderer>().material = material;
}

// Add a VR camera to the scene.


Camera camera = GameObject.Find("Camera").GetComponent<Camera>();

// Set the rendering mode of the camera to VR.


camera.renderingPath = RenderingPath.MultiPass;
camera.stereoTargetEye = StereoTargetEyeMask.Both;

// Build and run the project.


}
}

Output:

25
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Results:

This code will import the assemblies from the Assemblies folder, create a new material for
the VR models, set the properties of the material, and apply the material to the assemblies.
Then, it will add a VR camera to the scene and set the rendering mode of the camera to VR.

Exp.No: 8: Creation of digital mockup addition of behavior.

Date:

To create a digital mockup addition of behavior using Unity

 Import the digital mockup into Unity. We can do this by dragging and dropping the
mockup file into the Project window.
 Add a Rigidbody component to the mockup. This will allow the mockup to be
affected by physics.
 Add a script to the mockup to control its behavior. For example, you could add a
script to make the mockup move around or rotate.
 Play the scene and test the behavior of the mockup.

26
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Here is an example of a simple script to make a mockup move around:

using UnityEngine;

public class MoveMockup : MonoBehaviour


{
public float speed = 1f;

private void Update()


{
// Get the Rigidbody component of the mockup.
Rigidbody rigidbody = GetComponent<Rigidbody>();

// Apply a force to the mockup to move it forward.


rigidbody.AddForce(transform.forward * speed);
}
}
To use this script, simply attach it to the mockup object in the scene. Then, play the scene and
the mockup will start moving forward.

You can use more complex scripts to control the behavior of the mockup. For example, you
could write a script to make the mockup follow a specific path or to interact with other
objects in the scene.

Script to make the mockup follow a specific path:

using UnityEngine;

public class FollowPath : MonoBehaviour


{
public Transform[] pathPoints;
public float speed = 1f;
public int currentPathPoint = 0;

private void Update()


{
// Get the Rigidbody component of the mockup.
Rigidbody rigidbody = GetComponent<Rigidbody>();

// Move the mockup towards the current path point.


Vector3 direction = (pathPoints[currentPathPoint].position -
transform.position).normalized;
rigidbody.AddForce(direction * speed);

// Check if the mockup has reached the current path point.


if (Vector3.Distance(transform.position, pathPoints[currentPathPoint].position) < 0.1f)
{
// Move to the next path point.
currentPathPoint++;

27
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

// If the mockup has reached the end of the path, loop back to the beginning.
if (currentPathPoint >= pathPoints.Length)
{
currentPathPoint = 0;
}
}
}
}

Output:

Result:

We can use scripts to create all sorts of different behaviors for digital mockups. For example,
We could write a script to make the mockup explode, break into pieces, or interact with other
objects in the scene.

Exp.No 9: Develop an animation sequence to illustrate the concepts of kinematics and


dynamics using unity

Date:

To develop an animation sequence to illustrate the concepts of kinematics and dynamics


using Unity, you can follow these steps:

28
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Create a new Unity project.


Import the necessary assets, such as 3D models, textures, and animations.
Create a scene that illustrates the concepts of kinematics and dynamics. For example, you
could create a scene with a ball rolling down a hill or a rocket launching into space.
Add a script to the scene to control the animation sequence. The script should use the
kinematic and dynamic equations to calculate the motion of the objects in the scene.
Play the scene and test the animation sequence.
Here is an example of a simple script to animate a ball rolling down a hill:

using UnityEngine;

public class BallAnimation : MonoBehaviour


{
public float gravity = 9.81f;
public float velocity = 1f;

private void Update()


{
// Calculate the acceleration of the ball.
Vector3 acceleration = Vector3.down * gravity;

// Calculate the new velocity of the ball.


velocity += acceleration * Time.deltaTime;

// Calculate the new position of the ball.


transform.position += velocity * Time.deltaTime;
}
}
To use this script, simply attach it to the ball object in the scene. Then, play the scene and the
ball will start rolling down the hill.

You can use more complex scripts to create more sophisticated animation sequences. For
example, you could write a script to animate a rocket launching into space or a car driving
down a road.

Here is an example of a script to animate a rocket launching into space:

using UnityEngine;

public class RocketAnimation : MonoBehaviour


{
public float thrust = 100f;
public float gravity = 9.81f;

private void Update()


{
// Calculate the acceleration of the rocket.
Vector3 acceleration = thrust * transform.up + gravity * Vector3.down;

// Calculate the new velocity of the rocket.

29
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

rigidbody.AddForce(acceleration);

// Calculate the new position of the rocket.


transform.position += rigidbody.velocity * Time.deltaTime;
}
}
Output:

Result:

Using this script, simply attach it to the rocket object in the scene. Then, play the scene and the
rocket will start launching into space. We can use scripts to create all sorts of different
animation sequences to illustrate the concepts of kinematics and dynamics. For example, you
could create an animation sequence to illustrate the motion of a projectile, the motion of a
pendulum, or the motion of a car on a banked curve.

30
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Exp.No: 10 Build a 3D scene using VRML and explore it using various navigations.

Date:

To build a 3D scene using VRML and explore it using various navigations using Unity, you
can follow these steps:

1. Import the VRML file into Unity

You can do this by dragging and dropping the VRML file into the Project window. Unity will
automatically import the VRML file and create a new scene containing the 3D objects.

2. Add a VR camera to the scene

To do this, go to GameObject > Create Other > VRTK > Prefabs > CameraRig. This will create
a new camera rig object in the scene.

3. Add navigation scripts to the camera rig object

Unity provides a variety of navigation scripts that you can use to explore the 3D scene. For
example, you can add the VRTK_FreeFly script to allow the user to move freely through the
scene. You can also add the VRTK_BasicTeleport script to allow the user to teleport to
different locations in the scene.

4. Play the scene and explore it using the navigation scripts

Once you have added the navigation scripts to the camera rig object, you can play the scene
and start exploring the 3D scene.

Here is an example of how to add the VRTK_FreeFly script to the camera rig object:

using UnityEngine;
using VRTK;

public class FreeFlyNavigation : MonoBehaviour


{
private VRTK_FreeFly freeFly;

private void Start()


{
freeFly = GetComponent<VRTK_FreeFly>();
}

private void Update()


{
if (freeFly != null)
{
// Move the camera rig object based on the user's input.
freeFly.UpdateFreeFly();
}

31
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

}
}
To use this script, simply attach it to the camera rig object in the scene.

Here is an example of how to add the VRTK_BasicTeleport script to the camera rig object:

using UnityEngine;
using VRTK;

public class TeleportNavigation : MonoBehaviour


{
private VRTK_BasicTeleport teleport;

private void Start()


{
teleport = GetComponent<VRTK_BasicTeleport>();
}

private void Update()


{
if (teleport != null)
{
// Teleport the camera rig object to the user's destination.
teleport.Teleport();
}
}
}

Output:

Result:
To use this script, simply attach it to the camera rig object in the scene. We can use other
navigation scripts from the VRTK to explore the 3D scene in different ways. For example, you
can use the VRTK_TouchpadMove script to allow the user to move around the scene using a
touchpad. You can also use the VRTK_Pointer script to allow the user to interact with objects
in the scene using a pointer.

32
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Exp.No 11. Tracking using AR.

Date:

To track using AR using Unity, you can use the AR Foundation package. AR Foundation is a
cross-platform framework that provides a common set of APIs for building AR applications.

Here are the steps to track using AR using Unity:

 Install the AR Foundation package. You can do this by going to Window > Package
Manager and searching for "AR Foundation".
 Create a new Unity project.
 Import the AR Foundation package into the project.
 Add an AR Session Origin component to the scene. This component will provide the
AR system with a reference point for tracking.
 Add an AR Trackable Manager component to the scene. This component will manage
the trackables that the AR system will track.
 Create a new prefab for the trackable that you want to track. The prefab should have a
mesh renderer and a collider.
 Add the AR Trackable component to the prefab.
 Instantiate the trackable prefab in the scene.
 Play the scene. The AR system will now track the trackable in the scene.
 You can use the AR Trackable component to get the position and rotation of the
trackable in the scene. You can also use the AR Trackable to determine if the trackable
is visible to the user.

Here is an example of how to use the AR Trackable component to get the position and rotation
of the trackable in the scene:

using UnityEngine;
using UnityEngine.XR.ARFoundation;

public class TrackableTracker : MonoBehaviour


{
private ARTrackable trackable;

private void Start()


{
trackable = GetComponent<ARTrackable>();
}

private void Update()


{
if (trackable.trackingState == TrackingState.Tracked)
{
// Get the position and rotation of the trackable.
Vector3 position = trackable.transform.position;
Quaternion rotation = trackable.transform.rotation;

33
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

// Do something with the position and rotation of the trackable.


}
}
}
You can also use the AR Trackable component to determine if the trackable is visible to the
user.

Here is an example of how to use the AR Trackable component to determine if the trackable is
visible to the user:

using UnityEngine;
using UnityEngine.XR.ARFoundation;

public class TrackableVisibility : MonoBehaviour


{
private ARTrackable trackable;

private void Start()


{
trackable = GetComponent<ARTrackable>();
}

private void Update()


{
if (trackable.trackingState == TrackingState.Tracked && trackable.isInView)
{
// The trackable is visible to the user.
}
}
}

Output:

Result:

We can use the AR Trackable component to track all sorts of objects, such as images, faces,
and planes. We can use the tracked objects to build AR experiences such as AR games, AR
navigation, and AR product visualization.

Exp.No 12 : Haptic Sensing in AR

34
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Date:

To implement haptic sensing in AR using Unity is to use the Haptics plugin from the Unity
Asset Store. This plugin provides a set of APIs for controlling haptic feedback on a variety of
devices, including mobile devices and VR headsets.

To use the Haptics plugin to implement haptic sensing in AR, you will need to:

Install the Haptics plugin from the Unity Asset Store.


Import the Haptics plugin into your Unity project.
Create a new script to control the haptic feedback.
Use the Haptics plugin APIs to generate haptic feedback at the desired time.
Attach the script to the object in the AR scene that you want to generate haptic feedback
for.
A simple script to control haptic feedback using the Haptics plugin:

C#
using UnityEngine;
using Haptics;

public class HapticFeedback : MonoBehaviour


{
public HapticsDevice hapticDevice;

private void Start()


{
hapticDevice = HapticsDevice.GetDevice();
}

private void Update()


{
// Generate haptic feedback.
hapticDevice.Vibrate(100, 100);
}
}

To use this script, simply attach it to the object in the AR scene that you want to generate haptic
feedback for. When the object is touched, the script will generate haptic feedback.

35
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Another way to implement haptic sensing in AR using Unity is to use the AR Foundation plugin
from the Unity Asset Store. This plugin provides a framework for building AR applications on
mobile devices.

To use the AR Foundation plugin to implement haptic sensing in AR, you will need to:

 Install the AR Foundation plugin from the Unity Asset Store.


 Import the AR Foundation plugin into your Unity project.
 Create a new script to control the haptic feedback.
 Use the AR Foundation plugin APIs to generate haptic feedback at the desired time.
 Attach the script to the AR Session Origin object in the AR scene.

A script to control haptic feedback using the AR Foundation plugin:

C#
using UnityEngine;
using UnityEngine.XR.ARFoundation;

public class ARHapticFeedback : MonoBehaviour


{
private ARSessionOrigin arSessionOrigin;

private void Start()


{
arSessionOrigin = GetComponent<ARSessionOrigin>();
}

private void Update()


{
// Generate haptic feedback.
arSessionOrigin.RequestHapticFeedback(0.5f, 0.5f);
}
}

Output:

Result:

To use this script, simply attach it to the AR Session Origin object in the AR scene. When the
user touches the surface that the AR Session Origin is tracking, the script will generate haptic
feedback. We can use haptic feedback to enhance the AR experience in a variety of ways. For
example, you can use haptic feedback to provide feedback to the user when they are interacting
with virtual objects or when they are completing tasks in the AR environment.

36
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Exp.no 13: Study on Ergonomic and aesthetic

Date:

Ergonomic studies

Unreal Engine (UE) can be used to create realistic simulations of human activities, which can
be used to study the ergonomics of different tasks and environments. For example, UE can be
used to create a simulation of a worker assembling a product on a factory line. This simulation
can be used to identify potential ergonomic hazards and to design safer and more efficient work
processes.

UE can also be used to create simulations of medical procedures. This can be used to train
surgeons on new procedures and to develop new surgical techniques. For example, UE can be
used to create a simulation of a minimally invasive surgery. This simulation can be used to
train surgeons on how to perform the surgery without making large incisions.

Aesthetic studies

UE can also be used to create simulations of different environments and objects. This can be
used to study the aesthetics of different designs and to develop new design concepts. For
example, UE can be used to create a simulation of a new product design. This simulation can
be used to test the product's appearance and usability.

UE can also be used to create simulations of different architectural designs. This can be used
to study the aesthetic impact of different designs and to develop new architectural concepts.
For example, UE can be used to create a simulation of a new skyscraper design. This simulation
can be used to test the skyscraper's appearance and to identify any potential aesthetic problems.

Here are some specific examples of how UE has been used to conduct ergonomic and aesthetic
studies:

Ergonomics: Automobile Company is using UE to create simulations of its factories in order


to improve the ergonomics of its assembly lines.

Medical: The University of California, San Francisco is using UE to create simulations of


medical procedures in order to train surgeons and develop new surgical techniques.

Product design: Nike is using UE to create simulations of new product designs in order to test
their appearance and usability.

Architecture: The architectural firm Zaha Hadid Architects is using UE to create simulations
of new building designs in order to study their aesthetic impact and to identify any potential
aesthetic problems.

UE is a powerful tool that can be used to conduct a wide variety of ergonomic and aesthetic
studies. By creating realistic simulations of different tasks, environments, and objects, UE can
help us to understand and improve the design of our world.

37
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Unity being used for ergonomic and aesthetic studies

Here are some examples of how Unity is being used for ergonomic and aesthetic studies:

Ergonomics: The University of Michigan is using Unity to create simulations of assembly line
work to study the effects of different workstation designs on worker fatigue and productivity.

Medical: The Mayo Clinic is using Unity to create simulations of surgical procedures to train
surgeons and develop new surgical techniques.

Product design: The company IKEA is using Unity to create simulations of new product
designs to test their usability and aesthetic appeal.

Architecture: The architectural firm Foster + Partners is using Unity to create simulations of
new building designs to test their environmental impact and aesthetic effect.

Result:

Overall, Unity is a powerful and versatile tool that can be used to conduct a wide variety of
ergonomic and aesthetic studies. It is a good choice for users who are looking for an affordable
and easy-to-use game engine that can create realistic simulations.

38
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

Exp.No 14:
Study on Identification of a real life problem in thrust areas : Game Development

Date:

real-life problems in thrust areas in game development:

Problem: Lack of diversity in game development teams and games.

Examples:

The majority of game developers are white and male.


Games often perpetuate stereotypes and biases about different groups of people.
There are a lack of games that accurately and authentically represent the diversity of the real
world.
Impact:

The lack of diversity in game development teams and games can lead to games that are not
inclusive or representative of all players.
This can have a negative impact on players' experiences and can discourage people from
pursuing a career in game development.
Potential solutions:

Game development companies need to make a concerted effort to hire more diverse teams.
Developers need to be mindful of the diversity of their players and create games that are
inclusive and representative of everyone.
The game development community needs to support and promote diverse voices and
perspectives.
Problem: The high cost of game development.

Examples:

Game development is becoming increasingly complex and expensive.


This makes it difficult for independent developers to compete with AAA studios.
It also makes it difficult for developers to experiment with new ideas and take risks.
Impact:

The high cost of game development can lead to a lack of innovation and diversity in the games
market.
It can also make it difficult for new developers to enter the industry.
Potential solutions:

Game development companies need to find ways to reduce the cost of game development.
Governments and other organizations can provide financial support to independent developers.
There are also a number of tools and resources available to help developers reduce the cost of
game development.
Problem: The environmental impact of video games.

Examples:

39
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022

The production of video game consoles and games requires the use of energy and resources.
Video games can also generate electronic waste.
The data centers that power online games consume a lot of energy.
Impact:

The video game industry can have a negative impact on the environment.
This is especially concerning given the growing popularity of video games.
Potential solutions:

Game developers can use more sustainable practices in the development and production of
their games. Players can also take steps to reduce the environmental impact of their gaming
habits, such as by buying used games and recycling their old consoles and games.

Result:

These are just a few examples of real-life problems in thrust areas in game development. As
the game development industry continues to grow and evolve, new challenges will emerge.
However, with the support of the game development community, we can work to address these
challenges and create a more inclusive, sustainable, and innovative industry.

40

You might also like