Professional Documents
Culture Documents
Arvr 1
Arvr 1
A LAB REPORT
Submitted by
Anushree Bajaj[RA2011051010022]
BACHELOR OF TECHNOLOGY
COMPUTER SCIENCE ENGINEERING
with specialization in Gaming Technology
Certified that this Lab report titled “AR and VR Lab Experiments” is the bonafide work of “Anushree
Bajaj[RA2011051010022] who carried out the experimental work under my supervision. Certified
further, that to the best of my knowledge the work reported herein does not form part of any other thesis or
dissertation on the basis of which a degree or award was conferred on an earlier occasion for this or any
other candidate.
Exp.
Date Title of the Experiment Page. No Sign
No
Generation of fractal curves and landscapes using
1 algorithms (Using Python ) 1
1
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Date:
Fractal curves are geometric objects that are self-similar at multiple scales. This means that if
you zoom in on a fractal curve, you will see the same basic pattern repeated at a smaller
scale. Some examples of fractal curves include the Koch snowflake, the Sierpinski triangle,
and the Mandelbrot set.
There are many different algorithms for generating fractal curves. One common approach is
to use a recursive algorithm. This type of algorithm works by repeatedly applying a set of
rules to a starting object. For example, the following recursive algorithm can be used to
generate the Koch snowflake:
z = z^2 + c
where:
2
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Fractal landscapes are geometric objects that are self-similar at multiple scales. This means
that if you zoom in on a fractal landscape, you will see the same basic pattern repeated at a
smaller scale. Some examples of fractal landscapes include mountains, clouds, and coastlines.
There are many different algorithms for generating fractal landscapes. One common approach
is to use the random midpoint displacement algorithm. This algorithm works by starting with
a square grid and then iteratively displacing the midpoint of each square by a random amount.
The amount of displacement is typically chosen from a normal distribution with a mean of
0.0 and a standard deviation that is gradually decreased as the algorithm progresses.
Fractal curves and landscapes have many different applications. They are used in computer
graphics to generate realistic images of natural objects, such as mountains, trees, and clouds.
They are also used in mathematics to study complex phenomena, such as turbulence and
chaos.
Here are some specific examples of applications of fractal curves and landscapes:
Computer graphics: Fractal curves and landscapes are used to generate realistic
images of natural objects, such as mountains, trees, and clouds.
Mathematics: Fractal curves and landscapes are used to study complex
phenomena, such as turbulence and chaos.
Signal processing: Fractal curves and landscapes are used to analyze and compress
signals, such as images and audio recordings.
Medical imaging: Fractal curves and landscapes are used to identify and diagnose
diseases, such as cancer and Alzheimer's disease.
Finance: Fractal curves and landscapes are used to analyze and predict market trends.
Output:
3
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Result:
Fractal curves and landscapes are beautiful and complex mathematical objects. They have
many different applications in computer graphics, mathematics, signal processing, medical
imaging, and finance. Algorithms for generating fractal curves and landscapes are relatively
simple to implement, but they can produce stunning results.
4
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Date:
Aliasing:
Aliasing is a phenomenon that occurs when a signal is sampled at a lower frequency than its
highest component frequency. This can cause the signal to be reconstructed incorrectly,
resulting in jagged edges and other artifacts.
Box filter: The box filter is a simple aliasing technique that works by averaging the values of
adjacent pixels. This can be implemented in Python using the following code:
Python
import numpy as np
def box_filter(image):
"""Applies a box filter to the given image.
Args:
image: A numpy array representing the image.
Returns:
A numpy array representing the filtered image.
"""
Gaussian filter:
The Gaussian filter is a more sophisticated aliasing technique that uses a Gaussian kernel
to smooth the image. This can be implemented in Python using the following code:
Python
import numpy as np
import cv2
def gaussian_filter(image):
"""Applies a Gaussian filter to the given image.
Args:
image: A numpy array representing the image.
Returns:
A numpy array representing the filtered image.
"""
kernel = cv2.getGaussianKernel(3, 1)
filtered_image = cv2.filter2D(image, -1, kernel)
return filtered_image
5
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Bilateral filter:
The bilateral filter is a more advanced aliasing technique that takes into account both the
spatial and intensity differences between pixels. This can be implemented in Python
using the following code:
Python
import numpy as np
import cv2
def bilateral_filter(image):
"""Applies a bilateral filter to the given image.
Args:
image: A numpy array representing the image.
Returns:
A numpy array representing the filtered image.
"""
Which aliasing technique to use depends on the specific application. The box filter is the
simplest and fastest technique, but it is not as effective at removing aliasing as the
Gaussian or bilateral filters. The Gaussian filter is more effective at removing aliasing, but
it is slower than the box filter. The bilateral filter is the most effective at removing
aliasing, but it is also the slowest.
Python
import numpy as np
import cv2
image = cv2.imread("image.jpg")
Anti-aliasing:
Anti-aliasing is a technique that is used to reduce aliasing. There are many different anti-
aliasing techniques, but they all work by smoothing out the edges of a signal before it is
sampled.
6
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Anti- aliasing
Anti-aliasing is a technique used in computer graphics to reduce the visual artifacts known as
aliasing, which occur when displaying images at lower resolutions or on devices that cannot
accurately represent fine details. Aliasing causes jagged or "staircase-like" edges on diagonal
lines or curves. Anti-aliasing plays a crucial role in improving the visual quality of computer
graphics, especially in applications like video games, computer-generated imagery (CGI),
and graphic design.
Anti-aliasing is a process used to smooth out these jagged edges and create a more visually
appealing image. It involves various algorithms designed to blend the colors of pixels along
the edges, thereby reducing the sharp contrast between the object and the background.Types
of Anti-aliasing:
b. Multisample Anti-aliasing (MSAA): MSAA only samples certain points on each pixel,
reducing the workload compared to SSAA. It helps to smooth edges but may not eliminate all
aliasing artifacts.
Output:
7
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Results:
Anti-aliasing is a valuable technique that can be used to improve the quality and realism of
images and videos. However, it is important to be aware of the computational cost and
performance implications before using it.
8
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Date:
The Mandelbrot set and Julia sets are two of the most famous fractals. They are both
generated using complex numbers, and they both exhibit a remarkable degree of self-
similarity.
The Mandelbrot set is a set of complex numbers that do not diverge when iterated using the
following equation:
z = z^2 + c
where z is the complex number being iterated and c is a constant complex number.
The Julia sets are a family of fractals that are generated using the following equation:
z = z^2 + c
where z is the complex number being iterated and c is a constant complex number.
Procedure
Python
import numpy as np
Define the functions to generate the Mandelbrot set and Julia sets:
Python
Args:
c: A complex number.
9
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Returns:
"""
z=0
for i in range(max_iterations):
z = z**2 + c
if abs(z) > 2:
return i
return 0
Args:
c: A complex number.
Returns:
"""
z=0
for i in range(max_iterations):
z = z**2 + c
if abs(z) > 2:
10
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
return i
return 0
Python
for i in range(500):
for j in range(500):
mandelbrot_set[i, j] = mandelbrot(c)
for i in range(500):
for j in range(500):
c = complex(-0.8 + 0.156j)
julia_set_1[i, j] = julia(c)
for i in range(500):
for j in range(500):
c = complex(0.35 + 0.35j)
11
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
julia_set_2[i, j] = julia(c)
Python
plt.imshow(mandelbrot_set, cmap="hot")
plt.colorbar()
plt.title("Mandelbrot Set")
plt.show()
plt.imshow(julia_set_1, cmap="hot")
plt.colorbar()
plt.show()
plt.imshow(julia_set_2, cmap="hot")
plt.colorbar()
plt.show()
12
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Output:
Results:
We can explore the Mandelbrot set and Julia sets by changing the parameters of the
mandelbrot() and julia() functions. For example, you can change the max_iterations parameter.
13
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Exp: No 4: Construct the primitives with different color models and simulate the
conversion from one model to another.
Date:
There are three main color models used in computer graphics: RGB, HSV, and CMYK. Here
is a brief description of each model:
RGB: The RGB color model is based on the three primary colors of light: red, green, and
blue. Each pixel in an RGB image is represented by three values, one for each of the primary
colors. The values are typically in the range of 0 to 255, with 0 being the darkest and 255
being the brightest.
HSV: The HSV color model is based on the three perceptual attributes of color: hue,
saturation, and value. Hue is the color itself, such as red, green, or blue. Saturation is the
intensity of the color. Value is the brightness of the color.
CMYK: The CMYK color model is based on the four primary colors of ink: cyan, magenta,
yellow, and black. Each pixel in a CMYK image is represented by four values, one for each
of the primary colors. The values are typically in the range of 0 to 100, with 0 being the least
ink and 100 being the most ink.
Here is a table that shows the constructs of the different color models:
Converting from one color model to another is a relatively straightforward process. The
following algorithm can be used to convert from RGB to HSV:
def rgb_to_hsv(rgb):
Args:
rgb: A tuple of three values, representing the red, green, and blue
14
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Returns:
"""
r, g, b = rgb
def hsv_to_rgb(hsv):
Args:
hsv: A tuple of three values, representing the hue, saturation, and value
Returns:
15
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
h, s, v = hsv
r = v * (1 - s)
g = v * (1 - s * np.cos(h))
b = v * (1 - s * np.sin(h))
return r, g, b
The Python code simulates the conversion from one color model to another:
Python
import numpy as np
def rgb_to_hsv(rgb):
Args:
rgb: A tuple of three values, representing the red, green, and blue
Returns:
"""
16
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
r, g, b = rgb
def hsv_to_rgb(hsv):
Args:
hsv: A tuple of three values, representing the hue, saturation, and value
Returns:
Output:
17
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Results:
Here is a table that summarizes the constructs of the different color models:
Date:
18
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
To develop a new texture and apply various mapping on a 3D object using Unity, you can
follow these steps:
You can use a variety of tools to develop a new texture, such as a photo editing software like
Adobe Photoshop or a dedicated 3D texture painting software like Substance Painter.
Once you have developed your new texture, you need to import it into Unity. To do this,
simply drag and drop the texture file into the Project window.
To apply the texture to a 3D object, you need to create a material in Unity. To do this, go to
Assets > Create > Material.
Once you have created a material, you can assign the texture to it by dragging and dropping
the texture file onto the Albedo property in the Inspector window.
Unity supports a variety of texture mapping techniques, including diffuse mapping, bump
mapping, specular mapping, and normal mapping.
To apply diffuse mapping, simply assign the texture to the Albedo property of the material.
To apply bump mapping, assign the texture to the Normal property of the material.
To apply specular mapping, assign the texture to the Metallic property of the material.
To apply normal mapping, assign the texture to the Normal property of the material, but also
set the Normal Map property to True.
Once you have applied texture mapping, you may need to adjust the settings to get the
desired result.
For example, you may need to adjust the tiling, offset, and rotation of the texture. You may
also need to adjust the intensity of the texture.
Once you are satisfied with the texture mapping, you can save the material and apply it to the
3D object.
Here is an example of how to develop a new texture and apply diffuse mapping to a 3D
object in Unity:
19
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Output:
20
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Results:
We can use the Shader Graph in Unity to create more complex texture mapping effects. We
can use the Asset Postprocessor in Unity to automatically generate texture maps, such as
normal maps and roughness maps. We can use the Unity Standard Shader to apply multiple
textures to a single material. This can be useful for creating realistic materials, such as wood
and metal.
Date:
Ray tracing is a rendering technique that simulates the physical behavior of light to create
realistic images. It works by tracing the path of each ray of light from the camera to the light
source and back, calculating how the light interacts with the objects in the scene along the
way.
To implement ray tracing concepts with the collection of 3D models using Unity, you can
follow these steps:
Create a new material and assign the shader for ray tracing.
Set the properties of the material, such as the albedo, roughness, and metallic.
Place the 3D models in the scene.
Add a camera to the scene.
Set the rendering mode of the camera to ray tracing.
Render the scene.
Here is an example of how to implement ray tracing concepts with the collection of 3D
models using Unity:
using UnityEngine;
21
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
// Create a new material and assign the shader for ray tracing.
material.SetFloat("_Albedo", 0.5f);
material.SetFloat("_Roughness", 0.5f);
material.SetFloat("_Metallic", 0.0f);
Instantiate(model);
camera.renderingPath = RenderingPath.DeferredShading;
22
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
camera.allowHDR = true;
camera.Render();
Output:
23
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Results:
We can import 3D models from the Models folder, create a new material with the shader for
ray tracing, and set the properties of the material. Then, it will place the 3D models in the scene,
add a camera to the scene, set the rendering mode of the camera to ray tracing, and render the
scene. You can use the rayTracingMaterial property to assign a different material for ray
tracing. You can also use the _Albedo, _Roughness, and _Metallic properties to adjust the
appearance of the materials.
Date:
To convert assemblies to VR models using Unity, you can follow these steps:
Import the assemblies into Unity. You can do this by dragging and dropping the
assembly files into the Project window.
Create a new material for the VR models. To do this, go to Assets > Create >
Material.
Assign a shader to the material. The shader should be compatible with VR.
Set the properties of the material. This may include the albedo, roughness, metallic,
and normal map.
Apply the material to the assemblies.
Add a VR camera to the scene.
Set the rendering mode of the camera to VR.
Build and run the project.
Here is an example of how to convert assemblies to VR models using Unity:
using UnityEngine;
24
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Output:
25
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Results:
This code will import the assemblies from the Assemblies folder, create a new material for
the VR models, set the properties of the material, and apply the material to the assemblies.
Then, it will add a VR camera to the scene and set the rendering mode of the camera to VR.
Date:
Import the digital mockup into Unity. We can do this by dragging and dropping the
mockup file into the Project window.
Add a Rigidbody component to the mockup. This will allow the mockup to be
affected by physics.
Add a script to the mockup to control its behavior. For example, you could add a
script to make the mockup move around or rotate.
Play the scene and test the behavior of the mockup.
26
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
using UnityEngine;
You can use more complex scripts to control the behavior of the mockup. For example, you
could write a script to make the mockup follow a specific path or to interact with other
objects in the scene.
using UnityEngine;
27
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
// If the mockup has reached the end of the path, loop back to the beginning.
if (currentPathPoint >= pathPoints.Length)
{
currentPathPoint = 0;
}
}
}
}
Output:
Result:
We can use scripts to create all sorts of different behaviors for digital mockups. For example,
We could write a script to make the mockup explode, break into pieces, or interact with other
objects in the scene.
Date:
28
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
using UnityEngine;
You can use more complex scripts to create more sophisticated animation sequences. For
example, you could write a script to animate a rocket launching into space or a car driving
down a road.
using UnityEngine;
29
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
rigidbody.AddForce(acceleration);
Result:
Using this script, simply attach it to the rocket object in the scene. Then, play the scene and the
rocket will start launching into space. We can use scripts to create all sorts of different
animation sequences to illustrate the concepts of kinematics and dynamics. For example, you
could create an animation sequence to illustrate the motion of a projectile, the motion of a
pendulum, or the motion of a car on a banked curve.
30
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Exp.No: 10 Build a 3D scene using VRML and explore it using various navigations.
Date:
To build a 3D scene using VRML and explore it using various navigations using Unity, you
can follow these steps:
You can do this by dragging and dropping the VRML file into the Project window. Unity will
automatically import the VRML file and create a new scene containing the 3D objects.
To do this, go to GameObject > Create Other > VRTK > Prefabs > CameraRig. This will create
a new camera rig object in the scene.
Unity provides a variety of navigation scripts that you can use to explore the 3D scene. For
example, you can add the VRTK_FreeFly script to allow the user to move freely through the
scene. You can also add the VRTK_BasicTeleport script to allow the user to teleport to
different locations in the scene.
Once you have added the navigation scripts to the camera rig object, you can play the scene
and start exploring the 3D scene.
Here is an example of how to add the VRTK_FreeFly script to the camera rig object:
using UnityEngine;
using VRTK;
31
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
}
}
To use this script, simply attach it to the camera rig object in the scene.
Here is an example of how to add the VRTK_BasicTeleport script to the camera rig object:
using UnityEngine;
using VRTK;
Output:
Result:
To use this script, simply attach it to the camera rig object in the scene. We can use other
navigation scripts from the VRTK to explore the 3D scene in different ways. For example, you
can use the VRTK_TouchpadMove script to allow the user to move around the scene using a
touchpad. You can also use the VRTK_Pointer script to allow the user to interact with objects
in the scene using a pointer.
32
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Date:
To track using AR using Unity, you can use the AR Foundation package. AR Foundation is a
cross-platform framework that provides a common set of APIs for building AR applications.
Install the AR Foundation package. You can do this by going to Window > Package
Manager and searching for "AR Foundation".
Create a new Unity project.
Import the AR Foundation package into the project.
Add an AR Session Origin component to the scene. This component will provide the
AR system with a reference point for tracking.
Add an AR Trackable Manager component to the scene. This component will manage
the trackables that the AR system will track.
Create a new prefab for the trackable that you want to track. The prefab should have a
mesh renderer and a collider.
Add the AR Trackable component to the prefab.
Instantiate the trackable prefab in the scene.
Play the scene. The AR system will now track the trackable in the scene.
You can use the AR Trackable component to get the position and rotation of the
trackable in the scene. You can also use the AR Trackable to determine if the trackable
is visible to the user.
Here is an example of how to use the AR Trackable component to get the position and rotation
of the trackable in the scene:
using UnityEngine;
using UnityEngine.XR.ARFoundation;
33
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Here is an example of how to use the AR Trackable component to determine if the trackable is
visible to the user:
using UnityEngine;
using UnityEngine.XR.ARFoundation;
Output:
Result:
We can use the AR Trackable component to track all sorts of objects, such as images, faces,
and planes. We can use the tracked objects to build AR experiences such as AR games, AR
navigation, and AR product visualization.
34
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Date:
To implement haptic sensing in AR using Unity is to use the Haptics plugin from the Unity
Asset Store. This plugin provides a set of APIs for controlling haptic feedback on a variety of
devices, including mobile devices and VR headsets.
To use the Haptics plugin to implement haptic sensing in AR, you will need to:
C#
using UnityEngine;
using Haptics;
To use this script, simply attach it to the object in the AR scene that you want to generate haptic
feedback for. When the object is touched, the script will generate haptic feedback.
35
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Another way to implement haptic sensing in AR using Unity is to use the AR Foundation plugin
from the Unity Asset Store. This plugin provides a framework for building AR applications on
mobile devices.
To use the AR Foundation plugin to implement haptic sensing in AR, you will need to:
C#
using UnityEngine;
using UnityEngine.XR.ARFoundation;
Output:
Result:
To use this script, simply attach it to the AR Session Origin object in the AR scene. When the
user touches the surface that the AR Session Origin is tracking, the script will generate haptic
feedback. We can use haptic feedback to enhance the AR experience in a variety of ways. For
example, you can use haptic feedback to provide feedback to the user when they are interacting
with virtual objects or when they are completing tasks in the AR environment.
36
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Date:
Ergonomic studies
Unreal Engine (UE) can be used to create realistic simulations of human activities, which can
be used to study the ergonomics of different tasks and environments. For example, UE can be
used to create a simulation of a worker assembling a product on a factory line. This simulation
can be used to identify potential ergonomic hazards and to design safer and more efficient work
processes.
UE can also be used to create simulations of medical procedures. This can be used to train
surgeons on new procedures and to develop new surgical techniques. For example, UE can be
used to create a simulation of a minimally invasive surgery. This simulation can be used to
train surgeons on how to perform the surgery without making large incisions.
Aesthetic studies
UE can also be used to create simulations of different environments and objects. This can be
used to study the aesthetics of different designs and to develop new design concepts. For
example, UE can be used to create a simulation of a new product design. This simulation can
be used to test the product's appearance and usability.
UE can also be used to create simulations of different architectural designs. This can be used
to study the aesthetic impact of different designs and to develop new architectural concepts.
For example, UE can be used to create a simulation of a new skyscraper design. This simulation
can be used to test the skyscraper's appearance and to identify any potential aesthetic problems.
Here are some specific examples of how UE has been used to conduct ergonomic and aesthetic
studies:
Product design: Nike is using UE to create simulations of new product designs in order to test
their appearance and usability.
Architecture: The architectural firm Zaha Hadid Architects is using UE to create simulations
of new building designs in order to study their aesthetic impact and to identify any potential
aesthetic problems.
UE is a powerful tool that can be used to conduct a wide variety of ergonomic and aesthetic
studies. By creating realistic simulations of different tasks, environments, and objects, UE can
help us to understand and improve the design of our world.
37
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Here are some examples of how Unity is being used for ergonomic and aesthetic studies:
Ergonomics: The University of Michigan is using Unity to create simulations of assembly line
work to study the effects of different workstation designs on worker fatigue and productivity.
Medical: The Mayo Clinic is using Unity to create simulations of surgical procedures to train
surgeons and develop new surgical techniques.
Product design: The company IKEA is using Unity to create simulations of new product
designs to test their usability and aesthetic appeal.
Architecture: The architectural firm Foster + Partners is using Unity to create simulations of
new building designs to test their environmental impact and aesthetic effect.
Result:
Overall, Unity is a powerful and versatile tool that can be used to conduct a wide variety of
ergonomic and aesthetic studies. It is a good choice for users who are looking for an affordable
and easy-to-use game engine that can create realistic simulations.
38
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
Exp.No 14:
Study on Identification of a real life problem in thrust areas : Game Development
Date:
Examples:
The lack of diversity in game development teams and games can lead to games that are not
inclusive or representative of all players.
This can have a negative impact on players' experiences and can discourage people from
pursuing a career in game development.
Potential solutions:
Game development companies need to make a concerted effort to hire more diverse teams.
Developers need to be mindful of the diversity of their players and create games that are
inclusive and representative of everyone.
The game development community needs to support and promote diverse voices and
perspectives.
Problem: The high cost of game development.
Examples:
The high cost of game development can lead to a lack of innovation and diversity in the games
market.
It can also make it difficult for new developers to enter the industry.
Potential solutions:
Game development companies need to find ways to reduce the cost of game development.
Governments and other organizations can provide financial support to independent developers.
There are also a number of tools and resources available to help developers reduce the cost of
game development.
Problem: The environmental impact of video games.
Examples:
39
18CSE437J - Virtual Reality and Augmented Reality Lab |. Reg.No : RA2111051010022
The production of video game consoles and games requires the use of energy and resources.
Video games can also generate electronic waste.
The data centers that power online games consume a lot of energy.
Impact:
The video game industry can have a negative impact on the environment.
This is especially concerning given the growing popularity of video games.
Potential solutions:
Game developers can use more sustainable practices in the development and production of
their games. Players can also take steps to reduce the environmental impact of their gaming
habits, such as by buying used games and recycling their old consoles and games.
Result:
These are just a few examples of real-life problems in thrust areas in game development. As
the game development industry continues to grow and evolve, new challenges will emerge.
However, with the support of the game development community, we can work to address these
challenges and create a more inclusive, sustainable, and innovative industry.
40