Aiyoob's ET

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 168

Higher Nationals

Internal verification of assessment decisions – BTEC (RQF)


INTERNAL VERIFICATION – ASSESSMENT DECISIONS
Programme title HND in Computing

Mr. Abdul Rahman


Assessor Internal Verifier
45: Emerging Technologies
Unit(s)
Researching an ET for a potential client
Assignment title
Mohammed Aiyoob Anas
Student’s name
List which assessment criteria Pass Merit Distinction
the Assessor has awarded.

INTERNAL VERIFIER CHECKLIST

Do the assessment criteria awarded match


those shown in the assignment brief? Y/N

Is the Pass/Merit/Distinction grade awarded


justified by the assessor’s comments on the Y/N
student work?
Has the work been assessed Y/N
accurately?
Is the feedback to the student:
Give details:
• Constructive? Y/N
• Linked to relevant assessment criteria? Y/N
Y/N
• Identifying opportunities for
improved performance?
Y/N
• Agreeing actions?
Does the assessment decision need Y/N
amending?

Assessor signature Date

Internal Verifier signature Date


Programme Leader signature (if required)
Date
Confirm action completed
Remedial action taken
Give details:

Assessor signature Date


Internal Verifier
signature Date
Programme Leader
signature (if required) Date
Higher Nationals - Summative Assignment Feedback Form
Student Name/ID Mohamed Aiyoob Anas / COL00079253
45: Emerging Technologies
Unit Title
Assignment Number 1 Assessor
2022/9/12 Date Received
Submission Date 1st submission
Date Received 2nd
Re-submission Date submission
Assessor Feedback:
LO1 Assess what Emerging Technologies are necessary and appropriate when designing software
applications for the future
Pass, Merit & Distinction P1 P2 M1 M2 D1
Descripts

LO2 Research state-of-the-art Emerging Technologies and choose one you believe will have significant
impact in the future
Pass, Merit & Distinction P3 P4 M3 M4
Descripts

LO3 Discuss the current state and future impact of your chosen Emerging Technology

Pass, Merit & Distinction P5 P6 M5 M6 D2


Descripts

LO4 Evaluate the political, economic and social factors which play a role in the competition between
Emerging Technologies and their success or failure in the future

Pass, Merit & Distinction P7 M7 D3


Descripts

Grade: Assessor Signature: Date:

Resubmission Feedback:

Grade: Assessor Signature: Date:

Internal Verifier’s Comments:

Signature & Date:

* Please note that grade decisions are provisional. They are only confirmed once internal and external moderation has taken place and
grades decisions have been agreed at the assessment board.
Assignment Feedback
Formative Feedback: Assessor to Student

Action Plan

Summative feedback

Feedback: Student to Assessor

Assessor Date
signature

Student Date
signature
Pearson Higher Nationals in
Computing
45: Emerging Technologies
Assignment 01 of 01
General Guidelines

1. A Cover page or title page – You should always attach a title page to your assignment. Use
previous page as your cover sheet and make sure all the details are accurately filled.
2. Attach this brief as the first section of your assignment.
3. All the assignments should be prepared using a word processing software.
4. All the assignments should be printed on A4 sized papers. Use single side printing.
5. Allow 1” for top, bottom , right margins and 1.25” for the left margin of each page.

Word Processing Rules

1. The font size should be 12 point, and should be in the style of Time New Roman.
2. Use 1.5 line spacing. Left justify all paragraphs.
3. Ensure that all the headings are consistent in terms of the font size and font style.
4. Use footer function in the word processor to insert Your Name, Subject, Assignment No, and
Page Number on each page. This is useful if individual sheets become detached for any reason.
5. Use word processing application spell check and grammar check function to help editing your
assignment.

Important Points:

1. It is strictly prohibited to use textboxes to add texts in the assignments, except for the
compulsory information. eg: Figures, tables of comparison etc. Adding text boxes in the body
except for the before mentioned compulsory information will result in rejection of your work.
2. Avoid using page borders in your assignment body.
3. Carefully check the hand in date and the instructions given in the assignment. Late submissions
will not be accepted.
4. Ensure that you give yourself enough time to complete the assignment by the due date.
5. Excuses of any nature will not be accepted for failure to hand in the work on time.
6. You must take responsibility for managing your own time effectively.
7. If you are unable to hand in your assignment on time and have valid reasons such as illness, you
may apply (in writing) for an extension.
8. Failure to achieve at least PASS criteria will result in a REFERRAL grade .
9. Non-submission of work without valid reasons will lead to an automatic RE FERRAL. You will
then be asked to complete an alternative assignment.
10. If you use other people’s work or ideas in your assignment, reference them properly using
HARVARD referencing system to avoid plagiarism. You have to provide both in-text citation and
a reference list.
11. If you are proven to be guilty of plagiarism or any academic misconduct, your grade could be
reduced to A REFERRAL or at worst you could be expelled from the course
Student Declaration

I hereby, declare that I know what plagiarism entails, namely to use another’s work and to present it as
my own without attributing the sources in the correct form. I further understand what it means to
copy another’s work.

1. I know that plagiarism is a punishable offence because it constitutes theft.


2. I understand the plagiarism and copying policy of Edexcel UK.
3. I know what the consequences will be if I plagiarise or copy another’s work in any of the
assignments for this program.
4. I declare therefore that all work presented by me for every aspect of my program, will be my
own, and where I have made use of another’s work, I will attribute the source in the correct
way.
5. I acknowledge that the attachment of this document signed or not, constitutes a binding
agreement between myself and Pearson, l UK.
6. I understand that my assignment will not be considered as submitted if this document is not
attached to the assignment.

aiyoobanas0@gmail.com
Student’s Signature: Date: 2022/09/12
(Provide E-mail ID) (Provide Submission Date)
Higher National Diploma in Business
Assignment Brief
Student Name /ID Number Mohamed Aiyoob Anas / COL00079253
Unit Number and Title 45: Emerging Technologies

Academic Year 2021/2022


Unit Tutor

Assignment Title Dex Consulting - ET & the Future

Issue Date

Submission Date

IV Name & Date

Submission format

Submission for this assignment should be a document and a presentation.

1. Final Report – Arrange all your answers in a professionally written report.


2. Research report as a part of the research conducted - Develop a research report using
research gathered about your chosen Emerging Technology, industry and end user
3. 15 minutes Presentation – Arrange a presentation to demonstrate your findings, gather
feedback and answer questions.

For the final report and the research report, you are expected to make use of appropriate
structure – including headings, paragraphs, subsections and illustrations as appropriate and all
work must be supported with research and referenced using Harvard referencing system.

For your presentation you will be expected to utilize appropriate tools (PowerPoint, etc.) and
include support material such as wireframes, diagrams, sketches, user interviews, etc. where
appropriate.
Unit Learning Outcomes:

LO1 - Assess what Emerging Technologies are necessary and appropriate when designing
software applications for the future.

LO2 - Research state-of-the-art Emerging Technologies and choose one you believe will have
significant impact in the future.

LO3 - Discuss the current state and future impact of your chosen Emerging Technology.

LO4 - Evaluate the political, economic and social factors, which play a role in the competition
between emerging technologies and their success or failure in the future.
Assignment Brief and Guidance:

Scenario

‘Dex Consulting’ is a leading research and consultancy firm researching new market trends and
Emerging Technologies for corporate clients and the consumer market. You currently work as a
trainee technology analyst for ‘Dex Consulting’. As part of your role, your manager has tasked
you to research on an Emerging Technology suitable for a potential client. You are required to
identify a specific user group you believe will be most influenced by this Emerging Technology.

As part of this assignment, you must develop a report using research data gathered about your
chosen Emerging Technology, industry and end user and present your findings in a 15 minutes
presentation.

You may supporting evidence and material such as user personas, hype cycle, etc to the report.

Activity 01
 Assess formats, characteristics, trends, of Emerging Technologies and evaluate how they
can challenge the status quo of the markets, established practices and end user
experiences. Your answer should support with valid and relevant examples.

 Review different emerging technologies relevant to software development by exploring


their advantages/ disadvantage. Evaluate and justify the relevance and purpose of the
emerging technologies review above when designing innovative and useful software
applications for the future .

Activity 02

Select and research on a specific emerging technology that would be impacted on software
development industry. Organize your research findings and produce a small research report
with the following

 Select a specific emerging technology as stated in the scenario and relate it to the existing
technologies to demonstrate how the selected ET is likely to merge or replace an existing
technology in the industry. Defend your choice of emerging technology by evaluating why
you believe that it would have the most impact on the future software development.

 Contrast and evaluate the benefits, features, advantages and disadvantage of the selected
ET. Identify the industry and the end user group that will be mostly influenced by the
selected ET and review how they will be impacted by it.

 Critically evaluate the above findings while justifying the selected ET and its impact on its
end users and software development industry as a whole.

Activity 03

Demonstrate your research findings in a 15 minutes presentation to the client that you are
supposed to recommend the emerging technology. The presentation should cover the
following

 Demonstrate the findings of the research conducted on emerging technology, its benefits,
features, advantages and disadvantages.

 Gather feedback from the audience and answer the questions they raise about your
research. Document the feedback received and questions raised by the end users and
attach to your report.
Grading Rubric

Learning Outcomes and Assessment Criteria

Grading Criteria Achieved Feedback

LO1 Assess what Emerging Technologies are necessary and appropriate when designing software applications for the future

P1 Assess formats, characteristics and trends of


Emerging Technologies.

P2 Explore the advantages and disadvantages


of Emerging Technology.

M1 Evaluate the ability of Emerging Technology


to disrupt the status quo throughout
industries, markets, user adoption and
established practices.

M2 Review various forms of Emerging


Technologies, focusing on their relevance to
software development and computing.

D1 Evaluate Emerging Technologies and


justify their use when designing software
applications for the future.

LO2 Research state-of-the-art Emerging Technologies and choose one you believe will have significant impact in the future
P3 Select a specific Emerging Technology.

P4 Review a specific industry and end user


group that will be the most influenced by this
Emerging Technology.

M3 Evaluate the benefits, features, advantages


and disadvantages of this Emerging
Technology.

M4 Show how Emerging Technologies can


converge with existing technologies or replace
them.

LO3 Discuss the current state and future impact of your chosen Emerging Technology

P5 Organise your research and findings.

P6 Contrast the benefits, features, advantages and


disadvantages of your chosen Emerging Technology.

M5 Relate how your chosen Emerging Technologies


can converge with existing technologies or replace
them.

M6 Develop a report of your research and findings.

LO2 & LO3


D2 Defend your choice of Emerging Technology in
relation to your belief it will have the most impact on
software application design and development in the
future.

LO4 Evaluate the political, economic and social factors which play a role in the competition between Emerging Technologies and their success or failure in the future

P7 Evaluate your report findings and research.

M7 Arrange a presentation to demonstrate your


findings, gather feedback and answer questions.

D3 Critique the benefits, features, advantages and


disadvantages of your chosen Emerging Technology.
Acknowledgement

Working on this assignment has been an amazing, thrilling, sometimes difficult,


but always intriguing experience. This task would not have been accomplished
without the help of a large number of people. First of all, I would like to express
my deep and sincere thankfulness to my lecturer, Mr Abdul Rahman. This work
cannot be completed correctly without her guidance and help. he constantly
offers us his help and educates us on how to do our assignments in order to
obtain a good result from an assignment. And I'd want to thank my institute,
ESOFT Metro Campus, for providing us with the essential resources.

Mohamed Aiyoob Anas


Contents
Activity 01...............................................................................................................................20
What is Emerging technology.............................................................................................20
What are the emerging technologies.................................................................................20
Artificial intelligence (AI)................................................................................................ 20
Internet of things (IOT)...................................................................................................26
Blockchain...................................................................................................................... 34
Big data...........................................................................................................................40
Automation.....................................................................................................................46
Robotics.......................................................................................................................... 51
Immersive Media............................................................................................................57
Cloud computing............................................................................................................ 61
Nanotechnology............................................................................................................. 66
Holography..................................................................................................................... 69
Characteristics of emerging technologies...........................................................................72
Advantages of Emerging Technology..............................................................................74
Disadvantages of Emerging Technology.........................................................................74
5 industries and all the technology challengers for each industry......................................75
Real estate......................................................................................................................75
Healthcare industry........................................................................................................ 77
Retail...............................................................................................................................78
Transportation................................................................................................................79
Education industry..........................................................................................................81
4 emerging technologies and how are they affects software/IT industry...........................82
How Artificial intelligence affects software/IT industry..................................................82
How Automation affects software/IT industry...............................................................84
How robotics affects software/IT industry.....................................................................85
How cloud computing affects software/IT industry........................................................86
How emerging technology going to affect IT industry in future.........................................87
Artificial intelligence.......................................................................................................87
Automation.....................................................................................................................89
Activity 02...............................................................................................................................90
The chosen technology for Dex Consulting is Artificial intelligence (AI).............................90
What Is Artificial Intelligence (AI)?.................................................................................90
Understanding Artificial Intelligence (AI)........................................................................90
Artificial Intelligence Applications..................................................................................91
How artificial intelligence works.....................................................................................93
Features of Artificial Intelligence....................................................................................94
How Artificial Intelligence Will Change the Future.........................................................98
Types of Artificial Intelligence.........................................................................................99
Benefits of choosing Artificial intelligence (AI).............................................................100
Advantages and Disadvantages of Artificial Intelligence..................................................100
How Artificial intelligence related to the transportation industry....................................104
How Artificial intelligence related to the agriculture industry..........................................106
How Artificial intelligence related to the healthcare industry..........................................108
How Artificial intelligence related to the Utilities & Energy industry...............................110
How chosen technology impact on software industry.....................................................112
Role of AI in Software Development.............................................................................112
Technological convergence and opportunities.................................................................115
Technological convergence definition..........................................................................115
Convergence of Artificial intelligence...............................................................................118
How artificial intelligence converge with IOT...................................................................119
How Artificial intelligence converge with Blockchain.......................................................121
How Artificial intelligence converge with VR....................................................................124
How Artificial intelligence converge with AR....................................................................127
Activity 03.............................................................................................................................129
Content.............................................................................................................................129
3 emerging technologies that we are going to talk about in influencing factors..............129
Factors.............................................................................................................................. 129
How political factors impacting AI....................................................................................130
How economic factors impacting AI.................................................................................132
How social factors impacting AI........................................................................................133
How political factors impacting Cloud computing............................................................134
How economic factors impacting Cloud computing.........................................................135
How social factors impacting Cloud computing................................................................136
How political factors impacting Blockchain......................................................................137
How economic factors impacting Blockchain...................................................................138
How social factors impacting Blockchain..........................................................................139
Presentation of Emerging technologies............................................................................141
Google form survey.......................................................................................................... 152
Survey results analysis......................................................................................................156
Conclusion........................................................................................................................ 165
References............................................................................................................................166
Activity 01

What is Emerging technology

Emerging technology is a term that generally refers to a new technology, but it can also refer
to the ongoing development of an existing technology; it can have slightly different meanings
when used in different fields such as media, business, science, or education. The word
typically refers to technologies that are now being developed or that are projected to be
accessible within the next five to ten years, and it is usually reserved for technologies that are
having, or are expected to have, substantial societal or economic implications.
Emerging digital technologies have created new opportunities while also posing new legal
issues, notably in the areas of intellectual property, trademarks, patents, royalties, and
licensing. The emergence of new digital communication technologies and media, for
example, has given birth to unique difficulties concerning the digital reproduction and
dissemination of copyrighted works. The federal government, relevant sectors, and public
interest groups have taken (and continue to take) action to establish adequate safeguards and
provide legal clarity to copyright holders, digital technology enterprises, the public, and other
interested parties (Wiston, 2022).

What are the emerging technologies

Artificial intelligence (AI)


Machines may now mimic the powers of the human mind thanks to artificial intelligence. AI
is becoming more prevalent in everyday life, from the creation of self-driving cars to the
spread of smart assistants such as Siri and Alexa. As a result, numerous technology
businesses across a wide range of sectors are investing in artificially intelligent technologies.
Less than a decade after assisting the Allies in winning World War II by deciphering the Nazi
encryption system Enigma, mathematician Alan Turing changed history yet again with a
simple question: "Can machines think?"
Turing's 1950 work "Computing Machinery and Intelligence" and the Turing Test that
followed set the essential purpose and vision of AI.
At its heart, artificial intelligence (AI) is the discipline of computer science that seeks to
answer Turing's question in the positive (Investopedia, 2022).
DEFINITION OF ARTIFICIAL INTELLIGENCE: FOUR TYPES OF APPROACHES

 Thinking humanly is simulating cognition in the human mind.


 Thinking intelligently is simulating cognition via logical reasoning.
 Acting humanly is acting in a way that resembles human behavior.
 Acting logically is acting in a way that is intended to attain a certain aim.

The first two concepts are about cognitive processes and reasoning, whereas the rest are
about conduct. Norvig and Russell are particularly interested in rational agents who behave
to maximize their chances of success, stating that "all of the abilities required for the Turing
Test also allow an agent to act rationally."

Patrick Winston, a former MIT professor of AI and computer science, described AI as


"algorithms enabled by limitations, revealed by representations that support models focused
at loops that connect thought, perception, and action."

While these concepts may appear esoteric to the ordinary person, they assist to concentrate
the discipline as an area of computer science and give a roadmap for integrating ML and
other subsets of AI into machines and programs.

The Four Types of Artificial Intelligence

AI is classified into four categories based on the type and complexity of jobs that a system
can execute. Automated spam filtering, for example, belongs to the most fundamental type
of AI, but the far-off possibility of robots that can comprehend people's thoughts and
feelings belongs to an altogether separate AI subset.

1. Reactive Machines
2. Limited Memory
3. Theory of Mind
4. Self-awareness
Reactive Machines

A reactive machine adheres to the most fundamental AI principles and, as the name suggests,
is only capable of utilizing its intellect to observe and react to the environment in front of it.
Because a reactive machine lacks memory, it cannot depend on prior experiences to influence
real-time decision making.

Because reactive machines see the world immediately, they are only designed to do a few
specific tasks. However, intentionally reducing a reactive machine's worldview is not a cost-
cutting tactic; rather, it implies that this form of AI will be more trustworthy and dependable
it will respond consistently to the same stimuli.

Deep Blue, a chess-playing supercomputer created by IBM in the 1990s that defeated world
expert Gary Kasparov in a game, is a well-known example of a reactive machine. Deep Blue
could only identify the pieces on a chess board and know how each moves according to the
rules of chess, as well as recognize each piece's current location and determine what the most
logical move would be at that time. The computer was not anticipating prospective moves by
its opponent or attempting to better place its own pieces. Every turn was seen as its own
reality, apart from any previous movements.

Google's AlphaGo is another example of a reactive game-playing AI. AlphaGo is likewise


incapable of predicting future moves, instead relying on its own neural network to assess
current game developments, giving it an advantage over Deep Blue in a more complex game.
AlphaGo has also defeated world-class Go players, including champion Lee Sedol in 2016.

Though restricted in scope and difficult to modify, reactive machine AI can achieve a level of
complexity and dependability when designed to perform recurring tasks.
Limited Memory

When gathering information and considering prospective options, AI with limited memory
can preserve prior facts and forecasts – effectively peering into the past for indications on
what may happen ahead. AI with limited memory is more complicated and has more potential
than reactive computers.

Memory problems AI is developed when a team regularly trains a model to interpret and use
fresh data, or when an AI environment is constructed to allow models to be automatically
trained and updated.
Six actions must be taken when using restricted memory AI in ML: The training data must be
produced, the ML model must be formed, the model must be capable of making predictions,
the model must be capable of receiving human or environmental input, that feedback must be
recorded as data, and these processes must be repeated in a cycle.

There are three major ML models that utilize limited memory AI:

 Reinforcement learning, which teaches itself to produce better predictions through


trial and error.
 Long short-term memory (LSTM) is a type of memory that uses previous data to
anticipate the next item in a sequence. When generating predictions, LTSMs prioritize
recent information and devalue data from the past, yet they still use it to draw
inferences.
 E-GANs are evolutionary generative adversarial networks that expand over time,
exploring slightly changed routes based on prior experiences with each new choice.
This model is always looking for a better path and uses simulations, statistics, or
chance to forecast outcomes throughout its evolutionary mutation cycle.
Theory of Mind

Theory of Mind is, well, theoretical. We have not yet developed the technological and
scientific capabilities required to attain this next level of artificial intelligence.

The notion is founded on the psychological premise that other living creatures have
thoughts and feelings that influence one's own actions. This would imply that AI robots may
know how people, animals, and other machines feel and make decisions through self-
reflection and determination, and then use that information to make their own conclusions.
Machines would essentially have to be able to comprehend and interpret the idea of "mind,"
the changes of emotions in decision making, and a slew of other psychological concepts in
real time, creating a two-way relationship between people and AI.

Self-awareness

Once Theory of Mind has been created, AI will be able to become self-aware at some point
in the future. This type of AI has human-level consciousness and recognizes its own
presence in the environment, as well as the presence and emotional condition of others. It
would be able to grasp what others may require based not only on what they convey to
them, but also on how they communicate it.

Self-awareness in AI is dependent on both human researchers comprehending the concept of


consciousness and then discovering how to reproduce it so that it may be incorporated into
machines.
What Is the Importance of Artificial Intelligence?

AI has several applications, ranging from accelerating vaccine research to automating the
identification of possible fraud.
According to CB Insights, AI private market activity reached a new high in 2021, with
worldwide financing increasing 108 percent over the previous year. AI is creating waves in
a multitude of areas due to its rapid adoption.
According to Business Insider Intelligence's AI in Banking 2022 research, more than half of
financial services organizations now utilize AI technologies for risk management and
revenue creation. The use of AI in banking might result in savings of up to $400 billion.

In terms of medicine, a World Health Organization report from 2021 stated that, while
incorporating AI into the healthcare profession presents hurdles, the technology "holds
considerable potential," since it might lead to benefits such as more informed health policy
and improved patient diagnosis accuracy.

AI has also made an impact in the entertainment industry. According to Grand View
Research, the worldwide market for AI in media and entertainment is expected to reach
$99.48 billion by 2030, up from $10.87 billion in 2021. This upgrade contains AI
applications such as detecting plagiarism and creating high-definition visuals.

Artificial Intelligence Pros and Cons

While AI is widely regarded as a vital and rapidly expanding asset, this burgeoning
discipline is not without its drawbacks.
In 2021, the Pew Research Center polled 10,260 Americans on their thoughts regarding AI.
According to the findings, 45 percent of respondents are equally delighted and concerned,
with 37 percent being more concerned than excited. Furthermore, more than 40% of
respondents thought driverless automobiles were harmful to society. However, the notion of
employing AI to detect the spread of misleading information on social media was more
generally welcomed, with nearly 40% of those polled saying it was a good idea.
Internet of things (IOT)
The internet of things, or IoT, is a networked system of interconnected computing devices,
mechanical and digital machinery, items, animals, or people with unique identities (UIDs)
and the capacity to transfer data without needing human-to-human or human-to-computer
contact.

A thing in the internet of things can be a person implanted with a heart monitor, a farm
animal implanted with a biochip transponder, a car with built-in sensors to alert the driver
when tire pressure is low, or any other natural or man-made object that can be assigned an
Internet Protocol (IP) address and can transfer data over a network.

Organizations in a range of sectors are increasingly utilizing IoT to run more effectively,
better understand consumers in order to provide better customer service, improve decision-
making, and raise the value of the business (Oracle, 2022).

How does IoT work?

An IoT ecosystem is made up of web-enabled smart devices that employ embedded systems
including processors, sensors, and communication gear to gather, send, and act on data from
their surroundings. IoT devices exchange sensor data by connecting to an IoT gateway or
other edge device, where data is either transferred to the cloud for analysis or examined
locally. These gadgets occasionally interact with one another and act on the information
they receive. The gadgets conduct the majority of the work without human interaction,
while individuals may engage with them to set them up, give them instructions, or view the
data.
Why is IoT important?

People may use the internet of things to live and work smarter, as well as acquire total
control over their life. IoT is critical to business in addition to providing smart gadgets to
automate homes. IoT gives organizations a real-time view of how their systems actually
operate, providing insights into everything from machine performance to supply chain and
logistics operations.

Companies may use IoT to automate operations and cut labor expenses. It also reduces
waste and enhances service delivery, lowering the cost of manufacturing and delivering
items while also providing transparency into consumer interactions.
What are the advantages of IoT for businesses?

Organizations may profit from the internet of things in a variety of ways. Some advantages
are industry-specific, while others apply across various industries. Some of the most popular
advantages of IoT enable firms to:

 keep track of their whole business procedures


 enhance customer experience (CX);
 saving both time and money
 improve staff productivity
 Business models must be integrated and adapted.
 improve business decisions;
 increase your revenue

IoT pushes organizations to rethink their business approaches and provides them with the
tools to better their company strategy.
In general, IoT is most prevalent in manufacturing, transportation, and utility sectors, where
sensors and other IoT devices are used; but it has also found applications in agriculture,
infrastructure, and home automation, propelling some firms toward digital transformation.
Farmers may profit from IoT by making their jobs simpler. Sensors may collect data on
rainfall, humidity, temperature, and soil composition, among other things, to aid in the
automation of agricultural practices.

The ability to monitor infrastructure operations is another element that IoT may help with.
Sensors, for example, might be used to track events or changes in structures such as
buildings, bridges, and other infrastructure. This has advantages such as cost savings, time
savings, quality-of-life workflow adjustments, and paperless workflow.
IoT may be used by a home automation company to monitor and control mechanical and
electrical systems in a building. Smart cities can help residents minimize waste and energy
usage on a larger scale.
IoT has an impact on every industry, including healthcare, banking, retail, and
manufacturing.
What are the advantages and disadvantages of IoT?
The following are some of the benefits of IoT:

 ability to access information from anywhere at any time on any device;


 improved communication between connected electronic devices; saving time and
money by
 transferring data packets over a connected network; and
 automating tasks to help improve the quality of a business's services and reduce the
need for human intervention.

The following are some of the downsides of IoT:

 As the number of connected devices grows and more information is shared between
them, the possibility of a hacker stealing private information grows.
 Enterprises may someday have to deal with enormous numbers of IoT devices,
maybe millions, and collecting and managing data from all of those devices will be
difficult.
 If there is a flaw in the system, every linked device will most certainly get corrupted.
 Because there is no universal IoT interoperability standard, it is difficult for devices
from various manufacturers to connect with one another.

IoT applications for consumers and businesses


The internet of things has several real-world applications, spanning from consumer and
commercial IoT to manufacturing and industrial IoT. (IIoT). IoT applications are used in a
variety of industries, including automotive, telecommunications, and energy.

Smart houses outfitted with smart thermostats, smart appliances, and connected heating,
lighting, and electrical gadgets, for example, may be managed remotely via computers and
smartphones in the consumer market.

Wearable devices with sensors and software may gather and analyze user data, transmitting
signals about the users to other technologies in order to make their lives easier and more
comfortable. Wearable gadgets are also employed in public safety, such as increasing first
responder reaction times during emergencies by offering efficient routes to a place or
tracking construction workers' or firefighters' vital signs at potentially hazardous locations.

In healthcare, IoT has several advantages, including the capacity to more closely monitor
patients through data analysis. IoT technologies are frequently used in hospitals to fulfill
activities such as inventory management for drugs and medical devices.

Smart buildings, for example, can save energy by utilizing sensors that identify how many
people are in a room. The temperature can be adjusted automatically, such as putting on the
air conditioner if sensors detect that a conference room is full or turning down the heat if
everyone has left the workplace.

In agriculture, IoT-based smart farming systems may monitor agricultural fields' light,
temperature, humidity, and soil moisture via linked sensors. IoT is also useful for
automating irrigation systems.

IoT sensors and installations, such as smart lighting and smart meters, can assist ease traffic
congestion, preserve energy, monitor and manage environmental problems, and enhance
sanitation in a smart city.

IoT security and privacy issues

The internet of things involves the usage of billions of data points and the connection of
billions of devices to the internet, all of which must be protected. IoT security and privacy
are important concerns due to its extended attack surface.

Mirai, a botnet that hacked domain name server provider Dyn and shut down many websites
for a long period of time in one of the largest distributed denial-of-service (DDoS) assaults
ever witnessed, was one of the most prominent recent IoT attacks of 2016. Attackers
acquired network access by exploiting inadequately protected IoT devices.
Because IoT devices are so tightly linked, a hacker only has to exploit one weakness to
corrupt all of the data, rendering it useless. Manufacturers that fail to update their gadgets
on a regular basis, if at all, leave them open to hackers.

Furthermore, linked gadgets frequently prompt users to provide personal information such
as names, ages, residences, phone numbers, and even social network accounts – information
that hackers might exploit.

Hackers aren't the only threat to the internet of things; IoT consumers are also concerned
about their privacy. Companies that manufacture and distribute consumer IoT devices, for
example, might utilize those devices to collect and sell personal data from customers.

Aside from exposing personal data, IoT endangers key infrastructure such as power,
transportation, and financial services.

How is IoT changing the world? Take a look at connected cars.

By allowing linked automobiles, IoT is revolutionizing the automobile. Car owners may use
IoT to remotely run their vehicles, such as preheating the car before the driver gets in it or
hailing a car by phone. Cars will be able to arrange their own servicing appointments,
thanks to IoT's capacity to enable device-to-device communication.

The linked automobile enables automakers or dealers to flip the car ownership model on its
head. Previously, manufacturers maintained a distance from individual customers (or none
at all). The manufacturer's association with the vehicle essentially ended when it was
delivered to the dealer. Automobile manufacturers and dealers may maintain a constant
interaction with their consumers by using linked automobiles. Instead of selling
automobiles, they may charge drivers use fees and provide "transportation-as-a-service"
using self-driving cars. IoT enables car makers to continually upgrade their vehicles with
new software, which is a significant departure from the conventional paradigm of auto
ownership in which vehicles deteriorate in performance and value.
Advantages of IoT

The Internet of Things has various benefits in our daily lives. Some of its benefits are as
follows:

Reduce the need for human intervention: As IoT devices connect and communicate with
one another, they may automate activities, improving the quality of a business's services and
minimizing the need for human involvement.

Save time: Reducing human effort saves us a lot of time. One of the key benefits of
adopting the IoT platform is the ability to save time.

Improved data collection: Information is freely available, even when we are far from our
actual position, and it is regularly and in real-time updated. As a result, these gadgets can
access data from anywhere at any time on any device.

Improved security: An integrated system can help with better control of homes and towns
via mobile phones. It enhances security and offers personal protection.

Efficient resource usage: By understanding the functioning and operation of each device,
we can increase resource consumption and monitor natural resources.

Reduced usage of other electronic equipment: Because electric devices are directly
connected and may interact with a controller computer, such as a cell phone, power
consumption is reduced. As a result, there will be no need for electrical equipment.

Use in traffic systems: Asset tracking, delivery, surveillance, traffic or transportation


tracking, inventory control, individual order tracking, and customer management may all be
made more cost-effective by utilizing IoT technology.
Disadvantages of IoT

As the Internet of Things brings benefits, it also brings a considerable set of problems. The
following are some of the downsides of IoT:

Concerns about security: IoT systems are networked and interact via networks. As a
result, despite any security precautions, the system provides limited control and can lead to
many types of network assaults.

Concern about privacy: The IoT system collects essential personal data in great detail
without the user's active participation.

Increased unemployment: Both unskilled and skilled employees are at danger of losing
their jobs, resulting in high unemployment rates. Smart security cameras, robots, smart
ironing systems, smart washing machines, and other amenities are taking the place of the
humans who used to execute these tasks.

The system's complexity: It is relatively difficult to design, build, manage, and enable the
vast technology to IoT system.

High likelihood of the entire system becoming corrupted: If there is a problem in the
system, every linked device may get corrupted.
Lack of international standards: Because there is no international standard for IoT
interoperability, it is difficult for devices from various manufacturers to connect with one
another.

High internet reliance: They are extremely reliant on the internet and cannot operate
properly without it.
Blockchain

A blockchain is a distributed database or ledger that is shared across computer network


nodes. A blockchain, like a database, saves information electronically in digital format.
Blockchains are well recognized for their critical function in cryptocurrency systems like
Bitcoin, where they keep a secure and decentralized record of transactions. The blockchain's
novelty is that it ensures the accuracy and security of a data record and produces trust
without the requirement for a trusted third party.

The way data is organized differs significantly between a traditional database and a
blockchain. A blockchain accumulates information in groupings known as blocks, which
store sets of data. When a block's storage capacity is reached, it is closed and connected to
the previous full block, producing a data chain known as the blockchain. All new
information that follows that newly added block is assembled into a newly formed block,
which is then added to the chain once it is complete.

A database typically organizes its data into tables, but a blockchain, as the name suggests,
organizes its data into chunks (blocks) that are linked together. When implemented
decentralized, this data structure creates an irreversible data timeline. When a block is
completed, it becomes permanent and forms a part of this timeline. When a block is added
to the chain, it is assigned a precise timestamp (Investopedia, 2022).

How Does a Blockchain Work?


Blockchain's purpose is to enable digital information to be recorded and disseminated, but
not altered. A blockchain, in this sense, serves as the foundation for immutable ledgers, or
records of transactions that cannot be changed, erased, or destroyed. As a result,
blockchains are also known as distributed ledger technologies (DLT).

The blockchain concept was initially suggested as a research project in 1991, and it
preceded its first popular use in use: Bitcoin, in 2009. Since then, the usage of blockchains
has grown exponentially, thanks to the development of multiple cryptocurrencies,
decentralized finance (DeFi) apps, non-fungible tokens (NFTs), and smart contracts.
Transaction Process

Attributes of Cryptocurrency
Blockchain Decentralization
Consider a corporation that has a server farm with 10,000 machines that are utilized to
manage a database containing all of its clients' account information. This corporation owns a
warehouse facility that houses all of these computers under one roof and has complete control
over each of these systems and all of the information they hold. However, this creates a single
point of failure. What happens if the power goes out at that location? What if its Internet
connection fails? What if it catches fire and burns to the ground? What if a malicious actor
wipes out everything with a single keystroke? The data is either lost or damaged in any
situation.

A blockchain allows the data in that database to be distributed across several network nodes
in different places. This not only adds redundancy but also ensures the accuracy of the data
stored—if someone tries to change a record in one instance of the database, the other nodes
are not affected, preventing a bad actor from doing so. If a single user tampers with Bitcoin's
transaction record, the other nodes will cross-reference each other and readily identify the
node with inaccurate information. This method aids in the establishment of a precise and
visible sequence of occurrences. This manner, no one node in the network may change the
information stored within it.

As a result, information and history (such as cryptocurrency transactions) are irreversible.


Such a record may be a list of transactions (as with cryptocurrencies), but a blockchain could
also include a range of other information such as legal contracts, state identifications, or a
company's goods inventory.

To validate new entries or records to a block, a majority of the processing power in the
decentralized network must agree. Blockchains are protected by a consensus method such as
proof of work (PoW) or proof of stake to prevent bad actors from confirming bogus
transactions or duplicate spending (PoS). These procedures allow for consensus even when
there is no one node in command.
Transparency

Because of the decentralized structure of Bitcoin's blockchain, all transactions may be


transparently watched by owning a personal node or utilizing blockchain explorers, which
allow anybody to witness transactions taking place in real time. Every node maintains its own
copy of the chain, which is updated as new blocks are confirmed and added. This means that
you could follow Bitcoin wherever it went if you wanted to.

Exchanges, for example, have been hacked in the past, and customers who stored Bitcoin on
the exchange lost everything. While the hacker is completely anonymous, the Bitcoins they
stole are easily traceable. It would be known if the Bitcoins stolen in some of these attacks
were relocated or spent someplace.

The records recorded on the Bitcoin blockchain (as well as the majority of others) are, of
course, encrypted. This implies that only the record's owner may decode it and disclose their
identity (using a public-private key pair). As a consequence, blockchain users may stay
anonymous while maintaining transparency.

Is Blockchain Secure?

In numerous ways, blockchain technology delivers decentralized security and trust. To begin,
new blocks are always kept in a linear and chronological order. That is, they are always
appended to the blockchain's "end." It is exceedingly difficult to go back and change the
contents of a block once it has been added to the end of the blockchain unless a majority of
the network has achieved a consensus to do so. This is due to the fact that each block contains
its own hash, as well as the hash of the block preceding it and the previously mentioned date.
A mathematical function converts digital information into a string of numbers and letters to
generate hash codes. If that information is edited in any way, then the hash code changes as
well.
Assume a hacker, who also operates a node on a blockchain network, wishes to change a
blockchain and steal bitcoin from everyone else. If they changed their single copy, it would
no longer be in sync with everyone else's copy. When everyone else compares their copies to
each other, this one copy will stand out, and the hacker's version of the chain will be
dismissed as invalid.

To be successful, the hacker must simultaneously possess and change 51% or more of the
copies of the blockchain, so that their new copy becomes the majority copy and, therefore,
the agreed-upon chain. Such an assault would also need an enormous amount of money and
resources, since they would have to rewrite all of the blocks due to the varied timestamps and
hash codes.

Because of the magnitude and speed with which many cryptocurrency networks are
developing, the expense of accomplishing such a feat would very certainly be impossible.
This would be not only exceedingly costly, but also likely futile. Such an action would not go
unnoticed by network participants, who would detect such substantial changes to the
blockchain. Members of the network would then hard fork off to a new version of the chain
that was not impacted. This would lead the value of the attacked version of the token to
collapse, rendering the attack ultimately futile because the bad actor now controls a worthless
asset. The same thing would happen if a bad actor attacked the next Bitcoin fork. It is built
this way so that taking part in the network is far more economically incentivized than
attacking it.

Blockchain vs. Bitcoin

Stuart Haber and W. Scott Stornetta, two researchers who aimed to develop a system where
document timestamps could not be manipulated with, proposed blockchain technology in
1991. But it wasn't until over two decades later, with the January 2009 introduction of
Bitcoin, that blockchain saw its first real-world implementation.
A blockchain is the foundation of the Bitcoin protocol. Bitcoin's pseudonymous developer,
Satoshi Nakamoto, described the digital currency in a research paper as "a new electronic
cash system that's totally peer-to-peer, with no trusted third party."

The crucial point to remember here is that while Bitcoin utilizes blockchain to transparently
record a ledger of payments, blockchain may theoretically be used to immutably store any
amount of data items. As previously said, this might take the shape of transactions, election
votes, goods inventories, state identifications, deeds to residences, and much more.

Currently, tens of thousands of initiatives are attempting to use blockchains for purposes
other than transaction recording, such as voting securely in democratic elections. Because of
the immutability of blockchain, fraudulent voting would become significantly more difficult.
For example, a voting system may be designed so that each person of a country receives a
separate coin or token. Each candidate would then be assigned a unique wallet address, and
voters would transmit their token or cryptocurrency to the address of the candidate for whom
they desire to vote. The transparent and traceable nature of blockchain would eliminate both
the necessity for human vote counting and the capabilities of unscrupulous actors. Because
blockchain is transparent and traceable, it eliminates the necessity for human vote counting as
well as the possibility of unscrupulous actors to interfere with physical votes.

Pros

 Improved accuracy by eliminating the need for human verification


 Cost savings from removing third-party verification
 Decentralization makes it more difficult to mess with.
 Transactions are safe, confidential, and quick.
 Technology that is transparent
 Gives inhabitants of nations with unstable or undeveloped governments a financial
option and a mechanism to safeguard personal information.
Cons

 Significant technological costs are connected with bitcoin mining.


 Transactions per second are low.
 Use in illegal operations, such as on the dark web, in the past
 Regulation varies by jurisdiction and is still in flux.
 Data storage constraints

Big data
What is big data?
Big data refers to huge, complex organized and unstructured data collections that are created
and transferred in real time from a wide range of sources. These characteristics include the
three Vs of big data:

1. Volume: The enormous volume of data stored.


2. Velocity: The lightning speed at which data streams must be processed and
analyzed.
3. Variety refers to the various sources and formats of data that are collected, such as
numbers, text, video, photos, audio, and text.

Data is continually created these days whether we launch an app, search Google, or just go
from place to place using our mobile devices. The end results? Massive amounts of
important data that businesses and organizations must manage, store, view, and analyze.

Traditional data tools aren't meant to handle this level of complexity and volume, so a flurry
of specialist big data software and architectural solutions have emerged to help (Guru99,
2022).
WHAT ARE BIG DATA PLATFORMS?
Big data platforms are specifically intended to manage massive amounts of data that enter
the system at rapid speeds and in a broad variety. These big data platforms often include a
variety of servers, databases, and business intelligence tools that enable data scientists to
manipulate data in order to discover trends and patterns.
Volume

Big data is massive. Unlike regular data, which is measured in megabytes, gigabytes, and
terabytes, big data is measured in petabytes and zettabytes.

Consider this comparison from the Berkeley School of Information: one gigabyte is the
equivalent of a seven-minute HD film, but a single zettabyte is comparable to 250 billion
DVDs.

This only scratches the surface. According to an EMC estimate, the digital universe is
doubling every two years and is anticipated to reach 44 trillion zettabytes by 2020.

Big data gives the architecture for dealing with this type of data. It would be impossible to
mine for insights without adequate storage and processing options.

Velocity

Everything about big data is quick, from the rate at which it is generated to the amount of
time required to examine it. Some have compared it like drinking from a fire hose.

Companies and organizations must be able to harness this data and develop insights from it in
real-time, or it will be useless. Real-time processing enables decision makers to move fast,
providing them an advantage over their competitors.

While certain types of data may be batch processed and stay relevant over time, most big data
is flooding into businesses at breakneck speed and necessitates instant action for the best
results. Sensor data from medical gadgets is an excellent example. The capacity to process
health data in real time can offer users and doctors with potentially life-saving information.
Variety

Approximately 95% of all big data is unstructured, which means it does not easily fit into a
basic, traditional paradigm. A huge data stream might include everything from emails and
movies to scientific and meteorological data, each with its own set of characteristics.

How is big data used?

Big data is intrinsically complicated due to its diversity, necessitating the development of
systems capable of handling its many structural and semantic distinctions.

Big data necessitates the use of specialized NoSQL databases that can store data in a way that
does not need strict adherence to a certain paradigm. This gives the flexibility required to
assess seemingly incongruous sources of information in order to get a comprehensive
understanding of what is going on, how to respond, and when to act.

When gathering, processing, and analyzing large amounts of data, it is frequently categorized
as operational or analytical data and stored accordingly.

Operational systems distribute huge quantities of data over several servers and contain input
such as inventories, customer data, and purchases – the day-to-day information of a business.

Analytical systems are more advanced than operational systems, capable of managing
extensive data processing and delivering decision-making insights to enterprises. To optimize
data gathering and usage, these systems are frequently linked into existing processes and
infrastructure.

Data is ubiquitous, regardless of classification. Our phones, credit cards, software programs,
automobiles, records, websites, and the great majority of "things" in our environment can
transfer massive quantities of data, and this data is quite valuable.
In practically every business, big data is utilized to uncover patterns and trends, answer
questions, get insights into consumers, and solve complicated challenges. Companies and
organizations utilize information for a variety of purposes, including expanding their
operations, analyzing consumer decisions, improving research, forecasting, and targeting
critical audiences for advertising.

EXAMPLES OF BIG DATA

 Personalized buying experiences in e-commerce


 Financial market simulation
 Creating billions of data points to accelerate cancer research
 Media suggestions from streaming services such as Spotify, Hulu, and Netflix
 Crop production prediction for farmers
 Analyzing traffic patterns to reduce city congestion
 Data analytics technologies that recognize retail buying behavior and appropriate
product placement
 Big data is assisting sports teams in increasing their efficiency and value.
 Recognizing educational patterns among individual individuals, schools, and districts

Here are a few businesses where the big data revolution is already in full swing:

Finance

Big data and predictive analytics are used in the banking and insurance industries for fraud
detection, risk assessments, credit rankings, brokerage services, and blockchain technology,
among other things.

Big data is also being used by financial organizations to improve cybersecurity and
customize financial choices for clients.
Healthcare

Big data solutions are being used by hospitals, researchers, and pharmaceutical businesses
to enhance and develop healthcare.

Healthcare is improving treatments, conducting more effective research on diseases such as


cancer and Alzheimer's, developing new drugs, and gaining critical insights into population
health patterns thanks to access to massive amounts of patient and population data.

Media & Entertainment

You've seen big data at work if you've ever used Netflix, Hulu, or any other streaming
service that provides suggestions.

Media companies use our reading, viewing, and listening habits to create personalized
experiences. Netflix even utilizes data on images, titles, and colors to inform client
preference selections.

Agriculture

Big data and technology are quickly improving the farming business, from creating seeds to
predicting crop yields with incredible precision.

With the flood of data over the last two decades, information has become more plentiful
than food in many nations, prompting academics and scientists to employ big data to
combat hunger and malnutrition. Some progress is being made in the effort to eradicate
world hunger, with organizations such as the Global Open Data for Agriculture & Nutrition
(GODAN) supporting open and unfettered access to global nutrition and agricultural data.
Advantages of Big Data

The following are the advantages or benefits of Big Data:

 Big data analysis yields novel solutions. Big data analysis aids in consumer
knowledge and targeting. It aids in the optimization of corporate operations.
 It aids in the advancement of science and research.
 It enhances healthcare and public health by making patient records available.
 It is used in financial trading, sports, polling, security/law enforcement, and so forth.
 Anyone may access large amounts of information through surveys and provide
answers to any question.
 Every other addition is made.
 One platform may carry an infinite amount of data.

Disadvantages of Big Data

The following are some of Big Data's problems or disadvantages:

 Traditional storage may be expensive when it comes to storing large amounts of


data.
 A large amount of big data is unstructured.
 Big data analysis contradicts privacy principles.
 It is capable of manipulating client records.
 It has the potential to exacerbate social stratification.
 In the near run, big data analysis is useless. It must be studied for a longer period of
time in order to reap the benefits.
 Big data analysis results might be deceptive at times.
 Rapid changes in large data might cause real-world values to differ.
Automation

The development and implementation of technology to create and distribute goods and
services with little human intervention is referred to as automation. The use of automation
technologies, techniques, and processes increases the efficiency, reliability, and/or speed of
operations formerly conducted by people.

Automation is employed in a variety of industries, including manufacturing, transportation,


utilities, defense, facilities, operations, and, more recently, information technology.

Popular corporate automation technologies include business process automation (BPA) and
robotic process automation (RPA) (RPA). In general, the term BPA refers to how to apply
the notion of automation to business processes, whereas RPA refers to how to automate a
single, repetitive job.

The term "hyperautomation" is occasionally used in IT marketing to distinguish rules-based


machine learning vendor "solutions" from more sophisticated systems that employ artificial
intelligence and deep learning (Techopedia, 2022).

Why use automation?

Typically, automation is used to reduce labor or to replace humans in the most menial or
repetitive activities. Automation may be found in almost all industries and niches; however,
it is more prominent in manufacturing, utilities, transportation, and security.

Most industrial operations, for example, employ some type of automated procedure, such as
robotic assembly lines. The only human input necessary is to specify and manage the
processes, while the assembly of the different components is left to machines, which
automatically turn raw materials into completed items.
Examples of Automation

A software script in the information technology area can test a software product and provide
a report. There are additional software solutions on the market that can produce code for an
application. Users must merely configure the tool and specify the procedure.

Another emerging type of high-quality automation is advanced business intelligence in


apps. Automation has substantially boosted productivity in various industries over the
previous few decades, saving time and money.

Automation may take various shapes in our daily lives, from the most basic to the most
complicated. Household thermostats regulating boilers, the first automatic telephone
switchboards, electronic navigation systems, and the most complex algorithms behind self-
driving automobiles are all examples.

Home automation - is the use of a mix of hardware and software technologies to control
and manage appliances and gadgets in the home.

Network automation - is the process of automating a computer network's configuration,


management, and operations.

Workplace automation - entails the use of computers and software to digitize, store,
process, and transmit the majority of regular operations and procedures in a typical office.

Website testing that is automated streamlines and standardizes website testing settings for
configuration changes that occur throughout the development phase.

Data center automation - enables software programs to conduct the majority of data center
activities. Included in this category are automated system operations, often known as lights-
out operations.

Test automation - means that software code is tested for quality assurance (QA)
automatically using scripts and other automation techniques.
Job Loss Due to Automation

Automation ensures that the approaches are efficiently utilized in the supply of goods and
services. However, technology automatically leads many people to become obsolete
(particularly unskilled workers) and to be displaced.

Automation will very probably have a significant negative impact on employment and pay
for those jobs that do not require specialized training or skills. Many of these people,
however, may readily be retrained for other employment, and the influence of this
technology on our society is revolutionary enough to generate new possibilities for
everyone.

According to the World Bank's World Development Report 2019, the good economic
consequences of new sectors and jobs available greatly exceed the negative ones, although
technological unemployment is still a source of worry.

Regardless of advancements in automation, some manual involvement is always necessary,


even if the tool can execute the majority of the duties. Automation specialists who work on
the development, implementation, and management of such technologies are in great
demand.
Advantages of Automation

Improved working environment


It is feasible to improve working conditions and safety inside your manufacturing process or
facility by utilizing automation. Automation has the potential to decrease health and safety
concerns, remove physical handling, and lower the danger of repetitive strain injury.

Increased competitiveness, sales and profit


You may become more competitive in your market by utilizing automation. This is because
as your manufacturing process becomes more automated, human error decreases, product
quality improves, and cost per component decreases owing to greater production rates and a
reduction in the resources necessary to make the items.

No labor crisis
Finding labor for monotonous, repetitive activities is becoming increasingly difficult, and
this is expected to worsen in the UK after Brexit. Because unemployment in the UK is
presently at its lowest level since July 1975, many industries are struggling to find
manufacturing employees, particularly for heavy physical labor. Automation can reduce the
need for certain sorts of operations to be performed by employees.

Increase production capacity


Automation boosts your production capacity since equipment may be configured to run
unattended 24 hours a day, seven days a week. Because automated machines do not take
breaks, sick leave, or vacations, even if they are just functioning during conventional shift
hours, this alone may frequently result in a 140%+ boost in productivity. Automated
machinery can also often operate quicker and generate more precise, defect-free items.
Disadvantages of Automation

Capital expenditure
While automation can be quite productive and provide a great ROI, it may also have a rather
large initial cost. As a result, before making a selection, we recommend examining both the
required expenditure and the expected ROI. Before assessing whether or not there is a
business case for investment, it is critical to calculate the ROI by include enhanced
throughput value, lower labor costs, and a reduction in defects/recalls in addition to capital
expenditure. You may calculate your estimated payback and evaluate financial projections
with the aid of an automation project calculator.

Gets rid of jobs


It is true that with the advent of technology, certain jobs may become obsolete, but this does
not have to be a negative connotation of automation. Instead of doing mind-numbing,
tedious, or disagreeable jobs, employees might be taught to work in other areas of your
company. Many businesses have discovered that after implementing automation, their
revenues increased, resulting in the creation of new employees in various areas of their firm.

Bespoke automation becomes redundant when production processes change


As with any sort of machinery, if you modify your manufacturing method or product so that
a certain machine is no longer required, the machine becomes obsolete. As a result, it is
critical to future-proof any automation you add in your manufacturing process. A qualified
automation business will design your automation system to be quickly adaptable to changes
in your product design or manufacturing process. For example, typical flexible automation,
such as robots, may be simply deployed elsewhere in a manufacturing process if the old
method becomes obsolete.
Robotics

What Is Robotics?

Robotics is the use of science, engineering, and technology to create devices called robots
that mimic or replace for human behaviors. Robots have always intrigued pop culture, with
examples like R2-D2, the Terminator, and WALL-E. These exaggerated, humanoid robot
conceptions are frequently a parody of the actual thing. But are they more foresighted than
we realize? Robots are acquiring cognitive and mechanical skills that do not rule out the
potential of an R2-D2-like machine in the future.

The breadth of what is called robotics expands as technology advances. In 2005, 90 percent
of all robots were found in automobile assembly plants. These robots are mostly made up of
mechanical arms that are charged with welding or screwing on certain elements of an
automobile.

Today's definition of robotics encompasses the invention, construction, and deployment of


bots that do jobs such as exploring the planet's toughest climates, supporting law
enforcement, simplifying medical operations, and executing rescue missions (Bulitin, 2022).

WHAT EXACTLY IS A ROBOT?

A robot is a programmed machine that can execute a job, whereas robotics is the branch of
study that focuses on the development of robots and automation. Each robot has varying
degrees of autonomy. These levels vary from human-controlled bots to totally autonomous
bots that accomplish tasks without any outside interference.
Defined Robotics
While the field of robotics is developing, several features of a robot remain consistent:

1. Robots are made of some kind of mechanical structure. A robot's mechanical


component aids it in completing duties in the environment for which it was created.
For example, the Mars 2020 Rover's wheels are independently powered and
composed of titanium tubing, allowing it to securely grasp the red planet's tough
terrain.
2. Robots need electrical components that operate and power the gear. Essentially, an
electric current — such as a battery — is required to power the vast majority of
robots.
3. Robots all have some amount of computer programming. A robot would be little
more than a piece of simple hardware if it did not have a set of instructions
informing it what to do.

As artificial intelligence and software continue to advance, we're certain to see the promise
of the robot’s sector sooner rather than later. In the near future, owing to developments in
these technologies, robots will continue become smarter, more adaptable and more energy
efficient. They'll also be a key focus in smart factories, where they'll tackle increasingly
challenging difficulties and assist safeguard global supply networks.

The robotics business is full with laudable promises of advancement that science fiction
could only dream of a few years ago. Robots will be discovered executing activities that
humans could never imagine of doing alone, from the darkest depths of our seas to hundreds
of kilometers in outer space.

Etymology of the Robot


Robot comes from the Czech term robot, which meaning "forced labor." The term initially
originated in the 1920 drama R.U.R., in reference to the characters in the play, who were
mass-produced employee’s incapable of thinking creatively.
Types of Robotics

Mechanical bots exist in a variety of forms and sizes to do the tasks for which they are
created. All robots differ in terms of design, usefulness, and degree of autonomy. From the
0.2 millimeter-long "RoboBee" to the 200-meter-long "Vindskip," robots are developing to
perform jobs that humans just cannot.

There are five different sorts of robots that do different activities based on their capabilities.
The following is an overview of various kinds and what they do.

Pre-Programmed Robots

Pre-programmed robots do simple, boring activities in a controlled setting. A mechanical


arm in an assembly line is an example of a pre-programmed robot. The arm has one purpose
weld a door, put a part into the engine, etc. and its goal is to execute it longer, quicker, and
more efficiently than a person.

Humanoid Robots

Humanoid robots are robots that replicate or look like humans. These robots typically
engage in human-like actions (such as running, leaping, and carrying goods) and are
occasionally made to resemble people, including human features and attitudes. Hanson
Robotics' Sophia and Boston Dynamics' Atlas are two renowned examples of humanoid
robots.
Autonomous Robots

Human operators are not required for autonomous robots to function. These robots are often
designed to do jobs in open spaces without the need for human supervision. They are
extremely distinctive in that they utilize sensors to assess their surroundings and then use
decision-making mechanisms (typically a computer) to pick the best next action depending
on their data and purpose. The Roomba vacuum cleaner, which utilizes sensors to move
freely throughout a home, is one example of an autonomous robot.

Teleoperated Robots

Teleoperated robots are semi-autonomous bots that use a wireless network to allow remote
human control. These robots are often used in harsh geographical, meteorological, and
environmental situations. Human-controlled submarines used to repair undersea pipe breaks
during the BP oil spill are examples of teleoperated robots, as are drones used to locate
landmines on a battlefield.

Augmenting Robots

Augmenting robots, often known as VR robots, can augment existing human skills or
replace those that have been lost. The topic of robotics for human enhancement is one in
which science fiction might become reality very soon, with bots capable of redefining
humanity by making people quicker and stronger. Current augmenting robots include
robotic prosthetic limbs and exoskeletons used to carry heavy weights.
Advantages od Robotics

Cost Effectiveness
There will be no lunchtime, vacations, sick leave, or shift time set aside for robotic
automation. It may be made to work in a repeated cycle, and as long as it is properly
maintained, it will do so unless programmed differently. This prevents the possibility of RSI
arising.

Increased output at a cheaper cost has clear advantages for any firm. The initial cost may be
returned in a very short period of time, and the profits from there are, to put it mildly,
exponential.

Improved Quality Assurance


Few employees enjoy completing repeated activities, and concentration levels naturally fall
after a given amount of time. This break in attention is known as vigilance decrement, and it
may frequently result in costly blunders for the company and, in some cases, significant
damage to a member of staff.

Robotic automation removes these risks by creating and testing products to ensure they
meet the specified quality without fail. More goods being created to a higher level opens up
a slew of new economic opportunities for firms to capitalize on.

Work In Hazardous Environments


Aside from potential occupational injuries, employees in certain sectors may be required to
operate in unstable or unsafe conditions. For example, if there is a high amount of chemicals
present, robotic automation is the appropriate choice since it will continue to perform
without causing harm.

Because of the nature of the task, production locations that demand extremely high or low
temperatures generally have a high turnover of workers. Automated robots can reduce
material waste and eliminate the need for people to put themselves in danger.
Disadvantages of Robotics

Potential Job Losses


One of the most serious worries about the adoption of robotic automation is the impact on
employees' jobs. If a robot can execute at a quicker and more constant rate, humans may be
rendered obsolete. While these concerns are understandable, they are not entirely correct.

The same could be said of the early years of the industrial revolution, and as history has
shown, people have continued to play an important role. Amazon is an excellent illustration
of this. The employment rate has increased dramatically over a time in which they went
from utilizing about 1,000 robots to over 45,000.

Initial Investment Costs


This is often the most significant barrier that will determine whether a firm will invest in
robotic automation now or later. When contemplating the installation of this technology, a
complete business case must be developed. The returns can be large, and they frequently
occur in a short period of time. However, the cash flow must be sustainable in the
meantime, and the company's stability is not worth the risk if the rewards are merely
modest. However, in most cases, a repayment schedule will be offered, making it much
easier to budget and manage expenses.

When determining whether or not there is a business justification for investment, increased
throughput and defect reduction must be evaluated alongside capital expenditure. Intangible
advantages must also be evaluated, and we have produced a free calculator to assist you
with this.
Immersive Media

The word "immersion" is gaining use in the technological world nowadays.

When anything in the digital world is defined as "immersive," it frequently refers to the
domain of extended reality. Immersive media, which comes in a variety of formats, allows
individuals to interact with material on a deeper level. Rather than simply watching a film,
immersive media may create a world in which you can stop and interact with the objects
inside it.

Immersive media examples range from augmented reality to virtual reality, holopresence,
and more. Experts estimate that this landscape will be valued roughly $180 billion by 2022.

As extended reality provides more chances for learning, employment, and social
engagement, the need for immersive media grows.

Immersive media refers to immersive technologies that try to recreate or mimic the physical
environment through digital simulation. It is the meeting of technology and reality. Virtual
reality, augmented reality, mixed reality, and Imagine 4D's latest technology,
MultimmersionTM are all examples of immersive media. These and other future
technologies belong under the umbrella term "Extended Reality" (XR).

According to the NASSCOM research, immersive media will be worth $180 billion by
2022. According to reports, immersive media solutions will be major participants in a range
of sectors. Are you familiar with the five categories of Immersive Media? Continue reading
to discover more about these intriguing and now commonly utilized technologies (XR
Today, 2022).
5 types of immersive media

 Virtual Reality (VR): A digital environment substitutes the physical world of the
user. VR entirely immerses users in this digital environment, which is often
accomplished via the use of head-mounted displays (i.e., VR goggles).
 Augmented Reality (AR) is when digital material is superimposed on a real-world
setting. Rather than replacing reality, this technology enriches it. Snapchat's picture
filters are one example of AR.
 MR (Mixed Reality): As the name implies, MR is a fusion of the real and digital
worlds. This experience creates a setting in which technology and the physical world
coexist and interact. We may think of it as the marriage of VR and AR. Headwear is
also required for MR.
 3D content: 3D films and photos allow you to immerse yourself in a one-of-a-kind
image or video. This provides for immersion in the media, but without the use of a
smartphone or comparable device, there is typically no ability to engage with the
information directly.
 Extended Reality (XR) is a catch-all word for virtual reality (VR), augmented
reality (AR), mixed reality (MR), and Imagine 4D's new immersive media.

How Will Immersive Media Change the World?

Experts can develop relationships between humans and technology that we could never have
imagined before by making experiences more immersive. In a VR environment, for
example, a corporation may construct a digital twin of a product, allowing engineering and
manufacturing teams to experiment with designs and ideas for how to improve that thing.

A group of VR professionals might then collaborate in real-time on that digital twin,


modifying without the use of heavy machinery or material resources. As a consequence,
everyone engaged benefits from a much faster and more efficient creative process.
Access to immersive media promotes not just creativity but also productivity. Immersive
technology may provide consumers with virtual guided tours of a site before they visit it in
the travel landscape. Home purchasers in real estate might tour around a potential home
before it is built. Healthcare providers can utilize smart glasses in the AR environment to
gain advise on a patient's circulatory system or to locate the root of an issue.

Immersive media also facilitates more efficient collaboration in an age of hybrid and distant
working. Beyond simply simulating face-to-face contact, such as video conferencing, virtual
environments may enable people to share a digital space.

The Future is Immersive

Immersive media is an approach that businesses and individuals may employ to increase the
connection between people and the machines with which they work. We may employ
immersive media to gain a deeper understanding of critical subject and communicate ideas
on a larger scale. Immersive media may improve team cooperation and increase customer
service prospects.

The desire for immersive media will continue to rise as people seek new ways to use the
digital environment. Who knows what our next degree of immersion will entail?

How can Immersive Media be used within different industries?

Immersive technologies are becoming more popular among regular people, but the
corporate sector is experiencing and will continue to experience an immersive media boom.

Here are some instances of how immersive technologies are now being used:

Automotive industry: For example, in the automotive business, VR technology allows


engineers and designers to quickly experiment with the appearance and construction of a car
before ordering pricey prototypes.
Healthcare: Healthcare workers can use immersive technologies for training and practice in
a low-risk 3D environment before working on real bodies.

Tourism: Immersive technology enable consumers to take virtual guided tours all over the
world before deciding on and booking a vacation. Immersive media allows marketers to go
beyond simply offering photos and videos by letting visitors to physically experience the
location before making a commitment.

Real Estate: Thanks to immersive media and 3D material, house purchasers may
experience their future home before it is even created. This allows developers to upsell their
goods and allows clients to personalize their homes to their preferences.

Immersive media has several industry applications and use cases. Imagine 4D is
excited to be working in this field and assisting businesses in realizing their full
potential.
Cloud computing

What is cloud computing

Cloud computing is the delivery of various services over the Internet. Data storage, servers,
databases, networking, and software are examples of these resources.

Instead of storing files on a proprietary hard drive or local storage device, cloud-based
storage allows them to be saved to a distant database. As long as an electronic gadget has
internet connectivity, it has access to data and the software applications needed to execute
it.
Cloud computing is becoming increasingly popular among individuals and organizations for
a variety of reasons, including cost savings, enhanced productivity, speed and efficiency,
performance, and security (Investopedia, 2022).

 Cloud computing refers to the supply of various services through the Internet, such
as data storage, servers, databases, networking, and software.
 Cloud storage has increased in popularity among people who want more storage
space and organizations looking for an effective off-site data backup option.
 Cloud storage allows you to save files to a distant database and retrieve them
whenever you want.
 Services can be both public and private—public services are delivered online for a
price, whilst private services are delivered to individual consumers over a network.
 Cloud security has grown in importance in the IT industry.

Cloud computing gets its name from the fact that the information being accessed is located
remotely in the cloud or a virtual area. Cloud service providers allow customers to store
files and apps on remote servers and then access the material through the Internet. This
implies that the user does not need to be at a certain location to obtain access to it, allowing
them to work remotely.
Cloud computing offloads all of the hard labor associated with crunching and processing
data from the device you carry about or sit at to work on. It also offloads all of that work to
massive computer clusters in cyberspace. The Internet transforms into the cloud, and your
data, work, and apps are accessible from any device that can connect to the Internet,
wherever in the globe.

The cloud may be both public and private. For a price, public cloud providers offer their
services through the Internet. Private cloud services, on the other hand, only serve a limited
number of users. These services are a network system that provides hosted services. A
hybrid option is also available, which includes components of both public and private
services.

Types of Cloud Services

Regardless of the type of service, cloud computing services provide consumers a number of
benefits, including:

 Email
 Storage, backup, and data retrieval
 Creating and testing apps
 Analyzing data
 Audio and video streaming
 Delivering software on demand

Cloud computing is still a relatively new service, but it is being utilized by a wide range of
organizations, including large enterprises, small businesses, NGOs, government agencies,
and even individual consumers.
Deployment Models

There are several sorts of clouds, each with its own characteristics. Public clouds offer
services and storage on servers connected to the Internet. These are managed and controlled
by third-party firms, which handle and control all hardware, software, and general
infrastructure. Clients gain access to services through accounts that anybody may access.
Private clouds are usually reserved for a single business or organization. The cloud
computing service might be hosted by the company's data center. A private network is used
to deliver many private cloud computing services.
Hybrid clouds, as the name suggests, combine public and private services. This approach
gives the user greater freedom and aids in the optimization of the user's infrastructure and
security.

Types of Cloud Computing


Cloud computing, unlike a microchip or a cellphone, is not a single piece of technology. It
is, rather, a system made up of three services: software-as-a-service (SaaS), infrastructure-
as-a-service (IaaS), and platform-as-a-service (PaaS) (PaaS).

 SaaS (software-as-a-service) refers to the licensing of a software program to clients.


Licenses are often supplied on a pay-as-you-go or on-demand basis. This sort of
system is available in Microsoft Office 365.
 Infrastructure-as-a-service (IaaS) is a means of offering everything from operating
systems to servers and storage as an on-demand service via IP-based connection.
Clients can avoid purchasing software or servers by obtaining these resources
through an outsourced, on-demand service. IBM Cloud and Microsoft Azure are two
popular examples of IaaS systems.
 Platform-as-a-service (PaaS) is the most complicated of the three levels of cloud
computing. PaaS is similar to SaaS, with the main distinction being that instead of
delivering software online, it is a platform for developing software that is supplied
through the Internet. Platforms like as Salesforce.com and Heroku fit within this
approach.
Advantages of Cloud Computing

Cloud-based software provides several benefits to businesses of all sizes, including the
flexibility to access software from any device, whether through a native app or a browser.
As a consequence, users may seamlessly transfer their data and preferences from one device
to another.

Cloud computing entails much more than simply accessing information across different
devices. Users may check their email from any computer and save files using cloud
computing services such as Dropbox and Google Drive. Users may also utilize cloud
computing services to back up their music, data, and images, ensuring that they are
promptly accessible in the case of a hard drive accident.

It also has enormous cost-cutting possibilities for large enterprises. Before the cloud became
a realistic option, businesses had to acquire, build, and maintain expensive information
management systems and infrastructure. Companies may replace pricey server centers and
IT personnel with fast Internet connections, allowing employees to do jobs online by
interacting with the cloud.

Individuals can conserve storage space on their computers or laptops by using the cloud
framework. It also allows users to upgrade software more rapidly since software vendors
may distribute their goods on the web rather than more traditional, physical ways such as
discs or flash drives. Adobe clients, for example, may access Creative Cloud programs via
an Internet-based subscription. This enables users to simply get updated versions and
patches for their apps.
Disadvantages of the Cloud

There are hazards associated with cloud computing, despite its speed, efficiency, and
innovations.

Security has always been a major worry with the cloud, particularly when dealing with
sensitive medical records and financial data. While rules require cloud computing firms to
strengthen their security and compliance safeguards, the problem persists. Encryption
safeguards essential information, but if the encryption key is lost, the data is gone forever.

Natural calamities, internal faults, and power outages may all strike servers managed by
cloud computing corporations. The geographical reach of cloud computing is reciprocal: a
blackout in California might immobilize customers in New York, and a company in Texas
could lose data if its Maine-based supplier crashes.

There is a learning curve for both employees and managers, as with any technology.
However, when a large number of people access and manipulate information through a
single gateway, unintended errors can spread across the system.

The World of Business


Businesses may use cloud computing in a variety of ways. Certain users retain all
applications and data in the cloud, while others employ a hybrid approach, keeping some
programs and data on private servers and some in the cloud.

When it comes to offering services, the following companies are major participants in the
corporate computing sphere:

 Google Cloud
 Amazon Web Services (AWS)
 Microsoft Azure
 IBM Cloud
 Alibaba Cloud
Nanotechnology

Definition 01- what is nanotechnology

In the domain of computer science, nanotechnology is a sort of engineering aimed at


creating electronic components and devices measured in nanometers, which are exceedingly
small in size and structure. Nanotechnology enables the construction of functional materials
and systems at the atomic or molecular scale. It includes ideas from physics, biology,
engineering, and a variety of other fields.

Nanotechnology is frequently abbreviated as nanotech (NNI, 2022).

Definition 02 – what is nanotechnology

Nanotechnology is a scientific subject that employs system or component development


approaches to create products with extremely fine grain sizes. To create nanomaterials or
products, nanotechnology employs a variety of methodologies, including bottom-up, top-
down, and functional system development. A product is created from the smallest form
factor to a bigger product using a bottom-up method. In a top-down method, a huge product
may be reverse engineered to create nanometer-scaled items. A functional approach
considers the entire system and may include both bottom-up and top-down techniques.

Nanotechnology is used in a wide range of industries and applications, including computers,


biology, electronics, and chemical engineering.
How Nanotechnology Is Changing the Way We Design and Build Computers

As computers continue to be optimized for smaller and smaller dimensions, nanotechnology


in computing is becoming increasingly important.

Nanotechnology is the application of "very small objects" in technology, which may be


generalized to encompass any number of technologies that use parts with thicknesses of
nanometers or less.

Nanotechnology promises to make computers quicker, more powerful, and perform


smoother while using less overall volume. It's a promising field in computing, but in order
to appreciate how important it is, we should first take a step back and define what
nanotechnology is in the first place.

How nanotechnology is changing computing

Moving on to nanotechnology in the computing industry, we can get a clearer sense of how
prevalent the technology is in the space.

Carbon nanotubes are one technology that is assisting in the development of smaller and
quicker transistor designs, particularly at IBM. IBM is developing carbon nanotube
transistors to try to prevent silicon-based transistors from becoming obsolete. After 2020,
this year, silicon transistors are expected to have hit their physical limit for optimization.
Carbon nanotube-based transistors, on the other hand, would provide an ongoing
replacement.

The objective of creating smaller and smaller transistors for computers is to stay up with
Moore's Law. This rule, or forecast, asserts that the number of transistors that can be packed
into a given size circuit will double every two years.
Advantages and disadvantages of Nanotechnology

Manufacturing Advantages of Nanomaterials


Nanotechnology is already creating novel materials that have the potential to change
numerous industries. Nanotubes and nano particles, for example, which are tubes and
particles just a few atoms wide, and aerogels, which are materials consisting of incredibly
light and strong materials with extraordinary insulating qualities, might open the way for
new processes and improved goods. Furthermore, nanobots and nano factories, which are
robots that are only a few nanometers long, might aid in the creation of innovative materials
and products.
Advantages of Energy and Electronics
Nanotechnology has the potential to change the way we collect and use energy.
Nanotechnology, in particular, is projected to make solar power more affordable by
lowering the cost of building solar panels and accompanying equipment. As a consequence,
energy storage devices will become more efficient. Nanotechnology will also enable novel
techniques of energy generation and storage.

Nanotechnology is likely to transform the realm of electronics. Quantum dots, for example,
are small light-producing cells that might be utilized for lighting or display displays.

Nanotechnology's Medical Advantages


Nanotechnology has the potential to revolutionize medicine. Nanobots might be used to
unblock blockages in a patient's arteries. Surgeries may become considerably quicker and
more precise. Cell-by-cell healing of injuries is possible. It may even be feasible to cure
genetic disorders by repairing damaged genes. Nanotechnology might potentially be used to
improve medicine manufacture by customizing pharmaceuticals at the molecular level to
improve efficacy and decrease negative effects.
Holography

Holography is a technology that allows a light field to be captured and afterwards rebuilt
when the original light field is no longer there owing to the disappearance of the original
items. Holography is analogous to sound recording in that a sound field made by vibrating
matter, such as musical instruments or vocal cords, is stored such that it may be recreated
later without the presence of the original vibrating matter. However, it is much more
comparable to Ambisonic sound recording in that it can recreate any listening angle of a
sound field (TechTarget, 2022).

Leaser

The hologram is recorded using a laser light source that is exceedingly pure in color and
ordered in composition in laser holography. Various setups and holograms can be created,
but all involve the interaction of light coming from different directions and producing a
microscopic interference pattern that is photographed on a plate, film, or other medium.

In one typical configuration, the laser beam is divided into two halves, one of which is
known as the object beam and the other as the reference beam. By passing the object beam
via a lens, it is enlarged and utilized to light the topic. The recording media is placed where
the light will strike it after being reflected or dispersed by the subject. The medium's
margins will eventually function as a window through which the topic may be seen, thus
their placement is chosen with this in mind. The reference beam is widened and directed at
the medium, where it interacts with the light from the subject to produce the correct
interference pattern.

Holography, like traditional photography, requires an adequate exposure time to properly


impact the recording medium. Unlike traditional photography, the light source, optical
components, recording media, and subject must all stay immobile relative to each other
during the exposure, to within roughly a fourth of the wavelength of the light, otherwise the
interference pattern will be smeared and the hologram ruined. That is only achievable with
living beings and some fragile materials if a very powerful and extremely brief pulse of
laser light is employed, a risky process that is rarely performed outside of scientific and
industrial laboratory settings. Exposures of several seconds to several minutes are usual,
using a considerably lower-powered continually functioning laser.

Apparatus

A hologram can be created by shining a portion of the light beam directly into the recording
media and the other portion onto the object such that some of the dispersed light falls into
the recording medium. A more adaptable method of recording a hologram involves the laser
beam to be sent through a sequence of devices that alter it in various ways. The first
component is a beam splitter, which separates the beam into two identical beams pointing in
opposite directions:

 One beam (the 'illumination' or 'object beam') is stretched out by lenses and focused
onto the scene by mirrors. Some of the light reflected from the scene falls onto the
recording medium.
 The second beam (dubbed the "reference beam") is likewise dispersed using lenses,
but it is focused such that it does not come into touch with the scene and instead
goes directly onto the recording medium.

The recording media can be made of a variety of materials. One of the most frequent is a
film that is very similar to photographic film (silver halide photographic emulsion), but with
considerably tiny light-reactive grains (ideally with diameters less than 20 nm), allowing it
to achieve the much greater resolution required for holograms. A layer of this recording
medium (for example, silver halide) is adhered to a transparent substrate, which is typically
glass but can also be plastic.
Process

When the two laser beams collide and interfere with one other on the recording media, their
light waves intersect and interfere with each other. This interference pattern is permanently
recorded on the recording media. The pattern appears random because it shows how the
scene's light interfered with the original light source - but not the original light source itself.
The interference pattern is an encoded rendition of the scene that requires a specific key -
the original light source - to view its contents.

This missing key is then given by illuminating the produced film with a laser identical to the
one used to record the hologram. When this beam lights the hologram, it is diffracted by the
surface pattern of the hologram. This results in a light field that is identical to the one
created by the scene and scattered over the hologram.

Advantages:

 Reasonably priced
 increased storage capacity
 Objects' practicality has increased (depth)
 Allows for the creation of many pictures on a single plate as well as 3D images.
 Capability to integrate with other technologies

Disadvantages:

 Give static images


 Do not create graphics with intricate movement.
 Image production and viewing need complex, precise technology.
 Inline holography has a low axial resolution.
 The beams of reference lighting are collinear.
Characteristics of emerging technologies

Emerging technologies and emerging practices are not defined by newness

Although the word emerging and new are often treated as being synonymous, emerging
technologies and practices may not be new. emerging technologies and practices may be
recent developments (such as using 3D printers, publishing own data) or older ones (using
open-source learning management system). even though it may be true that most emerging
technologies are newer technologies, the mere fact they are new does not necessarily
categorize them as emerging (veletsianos, 2022).

Emerging technologies and emerging practices are evolving organisms that exist in a
state of " coming into being "

The word " evolving " refers to a dynamic state of change in which technologies and
practices are continuously refined and developed. To illustrate this, consider the chalkboard
and dry - erase board, the use of which is generally established within the educational
community and thus, while still in use, is no longer evolving. Contrast this to Twitter, the
currently popular social networking and micro - blogging platform. Although various
practices and activities on the Twitter platform can be said to be established, numerous
aspects of the technology, as well as practices associated with it, are emerging as platform
refinements change the way the technology is used and users engage in practices that may
depart from those originally anticipated.
Not - yet Ness: Emerging technologies and emerging practices are not yet fully
understood or researched
One distinguishing characteristic of emerging technologies and practices is that we are not
yet able to understand their implications for education, teaching, and learning or for
learners, instructors, and institutions. We also lack an understanding of the contextual,
negotiated, and symbiotic relationship between practices and technologies. For example,
what effect might the opportunity to socialize with classmates via social networking sites
have for online learners? How do automated grading practices reconfigure the role of
instructors? Could social networking sites or MOOCs break down digital divides between
haves and have - nots? Or are social networking sites simply another medium through
which societal inequalities are perpetuated? What are the pedagogical affordances of
social networking sites? How may learning analytics support online instructors? How may
we design supportive and engaging self - paced learning environments? Can location -
aware devices enhance communal learning experiences?

Emerging technologies and emerging practices have promising but as yet unfulfilled
potential
The final characteristic of an emerging technology or practice is its promise of significant
impact, which is as yet mostly unfulfilled. Individuals and organizations may recognize that
particular technologies and practices offer significant potential for enacting change (e.g.,
improving learner - learner interaction, reducing student cost, supporting classroom equity),
but such potential has not yet been realized. The fields most associated with the use of
technology in education, including online and distance learning, often exhibit techno -
utopian and techno - deterministic thinking. In particular, technology and certain practices
associated with it are often expected to revolutionize the way individuals learn
and teach. Yet scholars and practitioners alike are wise to maintain some skepticism about
promises of transformation that ignore the environmental factors that surround innovations.
Even though technology has had a significant impact on how education is delivered,
managed, negotiated, and practiced, this book, and past research, remind us that the
environment in which such impacts occur is influenced by a variety of factors, including
politics and economics.
Advantages of Emerging Technology

Emerging technologies have several advantages:

 It enables you to convey your thoughts so that others might benefit from them (Blogs)

 Technology allows you to communicate with individuals who are thousands of miles

distant (Skype)

 Emerging technology can provide you with information practically instantly (Twitter

or Dropbox).

 Technology, particularly games, enables you to study in a more enjoyable manner.

 MOOCs, an emerging technology, can deliver education to individuals who cannot

pay it or who want to learn more than what school provides.

Disadvantages of Emerging Technology

Here are some disadvantages

 It is quite impossible to totally delete anything from the internet; so, once an idea is
online, it can typically be traced back to you.

 If a person does not secure their virtual image when using developing technologies, it
might lead to identity theft.

 It removes the personal side of life, particularly if learning is done online.

 It is challenging to employ new technologies to teach since pupils prefer to surf the
internet rather than pay attention.
5 industries and all the technology challengers for each industry

Real estate

How AI challenges real estate industry


The employment of artificial intelligence in real estate presents significant hurdles, but they
are not insurmountable. To begin, AI works best in any industry when it can learn on its
own. An AI specialist is required for designing solutions with automated learning
capabilities in order to allow automated AI activities. Fortunately, this problem can be
solved by a skilled team of professionals!

Second, AI technologies must be used in accordance with the law, with data security as a
primary concern. One of the most difficult difficulties in AI and real estate is data protection
from possible hackers. It is a great responsibility to handle the personal information of
thousands of real estate clients. However, several AI experts are developing novel
approaches to data protection against cybercrime.

Finally, data scarcity is a significant barrier when employing AI in the real estate market.
The quality of AI algorithms is heavily dependent on the data they receive and use to
provide adequate output. Having both high-quality and large amounts of data allow your AI
program to make smarter conclusions. The fundamental issue emerges when the available
data sets are insufficiently diversified. For example, information on residential and
commercial properties, building characteristics, sales histories, rentals, tenants, and lease
agreements. However, innovative data mining processes can help the real estate business
overcome the difficulty of data scarcity.
How Blockchain challenges real estate industry

While blockchain technology has the potential to solve many problems in the real estate
business, there are always hurdles associated with shifting to an emerging technology that has
not yet reached its full maturity.

It is crucial to highlight that blockchain technology is still in its infancy, and widespread
adoption in the real estate business will present its own set of obstacles.

Regulation
Navigating complicated regional rules throughout the world is a major challenge for the
adoption of any new technology, including blockchain-based systems. Some blockchain-
based real estate investment platforms, for example, do not enable US investors to join since
the relevant rules are stringent and add significant administrative overhead to the sale and
transfer of tokens. As a result, it is easier for these platforms to simply prevent Americans
from joining, even if this means losing a large market of potential investors.

Scaling
Every year, millions, if not billions, of global transactions take place in the real estate market.
This necessitates networks capable of handling enormous transaction volumes swiftly and
effectively.

However, Ethereum can presently only manage roughly 15 transactions per second, whereas
Bitcoin can only handle about 5 transactions. Visa, on the other hand, claims to be able to
process over 24,000 transactions per second. For large-scale real estate firms that demand
ultra-fast processing times, a transaction bottleneck would be a serious concern.
Healthcare industry
How AI challenges healthcare industry

UNIFYING MIND AND MACHINE THROUGH BRAIN-COMPUTER


INTERFACES
Using computers to communicate is by no means a novel concept, but developing direct
interfaces between technology and the human mind without the need of keyboards, mouse,
and displays is a cutting-edge area of study with major potential for some patients.

Neurological illnesses and nervous system injuries can impair certain patients' ability to
communicate, move, and connect meaningfully with others and their surroundings. BCIs
powered by artificial intelligence may be able to restore such essential experiences to
individuals who worry they may be gone forever.

Change is hard.
Change is difficult to implement in healthcare. Many significant entrenched interests exist,
and there is sometimes no effective market to foster innovation and drive efficiency.
Johnson & Johnson received FDA clearance in 2013 for Sedasys, an automated anaesthesia
system used in routine procedures such as colonoscopies. It performed thousands of
successful procedures in Canada and the United States, but the profession fought back, and
sales were slow, despite the fact that the machine cost $150 each operation and a human
anaesthetist cost $2,000. J&J left the firm a few years later.

Eric Topol, an accomplished American cardiologist, scholar, and author, accuses his peers
in the medical profession of being condescending to their patients. He believes that too
many doctors are stingy with the information they gather about our health, on the premise
that a little knowledge is bad. In his 2015 book, "The Patient Will See You Now," he
advocates for a role reversal in which the "medical god" (the MD) is deposed and the patient
becomes CEO of their own healthcare. His 2019 book, "Deep Medicine," has a sub-title that
summarizes his optimistic outlook: "How Artificial Intelligence Can Make Healthcare
Human Again."
Retail

How Blockchain challenges Retail industry

Value changes
However, the usage of blockchain may provide various issues for the retail business. For
starters, the value of blockchain currencies changes dramatically. Blockchain development
firms have yet to devise methods to protect Bitcoins from volatile market fluctuations. When
compared against the dollar, this might mean that a retailer's bitcoin holdings could lose a lot
of value. Large online retailers can mitigate this risk by confining the majority of their
transactions to monetary currencies. Smaller businesses that rely completely on blockchain
would suffer greatly if Bitcoin demand fell.

Fragility
Furthermore, Bitcoin is not controlled by a single identifiable owner. Furthermore, the
systems are not managed by a corporate entity that can be held accountable for any value
changes. As a result, blockchain currencies are extremely vulnerable. The entire system can
cease to exist in a matter of seconds, rendering all cash useless. This is especially true for
younger cryptocurrencies that have yet to achieve a significant market share. Money, on the
other hand, has managed to remain fairly stable over a long period of time. This makes it
difficult for shops to trust blockchain technology sufficiently to incorporate it in their
operations.

Digital attacks
Finally, blockchain technology remains vulnerable to cyber threats. Despite the various
security precautions available for individual transactions, a full virtual system cannot be
totally protected from attack. As shown in the recent past, such hacks can result in the loss of
millions of dollars. A digital attack on a merchant might result in the loss of hard-earned
money, which could have irreversible consequences for the firm. As a result, unless
additional security and regulation are put in place for virtual currencies, internet retailers may
find it far safer to stick to money as their primary form of currency.
Transportation

How AI challenges transportation industry

Despite its promise, there are significant obstacles that transportation industry must overcome
before AI can be completely implemented. The following are three frequent issues that
industries face:

AI Requires Human Assistance


Despite its ability to perform tasks, AI is not a fully autonomous system. Human help and
control are required to guarantee efficient operation.

Cost of Adoption
Integrating AI technology is not inexpensive, especially if you lack in-house developers and
engineers. The multiple sensors, transmitters, and computer resources required for AI
implementation might also be costly. This makes AI too expensive for many enterprises,
making them less likely to employ AI technology until prices fall dramatically.

System Reliability
Many artificial intelligence systems are still far from ideal. The numerous mishaps with
autonomous cars are evidence of this. Organizations must invest time and effort developing
protections until these flaws are fixed or otherwise accounted for. For the time being, this
additional need deters many enterprises.

Job Flow
AI may considerably improve system efficiency, but it frequently comes at the price of
human employment. While some businesses may be unconcerned about this danger,
replacing large numbers of humans with AI is not a sustainable system that leads to decreased
consumer engagement and greater employee discontent. For AI to be applied sustainably,
enterprises must develop methods to redistribute work rather than entirely replace it.
How IOT challenges transportation industry

If the gadgets and communication methods work effectively, Internet of Things (IoT)
transportation applications can increase convenience, safety, and quality of life. If the system
malfunctions or even goes out briefly, it might cause property damage as well as significant
harm to passengers and onlookers. Manufacturers of IoT devices and software businesses
might face legal consequences.

Every company in the IoT ecosystem should be aware of the potential liabilities and take
steps to mitigate such risks. Here are four major risk areas that technology firms should be
aware of when they create IoT solutions for the transportation industry.

Property Damage: Damage to property. Property harm might occur if an IoT device has a
flaw or fails to perform properly. Loss of use or physical damage to personal property, such
as a laptop, or real property, such as an office space, are examples of these hazards.

For example, an IoT sensor intended to collect data from a locomotive in order to advise
predictive maintenance may fail to properly read the locomotive's performance. This might
result in the vehicle not receiving timely maintenance, resulting in possibly hundreds of
thousands of dollars of avoidable damage if the locomotive collides with other property. The
gadget manufacturer might be sued for delivering incorrect data.

Injury to the body. If a gadget fails to work as stated at any point, the device creator may be
held accountable for any damage that occur. IoT and software firms may face physical injury
risks, making them accountable in the case of a malfunction.

A self-driving truck, for example, manufactured as a joint venture between an autonomous


vehicle manufacturer and an IoT software development business may include a software
flaw. If the issue causes an accident with oncoming traffic, the wounded drivers and
passengers may be able to sue the software creator for the related medical bills.
Education industry

How Blockchain challenges the education industry

Concerns about security


When it comes to data storage, security is always a worry. When blockchain provides
security characteristics to the education market, it also provides certain danger
considerations. Although blockchain is thought to be extremely safe and secure, this does not
mean that it is impenetrable. The Bitcoin system has certain flexibility and weaknesses that
pose a danger to the educational department. The academy Credentials and the student's
education record will be saved on blockchain technology. This information will be useful to
the student throughout his career. However, in order to comply with federal data protection
legislation, institutions may be required to apply severe norms and regulations for data
storage and theft prevention. As a result, it is quite a difficulty for institutions to adopt such a
big technology throughout the whole education department at once.

Scalability
When the boss of the mouth of data is uploaded on technology such as blockchain, scalability
might be a serious concern. You may be aware that education is offered in every country, and
so the application of Blockchain will be rather widespread. The extent of data involved will
grow with time, necessitating the addition of new blocks. It will put long-term strain on
blockchain technology, resulting in a reduction in the pace of blockchain transactions.
Furthermore, the transaction will require peer-to-peer verification, which will lengthen the
process.

Low adoption rate


Despite the fact that blockchain technology is highly intriguing for the education industry, the
sole advantage will be to graduates who can store their potential and school successes in it.
Companies and international organizations, on the other hand, may find it difficult to
implement the blockchain system. The most obvious reason for this is because he did not
follow his process. Furthermore, the absence of technical developments in outmoded
institutions may be a contributing factor to the difficulties in Blockchain deployment.
4 emerging technologies and how are they affects software/IT industry

How Artificial intelligence affects software/IT industry

AI plays a key role in the design, code generation and testing of software. Let us discuss each
area in detail:

Requirement Gathering
As a conceptual step of SDLC, requirement collecting necessitates the most human
interaction. Artificial intelligence provides a wide range of techniques/tools, such as Google
ML Kit and Infosys Nia, to automate some processes and, to some extent, reduce human
interaction. This phase places a strong focus on finding flaws early on before going on to
design. Natural language processing, an AI method, will allow machines to grasp the user's
needs in natural language and automatically build high-level software models. Of course, this
strategy has significant drawbacks, such as difficulty in balancing the developed systems. It
is, nonetheless, still one of today's hot study subjects.

Software Design
To provide a definite solution, project planning and design require specialist knowledge and
expertise. Choosing the best design for each step is a difficult challenge for designers.
Retracts and forward planning drive dynamic modifications to the design until the customer
achieves the desired solution. By using AI technologies to automate some difficult activities,
the most capable methodologies may be used to develop projects. Designers, for example,
can utilize AIDA (Artificial Intelligence Develop Assistant) to understand the client's
demands as well as preferences and then use that information to design the suitable project.
AIDA is a website creation tool that analyses numerous software design combinations and
delivers the best customized design based on the client's requirements.
Automatic Code Generation
Taking a business idea and turning it into code for a large project takes time and effort. To
address time and money problems, experts have proposed a technique that involves writing
code before beginning development. However, the strategy is ineffective when there are
doubts, such as what the target code is intended to perform, because gathering these details
takes as long as building code from scratch. AI-assisted intelligence programming will
alleviate some of the workload.

Consider explaining the project concept in your native language, and your system
understanding it and converting it into executable code. Though it may appear to be science
fiction, artificial intelligence in software development might change the tale! Natural
language processing and AI techniques will make this possible.

Artificial Intelligence in Testing Services


Software testing is an important stage in software development since it verifies the product's
quality. If particular software testing is redone anytime the source code is modified, it can be
time-consuming and costly. The catch here shows AI in software testing saving the day once
more.

There are several programs that use AI to create test cases and do regression testing. These
AI solutions can help you automate testing and assure error-free testing. AI and machine
learning-based testing platforms include Appvance, Fictionize, and Testim.io.

Deployment Control
Learning by machine AI technologies had an influence on software deployment as well, such
as increased efficiency in deployment control tasks. The deployment phase is the step in the
software development paradigm were developers often upgrade programs or apps to newer
versions.

If developers fail to appropriately perform a procedure during upgrade, there will be a


considerable danger in running the program. AI can protect developers from such
vulnerabilities during upgrade and lessen the chance of deployment failure.
How Automation affects software/IT industry

Machines have acquired an advantage in their conflict with humans. Previously, computers
defeated us in games such as chess, Jeopardy, and Go, but with the introduction of AI-driven
algorithms, occupations such as physicians, attorneys, teachers, and IT professionals, among
others, are now being challenged.

According to McKinsey, existing technology might automate about half of all labor tasks,
displacing up to 30% of global workers by 2030.

However, automation will provide new positions and possibilities that did not previously
exist. The greater question is whether those new jobs will be enough to replace the obsolete
ones.

Lesser manpower requirement:


More and more jobs within the IT industry will be automated and people currently doing such
jobs will either be redeployed elsewhere OR just let go.

Faster Service Delivery Time/Speed:


IT companies will be able to deliver services faster to their clients, resulting in better
customer satisfaction and positive financial impact.

Greater quality of Delivery to customers:


Customers will be able to enjoy greater/enhanced quality of deliverables to them. This means
companies can save on the cost of rework and corrective actions.

Reduced Cost of providing services:


Companies will be able to provide more services at the same cost OR same services at a
greatly reduced cost. This will have positive impact on either the Revenue of the Companies
or the Profits of the companies OR both.
How robotics affects software/IT industry

Machines replacing people in the workplace has been a constant source of anxiety since the
Industrial Revolution, and it has become a more prominent topic of discussion in recent
decades with the advent of automation. However, thus far, excitement has overshadowed
evidence regarding how automation, particularly robots that do not require people to operate,
affects employment and pay.

The recently released research "Robots and Jobs: Evidence from US Labor Markets," co-
authored by MIT professor Daron Acemoglu and Boston University professor Pascual
Restrepo, PhD '16, reveals that industrial robots have a detrimental impact on labor.

The researchers discovered that for every robot introduced per 1,000 workers in the United
States, wages fall by 0.42% and the employment-to-population ratio falls by 0.2 percentage
points – a loss of around 400,000 jobs to date. The impact is greater in regions where robots
are deployed: introducing one additional robot in a commuting zone (geographic areas
utilized for economic studies) decreases employment in that area by six people.

To carry out their investigation, the economists devised a model in which robots and humans
compete for the production of certain activities.

Industries are adopting robots to varying degrees, and the effects vary across the country and
among different groups — the automotive industry has adopted robots to a greater extent than
other sectors, and workers with lower and middle incomes who perform manual labor and
live in the Rust Belt and Texas are among those most likely to have their jobs affected by
robots.

"Given all of the fear and enthusiasm about robotics, it's certainly a really serious problem,"
Acemoglu said. "Our findings indicates that robots boost productivity." They are critical for
continuous growth and for companies, but they also kill employment and lower labor
demand.
How cloud computing affects software/IT industry

The competitive dynamics of the hardware, software, and consultancy businesses are being
influenced by cloud computing. Firms looking to boost processing capacity formerly spent
substantially in costly, high-margin server gear, generating a massive market for computer
makers. However, hardware companies are increasingly concerned that the cloud will
undermine these sectors. The move from hardware to services is visible in IBM's quarterly
results. The company recently revealed that its overall profitability was up 12%, despite a
20% drop in hardware sales. "Goodbye, PC," by J. Fortt (and Mac). "Good day, services," 4
February 2009, Fortune What accounted for the difference? The expansion of Big Blue's
services division.

For many in the computer business, the transition to cloud computing has also changed the
margin structure. While Moore's Law has reduced the cost of servers, implementing SaaS and
running a commercial cloud remain prohibitively expensive—far more so than just
manufacturing more copies of conventional, packaged software. Microsoft stunned Wall
Street by announcing that it will need to invest at least $2 billion more in server farm capital
investment this year than analysts predicted. The company's stock, which is among the most
widely owned in the world, fell 11% in a single day. Fortune, July 28, 2006, S. Mehta,
"Behold the Server Farm." As a result, many portfolio managers began to pay more attention
to the cloud's business ramifications.

Cloud computing can stimulate innovation, altering the desirable skill mix and career
prospects for information technology employees. Customers who use cloud computing may
have more money to reinvest in strategic endeavors and innovation if they spend less money
on expensive infrastructure improvements. IT occupations may also evolve. Nonstrategic
skills such as hardware operations and maintenance are projected to decline in demand.
How emerging technology going to affect IT industry in future

Artificial intelligence

As artificial intelligence continues to have an influence on organizations throughout the


world, it's safe to conclude that one of the most affected areas will be IT. But how will AI
impact the roles of developers, analysts, and engineers in the industry? Could it reshape the
IT landscape as we know it?

There’ll be more time to concentrate on high level, value-add tasks


Let's face the elephant in the room: job loss is the first topic of discourse anytime artificial
intelligence is introduced. However, most IT workers believe that their jobs will not, or
cannot, be completely replaced by machines. Yes, artificial intelligence will automate many
activities, but these will usually be aspects of people's duties rather than whole occupations.
In fact, the need for specialist, specialized talents will rise.

In principle, AI should help IT workers improve their skills. They will have more time to
concentrate on high-level, value-add, project-heavy work by automating tedious processes,
allowing developers and engineers to better their abilities and more resources to be spent on
business-critical initiatives.

An increase in specialist skills


According to World Economic Forum research, the most important new vocations by 2022
will be Data Analysts, Data Scientists, and AI and Machine Learning Specialists. As AI
facilitates digital transformation, the job of software developers and IT workers becomes
more advanced, and the need for these individuals will only grow.
As we go into a more AI-dominated IT world, the demand for people with these talents will
only grow, with the following skills being especially appealing to potential employers:
 Hadoop ecosystem
 HDFS
 MapReduce
 YARN
 Spark
 Hive
 Hbase
 Flume
 Sqoop
 SQL
 NoSQL
 AWS
 Azure
 Google Cloud
 Java
 Scala
 Python

Professionals becoming more specialist


"As artificial intelligence continues to disrupt the sector, experts will need to become more
specialized in their employment," says Ross. Jobs that can be mechanized will be
automated, and information technology will become more streamlined. People with
specialized knowledge and abilities will become increasingly desirable as the sector grows."
AI Ops will very certainly boost productivity in software development. As technologies
such as customized software, speech and facial recognition, deep learning, and big data
progress, so will the demand for specialists in these fields.

Jobs like Big Data and Machine Learning are already in great demand, and this trend will
continue. You may view the positions we have available by using our job search.
Automation

Most jobs that people did in 1600 AD have been taken over by automation, at least in part.
We don't plant seeds by hand, we have a planter. We don't thresh grain by hand, we have a
combine harvester We don't make boards by hand (splitting them and adzing them), we
have sawmills, nor do we make nails, we have nail making machines, etc. All of those
things have been automated, or at least machine assisted.

Yet people still work in the farm industry, and still work in the construction industry. (far
LESS people work in farming due to automation but people still do)

Many jobs that people did in 1900 (that didn't even exist in 1600) have been automated too..
We don't connect phone calls by hand, the central switch does it for us. We don't collect
fares when passengers get on the horse tram, the passenger waves a smartcard at the
streetcar reader instead.

And so on.

Many jobs will be replaced. The proportion of workers in different industries will change.

But people will still have jobs. They just will shift in nature. There will always be a need for
artists and craftsmen, for prototypes, for trouble shooters, for customer facing workers, for
knowledge workers, and so on.
Activity 02

The chosen technology for Dex Consulting is Artificial intelligence (AI)

What Is Artificial Intelligence (AI)?


Artificial intelligence (AI) is the replication of human intellect in robots that are
programmed to think and act like humans. The phrase may also refer to any machine that
demonstrates human-like characteristics such as learning and problem-solving.

The capacity of artificial intelligence to rationalize and execute actions that have the highest
likelihood of reaching a certain objective is its ideal feature. Machine learning (ML) is a
subset of artificial intelligence that refers to the notion that computer systems can
automatically learn from and adapt to new data without the assistance of humans. Deep
learning approaches facilitate autonomous learning by absorbing massive volumes of
unstructured data such as text, photos, or video (BulitIN, 2022).

 The replication or approximation of human intellect in machines is referred to as


artificial intelligence (AI).
 Artificial intelligence's aims include computer-enhanced learning, thinking, and
perception.
 AI is being applied in a variety of areas today, from banking to healthcare.
 Weak AI is simplistic and focused on a single task, whereas strong AI performs
more complex and human-like activities.
 Some skeptics are concerned that widespread usage of sophisticated AI would have
a detrimental impact on society.

Understanding Artificial Intelligence (AI)


When most people hear the word artificial intelligence, they immediately think of robots.
This is due to the fact that big-budget films and novels weave storylines about human-like
robots wreaking havoc on Earth. However, this could not be further from the truth.
Artificial intelligence is founded on the idea that human intellect may be characterized in
such a manner that a computer can simply duplicate it and complete tasks ranging from the
most basic to the most complicated. Artificial intelligence's aims include imitating human
cognitive processes. To the degree that they can be concretely described, researchers and
developers in the field are making very quick progress in replicating tasks like as learning,
reasoning, and perception. Some predict that inventors may soon be able to create systems
that outperform humans' ability to study or reason about any topic. Others, though, remain
dubious since all cognitive activity is laden with value judgements based on human
experience.

As technology progresses, earlier artificial intelligence criteria become obsolete. Machines


that calculate fundamental calculations or detect text using optical character recognition, for
example, are no longer regarded to possess artificial intelligence since these functions are
now assumed to be inherent in computers.

AI is constantly evolving to benefit a wide range of sectors. Machines are wired utilizing a
multidisciplinary method that incorporates mathematics, computer science, linguistics,
psychology, and other disciplines.

Artificial Intelligence Applications


Artificial intelligence has a plethora of uses. The technology may be used in a variety of
sectors and businesses. AI is being studied and employed in the healthcare business for
medicine administration and treatment of individual patients, as well as assisting in surgical
operations in the operating room.

Computers that play chess and self-driving automobiles are two further instances of devices
having artificial intelligence. Each of these machines must consider the implications of
whatever action they perform, as each action has an influence on the eventual result. In
chess, the goal is to win the game. The computer system in self-driving cars must account
for all external data and calculate it in order to behave in a way that prevents a collision.
Artificial intelligence is also utilized in the financial industry to detect and highlight
activities in banking and finance, such as odd debit card usage and significant account
deposits—all of which aid a bank's fraud department. AI applications are also being utilized
to assist expedite and simplify trading. This is accomplished by making it easier to predict
the supply, demand, and pricing of securities.

Special Considerations
Artificial intelligence has been scrutinized by both scientists and the general public since its
inception. One recurring motif is the notion that robots will grow so advanced that humans
would be unable to keep up with them, and they will take off on their own, recreating
themselves at an exponential rate.

Another concern is that machines can invade people's privacy and possibly be armed. Other
debates center on the ethics of artificial intelligence and whether intelligent systems like
robots should be accorded the same rights as people.

Self-driving vehicles have been a source of contention since their machines are created with
the least amount of danger and casualties in mind. When confronted with the choice of
colliding with one or more people at the same time, these autos would determine which
option would do the least amount of harm.

Many individuals are concerned about how artificial intelligence may affect human jobs.
Many businesses are aiming to deploy clever technology to automate specific occupations,
raising concerns that individuals would be driven out of the workforce. Self-driving cars
might eliminate the need for taxis and car-sharing services, and manufacturers could simply
replace human labor with robots, rendering people's talents obsolete.

The first artificial intelligence is claimed to have been a checkers-playing computer


constructed in 1951 by computer scientists at Oxford University (UK).
How artificial intelligence works

AI systems learn from patterns and characteristics in the data that they study by combining
vast volumes of data with sophisticated, repetitive processing methods.
Every time an AI system processes data, it tests and assesses its own performance and gains
new knowledge.
Because AI never requires a break, it can swiftly complete hundreds, thousands, or even
millions of jobs, learning a tremendous lot in a short period of time and becoming
incredibly proficient at whatever it's being trained to do.
However, knowing how AI genuinely works requires understanding that AI is more than
just a single computer program or application, but a whole subject, or science.
The objective of AI science is to create a computer system capable of imitating human
behavior and solving complicated issues using human-like cognitive processes.
To achieve this goal, AI systems employ a wide range of methodologies and procedures, as
well as a wide range of diverse technology.
By looking at these approaches and technologies, we can begin to truly comprehend what
AI does and, as a result, how it works, so let's go over the next.

Beyond such imprecise pronouncements as "machines that are intelligent," AI is a


multifaceted subject with no obvious unified definition. To comprehend how AI works, it is
necessary to first comprehend how the word "artificial intelligence" is defined.

The definitions have been divided into four categories:

 Thinking humanly
 Thinking rationally
 Acting humanly
 Acting rationally
The first two of these domains are concerned with mental processes and reasoning, such as
the capacity to learn and solve problems in a way akin to the human mind. The remaining
two of these sections are concerned with behaviors and activities. These abstract concepts
contribute to the development of a road map for incorporating machine learning programs
and other areas of artificial intelligence work into machines.

AI technology can be fueled by continuous machine learning, whilst others are powered by
more commonplace sets of rules. Different forms of AI function in different ways, which
means that it is vital to understand the many types of AI in order to grasp how they differ
from one another.

Features of Artificial Intelligence

Eliminate Dull and Boring Tasks


At some time in our lives, we have all completed a chore just because we had to, rather than
because we wanted to. That work was uninteresting or tedious to us. However, with a
computer, you will never get bored in the same way.

An artificially intelligent system will perform and continue to perform the task that is
assigned to it, no matter how many times it must do so. Furthermore, such solutions only
make tiresome, large-scale work simpler for consumers.

Take, for example, Dialogflow, a Google subsidiary firm that claims credit for developing
the Google Assistant. We give this assistance so many orders in a single day! The assistant
can do everything from Ok Google, call mom, to Ok Google, order sandwiches.

At the same time, we can use this assistance to send out a large number of calendars invites
to people. All we have to do is select a time for an event and enter the list of visitors. The
rest of the job is done by the helper.
Data Ingestion
One of the most crucial aspects of artificial intelligence is data intake. Artificial intelligence
systems must cope with massive volumes of data. Even a tiny firm of approximately 50
workers has massive amounts of data to examine; we can't even conceive the amount of data
handled by corporations like Facebook.

Furthermore, an artificially intelligent system maintains knowledge about many entities from
various sources. All of this displays on the system synchronously or simultaneously.

The amount of data that we are all creating is increasing at an exponential rate, which is
where AI comes in. Such data is dynamically updated, making it challenging for traditional
database systems to assimilate it all. As a result, AI-enabled systems have gone above and
beyond, gathering and analyzing data that might be valuable to everyone.

One such example of artificial intelligence would be Elucify, which is basically a database of
multiple business contacts. Elucify operates on a simple premise: donate to receive.

The user must first establish an account and sign in before the system may access and share
the user's contacts information. In exchange, the user receives relevant business connections,
some of whom may be future clients. Elucify, in other words, is crowdsourcing this
information. This explains a lot about the coaching institutes who contact your friend first,
then you, and then other friends from the same batch.
Imitates Human Cognition
It is referred to as an artificially intelligent system because it imitates or replicates the way
the human mind works and solves issues. This is what distinguishes AI.

An AI, like humans, strives to perceive the world and act appropriately by studying it,
drawing conclusions, and then interacting with it.

However, as of today, it is not totally achievable, but developers and scientists are working
on systems that cater to the theory of mind and self-awareness of artificially intelligent
systems.

This raises the prospect of an AI system being able to perfectly imitate the human mind and
behave precisely like a human.

To be honest, I'm looking forward to the day when my own AI system will engage with
people in my place, and an asocial me will binge watch some web series at home.

For example, Baidu, a Chinese digital behemoth, is a search engine that only functions in
Chinese, particularly for Chinese residents.

Baidu also has a voice cloning technique that can now clone a human voice in 3-4 seconds of
audio, as opposed to the prior 30 minutes. Well, it's both horrifying and fantastic!

Prevent Natural Disasters


We are all in favor of employing AI for our companies, gaming accounts, and other similar
things. It is now our responsibility to advance AI and perfect it so that governments may
employ it in disaster management.

When given data from thousands of prior disasters, artificially intelligent systems may
reliably anticipate the future of disasters that may occur.
Today, using artificial intelligence aspects such as these, scientists are researching over a lac
of previously happened earthquakes and related calamities such as tremors and volcanic
eruptions in order to develop a neural network.

The mechanism of this network was evaluated on around 30,000 occurrences, and the
system's predictions were shown to be more exact than traditional methodologies.

Similarly, AI systems are researching seismic data, which is information on changes in the
tectonic plates beneath the earth's surface.

Scientists are also looking at ash particles that emerge from lava during a volcanic eruption,
as well as other geological data, to anticipate abrupt eruptions.

Facial Recognition and Chatbots


Facial Recognition and Chatbots A facial recognition system allows a machine to unlock or
provide allowed access to a person by validating or recognizing the individual's face. A
camera included within the system detects the face and compares it to a previously recorded
face in its memory. Nowadays, most smartphones use face recognition to unlock the phone.
My pals are known for playing practical jokes on me. I inadvertently exposed my phone's
password to them one day. I just changed the security option from password to face
recognition since I was afraid, they might goof it. Only I can now unlock my phone, which
has spared me from any humiliation. Chatbots are software that allows a chat with the user to
address whatever problems they are having, either through auditory or textual techniques.

These programs replicate human behavior while conversing with a human via an application.
Many businesses, like Swiggy and Nykaa, have begun to use chatbots for customer
assistance.

Swiggy chatbots provide services centered on problems that clients have while their meal is
stalled in traffic or the products, they requested are unavailable. The Nykaa chatbot exists to
give consumers with product recommendations.
How Artificial Intelligence Will Change the Future
As technology advances, artificial intelligence will have an influence on our lives in a variety
of ways. Many of these changes are already taking shape, and the future will witness
advancements that we haven't even considered yet. However, these are some of the
impending ways in which AI will alter our environment.

Driverless Cars and Robots


AI and robotics advancements have led to growth in fields such as autonomous automobiles
and delivery drones. Autonomous transportation technologies have the potential to
revolutionize how we carry goods and people throughout the world.

Fake News
This negative feature of artificial intelligence is already having an influence on society. It
may become increasingly difficult to believe what we see or hear in the media, whether
through voice or picture reproduction.

Language and Speech Recognition


Machine learning algorithms can now recognize what individuals say with over 95%
accuracy. This paves the way for robotic transcribers of spoken language into written
language, as well as alternatives for language translation.

Facial Recognition and Surveillance


This is another gray area for AI, as many people are opposed to utilizing face recognition for
surveillance reasons. In China, the use of face recognition in conjunction with CCTV is
already being encouraged in order to track criminals and follow persons who are acting
suspiciously. Despite privacy rules, there is a significant probability that artificial
intelligence, especially technologies capable of properly recognizing emotion, will be used to
monitor individuals more broadly in the future.
Types of Artificial Intelligence
Artificial intelligence is classified into two types: weak and powerful. Weak artificial
intelligence incorporates a system meant to do a single task. Weak AI systems include video
games like the chess example mentioned before, as well as personal assistants like Amazon's
Alexa and Apple's Siri. You ask the assistant a question, and it responds.

What Are the 4 Types of AI?


Artificial intelligence is classified into four kinds.

Reactive AI: Algorithms are used in reactive AI to optimize outputs depending on a


collection of inputs. For example, chess AIs are reactive systems that optimize the optimal
strategy to win the game. Reactive AI is typically rather static, incapable of learning or
adapting to unexpected conditions. As a result, given equal inputs, it will create the same
output.

Limited memory AI: Memory problems AI may learn from past experiences and update
itself in response to fresh observations or data. The quantity of updating is frequently limited
(hence the term), and memory is quite small. Self-driving cars, for example, can "read the
road" and adjust to new conditions, even "learning" from previous experience.

Theory-of-mind AI is entirely adaptable and has a vast ability to learn and recall prior
experiences. Advanced chat-bots that can pass the Turing Test and deceive a person into
thinking the AI is a human being are examples of this sort of AI. These AI, while smart and
amazing, are not self-aware.

Self-aware AI: As the name implies, self-aware AI becomes sentient and aware of its own
existence. Some scientists feel that an AI will never become aware or "living" since it is still
in the realm of science fiction.
Benefits of choosing Artificial intelligence (AI)
Artificial intelligence (AI) is expanding the capabilities of machine-enabled functions. This
cutting-edge technology enables robots to behave autonomously, resulting in the efficient
execution of iterative tasks.
AI enables the development of a next-generation workplace that thrives on the seamless
cooperation of organizational systems and employees. As a result, rather than making human
resources redundant, developing technology augments their efforts. In reality, AI gives
businesses the luxury of freeing up staff for higher-level activities.

The following are the primary benefits of choosing AI:

 AI reduces the time it takes to complete a task. It allows for multitasking and reduces
the demand on current resources.
 AI allows for the performance of previously complicated activities at low cost.
 AI functions continuously and without interruption, with no downtime.
 AI improves the capacities of people with disabilities.
 AI has broad commercial potential and can be used across sectors.
 AI makes decision-making easier by making it quicker and smarter.

Advantages and Disadvantages of Artificial Intelligence

Advantages of AI

Human Error Reduction:


Because people make mistakes from time to time, the term "human error" was coined.
Computers, on the other hand, do not make these errors if they are correctly programmed.
Artificial intelligence makes choices based on previously obtained data using a certain set of
algorithms. As a result, mistakes are decreased, and the prospect of achieving better
accuracy with greater precision exists.
For example, in weather forecasting, AI has removed the bulk of human mistake.

Instead of humans, it takes risks:


This is one of the most significant benefits of artificial intelligence. We can overcome many
of humanity's dangerous restrictions by constructing an AI robot that can do the risky
activities for us. It can be utilized efficiently in every type of natural or man-made disaster,
whether it is travelling to Mars, defusing a bomb, exploring the deepest regions of the
oceans, mining for coal and oil.

Example: Have you heard about the Chernobyl nuclear power plant explosion in Ukraine?
There were no AI-powered robots at the time to assist us reduce the effect of radiation by
managing the fire in the early phases, since any person who got too near to the core died in
a matter of minutes. They ultimately sprayed sand and boron from a safe distance using
helicopters.

Available around the clock:


A typical human will labor for 4-6 hours every day, excluding breaks. Humans are designed
in such a manner that they may take some time off to renew themselves and prepare for a
new day of work, and they even have weekly offer to keep up with their work-life and
personal life. But, unlike humans, we can use AI to make robots operate 24 hours a day,
seven days a week, with no pauses.

Example: Educational institutions and helplines receive a lot of questions and problems that
can be resolved with AI.

Helping in Repetitive Jobs:

We will be executing several repetitive tasks on a daily basis, such as writing thank-you
emails and checking particular papers for flaws, among other things. We can use artificial
intelligence to efficiently automate these monotonous processes and even remove "boring"
work for people, freeing them up to be more creative.
Digital Assistance:
Some of the most modern firms connect with users via digital assistants, which reduces the
need for human personnel. Many websites utilize digital assistants to supply services that
consumers demand. We may discuss our requirements with them. Some chatbots are created
in such a manner that it is difficult to tell if we are interacting with a chatbot or a real being.

Example: we all know that firms have a customer service staff that is responsible for
resolving customer concerns and questions. Organizations may use AI to create a Voice bot
or Chatbot that can assist consumers with all of their questions. Many firms have already
begun to use them on their websites and mobile applications.

Disadvantages of AI

High Costs of Creation:


Because AI is always evolving, hardware and software must be upgraded to suit the most
recent needs. Machines require repair and maintenance, which incurs significant
expenditures. Because they are extremely complicated machinery, their construction
necessitates exorbitant prices.

Making People Lazy:


With its applications automating the majority of the job, AI is making people lazy. Humans
are prone to get hooked to these advancements, which may pose a concern for future
generations.

Unemployment:
As AI replaces the bulk of repetitive labor and other duties with robots, human intervention
is decreasing, which will present a significant challenge in employment standards. Every
firm is aiming to replace the least skilled employees with AI robots that can do comparable
job more efficiently.
No Emotions:
There is no doubt that robots are far superior when it comes to efficiency, but they cannot
replace the human connection that binds the team together. Machines cannot form bonds
with people, which is a necessary characteristic in team management.

Lack of innovative thinking:


Machines can only accomplish the tasks for which they were intended or programmed;
anything else causes them to fail or provide irrelevant results, which can be disastrous.

SUMMARY:

These are some of the benefits and drawbacks of artificial intelligence. Every new
discovery or breakthrough will contain both, but we as humans must be mindful of
this and use the good aspects of the idea to make the world a better place. Artificial
intelligence has enormous potential benefits. The challenge for humans will be to keep
the "rising of the robots" under control. Some argue that if artificial intelligence falls
into the wrong hands, it has the potential to destroy human civilization. However, none
of the AI programs created at that size are capable of destroying or enslaving
mankind.
How Artificial intelligence related to the transportation industry
After a series of achievements, humanity eventually created a vehicle that ran on four
wheels and was powered by petroleum. Cars are increasingly driving themselves while their
owners sleep inside. AI and transportation have had a significant influence on future
technology. Various potential for labor automation have emerged as a result of the
advancement of Artificial Intelligence in transportation. Here is a summary of some of the
breakthroughs and benefits of artificial intelligence for transportation.

The discoveries and benefits of AI for transportation development will be highlighted in the
next paragraph. We'll talk about self-driving cars, traffic management, drone control, and
self-driving trucks. The article will also cover AI in public transit, shipping, and aviation, as
well as other aspects of the transportation business.

Self-driving Vehicles
To operate the automobile without the assistance of the driver, AI transportation employs a
sophisticated collection of sensors, cameras, and radar. Various firms, including BMW,
Audi, and Tesla, have developed and tested such automation.

This high degree of AI in the transportation market size is not only tough to accomplish, but
also extremely complicated. According to the Google test, some self-driving cars have
traveled over 140,000 kilometers. Complex neural networks have been designed to discover
comparable patterns to the human brain, which multitasks when driving.

Data in neural networks includes identifying traffic lights, pedestrians, curbs, trees, street
signs, traffic congestion, and anything else on the route. The system will get more
sophisticated as it navigates more.

Furthermore, deploying self-driving cars is critical for lowering logistical costs.


Autonomous taxis are already being tested in Tokyo, and industry projections indicate that
this technology will assist to boost public transit in remote places. The Tokyo trial might be
a great illustration of how AI is already being utilized in transportation. Furthermore,
autonomous rails can cut transit costs by up to 45%.

Traffic Management
Living and working in a large city and attempting to get to work or school on time is a
tiresome situation that many individuals experience on a daily basis. AI-powered traffic
management is a step toward reducing daily traffic congestion and driver weariness.

To mention a few, AI focuses on automated traffic signals, high-resolution cameras, real-


time processing, and automatic plate recognition systems. Recently, India's capital opted to
use such intelligent traffic control. 7500 infrared sensor cameras and 1000 LED boards will
be installed. AI technology will aid in reducing road congestion by directing traffic during
peak hours using live feeds, sensors, and Google Maps. This can reduce manual
intervention to a bare minimum and provide a smarter and more efficient mode of
transportation.

Artificial intelligence in transportation may also be utilized to cut down on traffic


congestion. In the transportation business, AI technology captures real-time route
information, delivers it to the cloud, and analyzes it for traffic forecasts. The technology
will alert users of the most direct route to their destination. In this example, the employment
of AI in transportation will increase route safety and reduce waiting time.

Drone Control
Beautiful aerial views are now possible thanks to drone technology. However, when paired
with AI, drones can deliver additional information. AI drones, for example, may be used to
survey building sites, which would take more than a week if done manually. There are also
plans to employ drones to control and monitor traffic because they can provide a far broader
picture of the entire region.

In Rwanda, the hunt for blood banks has been reduced to 15 minutes thanks to the use of
drones. Drones using artificial intelligence can also be used to avert road accidents or to
give emergency assistance. When combined with AI, this little toy-like contraption proves
to be exceptionally unique.
How Artificial intelligence related to the agriculture industry

Consider having at least 40 critical processes to watch, excel in, and monitor at the same
time throughout a vast farming region, which is commonly measured in hundreds of acres.
Understanding how weather, seasonal sunshine, migration patterns of animals, birds, and
insects, crop-specific fertilizers and pesticides, planting cycles, and irrigation cycles all
effect production is a wonderful challenge for machine learning. Excellent data has never
been more important in determining the financial success of a crop cycle. As a result,
farmers, co-ops, and agricultural development organizations are doubling down on data-
centric techniques and broadening the breadth and scale of how they employ AI and
machine learning to improve agricultural yields and quality.

Surveillance systems
Using AI and machine learning-based surveillance systems to monitor real-time video feeds
from every crop field recognizes animal or human breaches and promptly sends an
alarm. AI and machine learning lessen the possibility of domestic and wild animals
accidently destroying crops or committing a break-in or burglary at a remote farm location.
With the fast advancements in video analytics enabled by AI and machine learning
algorithms, everyone interested in farming can safeguard the perimeters of their crops and
buildings. AI and machine learning video surveillance systems scale equally well for a
large-scale agricultural enterprise as they do for a single farm. Machine-learning
surveillance systems may be designed or educated over time to distinguish between
employees and cars.

Price forecasting
Price forecasting for crops based on production rates is important in creating price strategies
for a certain crop. Understanding crop production rates and quality levels aids agricultural
enterprises, co-ops, and farmers in negotiating the best price for their harvests. Considering
overall demand for a certain crop to establish whether the price elasticity curve is inelastic,
unitary, or highly elastic dictates the pricing strategy. Knowing this information alone saves
agricultural firms millions of dollars in lost income each year.

Robots and smart vehicles


Because there is a labor crisis in agriculture today, AI and machine learning-based smart
tractors, agribots, and robots are becoming a feasible choice for many distant agricultural
companies that are struggling to recruit personnel. When large-scale agricultural firms can't
find enough workers, they resort to robotics to harvest hundreds of acres of crops while also
providing protection around remote areas. Programming self-propelled robots gear to apply
fertilizer to each row of crops reduces operational costs while increasing field output.
Agriculture robot complexity has increased rapidly, as seen by the dashboard of the
VineScout robot in operation.

Monitoring market activities


Monitoring livestock health, including vital signs, daily activity levels, and food intake, is
one of the most rapidly developing applications of AI and machine learning in agriculture.
Understanding how each variety of livestock reacts to feed and boarding circumstances is
crucial in determining how to effectively care them in the long run. It is critical to use AI
and machine learning to determine what makes every day cows content and happy in order
to produce more milk. This sector brings you whole new insights into how farms might be
more lucrative for many farms that rely on cows and cattle.

Irrigation leaks
AI contributes to increasing farming efficiency by detecting irrigation leaks, adjusting
irrigation systems, and monitoring how effective frequent crop watering enhances
production rates. Water is the most limited resource in many regions of North America,
particularly in communities that rely heavily on agriculture as their main source of income.
It might be the difference between a farm or agricultural enterprise remaining profitable or
not. Linear programming is frequently used to calculate the best quantity of water required
by a certain field or crop to achieve an acceptable production level. Supervised machine
learning methods are great for ensuring that fields and crops receive enough water to
optimize yields while waste none.
How Artificial intelligence related to the healthcare industry

Administrative
Artificial intelligence has a variety of administrative uses in healthcare. In comparison to
patient care, the use of artificial intelligence in medical settings is less game altering.
However, artificial intelligence in hospital administration can deliver significant
efficiencies. AI in healthcare may be utilized for a number of tasks such as claims
processing, clinical documentation, revenue cycle management, and medical records
management.

Machine learning is another use of artificial intelligence in healthcare that is relevant to


claims and payment administration. It may be used to match data from disparate databases.
Insurers and providers must check the accuracy of the millions of claims submitted each
day. Identifying and resolving coding mistakes and false claims saves time, money, and
resources for all parties.

Diagnosis and Treatment


For the last 50 years, disease diagnosis and treatment have been at the heart of artificial
intelligence AI in healthcare. Early rule-based systems had the ability to properly diagnose
and treat disease, but they were not widely used in clinical practice. They were not
considerably better at diagnosing than humans, and their integration with clinical workflows
and health record systems was less than optimal.

Using artificial intelligence in healthcare for diagnostic and treatment plans, whether rules-
based or algorithmic, can be challenging to integrate with clinical processes and EHR
systems. When compared to the accuracy of proposals, integration concerns have been a
higher impediment to mainstream use of AI in healthcare. Much of the AI and healthcare
capabilities offered by medical software manufacturers for diagnosis and treatment are
stand-alone and target only a subset of care. Some EHR software manufacturers are
beginning to include limited healthcare analytics functionalities powered by AI into their
product offerings, although they are still in the early stages. To fully benefit from the use of
artificial intelligence in healthcare utilizing a standalone EHR system, providers must either
undertake significant integration initiatives themselves.
Rule-based Expert Systems

In the 1980s and beyond, expert systems based on versions of 'if-then' rules were the
dominant AI technology in healthcare. To this day, artificial intelligence is routinely
employed in healthcare for clinical decision assistance. Many electronic health record
systems (EHRs) now provide a set of rules as part of their software offerings.

Human experts and engineers are typically used to create a comprehensive set of rules in a
specific knowledge area for expert systems. They work well up to a point and are simple to
follow and process. However, once the number of rules becomes too vast, generally in the
thousands, the rules might begin to clash and break apart. Furthermore, if the knowledge
area changes significantly, altering the rules might be time-consuming and hard. In
healthcare, machine learning is gradually replacing rule-based systems with techniques
focused on analyzing data using proprietary medical algorithms.

Natural Language Processing

For than 50 years, artificial intelligence and healthcare technology have sought to
understand human language. Most NLP systems incorporate speech recognition or text
analysis, followed by translation. NLP tools that can comprehend and categorize clinical
documents are a frequent use of artificial intelligence in healthcare. NLP systems can
evaluate unstructured clinical notes on patients, providing remarkable insight into quality
understanding, improved methodologies, and better patient outcomes.
How Artificial intelligence related to the Utilities & Energy industry

Optimizing Energy Production and Scheduling

Overruns in cost and time are a challenge while establishing offshore wind farm projects.
Weather delays, resource and product limits, and schedule hazards all contribute to this. The
problem's complexity grows as a result of platform construction, fishing and environmental
limits, government and local government laws.... It is therefore critical to develop
comprehensive project planning and scheduling models that take into account these
interacting components and related hazards in offshore windfarm projects.

For example, an AI-based solution enables wind turbine operators to predict turbine blade,
generator, and gearbox problems while improving energy output. Cloud-based systems
provide offshore operators access to advanced analytics software with AI algorithms that
analyze incoming data for abnormalities and, as a result, predict issues in the monitored
equipment.

Asset Tracking and Maintenance/Digital Twins

One of the most significant areas where digital twin (DT) technology may play a key role is
asset management, which includes monitoring and maintenance, project planning, and
lifecycle management. In such a case, Digital Twins help Energy & Utilities organizations
to manage concerns such as production imbalances, quick changes in global economic
situations such as the COVID-19 pandemic, and equipment dependability issues. Energy
and utilities firms require systems with real-time visibility and flexibility given by digital
twins technology to be responsive in these busy and rather chaotic times, similar to scenario
planning for machines in the digital era!
Emission Tracking
Several energy corporations have already committed to net-zero emissions goals. Despite
economic constraints, many businesses are attempting to decarbonize their operations and
value chains. According to BCG research, the potential total benefit of applying AI to
business sustainability by 2030 amounts to $1.3 trillion to $2.6 trillion in value gained
through increased revenues and cost reductions. Energy and utility companies are now
adopting AI technologies to track the amounts of fugitive emissions of greenhouse gases
that escape from pipelines and energy equipment, allowing them to better manage them.

AI Led Inventory Management


Companies suffer costs when inventory lags behind demand. AI aids in increasing
efficiency in network planning and forecasting demand, allowing merchants to be more
proactive. As energy and utility providers acquire greater visibility into demand patterns,
they may plan for change, such as new driving habits, by altering the number of cars
charging stations and directing consumers to neighboring sites where recharging terminals
are not in use. This results in more satisfied consumers and cheaper operating expenses.

Defect Detection
One of the difficulties that Energy and Utilities firms face is finding suspicious
pipes/wiring/machines or faults in fault-prone operations. Defects discovered at the end of
the energy production line due to substandard wind turbines result in significant losses for
energy businesses, turbine owners, turbine manufacturers, and budget resources.

To that aim, AI may aid in the validation of manufacturing quality and give deep insights
into faults in analytics. In compared to the core procedures, AI-powered Defect Detection
solutions are incredibly cost-effective. Deep learning pattern recognition allows video
streams captured with cameras to alert if an employee is not appropriately clothed for the set
of operations. Furthermore, predictive analytics alert operators on the health condition of
the equipment, allowing for proactive interventions to avert a disaster with serious
ramifications for health, safety, and the environment.
How chosen technology impact on software industry

Role of AI in Software Development


AI will influence how we design applications, and we may anticipate a better app produced in
the current context. Understanding AI will determine the future of software development;
now, most firms are interested in AI. 80% of businesses are wisely investing in AI. Almost
47% of digitally mature businesses have established their AI strategy. AI technologies alone
are expected to provide $2.9 trillion in corporate value by 2021.

If you wish to use this method, you must first grasp the function of AI in software
development and examine what has changed. Here are some of the capabilities that AI may
bring to software development in order to provide highly tailored goods or services to your
clients.
Areas were AI impacts Software Development
AI plays a key role in the design, code generation and testing of software. Let us discuss each
area in detail:

Requirement Gathering

As a conceptual step of SDLC, requirement collecting necessitates the most human


interaction. Artificial intelligence provides a wide range of techniques/tools, such as Google
ML Kit and Infosys Nia, to automate some processes and, to some extent, reduce human
interaction. This phase places a strong focus on finding flaws early on before going on to
design. Natural language processing, an AI method, will allow machines to grasp the user's
needs in natural language and automatically build high-level software models. Of course, this
strategy has significant drawbacks, such as difficulty in balancing the developed systems. It
is, nonetheless, still one of today's hot study subjects.

Software Design

To provide a definite solution, project planning and design require specialist knowledge and
expertise. Choosing the best design for each step is a difficult challenge for designers.
Retracts and forward planning drive dynamic modifications to the design until the customer
achieves the desired solution. By using AI technologies to automate some difficult activities,
the most capable methodologies may be used to develop projects. Designers, for example,
can utilize AIDA (Artificial Intelligence Develop Assistant) to understand the client's
demands as well as preferences and then use that information to design the suitable project.
AIDA is a website creation tool that analyses numerous software design combinations and
delivers the best customized design based on the client's requirements.
Automatic Code Generation
Taking a business idea and turning it into code for a large project takes time and effort. To
address time and money problems, experts have proposed a technique that involves writing
code before beginning development. However, the strategy is ineffective when there are
doubts, such as what the target code is intended to perform, because gathering these details
takes as long as building code from scratch. AI-assisted intelligence programming will
alleviate some of the workload.

Consider explaining the project concept in your native language, and your system
understanding it and converting it into executable code. Despite the fact that it appears to be
science fiction, artificial intelligence in software development can change the tale! Natural
language processing and AI techniques will make this possible.

AI in Testing Services
Software testing is an important stage in software development since it verifies the product's
quality. If particular software testing is redone anytime the source code is modified, it can be
time-consuming and costly. The catch here shows AI in software testing saving the day once
more.

There are several programs that use AI to create test cases and do regression testing. These
AI solutions can help you automate testing and assure error-free testing. AI and machine
learning-based testing platforms include Appvance, Functionize, and Testim.io.

Deployment Control
Learning by machine AI technologies had an influence on software deployment as well, such
as increased efficiency in deployment control tasks. The deployment phase is the step in the
software development paradigm were developers often upgrade programs or apps to newer
versions. If developers fail to appropriately perform a procedure during upgrade, there will be
a considerable danger in running the program. AI can protect developers from such
vulnerabilities during upgrade and lessen the chance of deployment failure.
Technological convergence and opportunities

Technological convergence definition

The phrase technological convergence is frequently characterized in extremely generic and


simple terms as a process through which telecommunications, information technology, and
media, which were formerly mostly independent of one another, are increasing together. The
technical and functional aspects of technological convergence are both present. The technical
side refers to any infrastructure's capacity to carry any sort of data, whilst the functional side
suggests that users may be able to seamlessly merge compute, entertainment, and speech
capabilities in a single device capable of performing a variety of jobs.

Opportunities of Convergence

If properly handled, technological convergence may play an essential role in any nation's
national economic and social growth. Governments can use this opportunity to support
market growth while also meeting previously unmet societal communication demands.

1/ Increased Market Competition

Convergence has reduced the barriers to entry for new operators and service providers. The
development of new market participants increases competition, providing customers with a
diverse range of providers and services to select from and lowering communication costs.
Furthermore, as technology advances, industry borders blur, allowing service providers to
deliver services in different sectors. Cable providers, for example, can provide voice
telephone and internet services in addition to television access. Content suppliers may now
easily reach customers without owning a distribution network. For example, a firm may
create TV material and distribute it over cable networks without having to pay for it.
to possess it
2/ Emergence of New Services and Applications
Convergence will provide established businesses with an opportunity to operate more
effectively, boost returns on technology investments, and reap additional economic benefits
through the creation of new services and quick market expansion. Convergence brings up
new sales marketplaces for businesses, as seen with mobile providers. As the industry
becomes saturated, they are looking to non-voice services like video streaming, portals,
messaging, information services, and gaming to drive future revenue growth. New
applications have spawned new forms of leisure (such as online gaming) and sociability (i.e.,
chat rooms). Because we can all chat, text, and send video over one network, the convergence
of voice, video, and data provides customers with new methods to communicate.

3/ Convenience and Simplicity


At the device level, customers see convergence as a way to enjoy the ease of having several
gadgets in one, saving on both size and ownership costs. A single mobile phone device, for
example, may receive television programs and play videos, allowing for simplicity and ease
in device ownership because one device can be used to access various services.

Challenges in a Converging World


As a result of technological convergence, a variety of challenges with adaptation to the new
environment have arisen. Telecommunications companies, service providers, lawmakers,
regulators, and users are all involved.

1/ New Regulatory Framework


The convergence of services on the same platform calls into question commonly held beliefs
about the appropriate way to license and regulate providers. Regulatory frameworks were
traditionally built for an age when there were obvious functional boundaries between services
and infrastructure, but these rules are becoming unsuitable for coping with today's
environment. At first look, the most pressing concerns appear to be interoperability,
connectivity, consumer protection, and universal access. Existing interconnection
mechanisms primarily focus on the interconnection of telecom networks based on circuit
switching technology, but broadcasting networks, for example, are either uncontrolled or
subject to various sorts of regulation. Furthermore, in a convergent environment that is
heavily reliant on packet switched networks, circuits are neither linked nor given.
2/ Bandwidth Shortage and Infrastructure Upgrade
Convergence gives rise to new bandwidth-intensive services and applications, necessitating
the presence of broadband infrastructure. Only with broadband connectivity is the usage of
complicated services (such as multimedia services) appealing or even conceivable. While
industrialized economies may not be facing a bandwidth constraint crisis, the same cannot be
true for the majority of emerging nations, as communications infrastructures continue to rely
on narrowband technology. These countries are dealing with the

3/ Strategic Alignment by Operators and Service Providers


As market entry barriers are reduced, allowing a greater number of new players to enter the
market and offer a wide range of different service packages, established operators and service
providers must rethink their business models and strategies not only to compete with these
new providers, but also to upgrade their networks to integrate it into their own offering.
Another problem is persuading customers of the value brought by the new services they must
pay for.

4/ Privacy, Security and Reliability


As society grows more linked and reliant on ICT networks, hackers continue to devise
increasingly devious methods to exploit human and computer weaknesses to their malevolent
advantage. This puts operators, service providers, and users under pressure to take
precautions against network intrusions, assaults, and viruses. Similarly, as technology and
systems get more complicated, the danger of instability increases. Product designers, makers,
and operators are tasked with ensuring the dependability of these new technologies.
Convergence of Artificial intelligence

The process of merging artificial intelligence and human intelligence into a single system is
known as AI convergence. These technologies complement one another, and there are several
instances of how they are applied. Self-driving automobiles are one example. Because
humans have a limited quantity of physical energy, they can drive for extended periods of
time. They must, however, pull over, alter places in the automobile, or rest if they grow tired.
We can save money on human fuel and lessen the danger of accidents by letting AI drive for
lengthy periods of time.

In many ways, artificial intelligence (AI) technologies are being incorporated into goods and
services. Artificial intelligence is most typically used in two ways:

 By boosting human abilities and skills (dubbed "Intelligent Personal Assistants").


 By completely replacing some jobs and procedures ("Machine Learning").

Both forms of artificial intelligence have been present for decades, but the rate of invention
is increasing.

With AI convergence and other developing technologies, there are several opportunities for
organizations to benefit from AI convergence. The most apparent use is automation. We
already have systems capable of scanning papers, tracking data, and analyzing data.
However, these systems are still in their infancy. The capacity to use AI for cognitive
services is a more robust application of AI. Chatbots, for example, may be used to automate
customer assistance. These bots would be able to communicate with customers through
phone calls, live chats, and email. They can answer basic queries and assist customers in
completing transactions.
How artificial intelligence converge with IOT

Of course, the Internet of Things is a wave of connection rather than a particular technology.
Much has been written on the commercial ramifications of this in the business-to-business
and industrial industries. More than that, information will progressively come to affect us all
in a variety of ways as it is incorporated into the products we buy and use in our everyday
lives.

The CTO of a FinTech startup fund, has spent the majority of his leisure time over the last
few months fiddling with a soldering iron and some frightening-looking plastic boxes. He has
been gradually connecting everything in his house to the internet.

Six months into this DIY undertaking, his house has developed its own personality and social
media profile. And he has learned a lot along the road. One important takeaway has been that
connecting items to the internet is the simple part. The issue comes in designing software to
have these objects do valuable things in reaction to their surroundings and other linked
entities.

That is the difficulty. And here is where Artificial Intelligence shines. Again, much has been
said on the influence of AI in the last year. But one thing is certain. This is not a 'coming'
technology; it is here and now.

If that thought bothers you, allow me to provide some solace. What AI genuinely offers us, in
my opinion, is something shockingly beneficial... a new method of developing software that
will regulate and improve the world around us.

The Google Self-Driving Car is perhaps the most well-known example of this, a technology
that, when completed, may eliminate human fallibility on the road and allow us all to become
more productive. The convergence of AI and other technologies is already making an impact,
and many of us eagerly await the day when virtual personal assistants will run our lives, free
up our time, and make us more successful. So, rather of fearing AI-induced redundancy,
perhaps we should embrace it.
The confluence of AI with the Internet of Things now offers the same potential as the
convergence of the internet and the mobile phone in 1999. Neither technology has achieved
its full potential, and each might yield more than we anticipate. Their combined influence,
however, will be immense - far more extensive than we have ever envisaged.

Write a quick but thorough summary of everything you do in a normal day as a thinking
experiment. Track your actions from the minute you wake up, shower, and put on your
clothes, through your commute and working day, and into your personal and leisure life.
Consider how each of those things may be done for you – and better – with the assistance of
AI and the Internet of Everything. Consider what life would be like if you were connected to
the things around you, which were then intelligently connected to each other and controlled
by intelligent software.

Depending on your natural optimism, I believe you will find this either tremendously thrilling
or terrifying.
How Artificial intelligence converge with Blockchain

When blockchain and artificial intelligence are combined, they form a game-changing
technology that is accelerating the rate of innovation in every sector. Without a doubt,
Blockchain and AI are marching together and leaving their mark on the technological
paradigm.

This strong combination will very certainly have an impact on practically every industry,
including healthcare, education, banking, fintech, law enforcement, and manufacturing. To
manage business processes and optimize cost-savings and profits, financial institutions are
integrating both AI and blockchain-based solutions.

According to Deloitte's global blockchain study in 2019, 57 percent of firms see cost savings
as one of the key benefits of participating in blockchain network consortia.

Secured Payment Network


Traditional financial institutions confront several obstacles, such as compromised payment
networks, delays in transaction processing, and high transaction costs, which make the entire
transaction time-consuming and difficult.

Blockchain technology provides a borderless payment network, and its decentralized


approach allows for speedier, frictionless payments. However, security concerns prevent
many bankers from using this technology into the banking sphere. Scammers and fraudsters
find it difficult to breach the network since the transaction requires a set of public and private
keys.

However, by incorporating the capabilities of AI into your blockchain account, the owner can
readily pinpoint any human participation in the crime. You may reduce security
vulnerabilities in payment networks by combining AI behavioral analysis and biometrics.
Controlled Automation for Financial Activities
To achieve regulated automation of financial processes, blockchain and AI must be combined
in order to assist the automatic process. By combining blockchain-powered smart contracts
with Artificial Intelligence, you can simply validate the smart contracts and identify potential
vulnerabilities and theft. It enables a totally safe, transparent, and efficient financial
transaction, which substitutes calculated constraints with an automated method.

What are the primary advantages of integrating these two revolutionary technologies?

The confluence of these two technologies, as well as their implementation in the real and
economic worlds, demonstrate that they were built for each other.

Consider some of the benefits of using blockchain in combination with AI applications.

Blockchain makes AI secure


Implementing blockchain technology allows you to decipher AI judgments, which might be
tough to grasp at times. Blockchain can analyze a vast variety of factors independently of one
another, assisting you in deciding the important job it is attempting to do.

By documenting the decision-making process, blockchain records, explains, and interprets AI


judgments, allowing for possible insights and bringing transparency to the public.

The computer system, unlike people, does not have its own brain; instead, it is programmed
to accomplish tasks. The machine-learning driven mining algorithm has made our lives much
more pleasant since the advent of AI.
AI and Blockchain work well together
Because of the encryption approach, the data stored in the blockchain blocks is extremely
safe. Because blockchain databases store very sensitive personal data in encrypted form, we
may store highly sensitive personal data in these blocks. You may be wondering how AI and
Blockchain technology complement each other.

AI has provided several benefits for keeping data safe and secure. The key advantage is that
with AI, we can deal with data in an encrypted form, which can decrease incidences of
security risk.

E-commerce giants such as Amazon and Netflix, for example, utilize a mix of these two
technologies since AI can function on encrypted data. One example of AI is the
recommendation algorithms used by Amazon and Netflix. You may exploit this fantastic
technology by hiring AI engineers in India.

AI improves blockchain security


While blockchain is sometimes immune to attack, various layers and applications beneath it
are not. By combining AI and blockchain, you can provide a secure application while
maintaining the stable system structure.

AI helps in handling blockchain more efficiently


Without a defined programming technique, computers find it difficult to complete a task.
Because blockchain data is encrypted, it requires a lot of computing power to work on it.
Take, for example, the cryptocurrency bitcoin.

Hashing algorithms using a "brute force" approach are used to mine blocks on the Bitcoin
network. This method practically tries every character combination until it discovers one that
can be utilized for transaction verification.

This procedure can be handled more conveniently and intelligently with the aid of AI. AI and
ML-powered mining algorithms are fed more training data, allowing them to outperform
humans. You may also engage Indian software engineers to construct AI-powered blockchain
apps for you.
How Artificial intelligence converge with VR

Convergence is the next natural step in the realm of technology. To perform what your
smartphone does alone, you would have needed a specialized pager, phone, camcorder,
camera, portable gaming device, GPS tracker, and a Walkman only a few decades ago. We
create technologies just to combine them later for improved and higher outcomes.

Two technologies stand out when discussing current innovation: virtual reality (VR) and
artificial intelligence (AI) (AI). The one seeks to construct an alternate reality for humanity,
while the second seeks to imbue computers with the vision and understanding of a sentient
creature. We've made great strides in recent years in refining both, and the prospect of
combining the two opens up endless possibilities.

The combination of AI and VR has the potential to create some mind-bending worlds.

Enhanced Virtual Shopping


If you've been living under a rock, you may be unaware that virtual shopping exists, allowing
you to check out a product in a virtual setting before placing an order. Cadillac's VR
showroom and shopping for garments with mixed reality help are already realities.

The team from Store No. 8 demonstrated a VR smartphone app that allows customers to
check out camping goods in a simulated Yosemite. Katie Finnegan, the company's founder of
innovation, said:

“You can see the tent in the environment in which you’ll use it,” she says. “You can unzip the
opening, get inside, lay on the ground and say, ‘You know what? This is too tight,’ then
swipe your hand to try another tent.”

Don't you think this is fantastic? Consider AI that learns from consumer behavior, desires,
and mannerisms. The VR world will become considerably more engaging and functional with
the addition of sufficient and well-managed AI. Brands and retail businesses may deploy in-
store AI-powered virtual salespeople that can give recommendations, listen to consumer
feedback, and even complete a transaction.
Revolution in Tourism

Tourist and technology go hand in hand, and if there is one area that can benefit greatly from
the use of AI and VR, it is the tourism industry. Some businesses have already embraced the
future and are utilizing Virtual Reality to show potential clients what they have to offer and
what they may experience personally.

When it comes to tourism or holidays, it's not about the details, but about the whole package.
A clever mix of AI and VR here can transform an already fantastic holiday into a memorable
one. AI can be utilized in every situation where decisions must be made, whereas VR can
recreate situations to offer users a visual representation of what is to come.

Booking.com Global Director of Customer Service, James Waters, believes that 80% of their
clients prefer to gather their own travel information rather than dealing with a booking
assistant, salesman, etc. As a result, Booking.com has created an artificially intelligent
backend that can assist users in completing their searches and bookings without the need for a
middleman. AI-powered software can handle tasks such as flight booking, ticket processing,
and hotel reservations, making the entire process more comfortable and efficient.

On the other side, tourist companies may employ VR to transport potential customers to a
location before they visit there. The wonderful thing about technological convergence is that
it benefits everyone. It's a win-win situation for everyone. Pictures are nice, and movies are
even better, but a virtual world that simulates reality is the closest thing to the genuine thing.
Gaming and Entertainment

Without a question, the industry that gave Virtual Reality fresh life is gaming. The notion of
living a game rather than just playing one was appealing, and the gaming community was
eager to spend money on it. The outcome was the Oculus Rift, which was eventually
purchased by Facebook.

While VR has emerged from the shadow of gaming with breakthroughs and constant growth,
a large portion of VR comes from the same industry that supported it over a decade ago.
Many headsets, including as the Sony PSVR, HTC Vive, Oculus Rift, Google Cardboard, and
others, are currently accessible to customers, and the majority of the apps made for these
headsets are games.

VR in games is already prospering, but the entry of AI into the gaming sector should thrill us.
Surprisingly, video games are also used to teach AI. If you've ever played video games on
any platform, you'll know that the only sentient person capable of making judgments on the
fly is you. Non-player characters, or NPCs, have always been mechanical whether you're
playing a first-person shooter, an open world game, or a role-playing game.

There's nothing wrong with them in and of themselves. However, they simply fail to respond
to player inputs and instead follow the hard-coded algorithms that the creators have
previously created. With the use of AI in gaming, we would be able to make non-player
characters far more intelligent, capable of making their own decisions, making the game
more participatory and interesting. AI might gradually improve game quality in terms of
player engagement.
How Artificial intelligence converge with AR

Augmented Reality (AR) is a digital medium that allows users to integrate virtual context into
the actual environment in a multidimensional, interactive fashion. Cameras and sensors
provide information about the surrounding environment to AR software.

AI improves the AR experience by allowing deep neural networks to replace standard


computer vision algorithms and introduce new capabilities like object detection, text analysis,
and scene categorization.

How does AI transform AR?

Historically, augmented reality software relied on standard computer vision techniques


known as Simultaneous Localization and Mapping (SLAM). In order to map and track the
surroundings, SLAM algorithms compare visual information across camera frames.

Modern AR apps, on the other hand, rely on deep learning to give more complex capabilities.

AI Applications in AR

AR developers may use AI algorithms to provide AR features such as better interaction with
the physical world. Machine learning and deep learning AI technologies are ideally suited to
AR environments because:
 Because cameras are continuously on, there is a chance to collect additional data for
the AI system to learn on.
 Because AR environments rely on several sensors (e.g., the device's gyroscopes,
sensors, accelerometers, and GPS), the input to the AI system is extensive. This
improves dependability over systems that rely solely on a single sensor.
Among the AI uses in AR are:

Object labeling
Machine learning classification models are used in object labeling. When a camera frame is
sent through the model, the picture is matched with a pre-defined label in the user's
classification library, and the label is superimposed over the real object in the AR world.
Volkswagen Mobile Augmented Reality Technical Assistance (MARTA), for example,
marks car parts and gives information about current problems as well as advice on how to
correct them.

Object detection and recognition


Convolutional neural network (CNN) techniques are used in object identification and
recognition to estimate the location and extent of items in a scene. Following the detection of
the object, the AR software may produce digital objects to overlay the physical one and
mediate interaction between the two. IKEA's ARKit program, for example, scans the
surrounding environment, measures vertical and horizontal planes, assesses depth, and then
proposes goods that match the specific area.

Text recognition and translation


Text recognition and translation integrate artificial intelligence (AI) Optical Character
Recognition (OCR) methods with text-to-text translation engines such as DeepL. A visual
tracker maintains track of the word and allows the translation to be superimposed over the
AR scene. This feature is available in Google Translate.

Automatic Speech Recognition


Automatic Speech Recognition (ASR) use neural networks to recognize audiovisual speech
(an algorithm that relies on image processing to extract text). Specific words cause a picture
identified to suit the word description in the library to be projected onto the AR area. Panda
sticker app is one example.
Activity 03

Content

1. Emerging technologies
2. Factors
3. How political factors impacting AI
4. How economic factors impacting AI
5. How social factors impacting AI
6. How political factors impacting Cloud computing
7. How economic factors impacting Cloud computing
8. How social factors impacting Cloud computing
9. How political factors impacting Blockchain
10. How economic factors impacting Blockchain
11. How social factors impacting Blockchain
12. Presentation of Emerging technologies
13. Google form survey
14. Results analysis pie chart
15. Conclusion

3 emerging technologies that we are going to talk about in influencing factors

1. Artificial intelligence (AI)


2. Cloud computing
3. Blockchain

Factors

1. Political
2. Economic
3. Social
How political factors impacting AI

Governments throughout the world have prioritized the development of AI technology,


organizing entities to create plans and policies to expedite and deploy AI in business,
markets, and governments. With this in mind, this study investigates the relationship
between the institutional dynamics of political regimes and the development of AI. It
examines 30 developing nations in order to understand the political and governance
processes that underlie the consequences of AI policy. This study examines how different
institutional frameworks yield varied outcomes in terms of AI development, delving into the
link between politics and policy. The study used data from the Bertelsmann Stiftung's
Transformation Index (BTI) and the Organization for Economic Cooperation and
Development (OECD) to examine situations using fuzzy-set quantitative comparative
analysis (QCA).

AI Can Help Governments Save Money


The perceived benefits to the public sector make automation and algorithmic decision-
making appealing. Governments are short on resources, whether they be manpower, money,
or skill. Algorithms provide a simple fix: hire a vendor with competence in a certain area to
outsource the problem without recruiting expensive data scientists. Algorithms also provide
an untrained audience the appearance of neutrality.

AI in Politics
Recently, we have witnessed incidents of AI politicians, like Alisa, a digital Russian lady
who ran for president in 2018, after launching her robotic candidacy in 2017, and Sam, the
first virtual politician in New Zealand.
world who uses social media to reach out to people and share her ideas on climate change,
healthcare, and education, among other things However, there is one key point that must be
addressed: "Why would I vote for a Republican?"
One explanation might be that historically, ambassadors, diplomats, and other types of
emissaries were in risk of being hurt or even killed.
Thus, the ancient adage, "Don't shoot the messenger!" Today, new technology and Internet
of Things (IoT) applications enable us to save lives.
Because of so-called "cyber embassies," diplomats can enter a room digitally (through
haloboration or telepresence) rather than physically.
physically, which saves lives, money, and time! Machines and robots have benefits. They
can work around the clock, never get weary, never forget, and are honest and exact - not
malevolent or searching for personal advantage.
How economic factors impacting AI

Economic potential of AI

According to the majority of research, AI will have a big economic influence. According to
Accenture research, AI may quadruple yearly global economic growth rates by 2035,
spanning 12 industrialized economies that collectively produce more than 0.5% of global
economic activity. AI will fuel this expansion in three ways. First, it will result in a
significant improvement in labor productivity (up to 40%) as a result of breakthrough
technologies that enable more effective workforce-related time management. Second, AI
will generate a new virtual workforce capable of problem solving and self-learning, referred
to in the research as "intelligent automation." Third, the economy will profit from the spread
of innovation, which will impact many industries and provide new revenue sources.

According to a PricewaterhouseCoopers (PwC) report, the accelerated development and use


of AI might enhance global GDP by up to 14% (the equivalent of US$15.7 trillion) by 2030.
According to the paper, the next wave of digital revolution will be unleashed with the help
of data created by the Internet of Things (IoT), which is expected to be many times higher
than data generated by the present 'Internet of People.' It will increase uniformity and, as a
result, automation, while also improving product and service customization. According to
PwC, there are two major pathways via which AI will affect the global economy. The first
includes AI resulting in near-term productivity benefits through automation of mundane
operations.

This will entail increased usage of technology such as robotics and self-driving cars.
Productivity will also increase when organizations use AI technology to supplement and
support their existing staff. It will necessitate investing in software, systems, and machines
based on assisted, autonomous, and augmented intelligence; this will not only allow the
workforce to perform its tasks better and more efficiently, but will also free up time for it to
focus on more stimulating and higher-value-added activities. Automation would reduce the
requirement for labor input, resulting in total productivity benefits.
How social factors impacting AI

Artificial Intelligence's Beneficial Effects on social

Artificial intelligence has the potential to significantly increase workplace efficiency and
supplement the job that people can undertake. When AI takes over monotonous or risky
duties, it frees up the human labor to focus on tasks that need creativity and empathy,
among other things. People's happiness and job satisfaction may grow if they do work that
they enjoy.

Artificial intelligence has the potential to significantly impact healthcare through improving
monitoring and diagnostic skills. AI can cut operational expenses and save money by
enhancing the operations of healthcare institutions and medical organizations. According to
one McKinsey estimate, big data may save medical and pharma up to $100 billion every
year. The genuine impact will be felt in patient care. The potential for customized treatment
plans and pharmacological regimes, as well as improved access to information across
medical institutions to assist influence patient care, will be game changers.

With the arrival of autonomous transportation and AI affecting our traffic congestion
concerns, our society will gain countless hours of productivity, not to mention the various
ways it will boost on-the-job productivity. Humans will be able to spend their time in a
number of various ways once they are free of unpleasant commutes.

Artificial intelligence will improve our ability to detect and solve illegal activities. Facial
recognition technology is becoming as prevalent as fingerprint recognition. The application
of AI in the legal system also gives several chances to find out how to use the technology
successfully without invading an individual's privacy.
How political factors impacting Cloud computing

Voter Services Using Cloud Computing


What are the applications of cloud computing for voter services? While the majority of
voting devices are not cloud-connected, cloud computing can administer county election
websites, store ballot data and voter registration rolls, give live results on election evenings,
and manage overseas voting for military personnel.

Several state governments have also contemplated adopting cloud computing to help handle
voting amid public health disasters such as COVID-19. However, some industry experts
urge caution before rushing to develop online voting websites or apps since they leave no
paper trail, which may weaken voters' trust in the outcomes.

The National Democratic Institute (NDI)


To account for restricted resources and people, the National Democratic Institute (NDI), an
organization dedicated to advancing fair and democratic elections across the world,
transferred its operations to the cloud. This change provided a more flexible IT
infrastructure, allowing them to build and manage applications more simply, as well as scale
up and down services as needed. In the process, they cut expenditures by 90% and increased
security.

The Republican National Committee


The Republican National Committee (RNC), an organization that supports the Republican
Party in United States elections, transferred its database to the cloud, enhancing security and
lowering expenses. The RNC required cloud services that could analyze election outcomes,
voter registration numbers, census data, and voter views and make the information available
to Republican candidates. Cloud computing enabled the RNC to gather vital insights on
American voters by creating a storage warehouse that kept data accessible.
How economic factors impacting Cloud computing

Cloud computing helps organizations to be more efficient, streamlined, and connected,


hence saving time and money. It also allows IT teams to create more inventive and efficient
technologies. More IT innovation, in turn, leads to increased business innovation, which
leads to increased income, which leads to the creation of new employment – and this is
occurring all over the world.

According to IDC's Economic Impact Model (EIM), the adoption of cloud computing in
digital transformation initiatives produced more than $1 trillion in global revenue in 2019.

According to the report, Worldwide Salesforce clients may expect a $375 billion net gain in
revenue from year-end 2019 to year-end 2024, including $53 billion in Western Europe.
How social factors impacting Cloud computing

The most frequently cited cloud computing benefit is IT's lower cost of ownership, because
a company can convert fixed IT costs into variable spend for 60-80% of the budget that is
used to "keep the lights on," lowering the total cost of IT as a percentage of revenue, which
can then be re-invested to grow the business.

While IT savings are appealing, the most compelling advantage of cloud computing is how
it improves business agility, particularly how the cloud can help establish totally new
enterprises with little to no up-front cash.

And when cloud computing is leveraged to start enterprises aimed at promoting the larger
social good, it has the potential to alter the world.

San Francisco is a pioneer in fostering collaboration between social entrepreneurs and


impact investors to drive global social change in Silicon Valley, where technological
innovation and venture finance meet.

Cloud computing is also hastening endeavors to build socially responsible new firms.

Impact investing is a new asset class in which investors actively apply capital to enterprises
that provide both social and environmental benefits while still making a profit. Proponents
think that impact investment can provide the incentive needed for charities and government
to better solve large-scale social concerns.
How political factors impacting Blockchain

Blockchain is expected to generate a more transparent society since it allows for


decentralized and transparent government. Governments that embrace the use of blockchain
in services such as land title transfers or administrative procedures are projected to see an
increase in the percentage of individuals engaging in the political process by voting,
expressing an opinion, and supporting projects.

This might result in Government 2.0, a version of the government structure and culture that
uses smart contracts to follow through on campaign pledges and practice responsive and
transparent leadership. When some government operations are decentralized, resulting in
"government as a service," the elected government may focus on using technology and data
to improve society's quality of life and giving everyone the pursuit of happiness. In light of
a new type of globalization taking place — a globalization of citizenship and profession —
a tax and legal framework is needed to enable the new type of global living and working.

2 use cases can help tackle government corruption


Public Procurement
A blockchain-based process can directly address procurement's corruption-risk factors by
facilitating third-party oversight of tamper-evident transactions and enabling greater
objectivity and uniformity through automated smart contracts, thereby improving
transaction and actor transparency and accountability.

Land Title Registries


Property registries based on blockchain technology have the potential to create a secure,
decentralized, publicly verifiable, and immutable record system via which people may
firmly verify their land rights. These characteristics decrease the possibility of self-
interested manipulation of property rights and, in general, strengthen the durability of land
ownership.
How economic factors impacting Blockchain

The blockchain economy


It is the future acceptance of cryptocurrencies and digital ledger systems. It entails the
replacement of the present monetary system, which means that there will be no central
authority to authenticate transactional data.

How will blockchain change the economy?


Blockchain and its influence on many areas, regardless of business fields, are numerous. We
see it embedding the economic elements of transparency, decentralization, and
immutability.
What we can predict is that the introduction of Blockchain technology in the economy's
ecosystem will only make it better and more transparent.

What impact will blockchain have on the sharing economy?


Some challenges have been stifling the sharing economy, such as unclear surge pricing,
untrustworthy riders and delivery people, and so on. Blockchain, with its immutable record
and smart contracts, allows for system transparency and security.

How might blockchain help to cut costs?

Blockchain plays an active role in decreasing operational and hidden expenses - process
streamlining, security, proactiveness in terms of keeping records open and decentralized,
and so on.
How social factors impacting Blockchain

The use of blockchain offers far-reaching possibilities for social impact, including:

 Transparency
 Supply chain management
 Digital identity
 Personal data protection
 Legitimacy
 Compliance
 Trust

Big IT firms keep their algorithms hidden, but blockchain's selling point is transparency and
unassailable record keeping. Because of blockchain's alternative trust-based, peer-to-peer
networks, some engineers believe blockchain and cryptocurrencies can realign capitalism.

Consider how blockchain is already reducing prices, realigning the concept of boundaries,
and altering the world as we know it.

Positive Examples of Blockchain for Social Impact


Many commercial companies, governments, and non-governmental organizations (NGOs)
are already adopting blockchain to make a difference in the world. Let's take a look at some
case studies and the far-reaching consequences of blockchain.

A recent study discovered that 40% of fish purchased in restaurants, marketplaces, and
fishmongers throughout the world were mislabeled and, in some circumstances, had traces
of pig. There was no clear supply network. What if blockchain could track boats, catches,
markets, and delivery?
IBM has worked with several major food industry firms to use blockchain to establish a
transparent supply chain for vegetables. This increases brand trust, and individuals can
verify to see whether the labels and packaging are accurate. People can follow every stage
of the supply chain and see whether pesticides were used, if the food is local, and where it
was grown. Agri digital, for example, is improving grain supply chains between farmers and
markets by adding real-time delivery and payment information.

Proof Points has used blockchain to convert data from supply chain traceability technologies
so that customers can verify product claims of sustainability or provenance. Because of
blockchain, businesses must back up their positive effect claims. Customers may use true
information to pick honest and respectable suppliers with their money.

The charity and NGO industry may save money on bank fees by using blockchain to send
financial relief to people in disaster zones. Consensys, a blockchain technology startup,
created a cash and voucher scheme for Oxfam in the Pacific Island of Vanuatu using the
Ethereum blockchain. Aid recipients, shops, and Oxfam utilized blockchain and
cryptocurrency to build an open, rapid, and transparent system that was less expensive than
banks.

Blockchain has several possible applications. Government expenditures, as well as financial


transactions for anything from paying taxes to taxes on earnings, may be tracked and public.
In terms of healthcare, your medical information and treatments might be securely
preserved and promptly accessible to professionals in the event of an emergency.
Presentation of Emerging technologies
Google form survey
Survey results analysis

There are 5 different age group people responded for the survey
Only 16.7% of people have good knowledge in technology
Majority of the people who responded to the survey doesn’t know about what emerging
technology is

50% of people don’t have a good experience in emerging technology 16.7% of have a good
experience in emerging technology
The majority of people who responded to the survey has a really good knowledge of
artificial intelligence.

66.7% of don’t even know what IOT is but there are few people in the survey has a decent
knowledge about IOT
83.3% of people have good knowledge about cloud computing 16.7% of people there are
few who doesn’t aware of cloud computing

66.7% of people have experienced VR with the VR headset


Most of the people doesn’t know what is hologram is they never heard that word before

Majority of the people have a good knowledge in blockchain maybe its because of the
famous crypto currency Bitcoin
33.3% of the people have average knowledge nanotechnology and 16.7% of the people have
a very poor knowledge in nanotechnology there is very few people have a decent
knowledge in nanotechnology

Majority of the people from the survey has a very good knowledge in robotics because of
some famous movies
The term Big data majority of the people doesn’t know about this only 33.3% of people
have decent knowledge in big data

66.7% of people have decent knowledge in AR the majority of people aware of AR


50% of the people have knowledge in automation but 50% of the people doesn’t have the
knowledge of automation

66.7% of people know that one drive is cloud computing technology but the other hand
33.3% of the people doesn’t know that one drive is a cloud storage
Conclusion

I now end this assignment by completing all of the theoretical sections on Activity One and
giving all appropriate proofs for each and every piece of information acquired from various
sources.

The practical portion is then handled in relation to the scenario supplied by the institution,
and all steps are taken in relation to the grading rubrics as well to finish this project in a
meaningful manner.
References
Bulitin. (2022, 09 01). Retrieved from what is Robotics: https://builtin.com/robotics
BulitIN. (2022, 09 01). Retrieved from what is AI: https://builtin.com/artificial-intelligence
Guru99. (2022, 09 01). Retrieved from what is Big data: https://www.guru99.com/what-is-
big-data.html
Investopedia. (2022, 09 01). Retrieved from what is AI:
https://www.investopedia.com/terms/a/artificial-intelligence-ai.asp
Investopedia. (2022, 09 01). Retrieved from what is Blockchain:
https://www.investopedia.com/terms/b/blockchain.asp
Investopedia. (2022, 09 01). Retrieved from what is Cloud Computing :
https://www.investopedia.com/terms/c/cloud-computing.asp
NNI. (2022, 09 01). Retrieved from what is Nanotechnology:
https://www.nano.gov/nanotech-101/what/definition
Oracle. (2022, 09 01). Retrieved from what is IOT: https://www.oracle.com/internet-of-
things/what-is-iot/
Techopedia. (2022, 09 01). Retrieved from what is Automation:
https://www.techopedia.com/definition/32099/automation
TechTarget. (2022, 09 01). Retrieved from what is Hologram :
https://www.techtarget.com/whatis/definition/hologram
veletsianos. (2022, 09 01). Retrieved from Emerging technology:
https://www.veletsianos.com/2016/06/13/defining-characteristics-of-emerging-
technologies-and-emerging-practices/
Wiston. (2022, 09 01). Retrieved from what is ET: https://www.winston.com/en/legal-
glossary/emerging-technology.html#:~:text=Emerging%20technology%20is%20a
%20term,business%2C%20science%2C%20or%20education.
XR Today. (2022, 09 01). Retrieved from what is Immersive Media :
https://www.xrtoday.com/mixed-reality/what-is-immersive-media-an-introduction/

You might also like