Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 16

UNDERSTANDING

NEURAL NETWORKS
WCA BAADAL KATHAIT
THE HISTORY OF DEEP LEARNING
• Deep learning was conceptualized by Geoffrey Hinton in the 1980s. He is widely considered to be the
founding father of the field of deep learning. Hinton has worked at Google since March 2013 when his
company, DNNresearch Inc., was acquired.

• Hinton’s main contribution to the field of deep learning was to compare machine learning techniques to
the human brain.

• More specifically, he created the concept of a "neural network", which is a deep learning algorithm
structured similar to the organization of neurons in the brain. Hinton took this approach because the
human brain is arguably the most powerful computational engine known today.
WHY DEEP LEARNING DID NOT
IMMEDIATELY WORK?
• If deep learning was originally conceived decades ago, why is it just beginning to gain momentum
today?

• It’s because any mature deep learning model requires an abundance of two resources:
• Data
• Computing power

• At the time of deep learning’s conceptual birth, researchers did not have access to enough of either
data or computing power to build and train meaningful deep learning models. This has changed over
time, which has led to deep learning’s prominence today.
UNDERSTANDING NEURONS IN DEEP
LEARNING
• Neurons in deep learning models are nodes through which data and computations flow.

• Neurons work like this:

• They receive one or more input signals. These input signals can come from either the raw data set or from
neurons positioned at a previous layer of the neural net.
• They perform some calculations.
• They send some output signals to neurons deeper in the neural net through a synapse.
HERE IS A DIAGRAM OF THE FUNCTIONALITY OF A
NEURON IN A DEEP LEARNING NEURAL NET:
WHAT IS A
NEURAL

NETWORK?
A neural network is a series of algorithms that endeavors to
recognize underlying relationships in a set of data through a
process that mimics the way the human brain operates. In this
sense, neural networks refer to systems of neurons, either
organic or artificial in nature.

• Neural networks can adapt to changing input; so the network


generates the best possible result without needing to
redesign the output criteria.
KEY TAKEAWAYS

•Neural networks are a series of algorithms that mimic the operations of an animal brain to
recognize relationships between vast amounts of data.

•As such, they tend to resemble the connections of neurons and synapses found in the
brain.

•They are used in a variety of applications in financial services, from forecasting and
marketing research to fraud detection and risk assessment.

•Neural networks with several process layers are known as "deep" networks and are used
for deep learning algorithms

•The success of neural networks for stock market price prediction varies.
WHAT ARE THE COMPONENTS OF A NEURAL
NETWORK?
• There are three main components: an input later, a processing layer, and an output layer. The inputs
may be weighted based on various criteria. Within the processing layer, which is hidden from view,
there are nodes and connections between these nodes, meant to be analogous to the neurons and
synapses in an animal brain.
1. Convolutional Neural Network
A convolutional neural network is one adapted for
analyzing and identifying visual data such as digital
images or photographs.

TYPES OF 2. Recurrent Neural Network

NEURAL A recurrent neural network is one adapted for analyzing


time series data, event history, or temporal ordering.

NETWORKS
3. Deep Neural Network
Also known as a deep learning network, a deep neural
network, at its most basic, is one that involves two or
more processing layers.
APPLICATION OF NEURAL NETWORKS:
• Neural networks are broadly used, with applications for financial operations, enterprise planning,
trading, business analytics, and product maintenance. Neural networks have also gained widespread
adoption in business applications such as forecasting and marketing research solutions, fraud detection,
and risk assessment.

• A neural network evaluates price data and unearths opportunities for making trade decisions based on
the data analysis. The networks can distinguish subtle nonlinear interdependencies and patterns other
methods of technical analysis cannot. According to research, the accuracy of neural networks in making
price predictions for stocks differs. Some models predict the correct stock prices 50 to 60 percent of the
time, while others are accurate in 70 percent of all instances. Some have posited that a 10 percent
improvement in efficiency is all an investor can ask for from a neural network.
WHAT ARE ACTIVATION FUNCTIONS IN DEEP
LEARNING?
• In the last section, we learned that neurons receive input signals from the preceding layer of a neural
network. A weighted sum of these signals is fed into the neuron's activation function, then the activation
function's output is passed onto the next layer of the network.

• There are four main types of activation functions that we’ll discuss in this tutorial:

• Threshold functions
• Sigmoid functions
• Rectifier functions, or ReLUs
• Hyperbolic Tangent functions
THRESHOLD FUNCTIONS
• Threshold functions compute a different output signal depending on whether or not its input lies above
or below a certain threshold. Remember, the input value to an activation function is the weighted sum
of the input values from the preceding layer in the neural network.
• Mathematically speaking, here is the formal definition of a deep learning threshold function:

• As the image above suggests, the threshold function is sometimes also called a unit step function.

• Threshold functions are similar to boolean variables in computer programming. Their computed value is
either 1 (similar to True) or 0 (equivalent to False).
THE SIGMOID FUNCTION
• The sigmoid function is well-known among the data science community because of its use in logistic regression, one of the core machine
learning techniques used to solve classification problems.

• The sigmoid function can accept any value, but always computes a value between 0 and 1.

• Here is the mathematical definition of the sigmoid function:

• One benefit of the sigmoid function over the threshold function is that its curve is smooth. This means
it is possible to calculate derivatives at any point along the curve.
THE RECTIFIER FUNCTION
• The rectifier function does not have the same smoothness property as the sigmoid function from the last
section. However, it is still very popular in the field of deep learning.
• The rectifier function is defined as follows:
• If the input value is less than 0, then the function outputs 0
• If not, the function outputs its input value
• Here is this concept explained mathematically:

• Rectifier functions are often called Rectified Linear Unit activation functions, or ReLUs for short.
THE HYPERBOLIC TANGENT FUNCTION
• The hyperbolic tangent function is the only activation function included in this presentation that is
based on a trigonometric identity.
• It’s mathematical definition is below:

• The hyperbolic tangent function is similar in appearance to the sigmoid function, but its output values
are all shifted downwards.
THANK YOU

You might also like