Fundamentals of Neural Networks What Is Neural Net

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 4

Fundamentals of Neural Networks What is Neural Net ?

• A neural net is an artificial representation of the human brain that


tries to simulate its learning process. An artificial neural network
(ANN) is often called a "Neural Network" or simply Neural Net (NN).
• Traditionally, the word neural network is referred to a network of
biological neurons in the nervous system that process and transmit
information.
• Artificial neural network is an interconnected group of artificial
neurons that uses a mathematical model or computational model for
information processing based on a connectionist approach to
computation.
• The artificial neural networks are made of interconnecting artificial
neurons which may share some properties of biological neural
networks.
• Artificial Neural network is a network of simple processing
elements (neurons) which can exhibit complex global behavior,
determined by the connections between the processing elements
and element parameters.
Neural Computers mimic certain processing capabilities of the
human brain.
- Neural Computing is an information processing paradigm, inspired
by biological system, composed of a large number of highly
interconnected processing elements (neurons) working in unison to
solve specific problems.
- Artificial Neural Networks (ANNs), like people, learn by example.
- An ANN is configured for a specific application, such as pattern
recognition or data classification, through a learning process.
- Learning in biological systems involves adjustments to the synaptic
connections that exist between the neurons. This is true of ANNs as
well.

Why Neural Network


Neural Networks follow a different paradigm for computing.
■ The conventional computers are good for - fast arithmetic and
does what programmer programs, ask them to do.
■ The conventional computers are not so good for - interacting with
noisy data or data from the environment, massive parallelism, fault
tolerance, and adapting to circumstances.
■ The neural network systems help where we can not formulate an
algorithmic solution or where we can get lots of examples of the
behavior we require.
■ Neural Networks follow different paradigm for computing. The von
Neumann machines are based on the processing/memory
abstraction of human information processing. The neural networks
are based on the parallel architecture of biological brains.
■ Neural networks are a form of multiprocessor computer system,
with - simple processing elements , - a high degree of
interconnection, - simple scalar messages, and - adaptive interaction
between elements

Genetic Algorithms Genetic Algorithms (GAs) were invented by John


Holland in early 1970's to mimic some of the processes observed in
natural evolution. Later in 1992 John Koza used GAs to evolve
programs to perform certain tasks. He called his method "Genetic
Programming" (GP). GAs simulate natural evolution, a combination
of selection, recombination and mutation to evolve a solution to a
problem. GAs simulate the survival of the fittest, among individuals
over consecutive generation for solving a problem. Each generation
consists of a population of character strings that are analogous to the
chromosome in our DNA (Deoxyribonucleic acid). DNA contains the
genetic instructions used in the development and functioning of all
known living organisms. What are Genetic Algorithms ■ Genetic
Algorithms (GAs) are adaptive heuristic search algorithm based on
the evolutionary ideas of natural selection and genetics. ■ Genetic
algorithms (GAs) are a part of evolutionary computing, a rapidly
growing area of artificial intelligence. GAs are inspired by Darwin's
theory about evolution - "survival of the fittest". ■ GAs represent an
intelligent exploitation of a random search used to solve
optimization problems. ■ GAs, although randomized, exploit
historical information to direct the search into the region of better
performance within the search space. ■ In nature, competition
among individuals for scanty resources results in the fittest
individuals dominating over the weaker ones. 42 • Why Genetic
Algorith
Why Genetic Algorithms "Genetic Algorithms are good at taking
large, potentially huge search spaces and navigating them, looking
for optimal combinations of things, solutions you might not
otherwise find in a lifetime.” - Salvatore Mangano Computer Design,
May 1995. - GA is better than conventional AI, in that it is more
robust. - Unlike older AI systems, GAs do not break easily even if the
inputs changed slightly, or in the presence of reasonable noise. - In
searching a large state-space, multi-modal state-space, or
ndimensional surface, a GA may offer significant benefits over more
typical search of optimization techniques, like - linear programming,
heuristic, depth-first, breath-first.
Synergism of GA-Fuzzy System Approach At the starting stage, high
crossover probability and low mutation probability yield good
results, because a large number of crossover operations produce
better chromosomes for a finite number of generations, after that
the fitness value of each chromosome vector becomes almost equal.
Beyond this the effect of crossover is insignificant due to little
variation in the chromosome vectors in that particular population. At
later stages, increasing the mutation rate of the chromosomes
inculcates new characteristics in the existing population and
therefore diversifies the population. Therefore, philosophy behind
varying Pc and Pm is that the response of the optimization procedure
depends largely on the stage of optimization, i.e. a high fitness value
may require relatively low crossover and high mutation probabilities
for further improvement, alternatively, at low fitness values the
response would be better with relatively high crossover and low
mutation probabilities. Schuster (1985) proposed heuristics for
optimal setting of the mutation probability (Pm). Fogarty, (1981) and
Booker (1987) investigated time dependencies on the mutation and
crossover probabilities respectively. Grefenstette, (1981) and
Schaffer (1981) found optimal settings for all these parameters of the
GA by experiment. In this work, a GA-Fuzzy approach is used in which
ranges of parameters – crossover probability (Pc) and mutation
probability (Pm) have been divided into LOW, MEDIUM and HIGH
membership functions. The GA parameters (Pc and Pm) are varied
based on the fitness function values as per the following logic: The
value of the best fitness for each generation (BF) is expected to
change over a number of generations, but if it does not change
significantly over a number of generations (UN) then this information
is considered to cause changes in both Pc and Pm. The diversity of a
population is one of the factors, which influences the search for a
true optimum. The variance of the fitness values of objective
function (VF) of a population is a measure of its diversity. Hence, it is
also considered as another factor on which both Pc and Pm may be
changed. The membership functions and membership values for
these three variables (BF, UN and VF) are selected after several trials
to get optimum results.

You might also like