Share Lecture-25 3

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

Lecture-25_3 (1).

pdf (Highlight: 88; Note: 0)

───────────────

▪ An artificial Ant Colony System (ACS) is an agent-based system, which simulates the natural
behavior of ants and develops mechanisms of cooperation and learning. • ACS was proposed
by Dorigo et al. (997) as a new heuristic to solve combinatorial optimization problems.

▪ The main idea of ACO is to model a problem as the search for a minimum cost path in a
graph.

▪ ants • They communicate through stigmergy

▪ Agents: Individuals that belong to a group (but are not necessarily identical)

▪ What is a Swarm? • They contribute to and benefit from the group • They can recognize,
communicate, and/or interact with each other • A swarm is better understood if thought of as
agents exhibiting a collective behavior • Termites swarm to build colonies • Birds swarm to find
food

▪ SI is AI technique based on the collective behavior in decentralized, self- organized systems

▪ Examples of Swarms in Nature: • Classic Example: Swarm of Bees • Ant colony • Agents: ants
• Flock of birds • Agents: birds • Traffic • Agents: cars • Immune system • Agents: cells and
molecules

▪ hy Insects? • Insects have a few hundred brain cells • However, organized insects have been
known for: • Architectural marvels • Complex communication systems • Resistance to hazards in
nature

▪ Two Common SI Algorithms 1. Ant Colony Optimization • The study of artificial systems
modeled after the behavior of real ant colonies 2. Particle Swarm Optimization • A population
based stochastic optimization technique, Inspiration: Swarms of Bees etc

▪ Attributed features: • fast • good optimizer for real-valued optimisation • relatively much theory

▪ Special: • self-adaptation of (mutation) parameters standard

▪ There are basically 4 types of Ess • 1. The Simple (1+1)-ES

▪ 2. The (+1)-ES

▪ 3. The (+)-ES. P(t+1)


▪ • 4. The (,)-ES. P(t+1)

▪ Evolution Strategies • Based on the concept of the evolution of evolution • Developed:


Germany in the 1970’s • Developed by: Rechenberg

▪ Evolution Strategies • ES cosiders both genotypic and phenotypic evolution • Emphasis is on


phenotypic behavior of individuals • Each individual is represented by its genotype and strategy
parameters • Both genotype and strategy parameters are evolved

▪ EP aimed at achieving intelligence • Intelligence was viewed as adaptive behaviour

▪ Mutated individuals are only accepted if fitness of parent is improved • Typically applied to: •
application concerning shape optimization • numerical optimisation; • continuous parameter
optimisation • computational fluid dynamics etc

▪ ESs are closer to Larmackian evolution

▪ The difference between GA and ES is the Representation and Survival selection mechanism

▪ EP algorithm is driven by two main evolutionary operators, namely mutation and selection.

▪ Mutation operators • a. Gaussian: • b. Cauchy • c. Levy • d. Exponential • e. Chaos: • f.


Combined distributions:

▪ assigned to each individual, any of a number of selection methods can be used • Elitism: •
Tournament selection: • Proportional selection: • Nonlinear ranking selection • Etccc..

▪ EP is one of the four major evolutionary algorithm paradigms • Developed: USA 1960s •
Invented by: Dr. Lawrence J. Fogel

▪ EP emphasizes the development of behavioral models and not genetic models

▪ EP is derived from the simulation of adaptive behavior in evolution • EP considers phenotypic


evolution • EP iteratively applies two evolutionary operators: -Variation through application of
mutation operators - Selection

▪ Main components • 1. Initialization • 2. Evaluation

▪ Main components • 3. Mutation=the only source of variation • 4. Selection=Main purpose to


select new population

▪ Categories of EP • Based on the characteristics of the scaling function, following categories of


EP algos: • 1. Non-adaptive EP=the deviations in step sizes remain static. • 2. Dynamic EP=the
deviations in step sizes change over time • 3. Self-adaptive EP=in which step sizes change
dynamically

▪ Most common recombination • Exchange two randomly chosen subtrees among the parents •
Recombination has two parameters: • Probability Pc to choose recombination vs. mutation

▪ Probability to chose an internal point within each parent as crossover point

▪ Like in most other EAs, genetic operators in GP are applied to individuals that are
probabilistically selected based on fitness.

▪ The most commonly employed method for selecting individuals in GP is, • Tournament
selection, • followed by Fitness- proportionate selection

▪ Roulette wheel selection

▪ Initialising the Population • Similar to other EAs, in GP the individuals in the initial population
are randomly generated. • There are a number of different approaches • The most simplest
methods are, • The Full & Grow methods and Ramped half-and- half.

▪ In the Full method nodes are taken at random from the function set until this maximum tree
depth is reached, and beyond that depth only terminals can be chosen.

▪ The Grow method allows for the creation of trees of varying size and shape.

▪ In Ramped half-and-half method, half the initial population is constructed using Full and half is
constructed using Grow. • This is done using a range of depth limits, hence the term “ramped”

▪ GP evolves computer programs, traditionally represented in memory as tree structures. • Trees


can be easily evaluated in a recursive manner. • Every tree node has an operator function

▪ Every terminal node has an operand, making mathematical expressions easy to evolve and
evaluate.

▪ Non-tree representations have been suggested and successfully implemented, such as linear
genetic programming which suits the more traditional imperative languages. • Most non-tree
representations have structurally noneffective code (introns).

▪ Its impact has been lessened by Moore’s law and paralel computing

▪ Current challenges for GP includes, • Economizing on GP resource usage, • Ensuring better


quality results, • Extracting more reliable convergence, or applying GP to a challenging problem
domain

▪ Genetic Programming • Developed: USA in the 1990’s • Early names: J. Koza • Typically
applied to: machine learning tasks (prediction, classification

▪ Genetic Programming • Attributed features:  competes with neural nets and alike  needs
huge populations (thousands)  Slow • Special:  non-linear chromoso: trees, graphs 
mutation possible but not necessary (disputed!)

▪ Automotive Design • GAs are used to both design composite materials and aerodynamic
shapes for race cars and regular means of transportation, return combinations of best materials
to provide faster, lighter, fuel efficient vehicles.

▪ Robotics • GAs can be programmed to search for a range of optimal designs and components
for each specific use, or to return results for entirely new types of robots that can perform
multiple tasks and have more general application.

▪ Computer Gaming • Those who spend some of their time playing computer Sims games will
often find themselves playing against sophisticated AI GAs instead of against other human
players online

▪ Radiology • GAs have been applied for feature selection= to identify a region of interest in
mammograms • 7. Gene Expression Profiling • GAs=developed to make analysis of GEP much
quicker and easier.

▪ GA: Onemax problem • Algorithm • Produce an initialize population • find fitness of all
individuals in the population • while (termination criteria is reached) do • Select fitter individuals
as parent for reproduction • crossover with probability pc • mutation with probability pm •
Evaluate the fitness of the modified inividuals • Generate a new populatin • End whie

▪ Issues for GA Practitioners • Choosing basic implementation issues: • representation •


population size, mutation rate, ... • selection, deletion policies • crossover, mutation operators •
Termination Criteria etc

▪ Benefits of GA • Concept is easy to understand • Modular, separate from application •


Supports multi-objective optimization • Good for “noisy” environments • Always an answer •
Inherently parallel; easily distributed

▪ Easy to exploit previous or alternate solutions • Flexible building blocks for hybrid applications
• Substantial history and range of use

▪ When to use GA • Alternate solutions are too slow or overly complicated • Need an exploratory
tool to examine new approaches • Problem is similar to one that has already been successfully
solved by using a GA • Want to hybridize with an existing solution

▪ Simple Genetic Algorithm { initialize population; evaluate population; while


TerminationCriteriaNotSatisfied { select parents for reproduction; perform recombination and
mutation; evaluate population;

▪ A genetic operator is an operator used in GA to guide the algorithm towards a solution to a


given problem

▪ There are three main types of operators (mutation, crossover and selection)

▪ hey are used to create and maintain genetic diversity, and select between solutions

▪ Crossover operator • also known as recombination,

▪ Mutation is fairly simple. • Just change the selected alleles based on what you feel is
necessary and move on.

▪ It give preference to better solutions, allowing them to pass on their 'genes' to the next
generation.

▪ It may also simply pass the best soln. from the current gen. directly to the next without being
mutated; this is known as elitism or elitist selection

▪ Elements of GAs • Encoding technique (gene, chromosome) • Initialization procedure


(creation) • Evaluation function (environment) • Selection of parents (reproduction) • Genetic
operators (mutation, recombination) • Parameter settings (practice and art)

▪ Popular and well-studied selection methods include roulette wheel selection and tournament
selection.

▪ GA as a multi-point optimization technique for multi- dimensional spaces.

▪ size of the population is in the range from 20 to 200 or 300.

▪ methods explores 1, 2, or 3 points in the search space on each iteration.

▪ Traditional methods require a starting point to begin the optimization.

▪ If a standard GA is used to optimize a function of continuous variables, it does not work

▪ Instead it uses "mapping" to code each continuous variable into an internal binary string of
fixed length.
▪ Such a mapping transforms the entire range of a continuous variable into a limited set of
binary coded numbers • The bigger the binary string, the larger the search space.

▪ GA does not require any auxillary information except yhe objective function values.

▪ GA uses the probabilities in their operators.

▪ This nature of narrowing the search spaceas the search progresses ,is adaptive and is the
unique characteristic of Genetic Al

▪ The method chosen for any particular case will depends on the character of the objective
function, the nature of the constraints & the no. of indep. & dep. variables..

▪ The current literature identifies three main search methods, • Calculus-based, enumerative
and random

▪ a number of difficulties when faced with complex problems.

▪ The major difficulty arises when one algorithm is applied to solve a number of different
problems. • This is because each classical method is designed to solve only a particular class of
problems efficeiently

▪ most classical methods do not have the global perspective and often get converged to a locally
optimal solution.

▪ inability to be used in parallel computing environment

▪ Most classical algorithms are serial in nature

▪ A GA is a search technique used in computing to find true or approximate solutions to


optimization and search problems.

▪ Categorized as global search heuristics.

▪ inspired by evolutionary biology

▪ Uses concepts of “Natural Selection” and “Genetic Inheritance” • Provide efficient, effective
techniques for optimization and machine learning applications • Developed by John Holland his
colleagues and students at the University of Michigan (1970’s

▪ A typical genetic algorithm requires two things to be defined: • 1. a genetic representation of


the solution domain, and • 2. a fitness function to evaluate the solution domain.
▪ Chromosomes could be Represented as: • Bit strings (0101 ... 1100) • Real numbers (43.2
-33.1 ... 0.0 89.2) • Permutations of element (E11 E3 E7 ... E1 E15) • Lists of rules (R1 R2 R3 ...
R22 R23) • Program elements (genetic programming) • ... any data structure

You might also like