Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 18

Animal Learning

Animal behaviour
What is learning ?
• A positive change in behavior
• A change in behavior as a result of experiences that one has had
• Learning is the cognitive acquisition of knowledge or skills.
• In the scope of this lesson we will be looking at
learning from an individual point of view.
• How do animals learn?
what is • Learning refers to a relatively permanent change
learning? in behavior as a result of experience
(Shettleworth, 1998)
• permanent change?
• Consider this, a rat that doesn't eat for 24 hours
is more likely to eat dinner than a rat that has
just eaten.
• So are we to say the rat that is foraging because
it hasn't eaten in a while has learned because it
expresses a change in behavior compared to the
one that has just eaten?
The definition of learning ( a relatively permanent change in behavior as a result of
experience) so just an interesting relationship between learning and what
evolutionary ecologists refer to as phenotypic plasticity.
Phenotype is typically defined as an observable characteristic of an Organism.
Phenotypic plasticity is broadly defined as the ability of an Organism to produce
different phenotypes depending on environmental conditions.
• Consider the invertebrate bryozoan Membranipora membranacea. These live in
colonies. When they live in colonies, they lack spines that are useful as
antipredator defense mechanisms.
• However, individuals grow spines relatively quickly when exposed to predatory
cues.
Phenotypic Plasticity
• Inducible defenses. In some bryozoans,
like Membranipora membranacea,
colonies produce spines when predators
are present. (A) Spines are shown
protruding from a
colony as a defense against predators
(arrows point to spines), and (B) a colony
of
Membranipora membranacea. (Photo
credits: © Ken Lucas/Visuals Unlimited;
© Sue
Daly/naturepl.com
• If learning is “a relatively permanent change in behavior as a result of
experience,” it then becomes one type of phenotypic plasticity because
behavior is part of a phenotype
• So all learning is a type of phenotypic plasticity, but not all phenotypic
plasticity involves learning.
• To see why, consider the “flushing” behavior often seen in foraging
birds. While searching for food, some birds may move their tails and
wings in a way that flushes insects out from cover—insects that the bird
then eats. In the painted redstart (Myioborus pictus), for example, when
individuals are under branches, they increase their wing and feather
motion and flush insects from the overhanging branches.
• Do birds know that increasing wing motion will yield more insects or,
• Is this a genetically encoded response
Phenotypic plasticity
• What scientists found was that the same increase
in wing flapping also occurs in the laboratory, even
when the birds are not rewarded for such
behavior:
• Juvenile birds under branches also started
flapping more as well, even when they got no
food for doing so (Jablonski et al., 2006).
• These results suggest that flushing insects under
branches does represent a case of phenotypic
plasticity—the ability of an organism to produce
different phenotypes depending on environmental
conditions (whether the birds are under trees or
not)— but it is not a case of learning.
• What is phenotypic plasticity
• Why is learning a type of phenotypic plasticity
How do animals learn?

• Heyes (1994) believes that the question of how


animals learn can be tied into why animals learn.
She notes that there are three commonly
recognized types of experience that can lead to
learning—namely, single stimulus, stimulus-
stimulus, and response-enforcer—each of which
facilitates certain forms of learning.
Learning from a single-stimulus experience

• Supposing we put a blue stick in a rats cage numerous of times in a day.


• If, over time, the rats become more likely to turn their heads in the
direction of the blue stick—that is, if they become more sensitive to the
stimulus with time, sensitization has occurred.
• Conversely, if, over time, the animals become less likely to turn their
head, habituation is said to have taken place.
• Sensitization and habituation are two simple single-stimulus forms of
learning.
PAVLOVIAN (CLASSICAL) CONDITIONING

• Suppose that rather than giving a rat a single stimulus like the blue stick, from
the start we pair this stimulus with a second stimulus, let’s say the odor of a cat
—an odor that rats fear, even when they are exposed to it for the first time.
• Let’s imagine that five seconds after the blue stick is in place, we spray the odor
of a cat into one corner of the cage.
• If the rat subsequently learns to pair stimulus 1 (blue stick) with stimulus 2 (cat
odor) and responds to the blue stick by climbing under the chip shavings (a safer
location) in its cage as soon as it appears, but before the odor is sprayed in, we
have designed an experiment in Pavlovian or classical conditioning
• A conditioned stimulus (CS) is defined as a stimulus that initially
fails to elicit a particular response but comes to do so when it
becomes associated with a second (unconditioned) stimulus.
• In our rat example, the blue stick is the conditioned stimulus, as
initially the rat will have no inherent fear of it. The
unconditioned stimulus (US) is a stimulus that elicits a vigorous
response in the absence of training.
• In our rat example, the US would be the cat odor, which
inherently causes a fear response in rats. Once the rat has
learned to hide after the blue stick (CS) alone is in place, we can
speak of its hiding as being a conditioned response (CR) to the
presence of the blue stick
Important terms in Pavlovian conditioning
• Any stimulus that is considered positive, pleasant, or rewarding is referred to as an
appetitive stimulus (e.g. food, the presence of a potential mate, etc.)
• Any stimulus that is unpleasant—shock, noxious odors, and so forth—is labeled an
aversive stimulus.
• When the first event (placement of the blue stick) in a conditioning experiment
predicts the occurrence of the second event (cat odor), there is a positive
relationship between events.
• Conversely, if the first event predicts that the second event will not occur—imagine
that a blue stick is always followed by not feeding an animal at its normal feeding
time— there is a negative relationship.
• Positive relationships—for example, blue stick predicts cat odor—produce excitatory
conditioning. Negative relationships—for example, blue stick leads to no food at a
time when food is usually presented—produce inhibitory conditioning.
• Pavlovian conditioning experiments can become complicated when
second-order conditioning is added on. In second-order conditioning,
once a conditioned response (CR) has been learned by pairing US and
CS1, a new stimulus is presented before the CS1, and if the new stimulus
itself eventually elicits the conditioned response, then the new stimulus
has become a conditioned stimulus (CS2). In our case, any rat that has
learned to pair the blue stick (CS1) with danger might now see a yellow
light (CS2) preceding the appearance of the blue stick. Once the rat has
learned to pair the yellow light (US) with the danger associated with cat
odor, second-order Pavlovian conditioning has occurred.
INSTRUMENTAL (OPERANT)
CONDITIONING
• Instrumental conditioning, also known as operant or goal-directed learning, occurs when the
response that is made by an animal is reinforced (increased) by the presentation of a reward or
the termination of an aversive stimulus, or when the response is suppressed (decreased) by the
presentation of aversive stimulus or the termination of a reward.
• In instrumental learning, the animal must undertake some action or response in order for the
conditioning process to produce learning.
• The classic example of instrumental learning is a rat pressing some sort of lever (that is, taking
an action) to get food to drop into its cage. Rats associate pressing on the lever (response) with
some probability of getting food (outcome) and learn this task.
• The earliest work on instrumental learning was that of Edward Thorndike and involved testing
how quickly cats could learn to escape from “puzzle boxes” that Thorndike had constructed.
Thorndike postulated the law of effect, which states that if a response in the presence of a
stimulus is followed by a positive event, the association between the stimulus and the response
will be strengthened. Conversely, if the response is followed by an aversive event, the
association will be weakened.
• Work in instrumental learning was revolutionized by B. F.
Skinner, who devised what is now known as a Skinner box
(Skinner, 1938). His idea was to create a continuous
measure of behavior that could somehow be divided into
meaningful units. When a rat pushes down on a lever, it is
making an operant response because the action changes
the rat’s environment by adding food to it (Figure 5.14).
Because “lever pushing” is a relatively unambiguous event
that is easily measurable, and because it occurs in an
environment over which the rat has control, the Skinner
box has facilitated the work of psychologists doing
research within the instrumental learning paradigm.

You might also like