Professional Documents
Culture Documents
Learning and Behavior 7th Edition Paul Chance Test Bank
Learning and Behavior 7th Edition Paul Chance Test Bank
Learning and Behavior 7th Edition Paul Chance Test Bank
1. A given reinforcement schedule tends to produce a distinctive pattern and rate of performance. These
are called schedule _______.
a. patterns
b. profiles
c. effects
d. matrixes
Ans: C Ref: 194
2. John spent his summer picking cantaloupes for a farmer. The farmer paid John a certain amount for
every basket of cantaloupes picked. John worked on a _________.
a. fixed ratio schedule
b. variable ratio schedule
c. fixed interval schedule
d. variable interval schedule
Ans: A Ref: 195
3. The schedule to use if you want to produce the most rapid learning of new behavior is _______.
a. CRF
b. FR 2
c. FI 3”
d. VI 3”
Ans: A Ref: 195
9. Bill spends his summer in the city panhandling. Every day he takes a position on a busy corner and
accosts passersby saying, “Can you spare some change?” Most people ignore him, but every now and
then someone gives him money. Bill’s reinforcement schedule is best described as a _________.
a. fixed ratio schedule
b. variable ratio schedule
c. fixed interval schedule
d. variable interval schedule
Ans: B Ref: 198
10. Your text reports the case of a man who apparently made hundreds of harassing phone calls. The
man’s behavior was most likley on a(n) _________.
a. FR schedule
b. VR schedule
c. FI schedule
d. VI schedule
Ans: B Ref: 200
11. The schedule that is likely to produce a cumulative record with scallops is the _________.
a. FR schedule
b. VR schedule
c. FI schedule
d. VI schedule
Ans: C Ref: 200
12. A pigeon is confronted with two disks, one green, the other red. The bird receives food on a VI 20"
schedule when it pecks the green disk, and on a VI 10" schedule when it pecks the red one. You
predict that the bird will peck _________.
a. one disk about as often as at the other
b. the green disk almost exclusively
c. the green disk about twice as often as the red disk
d. the red disk about twice as often as the green disk
Ans: D Ref: 203
15. The reappearance of previously effective behavior during extinction is called ____________.
a. spontaneous recovery
b. recovery
c. resurgence
d. fulfillment
Ans: C Ref: 205
16. Williams found that the greater the number of reinforcements before extinction, the _______.
a. greater the number of responses during extinction
b. faster the rate of extinction
c. stronger the response during extinction
d. greater the frustration during extinction
Ans: A Ref: 207
17. In a _____ schedule, reinforcement is contingent on the continuous performance of a behavior for
some period of time.
a. fixed duration
b. continuous reinforcement
c. fixed time
d. DRH
Ans: A Ref: 208
21. A schedule that does not require the performance of a particular behavior is the _________.
a. FT schedule
b. FD schedule
c. FI schedule
d. FR schedule
Ans: A Ref: 209
22. Harry spent his summer in the city panhandling. Every day he would sit on the sidewalk, put a
cardboard sign in front of him that said, "Please help," and place his hat on the sidewalk upside down.
Then he would wait. Every now and then someone would put money into his hat. Harry's
reinforcement schedule is best described as a _________.
a. fixed ratio schedule
b. variable ratio schedule
c. fixed interval schedule
d. variable time schedule
Ans: D Ref: 209
24. ___________ schedules differ from other schedules in that the rules describing the contingencies
change systematically.
a. Adaptive
b. Evolutionary
c. Progressive
d. Idiosyncratic
Ans: C Ref: 209
25. __________ refers to the point at which a behavior stops or its rate falls off sharply.
a. Block
b. Border time
c. Break point
d. Camel’s back
Ans: C Ref: 210
27. Things are going pretty well for George (see item 26) until he jumps from reinforcing every tenth
response to reinforcing every 50th response. At this point, the pigeon responds erratically and nearly
stops responding entirely. George's pigeon is suffering from _________.
a. ratio strain
b. ratiocination
c. satiation
d. reinforcer deprivation
Ans: A Ref: 211
29. Shirley trains a rat to press a lever and then reinforces lever presses on an FR 10 schedule when a red
light is on, and an FI 10" schedule when a green light is on. In this case, lever pressing is on a
_________.
a. multiple schedule
b. chain schedule
c. concurrent schedule
d. redundant schedule
Ans: A Ref: 212
30. A schedule in which reinforcement is contingent on the behavior of more than one subject is a
_________.
a. multiple schedule
b. mixed schedule
c. tandem schedule
d. cooperative schedule
Ans: D Ref: 213
34. The explanation of the PRE that puts greatest emphasis on internal cues is the ________ hypothesis.
a. discrimination
b. frustration
c. sequential
d. response unit
Ans: B Ref: 217
35. One explanation for the PRE implies that the effect is really an illusion. This is the _________.
a. discrimination hypothesis
b. frustration hypothesis
c. sequential hypothesis
d. response unit hypothesis
Ans: D Ref: 218
37. In one form of the matching law, BA stands for the behavior under consideration and B0 represents
_______.
a. reinforcement for BA
b. the baseline rate of BA
c. all behaviors other than BA
d. all behavior that is over expectation
Ans: C Ref: 222
True/False
40. Although important, the matching law is restricted to a narrow range of species, responses,
reinforcers, and reinforcement schedules. F (224)
41. In VI schedules, the reinforcer occurs periodically regardless of what the organism does. F (203)
42. One everyday example of a VR schedule is the lottery. T (198, inferred)
43. In a multiple schedule, the organism is forced to choose between two or more reinforcement
schedules. F (212)
44. When a response is placed on extinction, there is often an increase in emotional behavior. T (205)
45. When food is the reinforcer, it is possible to stretch the ratio to the point at which an animal expends
more energy than it receives. T (211)
46. One difference between FT and FI schedules is that in FT schedules, reinforcement is not contingent
on a behavior. T (209)
47. The thinner of two schedules, VR 5 and VR 10, is VR 10. T (198)
48. Harlan Lane and Paul Shinkman put a college student’s behavior on extinction following VI
reinforcement . The student performed the behavior 8,000 times without reinforcement. T (215)
49. The response unit hypothesis suggests that there really is no such thing as the partial reinforcement
effect. T (219)
50. One effect of the extinction procedure is an increase in the variability of behavior. T (205)
51. The more effort a behavior requires, the fewer times the behavior will be performed during extinction.
T (207)
52. Extinction often increases the frequency of emotional behavior. T (205)
53. Extinction often increases the variability of behavior. T (205)
Completion
54. The rule describing the delivery of reinforcement is called a ________of reinforcement.
Ans: schedule (194)
55. CRF stands for ________. Ans: continuous reinforcement (195)
56. The explanation of the PRE that puts greatest emphasis on internal cues is the ________ hypothesis.
Ans: frustration (217)
57. In CRF, the ratio of reinforcers to responses is 1 to 1; in FR 1, the ratio is _______. Ans: 1 to 1 (196)
58. Choice involves ________ schedules. Ans: concurrent (214)
59. When behavior is on a FR schedule, animals often discontinue working briefly following
reinforcement. These periods are called ________.
Ans: post-reinforcement pauses/pre-ratio pauses/between-ratio pauses (196)
60. The term ________ refers to the pattern and rate of performance produced by a particular
reinforcement schedule. Ans: schedule effects (194)
Short Essay
64. Explain why fatigue is not a good explanation for postreinforcement pauses. (197)
Answers should note that more demanding (fatiguing) schedules do not necessarily produce longer
pauses than less demanding schedules. Students might also argue that the fatigue explanation is
circular.
65. A teacher has a student who gives up at the first sign of difficulty. How can the teacher increase the
child’s persistence? (210)
This is essentially the same question as review question 4. Answers should make use of stretching the
ratio.
66. In a tandem schedule, behavior is performed during a series of schedules, but food reinforcement
comes only at the end of the last schedule. What reinforces behavior during the earlier schedules
when food is not provided? (213)
It could be argued that the food reinforcement reinforces all performances. However, students
should mention that each change from one schedule to the next brings the subject closer to food
reinforcement and may therefore be reinforcing.
67. A rat’s lever pressing is on a concurrent VI 5” VI 15” schedule. Describe the rat’s behavior. (221)
Students should indicate that for every lever press on the VI 15” schedule, there will be about three
responses on the VI 5” schedule.
68. How might you use what you know about reinforcement schedules to study the effects of air
temperature on behavior? (inferred, but see 225; figure 7-12)
Answers should indicate an understanding of the value of schedule-induced steady rates to study the
effects of independent variables, such as air temperature.
This is review question 16, rephrased. Students might discuss the use of schedules in defining and
studying personality characteristics such as laziness, the effects of drugs and other variables on
behavior, and their use in studying extinction effects and other basic phenomena.