Chapter 6 Reinforcement Schedules: Experimental Analyses And Applications
1. In a cumulative record, a straight diagonal line, going up from left to right, indicates
a. no responding
b. an increasing response rate
c. a steady response rate
d. a decreasing response rate
Answer: c
Rationale:
A straight diagonal line on a cumulative record represents a steady response rate because it
indicates consistent responding over time without significant increases or decreases in
behavior.
2. Playing the lottery is an example of behavior maintained by a
a. fixed ratio schedule
b. variable ratio schedule
c. variable-interval schedule
d. concurrent schedule
Answer: b
Rationale:
Playing the lottery is reinforced on a variable ratio schedule because the reinforcement
(winning) is delivered after an unpredictable number of responses (ticket purchases), which
typically leads to high and persistent rates of responding.
3. If a parent gives a child a special privilege every time the child successfully completes five
homework assignments, this is an example of a
a. fixed ratio schedule
b. variable ratio schedule
c. fixed-interval schedule
d. variable-interval schedule
Answer: a
Rationale:
This scenario illustrates a fixed ratio schedule because reinforcement (the special privilege) is
delivered after a fixed number of responses (completing five homework assignments).
4. In a fixed interval 60-second schedule, the delivery of a reinforcer
a. depends only on the passage of time
b. occurs only if the subject responds rapidly
c. requires at least one response
d. always occurs precisely 60 seconds after the last reinforcer
Answer: c
Rationale:
In a fixed interval schedule, reinforcement is delivered for the first response that occurs after
a fixed interval of time, regardless of how many responses occur during that time. Thus, at
least one response is required to obtain reinforcement.
5. In a random ratio 60 schedule, the maximum number of responses that might be required
for reinforcement is
a. 60
b. 120
c. 180
d. not predictable
Answer: d
Rationale:
In a random ratio schedule, the number of responses required for reinforcement varies
unpredictably. Therefore, there is no fixed maximum number of responses that might be
required for reinforcement.
6. An accelerating pattern of responding is most typical of behavior on a
a. fixed ratio schedule
b. variable ratio schedule
c. fixed-interval schedule
d. variable-interval schedule
Answer: c
Rationale:
An accelerating pattern of responding, characterized by a gradual increase in response rate as
the reinforcement time approaches, is most typical of behavior on a fixed-interval schedule.
7. A stop-and-go pattern of responding is most typical of performance on a
a. fixed ratio schedule
b. variable ratio schedule
c. fixed-interval schedule
d. variable-interval schedule
Answer: a
Rationale:
A stop-and-go pattern of responding, characterized by bursts of responding followed by
pauses, is most typical of performance on a fixed ratio schedule where reinforcement is
delivered after a fixed number of responses.
8. If a telemarketer makes many calls and occasionally gets someone to buy his product, this
is an example of a
a. fixed ratio schedule
b. variable ratio schedule
c. fixed-interval schedule
d. variable-interval schedule
Answer: b
Rationale:
The behavior of the telemarketer is reinforced on a variable ratio schedule because
reinforcement (making a sale) is delivered after an unpredictable number of responses (calls),
which typically leads to a high and persistent rate of calling behavior.
9. Rapid, steady responding is most typical of performance on a
a. fixed ratio schedule
b. variable ratio schedule
c. fixed-interval schedule
d. variable-interval schedule
Answer: b
Rationale:
Rapid, steady responding is most typical of behavior on a variable ratio schedule, where
reinforcement is delivered after an unpredictable number of responses, leading to high and
consistent response rates.
10. If a person working at her computer keeps checking her email and occasionally gets a
message from a friend, this is similar to a
a. fixed ratio schedule
b. variable ratio schedule
c. fixed-interval schedule
d. variable-interval schedule
Answer: d
Rationale:
This scenario is similar to behavior maintained on a variable-interval schedule, where
reinforcement (receiving an email) is delivered after varying intervals of time, leading to a
steady rate of checking behavior.
11. A chained schedule of reinforcement always makes use of
a. two or more distinctly different responses
b. two or more distinctly different discriminative stimuli
c. two or more primary reinforcers
d. all of the above
Answer: b
Rationale:
In a chained schedule of reinforcement, behavior is reinforced through a series of steps or
components, each requiring different discriminative stimuli. Therefore, the correct answer is
b, as it involves multiple discriminative stimuli to reinforce different behaviors.
12. Extinction is usually slowest after reinforcement on a
a. continuous reinforcement schedule
b. fixed ratio schedule
c. variable ratio schedule
d. fixed interval schedule
Answer: c
Rationale:
Extinction is slowest after reinforcement on a variable ratio schedule because this schedule
produces the highest rate of responding and the strongest resistance to extinction. With
variable ratio reinforcement, the subject is reinforced after an unpredictable number of
responses, making it difficult for them to predict when reinforcement will occur, thus slowing
down the extinction process.
13. According to the generalization decrement hypothesis, extinction after partial
reinforcement is slow because
a. the animal continues to respond out of frustration
b. the conditions of extinction are fairly similar to conditions in which the animal has been
reinforced in the past
c. the animal cannot discriminate any change from the previous reinforcement condition
d. all of the above
Answer: b
Rationale:
The generalization decrement hypothesis suggests that during extinction, animals respond
less when the conditions are significantly different from those present during reinforcement.
Partial reinforcement schedules involve intermittent reinforcement, making the extinction
conditions more similar to those of reinforcement compared to continuous reinforcement
schedules.
14. Performance on a reinforcement schedule can be greatly affected by
a. the rate of reinforcement
b. the quality of reinforcement
c. the amount of effort required to make the response
d. all of the above
Answer: d
Rationale:
All of the listed factors can significantly impact performance on a reinforcement schedule.
The rate of reinforcement, the quality of reinforcement, and the effort required to make a
response all influence the effectiveness of reinforcement and subsequently affect behavior.
15. Behavioral momentum can be measured as a behavior’s
a. forcefulness
b. duration
c. rate of occurrence
d. resistance to change
Answer: d
Rationale:
Behavioral momentum refers to the resistance of a behavior to change. It can be measured by
observing how resistant a behavior is to disruption or alteration even when reinforcement
conditions change or are removed.
16. A behavior is likely to have strong momentum if
a. the reinforcement rate is high
b. the reinforcement rate is low
c. the behavior is disrupted by some external event
d. it is a contingency-shaped behavior
Answer: a
Rationale:
A behavior is likely to have strong momentum when the reinforcement rate is high because
frequent reinforcement strengthens the association between the behavior and the
reinforcement, making the behavior more resistant to change.
17. If a person’s performance on a fixed-interval schedule is controlled by rule-governed
behavior, the person
a. will wait until the interval has elapsed, then make just one response
b. will exhibit an accelerating response pattern
c. will respond as rapidly as possible
d. might exhibit any of the above patterns, depending on what rule they were following
Answer: d
Rationale:
Rule-governed behavior refers to behavior controlled by verbal or written instructions rather
than direct experience with reinforcement. Depending on the specific instructions given, a
person's performance on a fixed-interval schedule may vary, leading to any of the mentioned
patterns of behavior.
18. In a typical laboratory situation, the post reinforcement pause on a FR schedule is
primarily a function of
a. fatigue
b. satiation
c. the number of responses remaining before the next reinforcer
d. none of the above
Answer: c
Rationale:
The post-reinforcement pause on a fixed-ratio (FR) schedule occurs because the subject has
just received reinforcement and needs to expend additional responses before receiving the
next reinforcement. Therefore, it is primarily a function of the number of responses
remaining before the next reinforcer is obtained.
19. Suppose a pigeon is trained on a multiple schedule consisting of FR 10 when the response
key is red and FR 100 when the response key is blue. If in the middle of one session the
pigeon experiences the sequence FR 10, FR 100, FR 10, we would expect to see the longest
post-reinforcement pause
a. before the first FR 10
b. before the FR 100
c. after the FR 100
d. after the second FR 10
Answer: b
Rationale:
In a multiple schedule, behavior is reinforced on different schedules in different contexts. The
longest post-reinforcement pause would be expected before the FR 100 because it represents
a higher ratio requirement for reinforcement compared to the FR 10, causing a longer delay in
reinforcement.
20. On a variable interval schedule, reinforcement is most likely after
a. a short interresponse time
b. a long interresponse time
c. rapid responding
d. steady responding
Answer: b
Rationale:
In a variable interval (VI) schedule, reinforcement is delivered after a variable amount of
time has passed since the last reinforcement, regardless of the number of responses made.
Reinforcement is most likely after a long interresponse time because the variability in the
timing of reinforcement encourages the subject to continue responding even after a period of
inactivity.
21. On a variable-ratio schedule, reinforcement is most likely after
a. a long interresponse time
b. a pause in responding
c. one reinforcer has just been delivered
d. none of the above
Answer: d
Rationale:
Variable-ratio schedules reinforce behavior after an unpredictable number of responses,
which leads to a high and steady rate of responding. Unlike fixed-ratio schedules where
there's a predictable number of responses required for reinforcement, variable-ratio schedules
don't rely on the timing of responses or the occurrence of pauses. Therefore, reinforcement is
not contingent upon any of the scenarios described in options a, b, or c.
22. If an animal doubles its response rate on a variable-interval schedule, the rate of
reinforcement
a. will double
b. will not change
c. will usually increase slightly
d. will decrease
Answer: c
Rationale:
On a variable-interval schedule, reinforcement is provided after an unpredictable amount of
time has passed. If the response rate doubles, it means the animal is responding more
frequently within those intervals of time. As a result, the rate of reinforcement usually
increases slightly because the animal is more likely to receive reinforcement within each
interval due to its increased response rate.
23. Evidence from laboratory experiments on why responding is faster on VR than VI
schedules has favored
a. the molar theory
b. the molecular theory
c. the fatigue theory
d. the generalization decrement theory
Answer: b
Rationale:
The "molecular theory" suggests that behavior is influenced by the frequency, duration, or
latency of individual responses. In the context of variable-ratio (VR) and variable-interval
(VI) schedules, the molecular theory suggests that VR schedules reinforce at a faster rate
because reinforcement is contingent upon a certain number of responses, while VI schedules
reinforce at a slower rate because reinforcement is contingent upon the passage of time. This
perspective aligns with the experimental evidence supporting the molecular theory as the
favored explanation for the difference in response rates between VR and VI schedules.
24. To help an autistic child learn to say the word ball, the therapist might use his hand to
guide the child's mouth and lips into the proper position. This is an example of
a. shaping
b. prompting
c. fading
d. partial reinforcement
Answer: b
Rationale:
Prompting involves providing cues or assistance to help an individual perform a desired
behavior. In this scenario, the therapist guiding the child's mouth and lips into the proper
position serves as a prompt to help the child say the word "ball."
25. To be effective, a token reinforcement system
a. must provide the patient with tokens that are physical objects, such as poker chips
b. should only allow the tokens to purchase desired activities, not physical objects
c. should be terminated abruptly once a patient's behavior has improved
d. none of the above
Answer: d
Rationale:
A token reinforcement system can be effective regardless of whether the tokens are physical
objects or abstract representations. Additionally, tokens can be exchanged for various types
of reinforcers, including both activities and physical objects. The termination of a token
reinforcement system should be gradual rather than abrupt to ensure sustained behavior
change.
26. Unlike many attempts of ordinary people to change another individual's behavior, with
behavior modification
a. the rules for reinforcement are always applied consistently
b. primary reinforcers are always used
c. continuous reinforcement is always used
d. all of the above
Answer: a
Rationale:
Behavior modification involves applying principles of reinforcement and punishment
systematically to change behavior. One key aspect is the consistent application of
reinforcement rules, which helps ensure that desired behaviors are reinforced appropriately
and consistently.
27. Behavioral marital therapy usually does not involve
a. measuring and recording the behaviors of one's spouse
b. learning how to punish a spouse's bad behaviors
c. learning communication and problem-solving skills
d. creating a written contract between spouses
Answer: b
Rationale:
Behavioral marital therapy focuses on improving communication, problem-solving skills, and
understanding between partners. Punishment of a spouse's behavior is generally not a
component of behavioral marital therapy, as it tends to focus on positive reinforcement of
desired behaviors and effective communication strategies.
28. Behavioral marital therapy can include
a. a contingency contract
b. behavior exchange
c. training in communication and problem-solving skills
d. all of the above
Answer: d
Rationale:
Behavioral marital therapy utilizes various techniques, including contingency contracts,
behavior exchange, and training in communication and problem-solving skills, to address
relational issues and promote positive interactions between partners. These approaches aim to
reinforce desired behaviors and improve the overall quality of the relationship.
Test Bank for Learning and Behavior
James E. Mazur
9780205864812, 9780205246441