Module 10—Operant and Cognitive Approaches
MULTIPLE CHOICE
1. Bart, the 10-foot-tall Kodiak bear, learned and performed 45 behaviors for a starring role in
movies through:
a. operant conditioning
b. imprinting
c. insight
d. classical conditioning
Answer: A
2. Instrumental conditioning is also known as:
a. classical conditioning
b. imprinting conditioning
c. instinctual conditioning
d. operant conditioning
Answer: D
3. The focus of operant conditioning is on how:
a. people learn from the examples of others without themselves being reinforced
b. repetition results in reflexive habits without awareness
c. behavior is influenced by its consequences and whether they increase or decrease the
likelihood of that behavior being repeated
d. stimuli are paired to elicit a response
Answer: C
4. The type of learning that focuses on the consequences of behavior is called:
a. classical conditioning
b. operant conditioning
c. process conditioning
d. latent learning
Answer: B
5. The Great Carlo is an internationally known lion trainer. You ask him how he goes about
training his lions. He responds by saying something about consequences that increase the
chance that the desired behavior will again be performed. You recognize his method as:
a. trial-and-error conditioning
b. cognitive learning
c. classical conditioning
d. operant conditioning
Answer: D
6. One of the websites that you’ve recently visited was a site describing techniques for
parents to manage their children’s behavior. You notice that many of the techniques are based
on operant conditioning. These techniques share what in common?
a. using consequences to influence behavior
b. using observation and imitation
c. pairing UCS with CS
d. associating UCS with UCR
Answer: A
7. In ____, the consequences of a behavior influence whether an organism will perform the
same behavior in the future.
a. latent learning
b. target learning
c. operant conditioning
d. classical conditioning
Answer: C
8. You want to change the behavior of your roommate. You decide to reward his behaviors
that you like and punish his behaviors that you do not like. You are using:
a. latent learning
b. cognitive learning
c. operant conditioning
d. classical conditioning
Answer: C
9. Plotnik’s example of skateboarder Tony Hawk shows that:
a. learning can also occur just by observation, without external rewards
b. human learning differs greatly from animal learning
c. rewards actually interfere with learning
d. complex behaviors can be acquired through classical conditioning
Answer: A
10. Those who study cognitive learning claim that learning can take place in an individual:
a. who has not received any noticeable rewards, but who simply observes and imitates
b. only if the learning is reflexive in nature
c. who shows a change in behavior
d. only when the behavior is followed by an effect
Answer: A
11. You are babysitting your three-year-old niece and notice that she is acting very much like
a character from a television show. Being an astute psychology student, you reason that you
are most likely witnessing:
a. operant conditioning
b. instrumental conditioning
c. classical conditioning
d. cognitive learning
Answer: D
12. What names are most associated with operant conditioning?
a. B. F. Skinner and Carl Rogers
b. Ivan Pavlov and George Miller
c. Edward Thorndike and B. F. Skinner
d. Albert Bandura and Ivan Pavlov
Answer: C
13. Whiskers is a cat being used in Thorndike’s studies on learning. The amount of time it
takes Whiskers to get out of the puzzle box to get a piece of fish is decreasing. Thorndike
would conclude that:
a. Whiskers’ behaviors that lead to escaping the box to get the fish are being strengthened.
b. Whiskers’ behaviors that lead to escaping the box to get the fish are being weakened.
c. Whiskers is imprinting on Thorndike’s behavior.
d. Whiskers’ behaviors are under a continuous schedule of reinforcement.
Answer: A
14. Trial-and-error learning is associated with experiments conducted with hungry cats placed
in a puzzle box. This work was conducted by:
a. B. F. Skinner
b. Ivan Pavlov
c. Edward Thorndike
d. Albert Bandura
Answer: C
15. In your backyard you have a bird feeder from which a particular squirrel likes to eat. You
have tried many ways to prevent it from stealing from it. You notice that the squirrel’s
random behavior gradually turns into goal-directed behavior. This best illustrates what
Thorndike called the:
a. Law of Effect
b. principle of continuity
c. law of consequence
d. classical conditioning
Answer: A
16. Thorndike developed the Law of Effect by studying:
a. the saliva of dogs
b. how a cat learns to escape from a puzzle box
c. how a rat learns to press a lever
d. how to train a bear to hold a stuffed animal
Answer: B
17. What is described as the idea that behaviors followed by positive consequences are
strengthened, while behaviors followed by negative consequences are weakened?
a. behavioral facilitation
b. the principle of continuity
c. cognitive learning
d. the Law of Effect
Answer: D
18. Thorndike found that a cat learned to escape from a puzzle box by the consequences of its
behavior. How could he arrive at such a conclusion?
a. The CR was increasing in magnitude over time.
b. The cat learned by watching Thorndike open the puzzle box.
c. The CS (freedom) provided information about the occurrence of the UCS (piece of fish).
d. The time it took cats to escape decreased over time.
Answer: D
19. The law of effect is to ____ as operant conditioning is to ____.
a. Tolman; Bandura
b. Thorndike; Skinner
c. Skinner; Thorndike
d. Skinner; Rescorla
Answer: B
20. The unit of behavior that Skinner could measure is called a(n):
a. reinforcer
b. conditioned stimulus
c. operant response
d. behavioral index
Answer: C
21. ____ is a term suggested by Skinner to indicate a response that can be changed by its
consequences and is a unit of behavior that can be easily measured.
a. Prepared response
b. Conditioned response
c. Effect response
d. Operant response
Answer: D
22. Bart the bear picks up a teddy bear. In operant conditioning, this behavior is called a(n)
____. He is given an apple, which is the ____ of picking up the teddy bear.
a. reinforcer; consequence
b. conditioned response; reinforcer
c. operant response; consequence
d. consequence; reinforcer
Answer: C
23. Conditioning a rat to press a lever is most associated with the work of:
a. Ivan Pavlov
b. B. F. Skinner
c. Edward Thorndike
d. Albert Bandura
Answer: B
24. The apparatus used by Skinner to study operant conditioning is called a(n):
a. Skinner box
b. classical chamber
c. puzzle box
d. Pavlov box
Answer: A
25. If B. F. Skinner was alive today, his website would be most likely named:
a. www.cognitive.com
b. www.insight.com
c. www.operant.com
d. www.classical.com
Answer: C
26. In operant conditioning, behavior that can be modified by its ____ is called a(n) ____.
a. antecedents; stimulus
b. consequences; operant response
c. consequences; unconditional stimulus
d. consequences; conditional stimulus
Answer: B
27. Dr. Peck wishes to operantly condition a pigeon to pick a black card out of a set of white
cards. To ensure that Peck’s pigeon picks the proper card, the reinforcer must:
a. precede the desired behavior
b. occur at the same time as the desired behavior
c. become a predictor for the desired behavior
d. follow the desired behavior
Answer: D
28. Shaping is defined as:
a. promising a reward in return for performing desired behavior
b. pairing two stimuli to elicit the desired behavior
c. reinforcing behaviors that successively lead up to the desired behavior
d. changing behavior through the use of punishment
Answer: C
29. In the early stages of shaping a rat to press a lever, the teacher would:
a. deliver a food pellet if the rat simply faced the lever
b. feed the rat many food pellets to familiarize the rat with the reinforcer
c. delay the delivery of food pellets to present a challenge to the rat
d. wait for the rat to press the lever three times in a row before delivering a food pellet
Answer: A
30. A researcher is trying to get a pigeon to play “Joy to the World” on a toy piano. If the
pigeon plays the first note followed by a reinforcer, and then the pigeon plays the first and
second note followed by a reinforcer, the researcher is most likely using:
a. classical conditioning
b. shaping
c. cognitive theory
d. stimulus substitution
Answer: B
31. A college student taking a composition class hears that the professor wants a topic for the
paper in two weeks, followed by a bibliography in four weeks, then an outline in 6 weeks, a
first draft in 8 weeks, and the final version in 12 weeks. These deadlines to reinforce
behaviors that lead up to the completed paper best illustrate:
a. a variable-ratio schedule
b. stimulus substitution
c. cognitive theory
d. shaping
Answer: D
32. Karen is “potty training” Andrew. First she gives him a cookie when he spends time near
the potty, then he gets a cookie if he sits on the potty, and finally he gets a cookie for making
“poo-poo” in the potty. Karen is using a procedure called:
a. negative reinforcement
b. generalization
c. shaping
d. intermittent reinforcement
Answer: C
33. Right before a game, a baseball player goes through a series of ritualistic behaviors that
he says give him good luck, but in fact are not in reality associated with any reinforcer. This
ball player illustrates:
a. discrimination
b. generalization
c. observational learning
d. superstitious behaviors
Answer: D
34. According to the textbook, what is the best explanation for a professional baseball player
eating chicken every day that a game is scheduled?
a. superstitious behaviors
b. generalization
c. observational learning
d. discrimination
Answer: A
35. Alfredo brings his lucky pencil with him for his exams. His pencil was accidentally paired
with a good grade on his exams. Alfredo’s behavior is an example of a:
a. variable-ratio schedule
b. reinforcement
c. conditioned response
d. superstitious behavior
Answer: D
36. Ali is trying to summarize operant conditioning. Which of the following does the best
job?
a. critical analyses can obstruct behavior
b. conditioning a consequence organizes behavior
c. constant attention commands operant behaviors
d. consequences are contingent on behavior
Answer: D
37. If parents wanted to increase the study habits of their children, they might consider using
operant conditioning. The first step would be to:
a. identify the target behavior or the goal for the child
b. select reinforcers
c. reinforce appropriate behavior
d. present reinforcers through the shaping procedure
Answer: A
38. What effect would a reinforcer have on a child’s behavior?
a. it decreases the likelihood that behavior will be repeated only if a reinforcer is given before
the child performs the behavior
b. it depends on what the child’s behavior is
c. it decreases the likelihood that the behavior will be repeated
d. it increases the likelihood that the behavior will be repeated
Answer: D
39. You are visiting some friends who have a three-year-old who is being toilet trained. You
hear the mother reinforcing the child after he says that he has to urinate, after he enters the
bathroom, and after he uses the toilet. The mother has used the operant technique called:
a. discrimination
b. spontaneous recovery
c. shaping
d. extinction
Answer: C
40. One of the steps in using operant conditioning to overcome a child’s refusal to eat certain
foods includes reinforcing her when she notices the food, then when it is placed in her mouth,
then when she tastes the food, and when she swallows it. This best describes:
a. shaping
b. conditioned responses
c. spontaneous recovery
d. continuous reinforcement
Answer: A
41. Which of the following is not among the four steps in using operant conditioning to teach
young children to taste, chew, and eat a food?
a. identify the target behavior
b. provide a reinforcer after the desired behavior is performed
c. shape the behavior
d. pair the unconditioned stimulus with the conditioned stimulus
Answer: D
42. The goal of operant conditioning is to ____, while the goal in classical conditioning is to
____.
a. create an association between stimuli; create an association between behavior and
consequences
b. create an expectation that a conditioned stimulus will lead to behavior; increase or decrease
the rate of some behavior
c. decrease the rate of some behavior; increase the rate of some behavior
d. increase or decrease the rate of some behavior; create a new response to a neutral stimulus
Answer: D
43. Classical is to operant as:
a. learned is to memorized
b. undesirable is to desirable
c. involuntary is to voluntary
d. learned is to innate
Answer: C
44. Classical is to operant as:
a. elicited is to emitted
b. undesirable is to desirable
c. observable is to invisible
d. consequences is to expectancy
Answer: A
45. In operant conditioning, the response is ____. In classical conditioning, that response is
____.
a. involuntary; voluntary
b. reflexive; involuntary
c. involuntary; reflexive
d. voluntary; involuntary
Answer: D
46. In classical conditioning, a stimulus is paired with ____; in operant conditioning, a
behavior is paired with ____.
a. a reward; a stimulus
b. another stimulus; a consequence
c. a reflex; a stimulus
d. a consequence; another organism
Answer: B
47. Learned behaviors in operant conditioning are ____ and in classical conditioning they are
____:
a. solicited; illicited
b. emitted; elicited
c. elicited; emitted
d. involuntary; voluntary
Answer: B
48. As compared to classical conditioning, the behaviors to be learned in operant conditioning
are:
a. reflexive
b. elicited
c. automatic
d. voluntary
Answer: D
49. Professor Cao is writing words on the overhead that describe operant conditioning. You
notice that she makes a mistake. Which word did she accidentally write down that does not
refer to operant conditioning?
a. voluntary
b. contingency
c. conditioned response
d. consequences
Answer: C
50. If you wish to modify your roommate’s behavior to clean up, which type of learning
would you use and why?
a. classical conditioning—cleaning is a conditioned response
b. operant conditioning—cleaning is a voluntary response
c. operant conditioning—cleaning is an unconditioned response
d. classical conditioning—cleaning can be conditioned using backward conditioning
Answer: B
51. In operant conditioning, ____ increases or decreases the chances that the ____ will occur
again.
a. behavior; consequences
b. response; stimulus
c. reflex; stimulus
d. consequences; behavior
Answer: D
52. Reinforcement is to ____, as punishment is to ____.
a. decrease; increase
b. decrease; decrease
c. increase; decrease
d. operant conditioning; classical conditioning
Answer: C
53. According to operant conditioning, an organism is more likely to perform a behavior in
the future if the behavior is:
a. reinforced
b. reflexive
c. substituted
d. spontaneously recovered
Answer: A
54. ____ is a consequence that has the effect of decreasing the chance that the behavior that
came before it will happen again.
a. Negative reinforcement
b. Shaping
c. Punishment
d. Operant response
Answer: C
55. “A consequence of a behavior that decreases the likelihood of that behavior occurring
again” is the definition of:
a. negative reinforcement
b. punishment
c. partial reinforcement
d. learned helplessness
Answer: B
56. Ben, a mentally retarded child, has been observed eating inedible objects and substances.
Ben’s parents are concerned and have taken him to a psychologist. The psychologist has
diagnosed Ben as having:
a. autism
b. pica
c. rumination
d. Grant’s disease
Answer: B
57. Pica has been successfully treated using operant conditioning. Each time an inedible
object was selected, the subject received ____. Each time an appropriate, edible object was
selected, ____ was presented.
a. praise; criticism
b. a consequence; reinforcement
c. negative reinforcement; reinforcement
d. mild punishment; reinforcement
Answer: D
58. A professor says to her student, “Nice job on that test.” She has used:
a. positive reinforcement
b. generalization
c. negative reinforcement
d. negative punishment
Answer: A
59. When Beaver learns the meaning of 10 new vocabulary words, his father Ward says,
“That’s a good boy, Beaver.” Ward’s praise is a(n):
a. UCS
b. conditioned stimulus
c. negative reinforcer
d. positive reinforcer
Answer: D
60. Negative reinforcement is:
a. a pleasant stimulus that increases the likelihood of the response occurring again
b. an unpleasant stimulus that increases the likelihood of the response occurring again
c. an unpleasant stimulus that decreases the likelihood of the response occurring again
d. the removal of an unpleasant stimulus that increases the likelihood of the response
occurring again
Answer: D
61. You have a painful headache and so you take an aspirin to eliminate the pain. The aspirin
works and now you are free of your headache. Taking the aspirin is an example of a:
a. negative reinforcer—it increases the chance of taking aspirin again the next time you have
a headache
b. negative reinforcer—it decreases the chance of taking aspirin again the next time you have
a headache
c. positive reinforcer—it increases the chance of taking aspirin again the next time you have a
headache
d. positive reinforcer—it decreases the chance of taking aspirin again the next time you have
a headache
Answer: A
62. Kristin wants to go out and play, but her mother has said no. Kristin goes to her room and
plays her rock music very loudly. The noise drives her mother crazy and Kristin is allowed to
go out and play if she will turn off her music. In this example, ____ was a form of negative
reinforcement.
a. playing the music
b. turning off the music
c. going crazy
d. going out to play
Answer: B
63. Which of the following is the best example of negative reinforcement?
a. being put in jail for driving while drunk
b. not being allowed to go to the movies on Saturday night
c. a spanking for bad behavior
d. elimination of pain after taking an aspirin
Answer: D
64. Reinforcers, whether they be positive or negative, have the same effect on behavior,
which is to:
a. decrease the probability that the behavior will be repeated
b. increase the probability that the behavior will be repeated
c. increase the probability that the behavior will be extinguished
d. decrease the probability that the behavior will be spontaneously recovered
Answer: B
65. Positive reinforcement ____ the likelihood that the preceding behavior will be repeated.
Negative reinforcement ____ the likelihood that the preceding behavior will be repeated.
a. increases; increases
b. increases; decreases
c. decreases; increases
d. decreases; decreases
Answer: A
66. Ricardo and Luis are out walking. Luis says, “Hey, I’ve got a pebble in my shoe,” and
proceeds to take off his shoe and to remove the pebble. “That feels better,” says Luis. Ricardo
believes that Luis’ behavior of removing the pebble is a(n) ____ because it increases the
chance that Luis will repeat the behavior if another pebble gets in his shoe.
a. positive punisher
b. positive reinforcer
c. negative reinforcer
d. negative punisher
Answer: C
67. Reinforcement is to increase as punishment is to ____.
a. increase
b. decrease
c. condition
d. negative
Answer: B
68. What do positive reinforcement, negative reinforcement, positive punishment, and
negative punishment all have in common?
a. They are all examples of responses used in classical conditioning.
b. They all increase the chances that behavior will be repeated.
c. All of them are consequences in operant conditioning.
d. They all decrease the chances that behavior will be repeated.
Answer: C
69. In operant conditioning, a stimulus that increases the probability of a behavior occurring
again is called a:
a. reinforcer
b. punisher
c. generalizer
d. conditioner
Answer: A
70. Whenever little Bobby cries, his father spanks him. Bobby’s father is trying to decrease
Bobby’s crying through the use of:
a. negative punishment
b. negative reinforcement
c. positive reinforcement
d. positive punishment
Answer: D
71. The little child who gets a good hard spanking for running out into the street is
experiencing an operant conditioning procedure called:
a. positive reinforcement
b. negative reinforcement
c. positive punishment
d. negative punishment
Answer: C
72. During a lecture on learning, a fellow student accidentally stubs his toe on a table leg and
lets out a “yelp.” Having heard it, the professor says, “Is that behavior [toe stubbing] likely to
happen again in the future?” Answer the professor and indicate the reason for your answer.
a. No—the behavior was followed by negative reinforcement (pain)
b. No—the behavior was an example of stimulus substitution
c. No—the behavior was followed by positive punishment (pain)
d. No—the consequence was followed by the behavior
Answer: C
73. What refers to presenting an aversive stimulus after a response that decreases the odds
that the response will recur?
a. negative punishment
b. punishment
c. positive punishment
d. latent punishment
Answer: C
74. What refers to removing a reinforcing stimulus after a response that decreases the odds
that the response will recur?
a. negative punishment
b. extinction
c. positive punishment
d. latent punishment
Answer: A
75. You remember a friend of yours in elementary school stuck his tongue on a pole on a
playground swing set in the middle of winter. He yelled in pain, but finally pulled his tongue
off the pole. He said, “I’ll never do that again, it hurts!” His behavior of putting his tongue on
the pole involved ____ since he never did it again.
a. negative punishment
b. positive punishment
c. salient punishment
d. primary punishment
Answer: B
76. Miranda comes home late one evening past her curfew only to find her parents waiting up
for her. Her father says, “Miranda, you’re late! You may not use the car for an entire month.”
Miranda’s father is using:
a. negative punishment
b. negative reinforcement
c. positive punishment
d. schedule of punishment
Answer: A
77. Positive punishment ____ the likelihood that the preceding behavior will be repeated.
Negative punishment ____ the likelihood that the preceding behavior will be repeated.
a. increases; increases
b. increases; decreases
c. decreases; increases
d. decreases; decreases
Answer: D
78. A primary reinforcer ____ the likelihood that the preceding behavior will be repeated. A
secondary reinforcer ____ the likelihood that the preceding behavior will be repeated.
a. increases; increases
b. increases; decreases
c. decreases; increases
d. decreases; decreases
Answer: A
79. A pigeon pecks on a sign and is given food. The food is a:
a. secondary consequence
b. primary stimulus
c. primary reinforcer
d. secondary reinforcer
Answer: C
80. Since chocolate activates the brain’s pleasure centers, it can be considered a:
a. secondary consequence
b. primary stimulus
c. primary reinforcer
d. secondary reinforcer
Answer: C
81. A stimulus that is associated with stimuli such as water, food, and shelter will become a:
a. primary reinforcer
b. continuous reinforcer
c. secondary reinforcer
d. partial reinforcer
Answer: C
82. Which of the following would not be an example of a primary reinforcer?
a. a drink of water
b. a sexual encounter
c. a hundred-dollar bonus
d. a warm place to sleep
Answer: C
83. Betty Lou gives her son Pierre a piece of pecan pie if he does all his homework. Betty
Lou is providing Pierre with a ____ reinforcer.
a. primary
b. secondary
c. negative
d. partial
Answer: A
84. Monica gave William a nice tie for his help in locating a good used car. The tie is an
example of a ____ reinforcer.
a. primary
b. secondary
c. negative
d. partial
Answer: B
85. The value of a secondary reinforcer is:
a. innate
b. its association with things like tokens and money
c. learned
d. evident to all humans
Answer: C
86. Which of the following would not be used as a secondary reinforcer when teaching young
children to read?
a. ice cream
b. poker chips
c. praise
d. colored stickers on a chart
Answer: A
87. The example of a Massachusetts school requiring students to wear backpacks that contain
shock devices illustrates the use of ______.
a. conditioned stimulus
b. punishment
c. secondary reinforcer
d. discrimination
Answer: B
88. Mrs. Paulson, a third-grade teacher, gives her students a sticker when they do a good job
on their homework. A sticker is an example of a(n):
a. primary reinforcer
b. secondary reinforcer
c. basic reinforcer
d. advanced reinforcer
Answer: B
89. Which of the following is not a secondary reinforcer?
a. high grades
b. money
c. shelter
d. a gold star
Answer: C
90. What technique involves removing reinforcing stimuli after noncompliance occurs in a
child?
a. classical conditioning
b. stimulus substitution
c. time out
d. secondary reinforcer
Answer: C
91. Time out is a procedure that:
a. uses positive punishment
b. gives an unpleasant consequence to the child for inappropriate behavior
c. removes a child from a situation where they might receive reinforcement for their
noncompliance
d. has been shown to be ineffective in reducing temper tantrums
Answer: C
92. Little Drew doesn’t like his spaghetti dinner so he throws a temper tantrum. His dad
Robert puts Drew in an empty room for three minutes and closes the door. Robert is using a
procedure called:
a. avoidance conditioning
b. negative reinforcement
c. learned helplessness
d. time out
Answer: D
93. The various rules, programs, and ways that reinforcers occur after performing some
behavior are called:
a. cumulative records
b. shaping procedures
c. behavior modifications
d. schedules of reinforcement
Answer: D
94. If you wish to determine the past behavior of a rat in a Skinner Box, you can review:
a. schedules of reinforcement
b. shaping procedures
c. cumulative records
d. videotapes
Answer: C
95. ____ give us a picture of an animal’s ongoing responses and reinforcements across time.
a. Cumulative records
b. Shaping procedures
c. Schedules of reinforcement
d. Puzzle box records
Answer: A
96. If a behavior is reinforced each and every time it occurs, its reinforcement schedule is:
a. an interval schedule of reinforcement
b. continuous reinforcement
c. complete reinforcement
d. stable reinforcement
Answer: B
97. As Toan gets on the bus to go to school each morning, the bus driver says, “Good
morning, Toan. It’s good to see you!” This is an example of:
a. interval reinforcement
b. basic reinforcement
c. partial reinforcement
d. continuous reinforcement
Answer: D
98. If you give your dog a treat each time she performs a trick, you are using
a. interval reinforcement
b. basic reinforcement
c. partial reinforcement
d. continuous reinforcement
Answer: D
99. If you give your dog a treat sometimes after she performs a trick, you are using:
a. interval reinforcement
b. basic reinforcement
c. partial reinforcement
d. continuous reinforcement
Answer: C
100. When is continuous reinforcement most appropriate?
a. when the behavior is a voluntary response
b. when the behavior is an involuntary reflex
c. in the initial stages of operant conditioning
d. only after the conditioning has taken place
Answer: C
101. Shirley is about to teach a group of eight-year-olds the backstroke. She wants to do this
using operant conditioning. At the outset of the swimming course, Shirley should:
a. appear quite stern so that later praise will seem more meaningful
b. praise them for no particular reason but to establish rapport
c. praise every correct thing the young swimmers do
d. praise them at the end of each lesson only, since that is what she would do if they were in
proper training for a meet
Answer: C
102. Partial reinforcement is defined as reinforcement in which:
a. behaviors are not reinforced every time they occur
b. the organism gives up before full reinforcement is obtained
c. only secondary reinforcers are utilized
d. punishment is used to shape behaviors
Answer: A
103. Robert is reinforced by his teacher every sixth time he turns in a homework assignment.
Robert’s teacher is using a ____ schedule of reinforcement.
a. fixed-interval
b. fixed-ratio
c. variable-interval
d. variable-ratio
Answer: B
104. A ____ refers to a reinforcer occurring only after an unchanging number of responses
take place.
a. fixed interval
b. fixed ratio
c. variable interval
d. variable ratio
Answer: B
105. Out in the garden, Lucille is given a dime for every five weeds she pulls. What
reinforcement schedule is she on?
a. fixed ratio
b. fixed interval
c. variable ratio
d. variable interval
Answer: A
106. “Every other set of encyclopedias you sell, I will give you $100,” says your supervisor.
You realize that you are on a ____ schedule of reinforcement.
a. fixed-ratio
b. fixed-interval
c. variable-ratio
d. variable-interval
Answer: A
107. When Bob plays cards with his friends, he finds that his winning hands seem to come in
bunches. Then, he may go as many as 10 hands without winning anything. Which schedule
best describes the reinforcement Bob receives when playing cards?
a. fixed ratio
b. fixed interval
c. variable ratio
d. variable interval
Answer: C
108. “Poor fool,” you think to yourself when your friend tells you she lost on the lottery
again, “another helpless victim of the ____ schedule of reinforcement.”
a. fixed-ratio
b. variable-ratio
c. fixed-interval
d. variable-interval
Answer: B
109. Mom tells Billy that she will pay him to pull weeds from her flower garden. Because
Mom is busy, she sometimes gives Billy a dime when he pulls five weeds, sometimes when
he pulls 10 weeds, and other times when he pulls 20 or more weeds. The reinforcement
schedule Mom is using is:
a. continuous
b. fixed interval
c. variable ratio
d. fixed ratio
Answer: C
110. “Maybe this lottery ticket will be a winner. The last couple ones I’ve bought were losers,
but I do buy a winner every once in a while.” This person’s lottery ticket-buying behavior is
on what schedule of reinforcement?
a. continuous
b. fixed interval
c. variable ratio
d. fixed ratio
Answer: C
111. A pattern in which students at a reform school clean up their rooms only before the
weekly inspections is typical of which kind of reinforcement schedule?
a. variable interval
b. variable ratio
c. fixed interval
d. fixed ratio
Answer: C
112. Bruce gives his son Kenny a quarter if he practices his tennis serve for 15 minutes.
Bruce is using which type of reinforcement schedule?
a. variable interval
b. fixed interval
c. variable ratio
d. fixed ratio
Answer: B
113. Every 50 minutes, the class takes a break if their behavior is appropriate. They are on a
____ schedule of reinforcement.
a. variable-interval
b. fixed-interval
c. variable-ratio
d. fixed-ratio
Answer: B
114. Suzanne randomly checks up on her employees several times throughout the day to
praise them if they are working hard. None of the employees know when Suzanne will be
checking up on them. Suzanne is using a ____ schedule of reinforcement.
a. fixed-interval
b. fixed-ratio
c. variable-interval
d. variable-ratio
Answer: C
115. You like to fish. Sometimes, it takes you 30 minutes to catch a fish; other times, you
catch a fish every 5 minutes. Fishing, in this case, is on a ____ schedule of reinforcement.
a. fixed-interval
b. fixed-ratio
c. variable-interval
d. variable-ratio
Answer: C
116. A ____ refers to a reinforcer occurring only after an unchanging amount of time has
lapsed.
a. fixed-interval
b. fixed-ratio
c. variable-interval
d. variable-ratio
Answer: A
117. Dakota is using operant conditioning to get his dog Rover to bring him his slippers. He
sounds a bell, has Rover bring the slippers, and reinforces the behavior with a dog biscuit.
One day, a church bell sounds outside and Rover brings Dakota his slippers. Rover’s behavior
illustrates:
a. discrimination
b. shaping
c. intermittent reinforcement
d. generalization
Answer: D
118. In operant conditioning, generalization has occurred when:
a. an organism emits the same response to similar stimuli
b. a response is not emitted in the presence of unreinforced stimuli
c. a behavior is no longer reinforced
d. an organism realizes that it has been operantly conditioned
Answer: A
119. After being praised for learning the word “doggie”, a young child will point to anything
with four legs and a tail and say “doggie”—even if the “doggie” is really a cat, a horse, or a
cow. This child is demonstrating:
a. discrimination
b. spontaneous recovery
c. extinction
d. generalization
Answer: D
120. If an organism emits a response only in the presence of reinforced stimuli and not in the
presence of unreinforced stimuli, then the organism is displaying:
a. discrimination
b. spontaneous recovery
c. extinction
d. generalization
Answer: A
121. A child learns that a particular large four-legged animal is a horse. When he learns that
the striped animal in the zoo is a zebra, he is able to demonstrate:
a. selective responding
b. selective attention
c. discrimination
d. spontaneous generalization
Answer: C
122. In classical conditioning, ____ is the tendency for some stimuli but not others to elicit a
conditioned response.
a. discrimination
b. selective attention
c. generalization
d. spontaneous extinction
Answer: A
123. In operant conditioning, extinction occurs because:
a. of disinterest
b. reinforcement no longer follows the behavior
c. the task is difficult
d. of delay of reinforcement
Answer: B
124. According to operant conditioning, if a behavior is no longer followed by a reinforcer,
the frequency of the behavior will:
a. become more intense
b. remain unchanged
c. increase
d. decrease
Answer: D
125. In classical conditioning, ____ refers to the reduction in a response when the
conditioned stimulus is no longer followed by the unconditioned stimulus.
a. stimulus discrimination
b. conditioned generalization
c. spontaneous recovery
d. extinction
Answer: D
126. According to the principles of operant conditioning, a response will undergo extinction
if the response is:
a. too difficult to maintain
b. no longer reinforced
c. reflexive in nature
d. reinforced too often
Answer: B
127. After operantly conditioning a rat to press a bar, a psychologist stops providing the
reinforcing pellets. The rat eventually stops pressing the bar. Bar pressing has undergone:
a. spontaneous recovery
b. extinction
c. shaping
d. generalization
Answer: B
128. After a period of extinction, a temporary increase in the rate of responding is called:
a. spontaneous recovery
b. extinction
c. discrimination
d. generalization
Answer: A
129. ____ involves mental processes and learning through observation.
a. Operant conditioning
b. Classical conditioning
c. Gestalt learning
d. Cognitive learning
Answer: D
130. Who said “...cognitive science is the [downfall] of psychology”?
a. B. F. Skinner
b. Ivan Pavlov
c. Edward Tolman
d. Albert Bandura
Answer: A
131. Little three-year-old Noelle likes to imitate whatever her big sisters are doing, but she
does so only later when she is by herself in her room. This learning is most probably:
a. operant conditioning
b. classical conditioning
c. cognitive learning
d. imprinting
Answer: C
132. Cognitive learning refers to:
a. associating NS with UCS
b. problem solving
c. the role of stimulus recognition in classical conditioning
d. learning that involves mental processes such as attention
Answer: D
133. “If you can’t observe it then you shouldn’t study it.” Which of the following theorists
would be most likely to say that statement?
a. Albert Bandura
b. Edward Tolman
c. Wolfgang Koehler
d. B. F. Skinner
Answer: D
134. Which of the following theorists argued that learning involves a mental representation of
the environment?
a. Albert Bandura
b. Edward Tolman
c. Wolfgang Koehler
d. B. F. Skinner
Answer: B
135. “I can see in my mind the layout of the town I visited last summer.” This person is using
her ____ of the town.
a. latent schema
b. cognitive map
c. cognitive network
d. imprinting
Answer: B
136. If the shortest path to a food box is blocked, a rat will select the next shortest path if the
rat has:
a. developed a cognitive map
b. been continuously reinforced
c. been classically conditioned
d. been punished
Answer: A
137. Which of the following theorists argued that learning can take place when someone is
watching another person and performs that behavior even when not reinforced?
a. Albert Bandura
b. Edward Tolman
c. Wolfgang Koehler
d. B. F. Skinner
Answer: A
138. You want to write a paper on the effects watching, imitating, and modeling has on
behavior. Which of the following journals should you look in?
a. Journal of Behaviorism
b. Journal of Classical Conditioning
c. Journal of Social Cognitive Learning
d. Journal of Operant Conditioning
Answer: C
139. Alex watches a violent TV show and then pretends to shoot his brother James with a toy
pistol. A psychologist would say that Alex has learned to “shoot” his brother through:
a. classical conditioning
b. observational learning
c. behavior modification
d. operant conditioning
Answer: B
140. Children learned to hit a Bobo doll through:
a. reinforcement of aggressive behaviors
b. watching an adult model hit a Bobo doll
c. classical conditioning principles
d. reflexive reactions to a stressful situation
Answer: B
141. Which subject in Bandura’s Bobo doll was most likely to show aggressive behavior?
a. Rachel, because she was instructed to do so by her teacher.
b. Tamara, since she was told by an adult to hit the Bobo doll.
c. Paul, who saw a model hit the doll.
d. Claudia, because she was reinforced for her aggression.
Answer: C
142. The most important conclusion from the Bobo doll study is that:
a. behavior can be modified throughout negative punishment
b. behavior can be modified by providing secondary reinforcers
c. we create cognitive maps of dolls
d. behavior can be modified by simply watching a live model
Answer: D
143. What happens to mirror neurons when we observe someone?
a. they become less active
b. they reduce their communication with the thalamus
c. they become activated
d. they reduce their communication with the hippocampus
Answer: C
144. “I know and understand this material,” says Paul. His instructor would agree with him.
However, when it comes time to prove his understanding on the exam, he typically doesn’t do
well. This exemplifies the idea of:
a. the learning-performance distinction
b. insight learning
c. a lack of preparedness
d. shaping
Answer: A
145. The learning-performance distinction suggests that:
a. children learn better if required to perform some behavior
b. when something is learned, it is immediately performed
c. reinforcement does not play a role in observational learning
d. learning may occur but may not always be immediately evident
Answer: D
146. “I didn’t know you knew how to do that!” says a bewildered parent to his young
daughter. Apparently, the young girl would watch her dad turn on and play games on the
computer. This imitation had been going on for several months, but this was the first time she
demonstrated her learned behavior. Her father explained the delay by using the notion of:
a. cognitive learning
b. observational learning
c. learning-performance distinction
d. operant conditioning
Answer: C
147. The four processes necessary for observational learning are attention, memory, imitation,
and ____.
a. discrimination
b. generalization
c. motivation
d. reinforcement
Answer: C
148. “If they don’t pay attention, they’ll never be able to do it,” a frustrated teacher complains
as she attempts to model the steps on how to solve a math problem. Her goal is to have the
students learn, but even if she gets her students to pay attention, they still must:
a. have a good reason to model the teacher’s behavior
b. generalize to other settings and be motivated
c. associate the behavior with a UCS
d. be reinforced for doing the behavior that is being modeled
Answer: A
149. “I watched a show on television last month about a person eating several cups of
earthworms. I paid attention to it and I remember it very well. I suppose I could do it, but
there’s no good reason for me to do it.” Which process of social cognitive learning is lacking
in this person’s case?
a. motivation
b. generalization
c. discrimination
d. imitation
Answer: A
150. “I watched a show on television last month about people who can perform amazing feats
of balance. I paid attention to the show, I remember the show, and I wish I could do the same
feats, but I cannot.” Using the four processes of social cognitive learning, which process is
lacking in this person’s case?
a. attention
b. memory
c. discrimination
d. imitation
Answer: D
151. ____ is a mental process marked by the sudden and unexpected solution of a problem.
a. Categorical learning
b. Operant conditioning
c. Insight learning
d. Cognitive learning
Answer: C
152. You are a member of a committee that has been trying to solve a community problem for
several months. During a recent low point in the meeting, someone stands up and yells, “Ah
ha, I’ve got the solution.” You recognize this to be an example of:
a. insight learning
b. latent conditioning
c. categorical learning
d. cognitive learning
Answer: A
153. Kohler believed that chimps learned to solve problems through:
a. trial and error
b. reinforcement
c. memory
d. insight
Answer: D
154. What problem was the chimp in Kohler’s study attempting to solve?
a. getting out of a box to get a banana
b. getting a banana hung high
c. pushing a box off a banana
d. peeling a banana
Answer: B
155. One criticism of Kohler’s suggestion that chimps demonstrate insight learning is:
a. that the chimps were not exposed enough to the problem of getting the banana
b. Kohler did not explain how chimps solved problems, but merely described it
c. that the schedule of reinforcement was not identified
d. that chimps are prepared to climb and jump and do so in the wild
Answer: B
156. “Ah ha!” is to ____ as reinforcement is to ____.
a. insight learning; operant conditioning
b. imprinting; classical conditioning
c. preparedness; cognitive theory
d. spontaneous recovery; insight learning
Answer: A
157. A newspaper article has the headline, “Scientists find innate tendency that helps
learning.” You realize that the “innate tendency” refers to:
a. cognitive factors
b. environmental stimuli
c. biological factors
d. behavioral factors
Answer: C
158. Why would animals and humans have biological predispositions to learn certain
behaviors? The behaviors have:
a. value for scientists to study
b. value for creating strife among groups
c. value for psychopathology
d. adaptive functions
Answer: D
159. Dr. Barr studies animals in their natural environments and is very curious about their
behavior. Most likely Dr. Barr is a(n):
a. ethologist
b. zoologist
c. biologist
d. behaviorist
Answer: A
160. If you had to write a slogan for the idea of critical or sensitive periods, what would it be?
a. “Don’t knock it until you’ve tried it”
b. “There’s more than one way to skin a cat”
c. “It’s now or never”
d. “Misery loves company”
Answer: C
161. The time period in which imprinting occurs is called the:
a. prepared time
b. ethological period
c. imprint schedule
d. critical period
Answer: D
162. Which of the following statements regarding imprinting is not true?
a. imprinting is irreversible
b. imprinting takes place during a critical or sensitive period
c. imprinting is evident in mature animals as well as in newborn animals
d. imprinting improves the chance that the animals will survive
Answer: C
163. A young chick will establish a social attachment to anything (or anyone) that moves or
provides food due to:
a. stimulus substitution
b. imprinting
c. biological restraint
d. observational learning
Answer: B
164. Kay raises ducks on a farm. Soon after being hatched, one of the ducks begins following
Kay around. The duck’s behavior is an example of:
a. classical conditioning
b. operant conditioning
c. spontaneous recovery
d. imprinting
Answer: D
165. You and a friend are at the hatching of ducklings. The ducklings first notice you and are
now trying to follow you. Your friend says, “Don’t worry, they’ll get over it.” Is your friend
right or wrong?
a. right, because imprinting is only temporary
b. right, because the ducklings will learn that you can’t feed them
c. wrong, because they won’t have other ducks to learn from
d. wrong, because imprinting is irreversible
Answer: D
166. A police dog is quickly taught to detect the smell of drugs such as marijuana and
cocaine. Which of the following best explains the ease with which the dog acquires this
ability?
a. prepared learning
b. imprinting
c. desensitization
d. spontaneous recovery
Answer: A
167. The nutcracker bird has an impressive memory that enables it to find previously hidden
food. It remembers where the hidden food is by recalling important landmarks such as trees.
The bird’s ____ is responsible for its remarkable memory.
a. prepared learning
b. imprinting
c. larger than normal hippocampus
d. cerebrum
Answer: C
168. A biological tendency found in animals to be able to recognize, attend to, and store
certain cues more easily than other cues is called:
a. prepared learning
b. imprinting
c. ethology
d. insight
Answer: A
169. The fact that all human babies between seven and eight months old start to babble
supports the idea of infants being:
a. slower to vocalize than other mammals
b. socially ready to speak at an early age
c. biologically prepared to produce sounds
d. taught how to produce sounds
Answer: C
170. In a recent study, subjects watched one of three movie clips, followed by a staged
confrontation and competitive reaction time test. Which group of subjects acted the most
aggressively?
a. subjects in the control group
b. subjects who watched physically and relationally aggressive clips
c. subjects who watched the non-aggressive clip
d. only males among those who watched the physically aggressive clip
Answer: B
171. Verbal bullying, ostracizing peers, and spreading rumors are examples of _____
aggression.
a. relational
b. instrumental
c. opportunistic
d. peer
Answer: A
172. How did Shinichi Suzuki adapt the Suzuki method to three- and four-year-olds who do
not have fully developed verbal skills?
a. The Suzuki method is not appropriate for this age group.
b. The principles of classical conditioning are used to give instructions to young children.
c. Information is given to the child through games and exercises.
d. The child is constantly rewarded for imitating the teacher.
Answer: C
173. The basic principles of the Suzuki method of instruction closely resemble:
a. the four mental processes of social cognitive learning
b. the principles of operant conditioning
c. the principles of classical conditioning
d. the structure of cognitive maps
Answer: A
174. With regard to motivation, what would Shinichi Suzuki never do in instructing the child
in violin?
a. push the child beyond his/her capacity
b. provide a model who was active and interesting for the child
c. use games
d. start at the young age of three years old
Answer: A
175. The treatment or therapy used to modify problem behavior based on the principles of
learning is called:
a. observational learning
b. covert rehearsal
c. behavior modification
d. self-reinforcement
Answer: C
176. Which disorder is characterized by abnormal or impaired development in social
interactions and communication?
a. depression
b. autism
c. Down syndrome
d. ADHD
Answer: B
177. What deficiencies are important to address in autistic children through intensive
behavioral treatment?
a. forming relationships and communication
b. creativity and communication
c. problem solving and communication
d. planning and problem solving
Answer: A
178. Lovaas’ approach to helping autistic children is most effective among children with:
a. the least severe symptoms
b. moderate symptoms
c. the most severe symptoms
d. the onset of symptoms after the age of 5
Answer: A
179. The program described in your textbook for autistic children used principles based
primarily upon:
a. classical conditioning
b. cognitive learning
c. operant conditioning
d. psychodynamic theory
Answer: C
180. Intensive behavior modification is most effective with an autistic child when it begins:
a. when the child is 2–3 years old
b. suddenly, so that the child is caught off guard
c. gradually, so that the child can slowly grow accustomed to it
d. when the child is 6–7 years old
Answer: A
181. The administration of a spank will be most effective when it is:
a. applied immediately after the unwanted behavior
b. administered mildly and gradually increased
c. included with signs of caring such as attention
d. not used in conjunction with a time-out procedure
Answer: A
182. Spanking is an example of:
a. negative reinforcement
b. negative punishment
c. positive punishment
d. time out
Answer: C
183. Harlan and Juanita spank their five-year-old daughter when she misbehaves. However,
after taking a psychology course, Juanita suggests to Harlan that to increase spanking’s
effectiveness they ought to:
a. use it in conjunction with a time-out procedure
b. wait a couple of hours after the inappropriate behavior and then give the spanking
c. make the spanking very, very mild
d. tell their daughter the reason for the spanking
Answer: D
184. A key to the success of the time-out procedure is that it:
a. eliminates reinforcement of undesired behaviors
b. induces fear that suppresses undesired behaviors
c. is more intense than punishment
d. can be administered more immediately than punishment
Answer: A
185. To increase the effectiveness of time out, parents should consider:
a. negative reinforcement
b. combining it with positive reinforcement of desired behaviors
c. negative punishment
d. administering it for one to two hours
Answer: B
186. If you want to train a killer whale to do a trick, what positive reinforcers would work
best and when?
a. access to other killer whales about an hour after the whale correctly performs the specific
behavior
b. removing an unpleasant sound right after the whale correctly performs the specific
behavior
c. foods, toys, and back scratches right after the whale correctly performs the specific
behavior
d. foods, toys, and back scratches about 3 minutes after the whale correctly performs the
specific behavior
Answer: C
187. According to the Critical Thinking section, how do trainers provide immediate feedback
to a killer whale after it has performed successfully?
a. by sounding a whistle
b. an underwater light is turned off
c. the whale is given a toy to play with
d. an underwater light is turned on
Answer: A
188. As a trainer is working with a killer whale, what primary reinforcer is being associated
with the sound of a whistle?
a. a specific behavior
b. to follow a target
c. an underwater light
d. food
Answer: D
189. If a killer whale fails to perform a trick, what is the appropriate response from the
trainer?
a. to turn off an underwater light
b. to be motionless and still
c. to sound the whistle
d. to offer a smaller-than-normal amount of food
Answer: B
TRUE/FALSE
1. Classical conditioning involves behavior and its consequences.
Answer: False
2. According to the Law of Effect, behavior that is followed by positive consequences is
weakened.
Answer: False
3. One of the criticisms of Skinner is that he failed to use an objective measure of behavior.
Answer: False
4. Skinner studied operant conditioning.
Answer: True
5. When successive behaviors are reinforced, the process of generalization is occurring.
Answer: False
6. Reinforcers increase the likelihood that behaviors will be repeated.
Answer: True
7. In operant conditioning, the reinforcer is contingent on the UCR.
Answer: False
8. Behavior is emitted and then followed by a consequence in operant conditioning.
Answer: True
9. Negative reinforcement is a type of punishment.
Answer: False
10. Punishment reduces the chance that a behavior will be repeated.
Answer: True
11. Something unpleasant is presented in negative punishment.
Answer: False
12. Time out removes a misbehaving child from an opportunity for reinforcement.
Answer: True
13. Money is a primary reinforcer.
Answer: False
14. Partial schedules of reinforcement maintain behavior over the long term.
Answer: True
15. If you reinforce a person every 30 minutes, she is on a fixed-ratio schedule.
Answer: False
16. A variable-ratio schedule indicates that behavior is reinforced after an unchanging number
of behaviors have occurred.
Answer: False
17. If a trained dog sits when a stranger says “sit” then generalization has taken place.
Answer: True
18. Later in his life, Skinner recognized the importance of cognitive factors in learning.
Answer: False
19. Prepared learning refers to how animals often learn certain combinations of CS and UCS
more easily than other combinations.
Answer: True
20. According to social cognitive theory, we watch or imitate others’ behaviors.
Answer: True
21. The Lovaas program is most effective for children with less severe autism.
Answer: True
22. When the principles of learning are used to alter undesirable behavior, it is called
behavior modification.
Answer: True
23. Operant conditioning explains how animals learn cognitive maps.
Answer: False
24. Using classical conditioning, trainers use food and toys to reinforce killer whales.
Answer: False
25. A trained killer whale associates a whistle with food.
Answer: True
Test Bank for Introduction to Psychology
Rod Plotnik, Haig Kouyoumdjian
9781133939535, 9781305008113, 9781285061306