Chapter 6 Multiple Choice Questions 1. In ____ conditioning, it is what happens ____ the behavior that is critical. a. instrumental; after b. classical; after c. instrumental; before d. both classical and instrumental; before Answer: a. instrumental; after Rationale: In instrumental conditioning (also known as operant conditioning), the consequence (what happens after the behavior) is critical in determining the likelihood of the behavior occurring again. This is in contrast to classical conditioning, where the critical factor is the association between stimuli. 2. The ____ are examples of discrete trial apparatuses and the____ is an example of a free operant apparatus. a. maze, puzzle box, Skinner box; runway b. maze, puzzle box, runway; Skinner box c. puzzle box, runway, Skinner box; maze d. maze, runway, Skinner box; puzzle box Answer: b. maze, puzzle box, runway; Skinner box Rationale: Discrete trial apparatuses are those where the subject's behavior is constrained to discrete trials, such as completing a maze or puzzle box. Free operant apparatuses allow for behavior to occur freely over time, such as in a Skinner box. 3. Skinner defined reinforcers and punishers technically according to a. their intensity. b. the probability of their occurrence. c. their effect on behavior. d. the extent to which they are perceived as pleasant versus unpleasant. Answer: c. their effect on behavior. Rationale: Skinner defined reinforcers as stimuli that increase the likelihood of a behavior occurring again, while punishers decrease the likelihood of a behavior occurring again. The critical factor in this definition is their effect on behavior. 4. When combined with the terms "reinforcement" or "punishment," the word "positive" means a. something that is appetitive. b. something that is subtle. c. something is added or presented. d. both something that is appetitive and something that is subtle. Answer: c. something is added or presented. Rationale: In operant conditioning, "positive" refers to the addition or presentation of a stimulus following a behavior, regardless of whether the stimulus is appetitive or aversive. 5. Two types of negative reinforcement are _____ and _____. a. time out; response cost b. escape; response cost. c. escape; avoidance d. avoidance; response cost Answer: c. escape; avoidance Rationale: Negative reinforcement involves the removal or avoidance of an aversive stimulus following a behavior. "Escape" involves terminating an ongoing aversive stimulus, while "avoidance" involves preventing the aversive stimulus from occurring altogether. 6. Two types of negative punishment are _____ and _____. a. time out; response cost b. escape; response cost. c. escape; avoidance d. avoidance; response cost Answer: a. time out; response cost Rationale: Negative punishment involves the removal of a desired stimulus following a behavior, leading to a decrease in the likelihood of that behavior occurring again. "Time out" involves removing access to positive reinforcement, while "response cost" involves removing specific reinforcers as a consequence of behavior. 7. Melissa stayed out past her curfew and subsequently lost car privileges for a week. As a result, she never again stayed out past her curfew. This example best illustrates the process of a. positive reinforcement. b. negative reinforcement. c. positive punishment. d. negative punishment. Answer: d. negative punishment. Rationale: Negative punishment involves the removal of a desired stimulus (car privileges) following a behavior (staying out past curfew), resulting in a decrease in the likelihood of that behavior occurring again (Melissa not staying out past curfew). 8. Innate is to learned as ____ reinforcer is to ____ reinforcer. a. secondary; primary b. primary; secondary c. intrinsic; extrinsic d. extrinsic; intrinsic Answer: b. primary; secondary Rationale: Primary reinforcers are innate, biological reinforcers such as food, water, or shelter. Secondary reinforcers are learned reinforcers that acquire their reinforcing properties through association with primary reinforcers or other secondary reinforcers. 9. Money and praise are common examples of ____ reinforcers. a. primary b. secondary c. unconditioned d. generalized Answer: d. generalized Rationale: Money and praise are generalized reinforcers because they can be exchanged for a wide variety of primary and secondary reinforcers. They are not inherently reinforcing like primary reinforcers but gain their reinforcing properties through association with other reinforcers. 10. A(n) ____ stimulus serves as a signal that a response will be followed by a reinforcer. a. operant b. discriminative c. conditioned d. appetitive Answer: b. discriminative Rationale: A discriminative stimulus is a cue or signal that indicates the likelihood of reinforcement following a particular response. It sets the occasion for the behavior to occur by signaling the availability of reinforcement. 11. In correct order, the three-term contingency consists of: a. antecedent, consequence, and behavior. b. antecedent, behavior, and consequence. c. consequence; behavior, and antecedent. d. behavior, antecedent, and consequence. Answer: b. antecedent, behavior, and consequence. Rationale: The three-term contingency in behavior analysis refers to the relationship between antecedents (stimuli that precede a behavior), behavior itself, and consequences (stimuli that follow a behavior). The correct order is antecedent, behavior, and consequence, as indicated in option b. 12. You press the power button to turn on your computer and wait for the desktop to appear before opening a program. This is an example of: a. shaping. b. prompting. c. fading. d. a stimulus-response chain. Answer: d. a stimulus-response chain. Rationale: A stimulus-response chain refers to a sequence of behaviors where each response produces a cue or stimulus for the next response in the sequence. In this scenario, pressing the power button serves as the initial stimulus (antecedent), which leads to the behavior of waiting for the desktop, followed by the consequence of the desktop appearing, which then prompts the behavior of opening a program. 13. You press the power button to turn on your computer and wait for the desktop to appear before opening a program. In this example, seeing the desktop is both: a. a discriminative stimulus for turning on the computer and a conditioned reinforcer for opening the program. b. a conditioned reinforcer for turning on the computer and a discriminative stimulus for opening the program. c. a warning stimulus for turning on the computer and a conditioned reinforcer for opening the program. d. a warning stimulus for opening the program and a conditioned reinforcer for turning on the computer. Answer: b. a conditioned reinforcer for turning on the computer and a discriminative stimulus for opening the program. Rationale: Seeing the desktop serves as a consequence (conditioned reinforcer) for the behavior of turning on the computer, thus reinforcing the behavior of turning on the computer. At the same time, seeing the desktop serves as a discriminative stimulus for the behavior of opening a program, indicating that it is an appropriate time to engage in that behavior. 14. Shaping is the: a. reinforcement of new operant behavior. b. gradual reinforcement of new operant behavior. c. reinforcement of gradual approximations to a new behavior. d. creation of new behavior through gradual reinforcement. Answer: c. reinforcement of gradual approximations to a new behavior. Rationale: Shaping involves reinforcing successive approximations of a desired behavior until the desired behavior is fully formed. It entails reinforcing behaviors that are closer and closer to the target behavior until the target behavior is achieved. 15. The sound of a click can be an effective tool for shaping after it has been paired with ____, thereby making it a ____. a. food; secondary reinforcer. b. shock; primary punisher. c. food; primary reinforcer. d. shock; secondary punisher. Answer: a. food; secondary reinforcer. Rationale: In shaping, a click (or any other conditioned reinforcer) becomes effective after being paired with a primary reinforcer such as food. Over time, the click itself becomes reinforcing because it predicts the delivery of the primary reinforcer (food), making it a secondary reinforcer. 16. A child is asked to trace dotted lines in order to learn to write the letters of the alphabet. This is an example of the ____ procedure. a. shaping. b. prompting. c. fading. d. discrimination training. Answer: b. prompting. Rationale: Prompting involves providing assistance or cues to help an individual perform a behavior. In this scenario, the child is provided with dotted lines as prompts to guide them in tracing the letters of the alphabet. 17. After the child successfully traces dotted lines of a letter of the alphabet, the space between dots is gradually increased until the dots are eliminated altogether. This is an example of the ____ procedure. a. shaping. b. prompting. c. fading. d. discrimination training. Answer: c. fading. Rationale: Fading involves gradually removing prompts or assistance once the behavior is mastered so that the individual can perform the behavior independently. In this case, increasing the space between the dots gradually removes the prompting until the child can trace the letters without any assistance. 18. A powerful sequence for using prompts to establish verbal control of behavior is: a. verbal prompt, physical prompt, gestural prompt. b. verbal prompt, gestural prompt, physical prompt. c. physical prompt, gestural prompt, verbal prompt. d. gestural prompt, physical prompt, verbal prompt. Answer: c. physical prompt, gestural prompt, verbal prompt. Rationale: This sequence starts with the most intrusive prompt (physical), then moves to a less intrusive prompt (gestural), and finally to the least intrusive prompt (verbal). It allows for the gradual fading of prompts, leading to independent behavior. 19. Putting on a heavy parka before going out into the cold is an example of a(n) ____ response, while putting it on after you go outside and become cold is an example of a(n) ____ response. a. operant; reflexive. b. avoidance; escape. c. escape; avoidance. d. reflexive; operant. Answer: b. avoidance; escape. Rationale: Avoidance behavior occurs in anticipation of an aversive stimulus, such as putting on a parka before going out into the cold to avoid feeling cold. Escape behavior occurs in response to an ongoing aversive stimulus, such as putting on a parka after feeling cold to escape from the discomfort. 20. It is relatively easy to teach a rat to ____ to avoid a shock but much more difficult to teach a rat to _____ to avoid shock. a. press a bar; run down a runway. b. run down a runway; press a bar. c. press a bar; walk down a runway. d. walk down a runway; press a bar. Answer: b. run down a runway; press a bar. Rationale: Running down a runway is a natural behavior for a rat, so it is easier to reinforce this behavior to avoid shock. Pressing a bar is not a natural behavior for a rat, so it is more difficult to shape this behavior to avoid shock. Test Bank for Adaptive Learning and the Human Condition Jeffrey C. Levy 9780205950775
Close