L&M CH. 6

Card Set Information

Author:
chuchi
ID:
24169
Filename:
L&M CH. 6
Updated:
2010-06-25 00:16:00
Tags:
Mazur Psychology
Folders:

Description:
L&M CH. 6
Show Answers:

Home > Flashcards > Print Preview

The flashcards below were created by user chuchi on FreezingBlue Flashcards. What would you like to do?


  1. Thorndike referred to the principle of strengthening a behavior by its consequences as __________; in modern terminology, this is called.
    the Law of Effect, reinforcement
  2. In photographing cats in the puzzle box, Guthrie and Horton found that the behaviors of an individual cat were ____________ from trial to trial, but they were _________ from cat to cat.
    similar, different
  3. Superstitious behaviors are more likely to occur when an individual has _________ of the reinforcer.
    little or no control
  4. When using food to shape behavior of a rat, the sound of the food dispenser is a ____________, and the food dispenser is a _________, and the food itself is a _________.
    conditioned reinforcer, primary reinforcer
  5. Thorndike's research with the puzzle box is an example of a ________ procedure, whereas Skinner's research used a __________ procedure.
    discrete trial, free operant
  6. The three parts of a three-term contingency are the _________, the ____________, and the _________.
    discriminative stimulus, operant response, reinforcer
  7. Each stimulus in the middle of a response chain serves as a _________ for the previous response and as a __________ for the next response.
    conditioned reinforcer, discriminative stimulus
  8. The procedure in which pigeons start to peck at a lighted response key when it precedes food deliveries is called ___________.
    autoshaping
  9. The Brelands used the term instinctive drift to refer to cases where an animal stopped performing _________ behaviors and started performing ___________ behaviors as its training progressed.
    reinforced, instinctive
  10. Another name for Thorndike's Law of Effect is: (This law is based upon reinforcement)
    The Principle of Positive Reinforcement
  11. Which of the following terms refer to the method of successive approximation to a goal? (A method used in behavior modification)
    A. Shaping
    B. Backward Chaining
    C. Resurgence
    D. Stimulus Control
    Shaping
  12. Which of the following was an apparatus used to establish the Law of Effect by Thorndike? (Similar to the Skinner Box)
    A. The Columbia Obstruction Box
    B. The Puzzle Box
    C. The Water Maze
    D. The Shuttle Box
    The Puzzle Box
  13. A previously neutral stimulus that has acquired the capacity to strengthen responses because that stimulus has been repeatedly paired with food is called: (Type of reinforce)
    Conditioned Reinforcer
  14. In operant conditioning, when an animal displays innate behaviors associated with a given reinforcer, even though these behaviors are not reinforced, this is called: (Type of biological constraint of operant conditioning)
    Instinctive drift
  15. Which of the following reflects the idea that different reinforcers evoke different systems or collections of behaviors? (Way to explain autoshaping behavior)
    A. Sign Tracking
    B. Instinctive Drift
    C. Behavior – System Analysis
    D. Superstitious behavior
    Behavior – System Analysis
  16. Resurgence is the reappearance of a previously reinforced response that occurs when a more recently reinforced response is extinguished. (Similar to spontaneous recovery)
    True
  17. Another name for operant conditioning is instrumental conditioning. (The subject's behavior is instrumental in obtaining the reinforce)
    True
  18. The best example of Skinner's concept of generalized reinforcer is money. (Refers to a large class of primary reinforcers)
    True
  19. Skinner referred to ____ behaviors as those that occur frequently early in the interval between reinforcers. (Examples include pecking towards the floor or moving along the front wall of the Skinner box.)
    Interim
  20. A ____ reinforcer is a stimulus that naturally strengthens any response it follows. (Critical component for shaping)
    Primary
  21. In ____ chaining, the teacher starts by reinforcing the first response of the chain, then gradually adds the second response, and so on. (Often used to train animals for sequence behaviors)
    Forward
  22. The procedure in which pigeons start to peck at a lighted response key when it precedes food deliveries is called ______: (Behavior that occurs without reinforcement)
    Autoshaping
  23. In instrumental conditioning, procedures that make use of lever pressing or similar responses are called:
    Free operant conditioning
  24. A sequence of behaviors that must occur in a specific order, with the primary reinforcer being delivered after the final response of the sequence.
    response chain
  25. The broad topic of how stimuli that precede a behavior can control the occurrence of that behavior.
    stimulus control
  26. The method of successive approximation.
    shaping
  27. If a response is followed by a reinforcer, the frequency of that response will increase.
    law of effect
  28. The subject's behavior is instrumental in obtaining the reinforcer.
    instrumental conditioning
  29. A previously neutral stimulus that has acquired the capacity to strengthen responses due to its repeated pairing with a primary reinforcer.
    conditioned response
  30. A reinforcer is anything that
    increases the frequency of a given response occurring again
  31. Thorndike demonstrated his principle of
    Law of Effect using the puzzle box experiment
  32. Superstitious behavior is the result of
    accidental reinforcement
  33. Successive approximation or shaping has many applications specifically with regard to
    behavior modification in the classroom
  34. Skinner used the term operant conditioning or instrumental conditioning to
    describe behaviors which were strengthen by reinforcement
  35. Three components of operant conditioning are:
    1) the stimuli that precedes the response, 2) the response itself, and 3) the reinforcer
  36. Like classical conditioning, the reappearance of a previously reinforced response that occurs when a more recently reinforced response is extinguished is called
    resurgence
  37. A response chain consists of
    alternating series of stimuli and responses, and only the last response is followed by the primary reinforcer
  38. The phenomenon of autoshaping is used to
    rebut the principle of reinforcement
  39. puzzle box
    • The Law of Effect:
    • Thorndike’s experiment
  40. Cats could escape by making a response (e.g., pulling on a string)
    • The Law of Effect:
    • Thorndike’s experiment
  41. Measure of performance was escape latency
    o 1st time luck, 11th time all did it
    • The Law of Effect:
    • Thorndike’s experiment
  42. Thorndike’s version of the principle of reinforcement, which states that responses that are followed by pleasant or satisfying stimuli will be strengthened and will occur more often in the future.
    o Positive Reinforcement
    The Law of Effect
  43. This principle states that there is a parallel between the _____ of the camera and the ________ in the experiments by ________
    • Law of Effect and the Stop-Action Principle:
    • action, reinforcer, Guthrie and Horton.
  44. The specific bodily position and the muscle movements occurring at the moment of reinforcement will have a
    • Law of Effect and the Stop-Action Principle:
    • higher probability of occurring on the next trial
  45. Skinner’s (1948) superstition experiment
    Superstitious Behaviors
  46. Whatever behavior happened to be occurring when the reinforcer was delivered was strengthened.
    Skinner’s (1948) superstition experiment
  47. A behavior that occurs because, by accident or coincidence, it has previously been followed by a reinforcer.
    superstition
  48. common among athletes
    superstition
  49. Superstitions that are widely held are probably due to
    communication with others
  50. Some superstitions were originally
    valid beliefs (e.g., bad luck to light 3 cigarettes with 1 match)
  51. A procedure for teaching new behavior in which closer and closer approximations to the desired behavior are reinforced.
    Shaping, or Successive Approximations
  52. pressing in a rat
    Shaping lever
  53. Hypothetical distribution of height of rat’s head
    (shaping example)
  54. Shaping behaviors in the
    classroom
  55. Shaping as a ______ in behavior modification (e.g, teach self-care skills to the mentally disabled)
    tool
  56. Different from the discrete trial procedure used by Thorndike; The operant response can occur at any time, and the operant response can occur repeatedly for as long as the subject remains in the skinner box.
    • Research of B.F. Skinner:
    • The Free Operant Procedure
  57. Response rate
    The Free Operant Procedure
  58. According to Skinner, there are ___________
    Three-Term Contingency
  59. 1) the context in which the response occur
    - relation b/w
    Three-Term Contingency
  60. 2) the response itself
    - lever pressing
    Three-Term Contingency
  61. 3) the reinforcer
    - reinforcer
    Three-Term Contingency
  62. This is the reappearance of a previously reinforced response that occurs when a more recently reinforced response is extinguished
    • Operant Conditioning:
    • Resurgence
  63. stimulus that naturally strengthens any response it follow (e.g., food, water, comfort)
    • Operant Conditioning:
    • Primary Reinforcer
  64. This reinforcer acts as a surrogate for the primary reinforcer
    • Operant Conditioning:
    • Conditioned Reinforcement
  65. A class of conditioned reinforcers that are associated with a number of different primary reinforcer
    Example: money
    • Operant Conditioning:
    • Generalized Reinforcers
  66. learning to respond to one stimulus but not another
    • Operant Conditioning:
    • Discrimination Learning
  67. a stimulus that indicates whether or not responding will lead to reinforcement
    • Operant Conditioning:
    • Discriminative stimulus
  68. Extinction discrimination and generalization =
    same as classical conditioning and operant conditioning
  69. A sequence of behaviors that must occur in a specific order, with the primary reinforcer being delivered only after the final response of the sequence
    Response Chains
  70. Total task method, which is most effective?
    • Backward and Forward Chaining
    • Depends on training
  71. Operant conditioning 3 ways
    Response Chains
  72. With extensive experience, the subject’s performance drifts away from the reinforced behaviors toward instinctive behaviors that occur when the animal is seeking the reinforcer.
    • Biological Constraints on Operant Conditioning:
    • Instinctive Drift
  73. Experiment by Brown and Jenkins (1968); Similarity with superstitious behavior
    • Biological Constraints on Operant Conditioning:
    • Autoshaping

What would you like to do?

Home > Flashcards > Print Preview