AP Psychology Unit 6 Vocabulary

Card Set Information

Author:
apelletier33
ID:
185304
Filename:
AP Psychology Unit 6 Vocabulary
Updated:
2012-11-26 19:26:35
Tags:
AP Psychology Unit Vocabulary
Folders:

Description:
AP Psychology Unit 6 Vocabulary
Show Answers:

Home > Flashcards > Print Preview

The flashcards below were created by user apelletier33 on FreezingBlue Flashcards. What would you like to do?


  1. A relatively permanent change in an organism's behavior due to experience.
    Learning
  2. An organism's decreasing response to a stimulus with repeated exposure to it.
    Habituation
  3. Learning that certain events occur together. The events may be two stimuli (as in classical conditioning) or a response and its consequences (as in operant conditioning).
    Associative learning
  4. A type of learning in which one learns to link two or more stimuli and anticipate events.
    Classical conditioning
  5. The view that psychology (1) should be an objective science that (2) studies behavior without reference to mental processes. Most research psychologists today agree with (1) but not with (2).
    Behaviorism
  6. In classical conditioning, the unlearned, naturally occurring response to the unconditioned stimulus (US), such as salivation when food is in the mouth.
    Unconditioned response (UR)
  7. In classical conditioning, a stimulus that unconditionally-naturally and automatically-triggers a response.
    Unconditioned stimulus (US)
  8. In classical conditioning, the learned response to a previously neutral (but now conditioned) stimulus (CS).
    Condtioned response (CR)
  9. In classical conditioning, an originally irrelevant stimulus that, after association with an unconditioned stimulus (US), comes to trigger a conditioned response.
    Conditioned stimulus (CS)
  10. In classical conditioning, the initial stage, when one links a neutral stimulus and an unconditioned stimulus so that the neutral stimulus begins triggering the conditioned response. In operant conditioning, the strengtheningof a reinforced response.
    Acquisition
  11. A procedure in which the conditioned stimulus in one conditioning experience is paired with a new neutral stimulus, creating a second (often weaker) conditioned stimulus. For example, an animal that has learned that a tone predicts food might then learn that a light predicts the tone and begin responding to the light alone. (Also called second-order conditioning.)
    Higher-order conditioning
  12. The diminishing of a conditioned response; occurs in classical conditioning when an unconditioned stimulus (US) does not follow a condiioned stimulus (CS); occurs in operant conditioning when a response is no longer reinforced.
    Extinction
  13. The reapperance, after a pause, of an extinguished conditioned response.
    Spontaneous recovery
  14. The tendency, once a response has been conditioned, for stimuli similar to the conditioned stimulus to elicit similar responses.
    Generalization
  15. In classical conditioning, the learned ability to distinguish between a conditioned stimulus and stimuli that do not signal an unconditioned stimulus.
    Discrimination
  16. The hopelessness and passive resignation an animal or human learns when unable to avoid repeated aversive events.
    Learned helpessness
  17. Behavior that occurs as an automatic response to some stimulus.
    Respondent behavior
  18. A type of learning in which behavior is strengthened if followed by a reinforcer or diminshed if followed by a punisher.
    Operant conditioning
  19. Behavior that operates on the environment, producing consequences.
    Operant behavior
  20. Thorndike's principle that behaviors followed by favorable consequences become more likely, and that behaviors followed by unfavorable consequences become less likely.
    Law of effect
  21. In operant conditioning research, a chamber (also known as a Skinner box) containing a bar or key that an animal can manipulate to obtain a food or water reinforcer; attached devices record the animal's rate of bar pressing or key pecking.
    Operant chamber
  22. An operant conditioning procedure in which reinforcers guide behavior toward closer and closer approximations of the desired behavior.
    Shaping
  23. In operant conditioning, a stimulus that elicits a response after association with reinforcement (in contrast to related stimuli not associated with reinforcement).
    Discriminative stimulus
  24. In operant conditioning, any event that strengthens the behavior it follows.
    Reinforcer
  25. Increasing behaviors by presenting positive stimuli, such as food. This is any stimulus that, when presented after a response, strengthens the response.
    Positive reinforcement
  26. Increasing behaviors by stopping or reducing negative stimuli, such as shock. This is any stimulus that, when removed after a response, strengthens the response. (Note: this is not punishment.)
    Negative reinforcement
  27. An innately reinforcing stimulus, such as one that satisfies a biological need.
    Primary reinforcer
  28. A stimulus that gains its reinforcing power through the association with a primary reinforcer; also known as a secondary reinforcer.
    Conditioned reinforcer
  29. Reinforcing the desired response every time it occurs.
    Continuous reinforcement
  30. Reinforcing a response only part of the time; results in slower acquisition of a response but much greater resistance to extinction than does continuous reinforcement.
    Partial (intermittent) reinforcement
  31. In operant conditioning, a reinforcement schedule that reinforces a response only after a specified number of responses.
    Fixed-ratio schedule
  32. In operant conditioning, a reinforcement schedule that reinforces a response after an unpredictable number of responses.
    Variable-ratio schedule
  33. In operant conditioning, a reinforcement schedule that reinforces a response only after a specified time has elapsed.
    Fixed-interval schedule
  34. In operant conditioning, a reinforcement schedule that reinforces a response at unpredictable time intervals.
    Variable-interval schedule
  35. An event that decreases the behavior that it follows.
    Punishment
  36. A mental representation of the layout of one's environment. For example, after exploring a maze, rats act as if they have learned a cognitive map of it.
    Cognitive map
  37. Learning that occurs but is not apparent until there is an incentive to demonstrate it.
    Latent learning
  38. A sudden and often novel realization of the solution to a problem.
    Insight
  39. A desire to perform a behavior effectively for its own sake.
    Intrinsic motivation
  40. A desire to perform a behavior to receive promised rewards or avoid threatened punishment.
    Extrinsic motivation
  41. A system for electronically recording, amplifying, and feeding back information regarding a subtle physiological state, such as blood pressure or muscle tension.
    Biofeedback
  42. Learning by observing others. Also called social learning.
    Observational learning
  43. The process of observing and imitating a specific behavior.
    Modeling
  44. Frontal lobe neurons that fire when performing certain actions or when observing another doing so. The brain's mirroring of another's action may enable imitation and empathy.
    Mirror neurons
  45. Positive, constructive, helpful behavior. The opposite of antisocial behavior.
    Prosocial behavior
  46. Punishment...
    May happen frequently because if the punished person stops misbehaving for a while this reinforces the punisher.
  47. Which of the following is an application of shaping?
    A mother playing catch with her daughter gradually backs up to increase the distance between the two of them.
  48. Mirror neurons may...
    Be the mechanism by which the brain accomplishes observational learning.
  49. Which of the following illustrates generalization?
    A rabbit that has been conditioned to blink to a tone also blinks when a similar tone is sounded.
  50. Albert Bandura's Bobo doll experiments demonstrated that...
    Children are more likely to imitate the behavior of adults.
  51. Robert Rescoria and Allan Wagner conducted experiments that established...
    The importance of cognitive factors in classical conditioning.
  52. Edward Thorndike's law of effect...
    States that rewarded behaviors is more likely to happen again.
  53. To produce the acquisition of a conditioned response, one should...
    Pair a neutral stimulus with an unconditoned stimulus several times.
  54. To determine just what an organism can learn to distinguish, you would use...
    A discriminative stimulus.
  55. A student studies long and hard to avoid the bad feelings associated with a low grade on a test. In this case, the studying behavior is being stregthened because of...
    Negative reinforcement.
  56. Taste aversion research has demonstrated that...
    There are biological predispositions inovled in learning.
  57. Mary checks her phone a couple times an hour for incoming text messages. Her behavior is being maintained on a ________ ________ reinforcement schedule.
    Variable interval
  58. A dog is trained to salivate when it hears a tone. Then the tone is sounded repeatedly without a US until the dog stops salivating. Later, when the tone sounds again, the dog salivates again. This is a description of...
    Spontaneous recovery
  59. Latent learning demonstrates that...
    Cognition plays an important role in operant conditioning.
  60. Classical and operant conditiong were both initially based on the principles of...
    Behaviorism
  61. Our capacity to learn new bethaviors that help us cope with changing circumstances.
    Adaptability
  62. A relatively permanent behavior change due to experience.
    Learning
  63. What are the three types of learning?
    • 1. Classical conditioning
    • 2. Operant conditioning
    • 3. Observational learning
  64. What do we learn by?
    Association
  65. The process of learning associations.
    Conditioning.
  66. In this type of conditioning, we learn to associate two stimuli and thus to anticipate events.
    Classical conditioning
  67. In this type of conditioning, we learn to associate a response (our behavior) and its consequences and thus to repeat acts followed by good results and avoid acts followed by bad results.
    Operant conditioning
  68. Through this type of learning, we learn from others' experiences.
    Observational learning
  69. An example of this type of conditioning is we learn to expect and prepare for significant events such as food of pain.
    Classical conditioning
  70. An example of this type of conditoning is we also learn to repeat acts that bring good results and to avoid acts that bring bad results.
    Operant conditioning
  71. By watching others we learn new behaviors.
    Observational learning
  72. What did Pavlov's work lay the foundation for?
    Many of psychologist John B. Watson's ideas, which formed behaviorism.
  73. Something the participant can notice but doesn't associate with the US.
    Neutral events
  74. Conditioned =
    Learned
  75. Uncondtioned =
    Unlearned.
  76. What are the five major conditioning processes?
    • 1. Acquisition
    • 2. Extinction
    • 3. Spontaneous recovery
    • 4. Generalization
    • 5. Discrimination
  77. Why is classical conditioning biologically adaptive?
    Because it helps humans and other animals prepare for good or bad events.
  78. Condtioning helps an animal survive and reproduce-by...
    • Responding to cues that help it
    • -gain food
    • -avoid dangers
    • -locate mates
    • -produce offspring
  79. This higher-order conditioning tends to be weaker than first-stage conditioning, it influences our everyday lives. Is also called...
    Second-order conditioning
  80. Biological influnces to learning.
    • 1. Genetic predispositions
    • 2. Unconditioned responses
    • 3. Adaptive responses
  81. Psychological influences to learning.
    • 1. Previous experiences.
    • 2. Predictability of associations.
    • 3. Generalization
    • 4. Discrimination
  82. Social-cultural influences to learning.
    • 1. Culturally learned preferences.
    • 2. Motivation, affected by presence of others.
  83. The learned ability to distinguish between a conditioned stimulus (which predicts the US) and other irrelevant stimuli.
    Discrimination
  84. In their dismissal of "mentalistic" concepts such as consciousness, who underestimated the importance of cognitive processes (thoughts, perceptions, expectations) and biological constraints on an organism's learning capacity?
    Pavlov and Watson
  85. What did Robert Rescoria and Allan Wagner show?
    That an animal can learn the predictability of an event.
  86. An awareness of how likely it is that the US will occur.
    Expectancy
  87. When the CS is something stimilar to stimuli associated with sexual activity in the natural environment, such as the stuffed head of a female quail.
    Ecologically relevant
  88. What does learning enable animals to do?
    Adapt to their environments.
  89. Why is Pavlov's work important? The importance lies in this finding:
    Many other responses to many other stimuli can be classically conditioned in many other organisms.
  90. Who showed us how a process such sa learning can be studied objectively?
    Pavlov.
  91. 1. Former drug users often feel a craving when they are again in the drug-using context-with people or in places they associate with previous highs.
    2. When a particular taste accompanies a drug that influences immune response, the taste by itself may come to produce an immune response.
    Applications of classical conditioning
  92. To teach an elephant to walk on its hind legs or a child to say please, we must turn to another type of learning-
    Operant conditioning.
  93. These are both forms of associative learning.
    Classical conditioning and operant conditioning.
  94. Forms associations between stimuli ( a CS and the US it signals).
    Classical conditioning
  95. Behavior that operates on the environment to produce rewarding or punishing stimuli is called...
    Operant behavior
  96. By asking Is the organism learning associations between events it does not control? Or is it learning associations between its behavior and resulting events? what can you distinguish?
    Distinguish classical from operant conditioning.
  97. Using Thorndike's law of effect as a starting point, what did Skinner develop?
    A behavioral technology that revealed principles of behavior control.
  98. For his pioneering stuides, Skinner designed an operant chamber, popularly known as a...
    Skinner box.
  99. In his experiments, Skinner used this, a procedure in which reinforcers, such as food, gradually guide an animal's actions toward a desired behavior.
    Shaping
  100. Description of this operant conditioning term is add a desirable stimulus.
    Positive reinforcement
  101. Examples of this operant conditioning term are getting a hug; receiving a paycheck.
    Postive reinforcement
  102. Discription of this operant conditioning term is remove an aversive stimulus.
    Negative reinforcement
  103. Examples of this operant conditioning term is fasting seatbelt to turn off beeping.
    Negative reinforcement
  104. What are two ways to increase behavior?
    • 1. Positive reinforcement
    • 2. Negative reinforcement
  105. Strenghtens a response by presenting a typically pleasurable stimulus after a response.
    Positve reinforcement
  106. Strengthens a response by reducing or removing something undesirable or unpleasant, as when an organism escapes an aversive situation.
    Negative reinforcement
  107. Negative reinforcement is not...
    Punishment.
  108. Removes a punishing (aversive) event.
     Negative reinforcement
  109. Whether it works by reducing something aversive, or by giving something desirable, this is any consequence that strengthens behavior.
    Reinforcement
  110. Condioned reinforcers, get their power through learned association with primary reinforcers, are also called what?
    Secondary reinforcers
  111. Although initial learning is slower, intermittent reinforcement produces greater what than is found with continuous reinforcement?
    Resistance to extinction.
  112. Reinforcement linked to number of responses.
    A ratio schedule.
  113. Reinforcement linked to amount of time.
    An interval schedule.
  114. Does a ratio schedule or an interval schedule produce a higher response rate?
    A ratio schedule.
  115. Every so many: reinforcement after every nth behavior, such as buy 10 coffees, get 1 free, or pay per product unit produced.
    Fixed ratio schedule of reinforcement
  116. After an unpredictable number: reinforcement after a random number of behaviors, as when playing slot machines or fly-casting.
    Variable ratio schedule of reinforcement
  117. Every so often: reinforcement for behavior after a fixed time, such as Tuesday discount prices.
    Fixed Interval schedule of reinforcement
  118. Unpredictably often: reinforcement for behavior after a random amount of time, as in checking for e-mail.
    Variable interval
  119. What produces more consistent responding, an unpredictable (variable) schedule or a preditable (fixed) schedule?
    An unpredictable (variable) schedule.
  120. Variable-interval schedules reinforce the first response after _______ time intervals.
    Varying
  121. Any consequence that decreases the frequency of a preceding behavior.
    Punisher
  122. What are the four drawbacks of physically punishing children?
    • 1. Punished behavior is suppressed, not forgotten.
    • 2. Punishment teaches discrimiantion.
    • 3. Punishment can teach fear.
    • 4. Physical punishment may increase aggressiveness by modeling aggression as a way to cope with problems.
  123. Punishment tells you what not to do; reinforcement...
    tells you what to do.
  124. There is more learning than associating a response with a consequence; there is also...
    Cognition.
  125. Biological constraints predispose organisms to?
    Learn associations that are naturally adaptive.
  126. Description of this type of punisher is to administer an aversive stimulus.
    Positive punishment
  127. Possible examples of this type of punisher are spanking; a parking ticket.
    Positive punishment
  128. Description of this type of punisher is to withdraw a desirable stimulus.
    Negative punishment
  129. Possible examples of this type of punisher are time-out from privileges (such as time with friends); revoked driver's license.
    Negative punishment
  130. Basic idea of this type of conditioning is organims learn associations between events they don't control.
    Classical conditioning
  131. Basic idea of this type of conditioning is organisms learn associations between their behavior and resulting events.
    Operant conditioning
  132. Response of this type of conditioning is involuntary, automatic.
    Classical conditioning
  133. Response of this type of conditioning is voluntary, operates on environment.
    Operant conditioning
  134. Acquisition of this type of conditioning is associating events; CS announces US.
    Classical conditioning
  135. Acquisition of this type of conditioning is associating response with a consequence (reinforcer or punisher).
    Operant conditioning
  136. Extinction of this type of conditioning is CR decreases when CS is repeatedly presented alone.
    Classical conditioning
  137. Extinction of this type of conditioning is responding decreases when reinforcement stops.
    Operant conditioning
  138. Spontaneous recovery of this type of conditioning is the reappearance, after a rest period, of an extinguished CR.
    Classical conditioning
  139. Spontaneous recovery of this type of conditioning is the reappearance, after a rest period, of an extinguished response.
    Operant conditioning
  140. Generalization of this type of conditioning is the tendency to respond to stimuli similar to the CS.
    Classical conditioning
  141. Generalization of this type of conditioning is organims; response to similar stimuli are also reinforced.
    Operant conditioning
  142. Discrimination of this type of conditioning is the learned ability to distinguish between a CS and other stimuli that do not signal a US.
    Classical conditioning
  143. Discrimination of this type of conditioning is organims learn that certain response, but not others, will be reinforced.
    Operant conditioning
  144. Cognitive processes of this conditioning is organisms develop expectation that CS signals the arrival of US.
    Classical conditioning
  145. Cognitive processes of this type of of conditioning is organisms develop expectation that response will be reinforced or punished; they also exhibit latent learning, without reinforcement.
    Operant conditioning
  146. Biological predispositions of this type of conditioning is natural predispositions constrain what stimuli and responses can easily be associated.
    Classical conditioning
  147. Biological predispositions of this type of conditioning is organims best learn behaviors similar to their natural behaviors; unnatural behaviors instctively drift back toward natural ones.
    Biological predispositions
  148. Skinner and others worked toward a day when teaching machines and textbooks would shape learning in small steps, immediately reinforcing correct response (such as use of computers).
    Operant conditioning principles applied in school.
  149. The key is to shape behavior, by first reinforcing small successes and then gradually increasing the challenge.
    Operant conditioning principles applied in sports.
  150. Reward specific, achievable behaviors, not vaguely defined "merit".
    Operant conditioning principles applied at work.
  151. Operant conditioning principles also reminds us that reinforcement should be..?
    IMMEDIATE
  152. To disrupt this cycle, parents should remember the basic rule of shaping:
    Notice people doing something right and affirm them for it.
  153. To do this, you need to reinforce your own desired behaviors and extinguish the undesired ones.
    To build up your self-control.
  154. What are the four steps to build up your self-control?
    • 1. State your goal-to cease smoking, eat less, exercise more, or stop procrastinating-in measurable terms, and announce it.
    • 2. Monitor how often you engage in your desired behavior.
    • 3. Reinforce the desired behavior.
    • 4. Reduce the rewards gradually.
  155. Giving this type of stimulus is postive reinforcement; taking it away is negative punishment.
    Desired (for example, a compliment).
  156. Giving this type of stimulus is positive punishment; taking it away is negative reinforcement.
    Undesired/aversive (for example, an insult).
  157. Higher animals, especially humans, can learn without direct experience, through observational learning because we learn by observing and initiating others. Also called..?
    Social learning
  158. Mirror neurons help give rise to children's empathy and to their ability to infer another's mental state, an ability known as...
    Theory of mind
  159. What do our brain's mirror neurons underlie?
    Our intensely social nature.
  160. What do many business organizations use to train communications, sales, and customer service skills?
    Behavior modeling
  161. Observational learning may have...
    Antisocial effects.
  162. What are the two factors that seem to stem from the violence-viewing effect?
    • 1. Imitation
    • 2. Desensitizes viewers
  163. Why is Pavlov's work important?
    That significant psychological phenomena can be stuided objectively, and that classical conditioning is a basic form of learning that applies to all species. Later research modified this finding somewhat by showing that in many species cognition and biological predispositions place some limits on conditioning.
  164. Expanding on Edward Thorndike's law of effect, B.F. Skinner and others found that?
    The behavior of rats or pigeons placed in an operant chamber (Skinner box) can be shaped by using reinforcers to guide closer and closer approximations of the desired behavior.
  165. What did Skinner underestimate?
    The limits that cognitive and biological constraints place on conditioning.
  166. What do Albert Bandura's experiments on observational learning (also called social learning) demonstrate?
    How we observe and imitate others.

What would you like to do?

Home > Flashcards > Print Preview