Sensation and Perception Part 3

Card Set Information

Sensation and Perception Part 3
2014-05-02 10:57:49

Attention to Speech and Music
Show Answers:

  1. What Attention does (4)
    • Prepares
    • Selects
    • Organizes
    • Integrates Sensations
  2. What we direct attention to
    • Novelty (Relevant info)
    • Movement
    • Intensity
    • Contrast
  3. Orienting Attention
    • adjustment to important info, specific stimuli
    • adjustment of eyes, ears, etc.
  4. Focusing Attention
    isolating awareness to one thing
  5. Dividing attention
    paying attention to 2+ objects at once
  6. Selective Attention
    • focusing on certain subsets of info, filtering out others 
    • cocktail party phenomenon
    • selective looking - visual search
  7. Focused attention tasks (do we see visuals not being paid attention to?)
    • Central task- focus on stimuli that answers question, very accurate
    • Peripheral task - stimuli that answers question not in focus, very accurate
    • Dual task - stimulus in peripheral, but also secondary stimuli (disk or face), accurate for face
    • Face has meaning, does not need focused attention to be aware of it, disk does
  8. Selective Attention and Listening
    • two messages going in each ear, attention paid to one, what changes are noticed in the other?
    • 1: noticed gender change and lack of message, not content or reversing
    • 2: message switch ears
    • 3: notices name in the ignored message
  9. Inattention blindness and task
    • how does attention affect awareness and understanding of stimuli?
    • If focused on stimuli to answer question, intro. of secondary stimuli not noticed (blind) 60%-80%
  10. Change Blindness
    • Hard to notice changes in information-heavy scene
    • If focused on one thing (map and directions), wont notice big change elsewhere (tourist asking for direction)
  11. Gist Awareness
    Background awareness, brief exposure (Rapid Serial Visual Presentation (RSVP)) gives a gist, but specifics nor the scene are dedicated to memory
  12. Filter Theory of Attention
    • Selective attention leading to selective awareness (and thus blindness), where specifics of scene are dedicated to memory, others are filtered out
    • emotional or relevant info often gets attention
  13. Attention Cuing and Response Time
    • If cue is given in direction of future stimulus, response time (RT) to pay attention to stimulus decreased
    • Neutral cue has medium RT
    • Cue opposite of future stimulus increases RT
  14. Attention and V4 tasks (2)
    • 1: Effective (high firing rate) and ineffective (low firing rate) stimuli presented, attention changed from effective to ineffective
    • Only the stimuli given attention affected neuron firing rate
    • 2: Attention shift into and out of receptor field (RF) following cue
    • Neurons firing rate changed with cue, before stimulus changed
    • Neuron firing rate (and thus attention) doesn't need eye movement to show change
  15. Feature Search
    Selective looking, one specific feature, easy, fast, number of items doesn't matter, use parallel search (equal attention) over whole field
  16. Conjuction Search
    Selective features, multiple features, difficult, RT increases WITH number of items, requires serial search over field
  17. Binding Problem and Attention
    • Once features are seen, how to match (bind) features with objects in field
    • Distributed representation: integrate distributed brain signals responding to different distributed features
  18. Feature Integration Theory
    • Fixes binding problem, as when we only pay attention to one single object, the features of the object easily integrated back to that object
    • (conjunction search)
  19. Biased Competition Theory
    • If features of different objects are in single RF, the features compete for representation
    • Competition increases as complexity of representation increases (up visual pathway)
    • Selectively pay attention to one object to bias representation to this object
  20. Top-down and Bottom-Up Attention
    • Top-Down- info. suggests where to pay attention, voluntary control of attention, slow
    • Bottom-Up- stimulus demands attention (reflex) involuntary control, fast response
  21. Posterior parietal cortex and Attention
    Damage leads to unilateral (left) field neglect, like V4, neurons respond to cue for attention shift, whether or not eye movement ocurrs
  22. Frontal Eye Field
    • Frontal cortex
    • Controls eye movement as well as attention shifting without eye movement
  23. PPC neurons and attention tasks
    • fixation- no attention shift, baseline firing rate
    • saccade- attention shift with eye movement, firing increase before eye movements
    • peripheral- attention shift in peripheral (no eye movement), firing rate still increases before shift (with cue)
  24. Balint's syndrome
    • Parietal lobe damage, cant focus attention to connect features (necessary for conjunction search)
    • Parietal- Petyr Balint- Balish - connects the dots
  25. Neural Correlates of Conciousness
    How to relate awareness with brain activity - correlate certain neuron features with shift in awareness (perceptual bistability) or lack of awareness (blindsight)
  26. Perceptual Bistability
    • Binocular Rivalry: Present two different images to each eye, shift awareness between images as they alternate 
    • (house-face images, house-parrahippocampal activity, face- fusiform gyrus activity)
    • Neural Correlate of Conciousness
  27. Blindsight
    • Recognize, discriminate, and react to visual stimulus (pupillary reflex, eye movement) without awareness of that stimuli (caused by V1 damage)
    • Reaction to stimulus seen in amygdala (emotional stimulus) and in superior colliculus and other visual areas with corollary discharge theory
    • Neural Correlate of Conciousness
  28. Multitasking
    Constantly switching attention and thus tasks, requiring time in between each shift
  29. Camouflage mechanisms
    • visual encoding
    • grouping encoding
    • object encoding
    • search (increase search time)
  30. Camo Adaptation Function (3) and mechanism (2)
    • distract or divert attention
    • prevent detection
    • blend with environment
    • Lateral Inhibition and Edge Detection prevention
  31. Camouflage Types (4)
    • Conceal oneself
    • Mimic another subject
    • Deceive searcher
    • Blend with background
    • Hiding is NOT an example of camo.
  32. Mimicry and types (5)
    • become as similar as possible to something else, prevent recognition or detection (mimic background)
    • behavior
    • looks
    • smell
    • sound
    • location
  33. Crypsis and ex (3)
    • prevention of detection while in plain sight (NOT hiding)
    • countershading
    • disruptive patterns and colors
    • matching background
  34. D
  35. Disruptive coloration and patterns
    • random but strong colors, preventing detection of continuous structure (pattern-black and white stripes on zebra)
    • lose structure continuity by preventing edge detection
  36. Masquerade
    looking like (mimicing) something else to prevent recognition, not detection
  37. Countershading and fucntions (4)
    • Matching background and light differences, dark top side matches light bottom side after light hits top and casts shadow on bottom side
    • 1. delete own shadow
    • 2. match one or multiple backgrounds (two backgrounds, two directions)
    • 3. change shading to change 3D detection (UV light)
    • 4. obliterate shadow to prevent 3D detection
  38. Abbott Thayer
    • Concealing-Coloration
    • Negative shading aka Upside-down shading aka countershading
    • Dazzle Paint (confuse motion -speed and direction, confuse shape - fake ship front)
  39. Material Properties and Camo
    surfaces reflect light at certain spectral power distributions and intensities, creating intensity borders
  40. Intensity Borders (2)
    • related to material properties 
    • Illumination edge - shadow with short WL dark light, UV detectable
    • Internal marking - texture pattern of reflected light
  41. Spectral information and Camo (5 types)
    • Mie and Rayleigh Scattering affecting light spectrum visible
    • Mie scattering- atmosphere near sun appears sun color
    • color changes quickly - border
    • color and intensity quickly - border unless shadow (illumination edge)
    • change over time - assign borders by light change, detect motion and assign common fate
  42. Edge detection and grouping (2)
    • Alter/disrupt small edges with color/pattern
    • Delete or add extraneous edge information
  43. Stopping Edge Detectors
    • Edge detector neurons like strong difference separation
    • Blur edges with graded pigmentation, creates perpendicular or adjacent strong "fake" edges to disguise real one called ILLUSORY CONTOURS
  44. Motion and Camo Mechanisms (3)
    • Optic Flow Mimicry
    • Motion Signal Minimization
    • Motion Disruption
  45. Motion Signal Minimization
    prevent detection of motion by minimizing either motion itself or the signal it creates
  46. Motion Disruption
    change form (illusory contours) or motion cues to alter motion perception
  47. Optic Flow Mimicry
    • Blend with moving background as searcher moves through environment
    • Real point and infinity point strategies
    • Dragonfly and Bat do both strategies
  48. Shape of Object and Camo
    • Prevent background discontinuity to remove 3D shape
    • Produce 2D pattern on surface to remove 3D shape
  49. Visual Search and Camo
    • Efficient searching requires serial search over multiple features/objects
    • The more varied the environment and the more varied the camouflaged object is, the longer the search
  50. Sound and Camo
    • Modify call (frequency, pattern, structure, etc)
    • Prevent detection by minimization
    • Mimic sound of other subject
    • Prevent localization (throw voice)
  51. Smell and Camo
    Insects mimic chemicals or limit chemicals detected (caterpillar mimics twig scent for ants)
  52. Substrate Vibrations and Camo
    • signals used for mating and coupling
    • signals used to detect predator or prey
  53. Electricity and Camo (detection (2) and mimicry)
    • Fish can detect change in electric fields
    • Passive detection - detect change in electric fields by movement of other objects
    • Active detection - send out electric signal like sonar
    • Send electric signal to match lightning to prevent detection
  54. Rayleigh Duplex Theory of Sound
    • Sound detection in horizontal plane by two processes
    • Interaural Level Difference
    • Interaural Time Difference
  55. Sound Perception And Sensation Definitions
    • Perception - Experience of hearing
    • Sensation - pressure changes of medium producing detectable waves in ear
  56. Sound Production
    • Diaphragm condenses or compresses air, pushing it out (increasing pressure)
    • Diaphragm rarefacts in air, pulling it back in (decreasing pressure)
  57. Physical Sound Properties (3)
    • Amplitude (decibel, dB)
    • Frequency (proportional to wavelength) (hertz -Hz)
    • Waveform (complexity)
  58. Perceptual Sound Properties
    • Loudness
    • Pitch
    • Timbre
  59. Pure Tone
    periodic sound wave called a sine wave
  60. Amplitude
    • decibels, loudness doubles every 10 dB, pressure max at top of wave
    • 85 dB max safe level (8 hr. exposure)
    • max at 4000 Hz
  61. Frequency
    • Hertz, pitch changes one octave every time frequency doubles
    • Range- 20 - 20000 Hz
    • Best at 2000 Hz
  62. Complex Periodic Sounds
    Starts with fundamental frequency (first harmonic), all other frequencies in complex sound (harmonics) multiple of first harmonic
  63. Additive Complexity Synthesis
    Adding together harmonics (frequencies) to complexify sounds
  64. Frequency Spectrum
    Display of each harmonic (frequency) in complex sound
  65. Attack and Decay of tones
    Beginning of sound that builds up and the end of the sound that dies off
  66. Timbre
    • complexity of sound, determines all other features besides pitch and loudness and duration, allowing for sound distinction between sources
    • Removing first harmonic (fundamental frequency) creates different timbre but retains perceived pitch
  67. Fourier analysis
    Displays complex sound as frequency components as their own each sine wave
  68. Unechoic Chamber
    Absorbs all sound, preventing any sound reflection, lowest decibels (99.9% sound absorbed)
  69. Reverberation Chamber
    reflects all sound possible for as long as possible, measures sound recorders and sound producers
  70. Audibility Curve
    • Compares how easy to detect sound (absolute threshold curve) at each frequency
    • best (lowest threshold) at 2000-4000 Hz
  71. Equal Loudness Contours
    • Compares how loud we percieve a sound at different frequencies
    • compare to 1000 Hz standard tone
    • Equal loudness of frequencies at 80 dB
    • Low and High frequencies softer at 40 dB
  72. Inverse square law of sound
    Energy of sound decreases with distance squared
  73. Scala Vestibuli
    • Vestibular Canal connected to oval window, receiving sound first before sending to Tympanic Canal
    • Vestibular - Vest - Best - First
  74. Scala Tympani
    Tympanic canal connected to round window, receives sound from vestibular canal
  75. Choclear duct
    • Between vestibular canal and tympanic canal
    • containing organ of corti, basilar membrane, and tectorial membrane
  76. Reissner's membrane
    Separates vestibular canal and choclear duct
  77. tympanic membrane
    separates choclear duct and tympanic membrane
  78. Perilymph
    fluid that fills scala vestibular and scala tympani in choclea
  79. helicotrema
    space at tip of outstretched basilar membrane in choclea
  80. Endolymph
    fluid filling organ of corti, assisting with transduction of sound from hair cells to auditory nerve
  81. Motile Response
    • Activated by bending of stereocilia on outer hair cells
    • allowing for amplification and sharpening of basilar membrane and thus sound transduction
  82. Nerve fibers signal frequency (2 factors)
    • How- Firing rate corresponds to frequency
    • Which - selective neurons only fire at certain frequencies, based on selective hair cells on basilar membrane
  83. Hemholtz: Signalling Frequency How and which (two theories)
    • Place (code) theory - certain hair cells respond based on location on basilar membrane and which frequency vibrates basilar memrane 
    • doesn't work for low Hz, doesn't move membrane
    • Frequency (temporal code) theory - neuron fire rates correspond to frequency
  84. Travelling Wave
    • Depiction indicating power of movement created by wave as wave moves across basilar membrane
    • detected frequency is location of where wave is at peak power
  85. Tonotopic Map
    mapping frequency (high to low) on basilar membrane (base to apex) across choclea
  86. Physiological Frequency tuning Curves
    • representation of detected threshold of various frequencies
    • presented by pure tones with white noise (masked threshold)
    • showing the detection sensitivity along basilar membrane
    • For higher frequencies, cuves become narrower, showing greater sensitivity
    • outer hair cells have slight curve, indicating amplification effect
  87. Phase locking
    • temporal place code theory explanation,
    • neurons respond with fire rate corresponding to frequency
    • "phase-lock" with peak of sine wave to fire
    • Groups of fibers needed, fire in bursts with pauses
  88. Dynamic Range
    • range of amplitudes audible and whose frequencies we can discriminate between
    • limited by maximum firing rate of neurons (grouped together to increase max level)
  89. Fiber Dynamic Range
    • range of amplitudes a neuron can fire at (within range of frequencies as well)
    • Ranges from baseline (no firing) to saturation (maximum firing)
  90. Hearing Loss (temporary and permanent)
    • Threshold Shift upwards, less sensitive
    • Temporary - min. hr. days
    • Permanent -permanent, deafness (outer, middle, inner, nerve, cortex damage)
  91. Auditory Fatigue
    • when sounds last too long, too loud, or change to quick, nerves can't react fast enough
    • Nerves go to refractory (hyperpolarize)
    • nerves permanently damaged/destroyed
  92. Audiogram
    Compares detected sound level (standard loudness) across frequencies to determine sensitivity
  93. Conductive Hearing Loss
    • Blocking sound from reaching receptors in choclea
    • Con-congress-block
  94. Sensorineural Hearing Loss
    • Neural- loss of neurons, nerve, or cortex ability
    • Damage to hair cells, nerve fiber, auditory pathway due to age or noise or congenital or tumors or drugs
  95. Presbycusis
    • Hearing loss due to drugs, noise, or age 
    • presbyterian- old man on drugs rocking out
    • Worse for higher frequencies
  96. Tinnitus
    • perception of sound, coninuous or random that is not actually there
    • result of sensorinueral damage
    • treated with white noise generator
  97. Conductive Hearing Loss Causes (5) and solution (1)
    • cerumen blocking canal
    • otosclerosis- growth of bone in middle ear blocking ossicle movement
    • torn tympanic membrane (ear drum)
    • otitis media- middle ear inflammation
    • 1: Bypassed by bone conductance to choclea
  98. Noise-induced damage causes
    • cochlea damage where amplitude is too high
    • Tears basilar membrane
    • destroys tip links in stereocilia
    • damages outer hair cells
    • Cell death
  99. Cell-death causes by noise-damage
    • excitotoxicity- glutamate flooding in 
    • loss of blood flow to cochlea
    • free radical damage to tissue and hair cells
  100. choclear implant (2 parts)
    • External - detects sound, sends to internal detector (transducer) as electrical signal
    • Internal - receives electrical signal, stimulates choclea by going through round window into choclear duct
  101. Auditory pathway to auditory cortex
    vestibulochoclear nerve >> choclear nucleus (brain stem)>> trapezoid body (brain stem) and superior olivary nucleus (brain stem) and inferior colliculus (mid brain) >> medial geniculate body (thalamus)
  102. Auditory Core and Surrounding (Hierarchy pathway)
    A1 (core)>>rostral core >> rostrotemporal core >> belt (complex sound) >> parabelt (complex sound)
  103. Auditory What Pathway
    • Anterior portion of core >> prefrontal cortex
    • identify sound (decision)
  104. Auditory Where Pathway
    • Posterior core >> belt >> parietal cortex >> prefrontal cortex 
    • localize sound
  105. Azimuth coordinates
    • left and right directions of sound, determined by:
    • interaural level diff.
    • acoustic shadow
    • interaural time diff.
    • head motion
    • cone of confusion
  106. Elevation
    • medial plane, up and down sound coordinates
    • detected by spectral shape cue determined by pinnae
  107. Distance
    • Distance from head center of sound source
    • Determined by perceived loudness:
    • blurring effect
    • echoes
    • doppler effect
  108. Angle and azimuth
    • If two sounds, need angle of separation between sources
    • 75% correct when azimuth audible angle different by <`10 degrees
  109. interaural time difference
    • arrival time differences
    • differ from 0-600 microsec. depending on azimuth angle
  110. Head motion and cone of confusion
    • cone of confusion- sound source equal angle and distance from head, turn head to alter ITD and ILD for each ear
    • sound source from 45 degrees has same ItD and ILD as a sound source from 135 degrees straight ahead
  111. Spectral Shape Cue
    • Pinnae alters sound by amplifying and dampening sound waves depending on how elevated the sound source is
    • The more complex the sound, the easier to discriminate elevation (more frequencies to work with)
  112. Percieved loudness and distance
    • application of inverse square law on energy of sound wave to determine distance
    • greater reduction for higher frequencies
  113. blurring effect and distance
    the further away the sound source, the more blurred (less pure) it is, decreases with sound level
  114. echoing and distance
    • if reflected sound is heard more than direct sound, the source is far away
    • if direct sound heard more, source is near
  115. Doppler effect
    • the sound is louder and high frequency if the source is moving towards you (in front of source)
    • softer and lower frequency is moving away (behind source)
  116. Echolocation characteristics (what can be determined)
    • size and shape
    • texture (composition)
    • elevation
    • distance
    • azimuth
  117. Ventriloquism necessary factors
    • Conflicting info from visual (sound from puppet) and auditory (sound from puppeter), have visual dominate by:
    • proximity to sound source
    • matching movement (timing) with sound source
    • match context of sound source
  118. Medial superior olivary complex
    • Neurons tuned for interaural time differences in recieving signal from each ear
    • gives info on azimuth of sound source
    • Coincidence detectors: only fire if receiving signal from each ear (each choclear nucleus) at same time
  119. Auditory Cortex and ILD
    • neurons in auditory cortex tuned to detect loudness differences between each ear
    • respond with population code to give azimuth info.
  120. Auditory Stream
    • Grouping frequencies together because they have same source or similar sources
    • Separate auditory space into streams for analysis
  121. Auditory Grouping Decisive Factors
    • Onset time similarity
    • Location similarity - starting location and follow same path and change slowly (common fate)
    • Similarity of timbre and pitch - sound similar
  122. Harmonic coherence
    Grouping sounds into auditory stream by sounds have same fundamental frequency and thus same harmonics
  123. Grouping by Synchrony and Asynchrony
    onset, change, and offset time of sound need to be similar to mix auditory streams
  124. Frequency similarity and sequential grouping
    two diff. frequencies, grouped together when alternating if they have similar frequencies
  125. Temporal proximity and sequential grouping
    Two diff. frequencies alternating grouped together if time between alternations is longer
  126. Illusory glides
    filling in frequency change path when masked by white noise
  127. Sensory Substitution Device
    • Use sound to give info about visual field
    • black and white - loudness
    • bottom to top - low to high frequency
    • left to right- time sound is presented
  128. P
  129. Phoneme
    smallest unit of speech, without meaning, just distinct sound (100 total), compose morpemes, rules of combination
  130. Morpheme
    • consists of phonemes, units of language with meaning, composing words
    • context- identify root with meaning
    • function- identify word with additives (suffixes and prefixes)
  131. Acoustic Signal
    sound produced when air is pushed through vocal chords
  132. Vowels
    unrestrict airflow with articulators, producing formants (2-3)
  133. Formants
    frequencies produced by resonant frequency of vocal cord as air passes over it, determined by airflow determined by articulators
  134. Consonants
    restrict airflow (constriction), shown by rapid formant changes both before and after
  135. Coarticulation
    overlap of articulation of two phonemes, influence one another
  136. Fundamental frequency in vocal cords
    • Depends on size and shape of vocal cords
    • Amount of airflow
    • size and shape of larynx
  137. Spectrogram
    frequency and amplitude over time as vowel is produced and frequency and intensity changes
  138. Consonant production factors
    • manner of articulation - how airflow is restricted (stopped, nasal, etc.)
    • location of articulation - where airflow is restricted (dental, glottal, etc.)
    • voicing - whether or not vocal folds vibrate (voiced) or not (voiceless)
  139. Auditory perceptual constancy and variability problem
    • fixes variability problem where signal doesn't exactly match phonemes because they vary based on:
    • context
    • coarticulation
    • sound producer
    • Able to interpret two diff signals as same consonant by probability or completion
  140. Phoneme transition probability
    top down, assessing whether or not phoneme is likely to appear in the position in a phrase or word
  141. Phonemic restoration
    perceptual constancy tool to complete particular missing phonemes by context and top-down (even unconscious)
  142. Voice Onset Time
    • consonant followed by vowel
    • time between frequencies of consonant end and frequencies of vowel begin
    • gap creates phonetic boundary between detected sounds
  143. McGurk Effect
    • compromise perception of phoneme when auditory cues do not match visual cues (blend the two)
    • eyes see: ga
    • ears hear: ba 
    • perceived as: da (midpoint)
  144. Ventral pathway of speech production
    • meaning of words 
    • primary auditory cortex combined with auditory signal from visual field, communicate with broca and wernickes area
  145. Dorsal pathway of speech production
    Production speech itself, coordinate speech sound with what needs to be produced, communication between wernickes in auditory cortex to brocas and frontal-motor cortex
  146. Angular Gyrus damage - "aphasia"
    Cannot speak, read, or understand words
  147. Brocas "non fluent" aphasia
    cannot produce words
  148. Wernickes "fluent" aphasia
    cannot understand or contextualize words fluent jargon
  149. Other dimensions of music besides pitch, loudness, timbre, timing
    dynamics, rhythm, tempo, structure, music theory
  150. Octave components
    • first to last note doubles fundamental frequency
    • intervals between 13 notes (separated by percieve frequency) called semitones
  151. Pitch helix, turn, chroma, height, distance
    • turn- one octave
    • chroma - restarts per turn, shows frequency heightening for each tone
    • height - determines turn number and thus, octave
    • distance- constant between each note, showing equal pitch intervals
  152. Dynamics of music
    variation of loudness throughout peice
  153. Rhythm
    • time changes throughout peice
    • tempo-overall timing
    • beat- pulse describing tempo
    • meter- pattern of pulses organized in piece
  154. harmonicity
    • how much note (and their harmonics) combinations blend/coincide
    • determines dissonance or consonance
  155. Music perception (top down) factors
    knowledge, repetition, familiarity, context , "fit" of notes within scale
  156. Left hemisphere and music
    speech and timing processing
  157. Right hemisphere and music
    perception of pitch
  158. motor cortex and music
    music and speech production
  159. Amusia
    • cannot perceive differences in pitch or melody
    • congenital
    • thicker right inferior frontal cortex and auditory cortex limit connections
  160. Intimacy time
    difference in time between sound production and first reflection (20 microsec.)
  161. Bass ratio
    • ratio of low to middle frequencies reflected
    • better if lower is reflected more
  162. Spaciousness factor
    • ratio of sound received that is reflected versus direct (the more indirect the better)
    • the bigger the room, the longer the ideal reverberation time