Study your flashcards anywhere!

Download the official Cram app for free >

  • Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off

How to study your flashcards.

Right/Left arrow keys: Navigate between flashcards.right arrow keyleft arrow key

Up/Down arrow keys: Flip the card between the front and back.down keyup key

H key: Show hint (3rd side).h key

A key: Read text to speech.a key


Play button


Play button




Click to flip

55 Cards in this Set

  • Front
  • Back
  • 3rd side (hint)
phase 1: instrumental conditioining of operant response; phase 2: classical conditiion; phase 3 = transfer phase (allowed to engage in instrumental response and CS from 2 periodically observed to see its effects
transfer-of-control test; Rescorla and Colwill;
reinforcement schedule in which response is reinforced only if it occurs after a specified amount of time has elapsed following that response
differential reinforcement of low rate
the gradually increasing rate of responding that occurs between successive reinforcements on a fixed interval schedule
fixed interval scallop
as time of availability of the next reinforcer draws closer, the response rate increases, evident in an acceleration in the cumulative record
reinforcement schedule in which reinforcer is delivered for the first response that occurs after a fixed amount of time following the last reinforcer
fixed interval schedule
set time is constnat from one occasion to the next
reinforcement schedule in which fixed number of responses must ocur in order for the next response to be reinforced
fixed ratio schedule
schedule of reinforcement in which only some of the ocurrences of the instrumental response are reinforced; thus, instrumental reponse is reinforced occasionally or intermittently
intermittent reinforcement; partial reinforcement
interval between one response and the next
interresponsetime (IRT)
reinforcement schedule in which a response is reinforced only if it occurs after a set (not fixed?) amount of time following the last reinforcement
interval schedule
restriction on how long reinforcements remain available
limited hold
available only for limited periods; can be added to both fixed interval and variable intercal schedules
rule for instrumental behavior which states that the relative rate of responding equals the relative rate of reinforcement
matching law
mechanism for achieving matching by responding so as to improve local rates of reinforcement for response alternatives
a pause in responding that typically occurs after the delivery of the reinforcer on fixed ration and fixed interval schedules of reinforcement
postreinforcement pause
zero rate of responding that occurs just after reinforcement
the high and invariatn rate of responding observed after the postreinforcement pause on fixed ratio reinforcement schedules;
ratio run
when does the ration run end?
when te participant is reinforced
reinforcement schedule in whihc reinforcedment depends on the number of responses the participant performs, irrespective of when those responses occur
ratio schedule
reinforcement depends on the number of responses organism has performed
disruption of responding that occurs when a fixed ratio response requirement is increased too rapidly
ratio strain
reinforcement schedule in which response is reinforced depending on how soon that response is made after the previous one
response-rate schedule
a program or rule that determines how and when the occurrence of a response will be followed by the delivery of the reinforcer
schedule of reinforcement
pattern of results is highly predictable
reinforcement schedule in which reinforcement is provided for the first response that occurs after a variable amount of time from the last reinforcement
variable interval schedule
the value of the schedule refers to the average amount of time needed for reinforcement
reinforcement schedule in which the number of responses necessary to produce reinforcement varies from trial to trial; value of the schedule refers to the average number of responses needed for reinforcement
variable ratio schedule
numerical value represents the average number of responses required
instance in which delivery of reinforcer happens to coincide with a particular reponse even though that response was not responsible for the reinforcer presentation
accidental reinforcement; adventitious reinforcement
a pleasant or satisfying stimulus
appetitive stimulus
an instrumental conditioning procedure in which the instruemtnal response prevents the delivery of an aversive stimulus
involves aversive stimulus scheduled to be presented sometime in the future; do something to prevent aversive stimulus
theoretical idea that an organisms's evolutionary history makes certain responses fit or belong with certain reinforcers
belongingness; proposed by Thorndike
stimulus that becomes an effective reinforcer because of its association with a primary or unconditioned reinforcer
conditioned reinforcer; secondary reinforcer
may bridge delay between instrumental response and primary reinforcer
simultaneous occurrence of two events
contiguity; temporal contiguity
instrumental conditioning procedure in which positive reinforcer is devivered if the participant fails to perform a particular response
differential reinforcement of other behavior; ommission
method of instrumental conditioning in which the participant can perform the instrumental response only during specified periods usually determined by placement of the participant of the participant in an experimental chamber or by the presentation of a stimulus
discrete-trial method
instrumentla conditioin procedure in which the instrumental response terminates aversive stimulus
one of negative reinforcement
method of instrumetnal conditioning that permits repeated performance of the instrumentla response without the participant being removed from the experimntal chamber
free-operant method
gradual drift of instrumental behavior away from the responses required for reinforcement to species-typical or instinctive responses related to the reinforcer and to other stimuli in the experimental situation
instinctive drift
Breland and Breland
activity that occurs because it is effective in producing a particular consequence or reinforcer
instrumental behavior
aspect of behavior produces significatn outcome (behavior occurs because similar actions produced the same type of outcome in the past)
respose that increases in frequency after the delivery of a periodic reinforcer and then declines as time for the next reinforcer approaches
interim response
rule for instrumental behavior which states that if a response in the presence of a stimulus is followed by a satisfying event, association between S-R will be strengthened; if response is followed by annoying event, S-R weakened
law of effect, proposed by Thorndike-involves S-R learning
animals learn association between response and stimuli present at the time of the response (consequence of the response is not one of the elemnts in the association)
interference with learning of new instrumentla responses as result of exposure to inescapable and unavoidable aversive stimulation
learned-helplessness effect
theoretical idea that assumes that during exposure to inescapable and unavoidable aversive stimulation, participants lern that their behavior does not control environmental events
learned-helplessness hypothesis
preliminary stage of instrumental conditioning in which stimulus is repeatedly paired with the reinforcer to enable participant to learn to go and get reinforcer when it is presented
magazine training
sound of food-delivery device maybe be repeatedly paired with food so that animal will learn to go to the food cup when food is delivered; involves classical conditioing like sound of food delivery device (food magazine) paired with dlivery of food pellet into the cup
What should you do with every question of this test?
response turns on appetitive stimulus, there is a posiitive contingency
positive reinforcement
response turns on aversive stimulus; there is a postitive contingney with an aversive stimulus
instrumental response prevents delivery of an aversive stimulus
negative reinforcement
less responding for reinforcer following previous epxerince with a more desired reinforcer than in the absence of such prior experience
negative contrast
instruemtnal conditioning procedure in which there is a negative contingency between instrumental response and avserive stimulus; aversive stimulus is terminated or preventd from ocurring when response is performed
negative reinforcement
instrumental conditioning prcedure in which instrumental response prevents delivery of reinforcing stimulus
response defined by effect it produces in the environemnt
operant respones
greater responding for favorable reinforcer following prevoius experience with less desired reinforcer than in the absence of such prior experience
positive contrast
instrumentla conditioning procedure in which there is a positive contingency between isntrumental response and reinforcing stimulus; perform response, receive reinforcing stimulus
positive reinforcement
instrumentla conditioning proceudre in which there is postivit contingency between instrumental response and aversive stimulus; perform response recieves aversive stimulus
causal relation between response and a reinforcer
response-reinforcer contingency
how fast an animal moves ina runway
running speed
reinforcement of successive approximations (nonreinforcement of past responses)
behavioral contrast effects prduced by frequent shifts between favoratlbe and unfavorable reard conditionin
simultianeous behavioral contrast
behavior that increaes in frequency because of accidnetal parings of the delivery of a reinforcer with occurrences of the behavior
superstitious behavior
response that is most likely at the end of the interval between successive rinforcements that are presented at fixed intervals
terminal response
technically a ratio schedule of one
continuous reinforcement