Study your flashcards anywhere!

Download the official Cram app for free >

  • Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off

How to study your flashcards.

Right/Left arrow keys: Navigate between flashcards.right arrow keyleft arrow key

Up/Down arrow keys: Flip the card between the front and back.down keyup key

H key: Show hint (3rd side).h key

A key: Read text to speech.a key


Play button


Play button




Click to flip

38 Cards in this Set

  • Front
  • Back
Fixed-interval (FI) schedule of reinforcement
-A reinforcer is contingent on
-the first response,
-after a fixed interval of time,
-since the last opportunity for reinforcement
What does a fixed-interval schedule produce when it comes to responding?
Fixed-interval scallop
Fixed-interval Scallop
-A fixed-interval schedule often produces a scallop --
-a gradual increase in the rate of responding,
-with responding occurring at a high rate
-just before reinforcement is available.
-No responding occurs for some time after reinforcement.
What did Joe do with his term paper?
He did not work on it until the 8th week, and then he worked increasingly more on it as the deadline neared
Was Joe's term-paper schedule a fixed-interval?
What were the six differences between an FI and the term-paper schedule?
1) Early responding affects the term-paper, not an FI
2) The amount of reinforcer increases with work in a term-paper, not an FI
3)An early response produces nothing in an FI
4) Term paper schedules have calendars and clocks
5) Term-paper schedules have deadlines
6) Term-paper schedules involve reinforcers that are too delayed to reinforce the causal response
FI vs TP

Does early responding affect anything?
FI: no
TP: yes
FI vs TP

Do you get more if you work harder?
FI: no
TP: yes
FI vs TP

Is the relevant response class clear?
FI: yes
TP: no
FI vs TP

Are there calendars and clocks?
FI: no
TP: yes
FI vs TP

Is there a deadline?
FI: no
TP: yes
FI vs TP

Is the reinforcer too delayed?
FI: no
TP: yes
Is a Tv schedule a fixed-interval?
Why isn't a tv schedule a fixed-interval?
Because you have a calendar and a clock

and a deadline.

The contingency is avoidance of the loss of the opportunity to receive a reinforcer
Is a paycheck schedule fixed interval?
No. There is no fixed interval scalop
What is the Seinfeld example of a fixed interval?
You're watching Seinfeld, but a commercial comes on
-You switch to Jerry Springer
-You check back on Seinfeld's channel with increasing frequency as the commercial interval wears on
-Eventually, one of your flips is rewarded with Seinfeld

If all of the commercial breaks are the same length, it is a fixed interval
What was the superstition in the pigeons?
A response key was not hooked up to the food hopper, but the hopper was connected to a timer that was set for repeating intervals of 15 seconds.

Every 15 seconds the hopper would open and be available regardless of what the bird had been doing

One of the birds made a counterclockwise turn at the moment the food hopper became available. After that response was reinforced, the bird started making more and more counterclockwise turns.

Other birds established head-tossing or pendulum motions
Fixed-time schedule of reinforcer delivery
-A reinforcer is delivered
-after the passage of a fixed period of time,
-independently of the response.
Superstitious Behavior
-Behaving as if the response causes
-some specific outcome,
-when it really does not
Variable-interval (VI) schedule of reinforcement
-A reinforcer is contingent on
-the first response
-after a variable interval of time
-since the last opportunity for reinforcement
What kind of response rates do VI schedules generate?
Consistent response rates;

slopes tend to be even and uniform;

not rapid
Variable-interval responding
-Variable-interval schedules produce
-a moderate rate of responding,
-with almost no postreinforcement pausing
What are the four classic schedules of intermittent reinforcement?
1) Fixed-ratio
2) Fixed-interval
3) Variable-ratio
4) Variable-interval
Resistance to Extinction
-The number of responses or
-the amount of time
-before a response extinguishes
Resistance to extinction and intermittent reinforcement
-Intermittent reinforcement
-makes the response
-more resistant to extinction
-than does continuous reinforcement
What is the difference between Fixed interval (FI) and Fixed time schedules?
Fixed time schedules do not require a response
-Any stimulus, event, or condition
-whose presentation immediately follows a response and
-increases the frequency of that response
-presentation of
-a reinforcer
-resulting in an increased frequency of that response
What's the relation between rate of responding and rate of reinforcement?
-With ratio schedules, the faster you respond, the more reinforcers you will get per hour

-With interval schedules, responding faster doesn't help; you only have to respond faster than the shortest interval
How did Max get rid of Ted's behavior?
He used a penalty interval schedule;

he gave Ted 12 tokens to be taken away for each time he misbehaved; each token represented 5 minutes of missed lunch and extra minutes in class

A prerecorded beep went off every 5 minutes to remind Max to check on Ted

It only took 3 class sessions to make Ted obedient
Principle of resistance to extinction
Intermittent reinforcement makes behavior more resistant to extinction than does continuous reinforcement
Why does intermittent reinforcement increase resistance to extinction?
Stimulus generalization

Because extinction and intermittent reinforcement are more similar (both have periods where responses will produce no reinforcement)
What was the example of limited hold in the pit?
Manuel was a prisoner in a labor camp;

the wind gust blowing the curtain in the tower revealing the commandant's daughter was the limited hold

he had to look up more often because the opportunity to see the lady was available for a limited time

The lady turned out to be a dressmaker's dummy
Limited hold
the opportunity to produce the reinforcer is available for a limited time;

a time period during which the response will produce the after condition; responding either before or after the limited hold will have no effect
What effect does a limited hold have on response rates?
Responses occur at a high rate in order to produce all the reinforcers that become available
the time before which we should make a response or a set of responses or complete a task
When should you respond in a 10 minute fixed interval with a 1 minute limited hold?
From the 10th minute to the 11th minute (the limited hold)
Can an S^D be associated with both a deadline and a limited hold?