Study your flashcards anywhere!

Download the official Cram app for free >

  • Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off

How to study your flashcards.

Right/Left arrow keys: Navigate between flashcards.right arrow keyleft arrow key

Up/Down arrow keys: Flip the card between the front and back.down keyup key

H key: Show hint (3rd side).h key

A key: Read text to speech.a key


Play button


Play button




Click to flip

19 Cards in this Set

  • Front
  • Back
Intermittent reinforcement
-A reinforcer follows the response
-only once in a while
Continuous Reinforcement
-A reinforcer follows each response
Schedule of Reinforcement
-The way reinforcement occurs
-because of the number of responses,
-time between responses,
-and stimulus conditions
Which type is best for shaping or maintaining difficult behavior?
What was an example of shaping with continous reinforcement?
Andrew's speech
Fixed-ratio schedule of reinforcement
-A reinforcer follows
-a fixed number of responses
Fixed-ratio responding
-After a response is reinforced,
-no responding occurs for a period of time,
-then responding occurs at a high, steady rate
-until the next reinforcer is delivered
Postreinforcement Pause
-the pause after the consumption of a reinforcer
-before the next ratio of responses betgins
General rule for establishing intermittently reinforced behavior
-First use continous reinforcement and
-gradually increase the intermittency of reinforcement
-as responding stabilizes at a high rate
What type of graph do behavior analysts often use when studying schedules of reinforcement?
The cumulative graph
Variable-ratio schedule of reinforcement
-A reinforcer follows
-after a variable number of responses
Variable-ratio responding
-Variable-ratio schedules produce
-a high rate of responding,
-with almost no postreinforcement pausing
What schedule of reinforcement are most of the contingencies in our everyday lives?
Continous reinforcement (or continous punishment)
Discrete-Trial procedure
There is an S^D, a single response, and an outcome, followed by an S^Delta, then the next trial begins
Free operant procedures
there amy or may not be an S^D, there can be several responses, with the responses being reinforced either continously or intermittently
What are four differences between the Skinner box and gambling schedules?
Gambling has:

1)Interspersed learned reinforcers
2) Amount of reinforcer varies from ratio to ratio
3) Smaller ratio of responses before reinforcement
4) Emotional reinforcers
What are 3 examples of variable-ratio schedules?
1) Steve Stud using a bad pickup line (very large vr)
2) Uncle telling his dog to sit (reinforced on a variable ratio)
3) Kid whines until parents agree to what they want
What was Yasako's story?
She would vacuum before her chemistry study group met, but her neighbor downstairs would bang on the ceiling, swear, and threaten her until she stopped.

After a variable number of broom hits, she stops vacuuming. This is reinforcement for Mr. Bill by the removal of an aversive condition (negative reinforcement/escape)
What was Nasty Ned's story?
He would bully his class, but the teacher only noticed every once in a while.

This was a variable ratio penalty schedule.