• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/57

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

57 Cards in this Set

  • Front
  • Back
Vollmer, Iwata

Establishing Operations and Reinforcer Effects

Procedure
residential setting
5 profoundly retarded male participants
ages 25, 27, 28, 29, 36

2 responses:
-block placement task
-switch closure task

3 consequences:
food, music, social interaction

3 conditions:
baseline, deprivation, satiation
Vollmer, Iwata

Establishing Operations and Reinforcer Effects

Results
No consequences during baseline.

Deprivation acted as a an establishing operation which increased probability of responses.

Satiation acted as an establishing operation which decreased the probability of responses.

*Response rates during reinforcement conditions vary as a function of relative deprivation versus satiation.
Hammond

The Effect of Contingency Upon The Appetitive Conditioning of Free Operant Behavior

Procedure experiment I
10 male albino rats
23 hour water deprivation
1 hour magazine training
conditions varied from very high positive contingency, high positive, mod high pos, zero,mod high pos,to zero.

(response- lever press)
Hammond
The Effect of Contingency Upon The Appetitive Conditioning of Free Operant Behavior

Results experiment I
Decline in responding upon intro to zero contingency (greater decline 2nd zero contingency shift)

Greater responding with positive contingency
Hammond

The Effect of Contingency Upon The Appetitive Conditioning of Free Operant Behavior

Procedure experiment II
47 male albino rats (grouped into 5)

conditions varied from very high positive contingency, mod high, zero(1), vhp, mhp, strongly neg(2), vhp, hp, zero(3),vhp, hp, hp(4), and vhp, hp, intermediate pos(group5).
Hammond

The Effect of Contingency Upon The Appetitive Conditioning of Free Operant Behavior

Results experiment II
Negative contingency more effective at supressing behavior than zero contingency.
Nonassociative learning
reflexive, S-R

change in behavior by the presentation of a stimulus
Respondent conditioning
classical/Pavlonian

The conditioning of reflexive behavior by S-S contingencies (or pairings)

S-R link not influenced by consequences of responses
Operant conditioning
R-S, Instrumental conditioning

Behavior emitted by an organism and reinforced by environmental events following responding.

Consequences of behavior influence future response rate.

Reinforcement and Punishment are basic operant processes.
How are nonassociative learning, respondent conditioning, and operant conditioning similar and different?
Non-A learning and R-
Conditioning both concern reflexive behavior, whereas operant conditioning behaviors are learned from past history.

NAL does not require an association between two stimuli, whereas RC includes pairing of a US and a CS.
NA= S-R
RC= S+S-R

In both RC and OC, responses are influenced by a stimuli, whether before or after the behavior.
Habituation processes
desensitization

A decrease in the effect of a stimulus on the response following repeated presentations.

ex: repeated presentation of a novel tone w/no consequence elecits reduced orienting responses.

ex: rooting reponse to cheek stimulation in neonates decreases with repeated presentations.
Habituation functional relations
Habituation occurs more slowly the more intense the stimuli.
Repeated occurrences of habituation result in smaller effects.
After habituation occurs, re-presentation of stimulus elicits stronger response.
Repeated presentations of a stimulus results in reduced response rates.
Sensitization processes
Increase in eliciting effect of a stimulus following repeated presentations.

ex: watching a horror movie and then hearing a novel noise outside (increased startle response)
ex: Delivery of foot shock and novel tone (increased startle response)
Sensitization functional relations
The higher the intensity of the stimulus, the stronger the sensitization effects.

The longer the time between stimuli, the stronger the sensitization effect.
Different types of respondent conditioning
respondent extinction, respondent-stimulus
generalization, sensory preconditioning, respondent stimulus discrimination,
Higher-order respondent
conditioning, Simultaneous conditioning, Trace Conditioning, Delay Conditioning, Backward Conditioning, Temporal Conditioning, Blocking,
Overshadowing
Positive Reinforcement
(definition and example)
Any stimulus whose presentation following a response increases the probability of that response.

ex: pigeon pecks a key ro recieve grain (pecking increases)

ex: child presses a button to get an m&m (button presses increase)
Negative Reinforcement
(definition and example)
Any stimulus whose removal following a response increases the probability of that response.

ex: pigeon pecks a key that reduces illumination (pecking increases)

ex: child presses button to terminate work (button presses increase)
Positive Punishment
(definition and example)
Any stimulus whose presentation after a response decreases the probability of that response.

ex: pigeon pecks key and illumination increases (key pecks decrease)

ex: child presses button and gets more work (button presses decrease)
Negative Punishment
(definition and example)
Any stimulus who removal following a reponse decreases the probability of that response.

ex: pigeon pecks a key and food is removed (pecking is decreased)

ex: child presses a button and m&m is taken away (button presses decrease)
Shaping
definition and examples
Reinforcing successive approximations of a target behavior.

DRL, DRH, T-maze performance, Place preference, backward chaining, forward chaining, Three-position response sequence
2 behavioral processes of shaping
differential reinforcement
extinction
differential reinforcement
contingency in which only some response topographies are reinforced and others are not.
extinction
absence of reinforcement for some response topographies

absence of contingency
How does differential reinforcement increase or decrease behavior?
DR increases target behavior through a contingency for that behavior ro be emitted in order to recieve reinforcement. Only response topography of target behavior is reinforced.

Response topographies that are not reinforced decrease (are extinguished).

DRL, DRO, DRI, DRH
How does extinction increase or decrease behavior?
extinction decreases behavior by removing the contingency (and prior reinforcement).

Whatever was maintaining the behavior is no longer being reinforced.
Contingency

(definition and how it relates to operant conditioning)
a consequence that is based by the meeting of a response requirement.

All operant behavior is based on the concept of contingency.
Contiguity

(definition and how it relates to respondent conditioning)
next to or near in time or sequence.

Time relationship between CS and US.

Contiguity is critical in respondent conditioning.

CS and US must be close enough in time to form and association and far enough away in time to be discriminated from one another.
Examples of basic behavioral processes that can be synthesized into a more complex behavioral process.
Rat lever response:

Rat presses one lever and a shock is avoided so the pressing lever response is increased.
Rat presses another lever and is shocked. The lever pressing response is decreased.

This process involves response avoidance and positive punishment.
Examples of basic behavioral processes that can be synthesized into a more complex behavioral process.
Three-position response sequence.

Involved differential reinforcement, extinction, and conditioned reinforcement to establish a response sequence.
Examples of basic behavioral processes that can be synthesized into a more complex behavioral process.
Individual processes:
extinction, habituation, US-CS conditioning, neg reinforcement, pos punishment, etc.

Synthetic processes:
Differential reinforcement (extinction, reinforcement)

Satiation
(positive reinforcement, abolishing operation)
Response class
behaviors (or different topographies) that occasion a similar type of reinforcement (can be pos or neg)

a group of responses with the same function (each response group produces the same effect on the environment)
How response classes relate to positive and negative reinforcers.
Positive reinforcment: there are many ways to open a bag of chips.

Negative reinforcement: there are many ways to kill a bug.
Free operant avoidance
each response postpones an aversive stimulus
RS/SS examples
RS- response-shock interval

ex: (RS at 5s interval) If the rat responds it avoids a shock for 5s.

SS- Shock-Shock Interval

ex: (ex SS interval of 20s) If a rat does not respond it gets shocked every 20s)

ex: RS (5s) SS(15s)
If the rat responds it postpones the shock for 10s. If it does not respond it gets shocked every 15s. The response is likely to be lower in this condition.
RS/SS interval definitions
Response shock interval is the amount of time that a shock can be avoided given a response.

Shock-shock interval is the amount of time between shocks given no response
RS/SS functional relations
When the RS interval is greater than the SS interval the condition enters into a negative reinforcement contingency (rat presses lever to avoid shock)

The longer the RS interval, the higher the response probability.

The longer the SS interval the lower the response probability.
IRT
Inter-response Time

the time between 2 consecutive responses
How do IRTs relate to responding under the control of positive or negative reinforcement?
Negative reinforcement: the longer the IRT required for reinforcement, the less probability of a response

The smaller the IRT required for reinforcement, the higher the responding.

(is it the same for both pos and neg reinforcement?)
"The Premack Principle"

(relativity theory of reinforcement)
the relativity of reinforcers and punishers.

ex: access to higher probability operants can be used as reinforcers to increase the occurrence of lower probability operants.

making the opportunity to engage in a high frequency behavior contingent upon a lower frequency behavior will function as a reinforcement for the low-frequency behavior
Synthesis of habituation and sensitization
If a strong stimulus occurs repeatedly at a low rate, sensitization will occur before habituation.

Reduction in a stimulus strength reduced sensitization effects (reciprocal: more=greater)

Increased time between presentations increases sensitization effects )reciprocal: shorter=lesser)
Respondent extinction
CS-R is no longer paired with the US.

Need intermittent pairings of CS+US, otherwise respondent extinction occurs.
Variables that influence respondent extinction
strength of US: the stronger the US the less often you have to pair US+CS

duration of US: longer the duration of the US the less often pairing needs to occur

time since last presentation of US

Number of time CS extinction has taken place: the more times extinguished the more often you must have CS+US pairings
Respondent stimulus generalization
the tendency of R to occur to a CS that varies slightly in some physical dimension (duration, frequency, intensity)

Organism may have same or similar response to a slightly different CS.
Respondent Stimulus Discrimination
Organism learns to respond to CS that is paired with US, but not to stimuli that do not recieve pairings with US.
Higher-order respondent conditioning
pairing of a CS with a different CS+US pairing (chained)

CS2 produces weaker response than CS1
Sensory Preconditioning
Pairing of a CS with a different CS+US pairing (unchained)

CS2 produces weaker response than CS1
Simultaneous Conditioning
CS precedes US by less than 5s

rapid conditioning of CS

learning occurs quickly
Trace Conditioning
CS is presented with a relatively long delay before the onset of the US

Longer the delay, the less effective the conditioning
Delay Conditioning
CS is presented continuously before the onset of US

conditioning strength decreases with increased delays between CS and US
Backward Conditioning
US is presented before CS

Least effective type of respondent conditioning

extinguishes quickly

can not come under the control of higher order conditioning
Temporal Conditioning
US presented at constant temporal intervals

passage of time may become discriminative- passage of time may become a CS
Which type of conditioning paradigm results in the most rapid extinction?
backward
Which type of conditioning paradigm results in the greatest resistance to extinction?
simultaneous
Does stimulus presentation occur in temporal conditioning?
yes
Which type of conditioning paradigm is least likely to come under the control of higher-order conditioning?
backward
Blocking
Previous conditioning with CS impairs occurring with a novel CS.
training
CS1+S2+US- R

CS1-response
S2- no response

S2 doesn't predict anything even though it's added to the chain
Overshadowing
Simultaneous pairing of CSs results in only one stimulus becoming conditioned.

No way to predict which stimulus will become conditioned.