• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/64

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

64 Cards in this Set

  • Front
  • Back
  • 3rd side (hint)
discriminative stimulus (S^D)
a stimulus in the presence of which responses are reinforced and in the absence of which they are not reinforced; a signal that indicates that a response will be followed by a reinforcer; sets the occasion for the behavior to happen
EX: Susan's presence is the S^D for Jonathan to tell a joke
escape behavior
A behavior that results in the termination of an aversive stimulus
EX: Opening an umbrella when it's raining
extrinsic reinforcement
The reinforcement provided by a consequence that is external to the behavior
EX: Studying in order to get an A
intrinsic reinforcement
Reinforcement provided by the mere act of performing the behavior
EX: Reading a book because you want to (vs reading it for a good grade)
generalized reinforcer
A type of secondary reinforcer that has been associated with other reinforcers
EX: Money is a generalized reinforcer because it can be associated with other things (food, clothes)
law of effect
As stated by Thorndike, the proposition that behaviors that lead to a satisfying state of affairs are strengthened or "stamped in," while behaviors that lead to an unsatisfying or annoying state of affairs are weakened or "stamped out"
Stupid
operant conditioning
A type of learning in which the future probability of a behavior is affected by its consequence
EX: If a mouse hits the lever, it gets food
reinforcer
An event that 1) follows a behavior and 2) increases the future probability of that behavior
EX: lever press --> food pellet
punisher
An event that 1) follows a behavior and 2) decreases the future probability of that behavior
EX: tell a joke and the person frowns --> not telling jokes again
3-termed contingency
The relationship between a discriminative stimulus (S^D), an operant behavior, and a reinforcer (or punisher);
Antecedent (S^D) --> Behavior (R) --> Consequence (S^R)
EX: You notice something, say, Susan (Antecedent) --> then you do something, say, tell a joke (Behavior) --> then you get something; Susan laughs (Consequence)
positive reinforcement
The presentation of a stimulus (one that is usually considered pleasant or rewarding) following a response, which then leads to an increase in the future strength of the response
EX: Ordering coffee (Behavior) --> Receiving the coffee (+S^R)
negative reinforcement
The removal of a stimulus (one that is usually considered unpleasant or aversive) following a response, which then leads to an increase in the future strength of the response
EX: Take aspirin (Response) --> Eliminating the headache (-S^R)
positive punishment
The presentation of a stimulus (one that is usually considered unpleasant or aversive) following a response, which then leads to a decrease in the future strength of the response
EX: Swatting at a bee gets you stung; won't swat at it again
negative punishment
The removal of a stimulus (one that is usually considered pleasant or rewarding) following a response, which then leads to a decrease in the future strength of the response
EX: Staying up past curfew (behavior) --> losing car privileges (-S^P)
primary reinforcer (unconditioned reinforcer)
An event that is innately reinforcing
EX: obtaining food, water, sex, etc. (born to like)
secondary reinforcer (conditioned reinforcer)
An event that is reinforcing because it has been associated with some other reinforcer
EX: Praise, money
natural reinforcers
Reinforcers that are naturally provided for a certain behavior; they are a typical consequence of the behavior within that setting
EX: Money is a natural consequence of selling merchandise
shaping
The gradual creation of new operant behavior through reinforcement of successive approximations of the behavior
EX: teaching a dog how to catch a frisbee
Schedule of Reinforcement
The response requirement that must be met to obtain a reinforcement
EX: Does each lever pressed by the rat result in a food pellet, or are several lever presses required?
Fixed Ratio Schedule (FR Schedule)
A schedule in which reinforcement is contingent upon a fixed predictable number of responses
EX: On a fixed ration 5 schedule (FR 5), a rat has to press the lever 5 times to obtain 1 food pellet
Variable Ratio Schedule (VR Schedule)
A schedule in which reinforcement is contingent upon a varying, unpredictable number of responses
EX: On a variable ratio 5 (VR 5) schedule, a rat has to emit an average of 5 lever presses for food
Fixed Interval Schedule (FI Schedule)
A schedule in which reinforcement is contingent upon the first response after a fixed predictable period of time
EX: For a rat on a fixed interval 30 second (FI 30-sec) schedule, the first lever pressed after a 30 second interval has elapsed results in food
Variable Interval Schedule (VI Schedule)
A schedule in which reinforcement is contingent upon the first response after a varying, unpredictable period of time
EX: VI 30-sec means first lever pressed after an average of 30 seconds will result in food
Fixed Duration (FD) Schedule
A schedule in which reinforcement is contingent upon continuous performance of a behavior for a fixed predictable period of time
EX: Rat must run continuously for 60 seconds in the wheel for 1 pellet of food (FD 60-sec)
Variable Duration (VD) Schedule
A schedule in which reinforcement is contingent upon continuous performance of a behavior for a varying, unpredictable period of time
EX: Rat must run in the wheel for an average of 60 seconds to earn 1 pellet of food (VD 60-sec)
Response-Rate Schedule
A schedule in which reinforcement is contingent upon the organism's rate of response
(refer to other 3 poops, DRH, DRL, DRP)
Differential Reinforcement of High Rates (DRH)
A schedule in which reinforcement is contingent upon emitting at least a certain number of responses in a certain period of time - or, more generally, reinforcement is provided for responding at a fast rate
EX: A rat might receive a food pellet only if it emits at least 30 lever presses within a period of a minute
Differential Reinforcement of Low Rates (DRL)
A schedule in which a minimum amount of time must pass between each response before the reinforcer will be delivered - or, more generally, reinforcement is provided for responding at a slow rate
EX: A rat might receive a food pellet only if it waits at least 10 seconds between lever presses
Differential Reinforcement of Paste Responding (DRP)
A schedule in which reinforcement is contingent upon emitting a series of responses at a set rate - or, more generally, reinforcement is provided for responding neither too fast nor too slow
EX: A rat might receive a food pellet if it emits 10 consecutive responses with each response separated by an interval of no less than 1.5 seconds and no more than 2.5 seconds
Non-Contingent Schedule of Reinforcement
A schedule in which the reinforcer is delivered independently of any response
(refer to 2 poops, FT and VT)
Fixed Time Schedule (FT)
A schedule in which the reinforcer is delivered following a fixed, predictable period of time regardless of the organism's behavior
EX: FT 30-sec = a pigeon receives access to food every 30 seconds regardless of its behavior
Variable Time Schedule (VT)
A schedule in which the reinforcer is delivered following a varying, unpredictable period of time regardless of the organism's behavior
EX: on a VT 30-sec schedule = pigeon receives food after an avg. interval of 30 seconds regardless of its behavior
Chained Schedule
A schedule consisting of a sequence of 2 or more simple schedules, each with its own S^D and the last of which results in a terminal reinforcer
EX: A pigeon sees a green light, then it must peck on the lever 20 times, after which the red light will be shown. The red light acts as the S^R and the S^D
Goal-Gradient Effect
An increase in the strength and/or efficiency of responding as one draws near to the goal
EX: A student writing an essay is likely to take shorter breaks and work more intensely as he nears the end
Drive Reduction Theory
An event is reinforcing to the extent that it is associated with a reduction in some type of physiological drive
EX: Food deprivation produces a hungry drive which propels an animal to look for food
Premack Principle
The notion that a high-probability behavior can be used to reinforce a low-probability behavior
EX: When a rat is hungry, eating becomes the high-probability behavior, and running on a wheel is the low-probability behavior - therefore, obtaining food can be used to reinforce running on the wheel
Response Deprivation Hypothesis
The notion that a behavior can serve as a reinforcer when 1) access to the behavior is restricted and 2) its frequency thereby falls below its preferred level of occurrence
EX: A rat's baseline is to run for 1 hour a day. If we're only allowed 15 minutes a day, it will be in a state of deprivation and therefore the rat will be willing to work in order to obtain the additional 45 minutes
Bliss Point
The theory that an organism with free access to alternative activities will distribute its behavior in such a way as to maximize overall reinforcement
EX: Rat has the option to do the 2 things it wants to, run on the wheel and in the maze. It spends 30 minutes on the wheel and 1 hour in the maze to maximize its pleasure
Extinction
The non-reinforcement of a previously reinforced response, the result of which is a decrease in the strength of that response
EX: If it presses the lever and doesn't get food, it will eventually stop pressing the lever
Extinction Burst
A temporary increase in the frequency and intensity of responding when extinction is first implemented
EX: Put money in candy machine; after not getting candy, press the button harder/repeatedly
Resurgence
The reappearance during extinction of other behaviors that had once been effective in obtaining reinforcement
EX: A rat has to run a 20-ft maze, then go through a 40-ft maze. Once all of the running is put on extinction, the rat will try to do the 40-ft pattern first because it knows that at the end of the 40-ft maze, it will get the reinforcer
Partial Reinforcement Effect
The process whereby behavior that has been maintained on an intermittent (partial) schedule of reinforcement extinguishes more slowly than behavior that has been maintained on a continuous schedule
EX: A rat that needs to press the lever 10 times to get food will take longer to extinguish than a rat who has to press the lever once
Spontaneous Recovery
The reappearance of a extinguished response following a rest period after extinction
EX: Suppose we extinguish a rat's behavior of lever pressing; the next day, it will likely press the lever again (though a weaker behavior this time)
Differential Reinforcement of Other Behavior (DRO)
Reinforcement of any behavior other than a target behavior that is being extinguished
EX: Paying attention to a child only if he is doing something other than fighting with his sister
Stimulus Control
A situation in which the presence of a discriminative stimulus reliably affects the probability of a behavior
Stupid (just S^D)
Stimulus Generalization
In operant conditioning, the tendency for an operant response to be emitted in the presence of a stimulus that is similar to a S^D
EX: Rat hears 2000 Hz tone, presses lever; then hears 2200 Hz tone (similar), still presses lever
Generalization Gradient
A graphic description of the strength of responding in the presence of stimuli that are similar to the S^D and vary along a continuum
Triangle looking graphy thingy
Discrimination Training
As applied to operant conditioning, the differential reinforcement of responding in the presence of 1 stimulus (the S^D) and not another (the Discriminative Stimulus for Extinction)
EX: 2000 Hz --> press lever --> food. ALSO, 1200 Hz --> press lever --> no food.
Discriminative Stimulus for Extinction (S [delta])
A stimulus that signals the absence of reinforcement
EX: 2000 Hz --> press lever --> food. ALSO, 1200 Hz --> press lever --> no food. <--- this is the S [delta]
Peak Shift Effect
Following discrimination training, the peak of a generalization gradient will shift from the S^D to a stimulus that is further removed from the S[delta]
Triangle graphy thingy peak shifts
Multiple Schedule
A complex schedule consisting of 2 or more independent schedules presented in sequence, each resulting in reinforcement and each having a distinctive S^D
EX: There is a red line. Then the bird has to peck the light, which results in getting food. The light then changes to the color green, and now the bird has to peck the lever to get food. Turns to a red light again, continues...
Fading
The process of gradually altering the intensity of a stimulus
EX: A fade (in or out) in music
Learned Helplessness
A detriment in learning ability that results from repeated exposure to uncontrollable aversive events
EX: F*ck with dog
Experimental Neurosis
produced in the laboratory by putting subjects in a situation where they are required to make discriminations or produce problem solving responses that are beyond their capacity to produce. This is a learned helplessness paradigm when aversive stimulation consistently follows their inevitable failures
Sowwy, no example. Too tired
Two-Process Theory of Avoidance
The theory that avoidance behavior is a result of 2 distinct process: 1) classical conditioning, in which a fear response comes to be elicited by a CS, and 2) operant conditioning, in which moving away from the CS is negatively reinforced by a reduction in fear
EX: 1) Light is the NS that is a signal for a shock, the US, which then leads to fear, the UR. At this point, light is the CS that leads to fear, the CR.
2) Light is the S^D, and the response is climbing over the barrier, and the S^R is the reduction in fear
One-process Theory of Avoidance
The theory that avoidance behavior is a result of 1 process, *operant conditioning*
too bad, so sad
Concurrent Schedule of Reinforcement
A complex schedule consisting of the simultaneous presentation of 2 or more independent schedules, each leading to reinforcers
EX: Both red and green lights are presented; the pigeon chooses between the two; difference is the scheduling (say, VI 20-sec vs VI 50-sec)
Matching Law
The principle that the proportion of responses emitted on a particular schedule matches the proportion of reinforcers obtained on that schedule
EX: Work twice as much in VI 5-sec than in VI 10-sec
Undermatching
A deviation from matching in which the proportion of responses on the richer schedule vs poorer schedule is less different than would be predicted by matching
yep
Bias
A deviation from matching in which one alternative attracts a higher proportion of responses than would be predicted by matching, regardless of whether that alternative contains the richer vs poorer schedule
hmm...
Melioration Theory
A theory of matching that holds that the distribution of behavior in a choice situation shifts towards those alternatives that have higher value regardless of the long-term effect on overall amount of reinforcement
Shift toward richer schedule
Self-control
With respect to choice between 2 rewards, selecting a larger, later reward over a smaller, sooner reward
EX: marshmallow!
Impulsiveness
With respect to choice between 2 rewards, selecting a smaller, sooner reward over a larger, later reward
yep
Commitment Response
An action carried out at an early point in time that serves to either eliminate or reduce the value of an upcoming temptation
EX: Student needs to study in the evening, but she says this in the morning. The commitment response is giving her brother $20 and saying "If I don't study, you get to keep the $20."