Use LEFT and RIGHT arrow keys to navigate between flashcards;
Use UP and DOWN arrow keys to flip the card;
H to show hint;
A reads text to speech;
89 Cards in this Set
- Front
- Back
Operant Conditioning
|
aka Instrumental
limitations of classical conditioning must build on innate reflexes reacting environment |
|
How do you change a behavior?
|
by changing consequences
ex) you get points for coming to class but penalized for not coming, therefore a changing behavior- association |
|
Critical Environmental Stimulus
|
comes after response and is obtained by the action of the learner
|
|
Discrete Trial Method
|
only one response can occur per trial
|
|
Who was the discrete trial used by
|
Thorndike
|
|
What did Thorndike study?
|
interested in animal intelligence
|
|
What was an experiment used by Thorndike?
|
Cat in box; if cat pulled lever, food was released; cats learned about the association between the food and escape response
|
|
Did time increase or decrease with number of trials in Thorndike's experiment?
|
time decreased because of learned association with food and lever
|
|
Law of effect
|
when a response is followed by a desirable consequence, the probability of that response is increased; when a response is followed by an undesirable consequence, the probability of that response is decreased
|
|
When learning a behavior, what relationship are you learning?
|
learning relationship between the response and the consequence
|
|
What type of method of operant conditioning did B.F Skinner use?
|
used the Free-Operant Method
|
|
What is the Free-Operant Method?
|
participants are free to respond when and as often as they like; used by B.F Skinner
|
|
What is Skinner famous for?
|
The Skinner Box
|
|
What is the Skinner Box?
|
Skinner Box=an operant chamber
Magazine training used to teach the animal where and when to get food --more animals press bar, food released and behavior Reinforced (positive) |
|
Positive Reinforment
|
Adds something to continue the wanted response
|
|
Negative Reinforcment
|
Response takes something away
|
|
Reinforcment
|
increases the response rate
|
|
Punishment
|
decreases the response rate
|
|
Primary reinforcers
|
effective at birth--ex) food, sex, water, visual stimulation
|
|
Secondary Reinforcers
|
acquiring reinforcing properties through experience
become associated with primary reinforcing--ex] money |
|
The Premack Principle
|
any higher probability activity could reinforce a low probability activity
|
|
What is an example of the Premack Principle?
|
when you were little, your parent said, "if you finish your spinach, you can have dessert"
-reinforcer at the end |
|
Does this example work in regards to the Premack principle: If you eat all of your spinach, you can have some potatoes
|
yes, generally children like potatoes
|
|
Importance of Premack Principle
|
-you can use reinforcers
-used in educational settings and mental hospitals |
|
What always comes first in the Premack Principle?
|
always have to do the thing you hate before you do the thing you like
|
|
Problems with the Premack Principle
|
-need to figure out high vs. low response probabilities
-response probabilities change-things you like change |
|
What did Skinner's experiment do? (punishment)
|
-trained rats to press bar
--one group received extinction(no food) --one group received mild punishment (slap on paw) --one group received no punishment (only food) |
|
What was the IV in Skinner's Punishment Experiment?
|
the consequence
|
|
How many levels were in Skinner's Punishment experiment?
|
3 levels--no punishment, mild punishment, extinction
|
|
What did Skinner's punishment experiment conclude?
|
the punishment was ineffective
|
|
Side effects of punishment
|
it can induce fear
it can induce learned helplessness can induce pain-elicited aggression may serve as a model for aggression |
|
An example of learned helplessness
|
Two teachers gave fifth graders a series of math problems
--teacher A always gave solvable math probs --teacher B always gave unsolvable math probs -when teacher B finally gave a solvable math problem, the students failed to solve them even though they have solved exactly the same problems earlier for teacher A LEARNED HELPLESSNESS |
|
Important issues in operant conditioning
|
1. shaping 2. generalization 3. discrimination 4. extinction 5. superstitious behavior
|
|
Adventitious Reinforcement
|
accidental pairing of response and delivery of reinforcer
|
|
What is an example of adventitious reinforcement?
|
Dr. Golding's lucky socks; each accidental pairing 'stamps in' a particular response
|
|
In Skinner's Superstitious experiment, how often did the pidegons receive food?
|
Every 15 seconds
|
|
What was the outcome of Skinner's Superstitious experiment?
|
There was the belief that behavior is controlled by the reinforcer
|
|
Mechanics of Operant Conditioning
|
Time is crucial, don't delay; always want the individuals to be clear about the association-it knows something causes something else.
|
|
Continuous Operant Conditioning
|
- reinforcement/punishment after after response
-does not work very well because ppl get lazy and there is a loss of motivation |
|
What is an example of Continuous Operant Conditioning?
|
SNL, Bill Murray getting a cracker after every trick
|
|
Partial Operant Conditioning
|
4 types of Partial:
-variable -fixed -ratio -interval |
|
Variable
|
Random
|
|
Fixed
|
every time or response
|
|
Ratio
|
responses
|
|
Interval
|
Time
|
|
Variable Interval
|
-Response/Punishment for First response after random amount of time
|
|
What is an example of variable interval?
|
reinforcement after 2 sec, 8 sec, 54 sec, etc.
|
|
Variable Ratio
|
Reinforcement/ Punishment after random number of responses
|
|
What is an example of variable ratio?
|
Reinforcement after response 4, response 13, response 39, etc
|
|
Fixed Ratio
|
Reinforcement/Punishment after SPECIFIC number of responses
|
|
What is an example of Fixed Ratio?
|
reinforcement after SPECIFIC response 4, response 8, response 12, etc.
|
|
Fixed Interval
|
Reinforcement/ Punishment for first response after SPECIFIC amounts of time
|
|
What is an example of Fixed Interval?
|
Reinforcement after 3 sec, 6 sec, 9 sec, etc
|
|
What is one of the best partial schedules and an example?
|
Variable Ratio-Slot Machines, casinos always make more money and have greater chance of winning than the ppl playing--18/38
|
|
What is another word for Operant Conditioning and Classical Conditioning?
|
Behaviorism
|
|
Behavior Modification
|
applying principles of reinforcement/punishment to human behaviors
|
|
What is an example of Behavior Modification?
|
Trying to fix the fact tht your roommate leaves laundry all over the floor all the time
|
|
What are three ways of solving the laundry problem?
|
punish-yell and scold by throwing away laundry
train an incompatible behavior-buy laundry hamper and reward for using it reward abscence of unwanted behavior- invite friends over when room is clean |
|
What are the three most important factors in behavior modification using positive reinforcement?
|
1. Select the behavior to be increased, be as specific as possible
2. Choose reinforcement, must be effective for the individual 3. Immediately, maximum reinforcement given immediately after the desired response |
|
Pitfalls of behavior modification using positive reinforcement
|
May lead to feelings of being controlled
ex] what does child understand about his/her behavior?--do they understand clean clothes don't belong on floor? OR are they just doing it to get the reinforcement? |
|
Extinction
|
Withold reinforcement that maintains undesirable behavior
|
|
Can extinction be a form of punishment?
|
yes, can cause emotional responses, aggression and frusteration
|
|
Cognitive Learning
|
The problem with a strict behavioristic view of learning is that learning can depend on mental processes that cannot be directly observed and that organisms can be active processors of info
|
|
What is an example from class of Observational Learning?
|
The movie clip of Thelma and Louise--> robbing the bank
|
|
What is an example of Observational Learning from the book?
|
Bandora and the Bobo doll experiment
|
|
What were the hypothesizes of the Bobo Doll experiment?
|
1. Boys are more aggressive than girls
2. Boys who watch male models are more aggressive 3. Control group would be less aggressive 4. People exposed to aggression would be more aggressive |
|
What was the design of the Bobo Doll experiment?
|
2x2x2 with a control
|
|
What were the measures of the Bobo Doll experiment?
|
Verbal Aggression, Mallard Aggression, Non imitative Aggression, Other Maladaptive Aggression
|
|
What were the results of the Bobo Doll experiment?
|
Overall males were more aggressive than females
|
|
Causal Attributions
|
Perceptions of the causes of behavior often involves learning a cognitive relationship
|
|
What are two examples of causal distributions?
|
-you know you did well because you studied well
-you know you did poorly because Dr. Golding tricked us |
|
Dispositional Attributions
|
Something about person themselves
|
|
Situational Attributions
|
Blaming on situation, not self
|
|
Food Aversion
|
When you eat soemthing and get sick later there is a tendency to avoid food later
|
|
What is wrong with the example of Food Aversion?
|
Trial learning goes against the notion of shaping
--there is time between eating and throwing up-why wouldn't she associate it with something else? --they are not closely together in time and goes against behaviorism |
|
Language
|
Predisposition to learn language
--more to language than one to one associaion |
|
What is an example of Language?
|
Seinfeld --even if you've never seen it before, it's still funny
|
|
What is wrong with Language?
|
Ambiguity
--i love U 2, i love you two, i love you too---know difference bc of contex --they are cooking apples (ADJ or Verb?) |
|
Biological Strength
|
Innate-predisposition to learn things
-cannot use behavioristic techniques to train all behaviors |
|
What is an example of biological strength?
|
Breland and Breland wanting to train a raccoon to put $$ in bank but it never actually did
---everything to raccoons means food Tabula Rosa-not true! |
|
What is the example from class that was used to explain Cognitive maps?
|
Rats were trained to tun through a partially flooded maze for food
- once they learned the maze, they flooded it s rats had to swim |
|
S-R prediction of flooded maze
|
rats will not find the correct path to the goal box; they used a different set of muscular movements when they 1st learned the route
|
|
Tolman's Cognitive map Prediction
|
Rats would swim to goal box based on cognitive map
|
|
What were the results of the Flooded Maze experiment?
|
Rats would swim to goal box without error
|
|
Cognition
|
Actively process information (all animals)
- learning is too complex to be explained by simple associations |
|
What is an example of cognition in remembering something?
|
deciding what to rehearse
--songs, associating with bdays, channels, etc. |
|
Principles of Cognitive Psych- Study Unobservables
|
Use observable behavior to draw inferences about unobservables ex] memory
|
|
Principles of Cognitive Psych-Organism is an active seeker & processor of info
|
this means you have some control
-ex] penny getting chocolate |
|
Principles of Cognitive Psych- In general Cog psych is nonreductional
|
Not safe to assume that what you observe with rats is true of humans
|