• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/18

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

18 Cards in this Set

  • Front
  • Back
Edward Thorndike
Edward Thorndike: functionalist (how the mind helps you adapt to your environment).

Also a behaviorist, invented the Law of Effect: the basis for operant conditioning.
John Watson
John Watson: Classical conditioning experiment on Little Albert.
After Watson, what was hot? Who's Clark Hull?
Behaviorism (1920s-1960's): Hot after Watson.

Clark Hull: Theory of Motivation (drive- reduction theory): The goal of behavior is to reduce biological drives.

Other behaviorists: Edwin Guthrie, B.F Skinner.
Konrad Lorenz started what party?
Konrad Lorenz: Started Ethology (study of animal behavior done in the field). Studied it in great detail, function in context.
Classical (respondent) conditioning
Classical (respondent) conditioning:

Pavlov: OK, the food was the UCS, the food in mouth was the UCR. The new stimulus (the bell) became a CS when it started eliciting saliva. The learned saliva response was the CR... shocking.
Timing for Pavlov (what's forward conditioning? Backward conditioning? What's acquisition? What's extinction? What's spontaneous recovery? What's generalization?)
Timing was crucial in Pavlov's classical conditioning. The CS was always presented BEFORE the UCS. (bell before food powder). That's called FORWARD conditioning.

2. Backward conditioning: Presenting the UCS, then the CS. Generally doesn't work.

3. Extinction: Repeatedly giving the CS without the UCS.

4. Spontaneous Recovery: After extinction and a rest period, presenting the CS without the UCS will again elicit a WEAK CR.

5. Acquisition: The period during which an organism is learning the association of the stimuli.

6. Generalization: The tendency for stimuli similar to the CS to elicit the CR. (like, conditioned for a big dog, now you fear a poodle).
What's second-order conditioning?

What's sensory preconditioning?
1. Second-Order Preconditioning: A neutral stimulus is paired with a CS rather than a UCS. Stage 1: Regular classic conditioning (dog to salivate to a bell ring). Stage 2: There's a new UCS (like a light flash) right before the CS (the bell), but no UCS. After several trials, the dog'll salivate to the formerly neutral stimulus (just the light).

2. Sensory Preconditioning: Two neutral stimuli are paired together, then one of the neutral stimuli is paired with a UCS.

Stage 1: Two neutral stimuli are paired (ex. light and ring).
Stage 2: One of the neutral stimuli is paired with a UCS (like food)
Then, flash the light (no bell) to see if sensory preconditioning happened (if so, the light flash would elicit salivation!!)
Who's Robert Rescola? What's the Contingency Explanation of Classical Conditioning?

What's the Contiguity? What's Blocking?
1. Temporal contiguity: Means the CS and UCS are presented in succession, and that's why classical conditioning happens (because CS and UCS are presented in succession)

2. 1960s: Robert Rescola had the Contingency Explanation of Classical Conditioning: the CS needs to be a GOOD SIGNAL for the UCS (the CS has to have informational value and be a good predictor of the UCS).

3. Blocking: Not only does the UCS and CS need to be contiguous and contingent, the CS needs to provide NONREDUNDANT information about the occurrence of the UCS (it needs to predictive)
Operant (instrumental) Conditioning (aka reward learning)

What is it? Who's E.L. Thorndike? What's the Law of Effect? B.F Skinner? What's positive/ negative reinforcement? What's escape? Avoidance? What's a DS? What's generalization?)
Operant (instrumental) conditioning (aka reward learning)

1. Operant: Based on learning the relationship between one's actions and their consequences.
2. E.L. Thorndike: Proposed the Law of Effect: if a response is followed by an annoying consequence, the animal will be less likely to emit the same response in the future.

3. B.F. Skinner: Rejected Thorndike's stress on mentalistic terms (ex. annoying). Skinner came up with four Op. Conditioning concepts:
i. Positive Reinforcement: Behavior is rewarded
ii. Negative Reinforcement:
a. Escape: Behavior removes something unpleasant. Increases p (behavior).
b. Avoidance: Behavior prevents something unpleasant (increases prob of behavior).
iii. Punishment: Behavior causes something unpleasant. Decreases prob/ behavior.
(well, I think there's two types,
i. positive punishment- behavior gives you an aversive stimulus. And
ii. negative punishment: something you like gets taken away).

4. Discriminative Stimulus: A stimulus condition that indicates the organism's behavior will have consequences.

5. Generalization: We can train an animal to peck at a green light (the SD), and then to generalize it to other, similar-colored lights (the closer to green, the closer the response).
What's a partial reinforcement effect? Tell me the four basic types of partial reinforcement? What's the hardest to extinguish?
What's a partial reinforcement effect: Like gambling. Basically that partially reinforced responses are hard to extinguish.

Tell me the four basic types of partial reinforcement

1. Fixed Ratio: Behavior gets reinforced after a FIXED NUMBER of times

FR1 (continuous reinforcement): You get something every time you respond (ex. $1/30 envelopes stuffed)

2. Variable-Ratio: Reinforced after a VARYING NUMBER of times (ex. slot machine)

VR5: You get a pellet every five times ON AVERAGE.


3. Fixed-Interval: Behavior will be reinforced after a fixed period of time since the last reinforcement (ex. coming to the office to pick up a check)

FI 45seconds: you get a pellet 45 seconds after your last pellet.


4. Variable-Interval: Behavior gets reinforced for the first response after a VARYING period has elapsed since the last reinforcement.

VI 54: You get reinforced an AVERAGE of 54 seconds after your last pellet (can and will vary).


What's the hardest to extinguish? Variable Ratio. VR also has the most rapid response rate.
Shaping and differential reinforcement
Shaping: For a desired behavior, you reinforce (and then extinguish) steps closer and closer to the behavior you ultimately want.

Shaping is aka differential reinforcement ;@)
Behavior Therapies:

Based on classical conditioning (Flooding, Implosion, Systematic Desensitization, Conditioned Aversion)
Behavior Therapies:

Based on Classical Conditioning:
1. Flooding: Directly experiencing the CS

2. Implosion: Imagining the feared object (the CS).

3. Systematic Desensitization (of Joseph Wolpe) Imagine what the feared object while trying to ensure the client stays relaxed. Builds an anxiety hierarchy. Uses counter-conditioning.

4. Conditioned stimulus: Pairing a desired CS with an aversive UCS.

BT's based on Operant Conditioning:
1. Contingency Management: Therapies that attempt to change the clients' behavior by altering the consequences of the behavior.

2. Behavioral contract: A written contract that states the consequences of certain acts.

3. Time-Out: Leaving the scene before you can gain reinforcement from an undesirable behavior.

4. Premack Principle: Using a more preferred activity to reinforce a less preferred activity.
Challenges to the Behaviorists

1. Thorndike's Trial and Error (and Law of Effect)
2.Wolfgang Kohler's Insight
3. Tolman's Cognitive Maps
4. Bandura's observational learning
5. Garcia's Preparedness (biological constraints)
6. The Brelands' Instinctual Drift
7. Taste-aversion studies
Challenges to the Behaviorists:

1. Thorndike (also a behaviorist): Problem solving with trial-and-error.

E.L Thorndike's Law of Effect and puzzle box: Cats accidentally find the way out of a maze, and get out really fast the next time. It's basic trial-and-error.

2.Wolfgang Kohler (cofounded Gestalt): Said animals could problem-solve by insight.

Kohler said that if animals are given the OPPORTUNITY, they can learn with trail and error AND insight. Had chimps at a hard situation try TandA, then sit down and think about it, then appeared to solve problem spontaneously (with insight).

3. Edward Tolman's Cognitive Maps: (a mental representation of a physical space). The rats made cognitive maps of their maze, so if their favorite route was blocked, they used their map to make up a new route.

4. Observational Learning (Bandura): Observing others' behavior can influence your behavior

5. Garica's Peparedness: Animals are prepared to learn certain connections between stimuli (more than they are for others).

Garcia's work re: biological constraints: (inborn predispositions to learn different things in a certain way. Differs per species).

Garcia Effect: Rats: Group1: sweet water, shock. Group 2: Sweet water, nausea. Group 3: Bright-noisy water Group 4: Bright-noisy water, nausea.

So: NO CLASSICAL conditioning in groups 1 (sweet and shock) and 4 (bright-noisy and Nausea).


Why? Rats are biologically predisposed to associate illness with something they ingested, and pair sights and sounds with externally induced pain.

People: Chemo and lack of appetite.


6. Brelands' (2 people) Instinctual Drift: Instinctual ways of behaving can OVERRIDE operant-conditioned responses.

Brelands: Biological Constraints in Operant Conditioning. The raccoon was dipping the coin, rubbing it, and NOT putting it in the piggy bank..

Why? Instinctual drift (raccoon was acting like the coin was a crayfish).

7. Taste-Aversion studies (vs. basic classical conditioning):
*Learned taste aversion can happen after JUST ONE trial.
* Learning with taste aversion can happen even if the UCS takes UP TO 24 hours to follow the CS (usually it needs to follow the CS immediately).
Ethology (what's species-typical behavior? Konrad Lorenz? Niko Tinbergen?)
Ethology:

*Species-Specific/ Typical Behaviors: Have an instinctual basis, it's behaviors that are characteristics of a certain species.
*Korad Lorenz: Did the imprinting work. Established Ethology.
*Niko Tinbergen: Experimental methods to ethology.
Ethology II (what's a FAP? What triggers it? What's a sign stimulus? What's a releaser? Stickleback experiment? What's a supernormal stimulus?)
Ethology II

1. FAP (fixed-action pattens): Certain action patterns that are stereotyped and species-typical. They're not unconditioned responses because they're MORE COMPLEX (ex. building a nest, rolling an egg).


FAPs are triggered by:

2. Sign-stimuli: Features of a stimulus that are SUFFICIENT to bring about an FAP.
3. Releasers: they're sign stimuli from ONE ANIMAL to ANOTHER. So, a releaser is an environmental stimulus that sets off a specific behavior.

4. In the stickleback experiment: The RED belly of the invading stickleback was the Sign stimulus and the releaser (because it was the most important element in triggering the aggressive behavior).

5. Supernormal Stimulus (Tinbergen): A stimulus that's even more effective at triggering the FAP than a natural stimulus.
Ethology III

What's an IRM (innate releasing mechanism)? what's a reproductive isolating mechanism? Who's Karl von Frisch? What's current in ethology?
Ethology III

1. Innate Releasing Mechanism (IRM): a proposed mechanism in an animal's nervous system that connects the stimulus with the right response.

IRM explains why FAP happen automatically after an animal perceives a sign stimulus, even if the stimulus is removed in the middle of the behavioral sequence.

2. Reproductive Isolating Mechanism: Makes an animal not mate with closely related species. Helps an animal identify its own species. (ex. a species-specific call)

3. Karl von Frisch: Honey Bee dances. They communicate the distance and direction of a food source with special movement patterns.

Current ethology: Why does the animal act a certain way? Want to know the evolutionary significance of certain behaviors
Charles Darwin's Natural Selection
(3 steps? What's reproductive fitness? How's it relate to altruism? How does it contrast with the Theory of Kin Selection? re: Inclusive fitness?)
Charles Darwin's Natural Selection

Darwin: Natural selection is the key to evolution. Premise: Not every member of the species is equally successful at surviving and reproducing. There's variation between individual species' members, and some of it's genetic.

Step 1: There's genetic differences between member of a species
Step 2: If a certain genetic variation increases the chances of reproduction, it'll tend to be passed down to the next generation.

If a specific genetic variation DECREASES the chances of reproduction, it'll not be passed down.

Step3: Over time, more and more members of the species will tend to have the genetic variation that increases their chances of production.

Reproductive Fitness: The number of offspring that live long enough to reproduce.
Altruism: When the animal's behavior decreases its reproductive fitness.

BUT:

Theory of Kin Selection: Animals try to increase their Inclusive Fitness (the number of offspring that'll live to reproduce, AND the number of your relatives that will live to reproduce).
Ethology:

E.O. Wilson
Ethology

E.O. Wilson: Sociobiology (studies how social behaviors increase fitness).

Wilson: Believed that GENETICS and the ENVIRONMENT cause behavior.