• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/87

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

87 Cards in this Set

  • Front
  • Back
Delay conditioning
Conditioned stimulus presented before the unconditioned stimulus (forward pairing); most effective form of classical conditioning because it has predictive value and contiguity
Trace conditioning
Conditioned stimulus is presented long before the unconditioned stimulus (forward pairing); not as effective since it relies on the subject's trace memory of the conditioned stimulus
Most important factors in classical conditioning
1) Contiguity: how close the CS and US are temporally
2) Frequency of the pairing: increased pairings leads to a stronger connection between the CS and US
3) Predictive value of the CS
Backward conditioning
Unconditioned stimulus is presented before the conditioned stimulus; not effective because there is no predictive value
Autoshaping
In this experiment, the US will come irrespective of the animal's behavior, but when the US is paired with some signal (i.e., button that lights up), the animal will respond to this signal as if it is the US (i.e., pigeons will peck at a light as if it is food because it seems to signal food)
Inhibition of delay
Inhibition that develops to the early portion of a CS in delay conditioning; early part of a CS signals a period without the US so that the animal does not feel compelled to act immediately, but it will increase the strength of its CR as it reaches the time at which the US is generally presented
Overshadowing
If an animal is presented with two CSs, both of which are equally predictive of the US, it will respond more strongly to the CS that is more biologically relevant (i.e., a tone vs. a light in dogs)
Conditioned emotional response
1) Train a rat to press a lever for food on a variable interval schedule so that the rat will continuously press the lever because it is not sure when the next reinforcer (food) will come
2) Pair a US (usually aversive, like a shock) with a CS (a tone)
3) Repeat the first step; the rat will press the lever with the same rate, but if the tone is played, the rat will slow down its lever pressing
4) The rat's fear can be measured by observing how much it slows down its lever pressing
Stimulus substitution theory
Posited by Pavlov; through repeated pairings of the CS and US, the CS becomes a substitute for the US, so that the UR becomes the CR, and the two are comparable; however, the CR and UR are rarely identical
Acquisition phase (classical conditioning)
Part of an experiment in which the subject first experiences a series of CS+US pairings and during which the CR gradually appears and increases in strength
Asymptote (classical conditioning)
Maximum level of responding that is gradually approached as the learning experiment continues; stronger CS+US lead to higher asymptotes and faster conditioning
Extinction (classical conditioning)
Presenting the CS without the US so that the CR is diminished over time
Spontaneous recovery
Spontaneous reappearance of a CR after an extinction trial even when no CR is recorded at the end of the extinction trial
Inhibition theory of spontaneous recovery
After extinction is complete, the animal has two counteracting associations: the excitatory CS+US association formed during acquisition and the inhibitory CS-US association developed during extinction; inhibitory associations are more fragile and can be weakened with passage of time, explaining the spontaneous recovery
Conditioned inhibition
Any CS that reduces the size of the CR or that reduces the size of the CR from what it would otherwise be.
For example, if a dog is trained to salivate at the sound of a buzzer (+food), a light flash can be introduced, which signals no food. At first, the dog salivates when the buzzer and light are presented together, but eventually, it will only salivate when the buzzer is sounded (learns to discriminate).
Generalization
Transfer of the effects of conditioning to a similar stimuli; put a pigeon on a VI 5 minute schedule when a green light is shown and measure the number of pecks at different wavelengths to compile an excitatory gradient; peak shift can occur in which responses shift away from the negative stimulus (wavelengths that are not green)
Discrimination
Subjects learn to respond to one stimulus but not a similar stimulus
Rescorla's CER study
Rats were divided into three groups: 1) standard pairing (each CS was followed by a US), 2) partial pairing (only some CSs were followed by a US), and 3) random control (some USs without CSs), and CER was performed with all groups. It was found that suppression of lever-pressing was the highest in the rats that underwent standard pairing because this type of conditioning had strong contingency (predictive value) and contiguity.
Rescorla-Wagner model equation
A mathematical theory of classical conditioning that states that, on each trial, the amount of excitatory or inhibitory conditioning depends on the associative strengths of all the conditioned stimuli that are present and on the intensity of the unconditioned stimulus
DeltaVn = k(lambda - Vn-1)
Lambda = asymptote of conditioning, V = measurement of the CR, n = trial number, k = CS salience (0-1, 1 is most salient)
CS pre-exposure effect
Classical conditioning will proceed more slowly if a CS is repeatedly presented by itself before it is paired with a US because the subject will ignore the CS since it is not predictive of anything, and it will take a longer time to form the CS+US association; Rescorla-Wagner model does not predict this because it would predict no learning occurs; however, it looks like some does occur
Blocking effect
One group of rats (blocking group) underwent a CER experiment where light flashes were paired with shocks. In the second phase, the blocking group underwear CER with light flashes + tone + shock. A control group underwent the same procedure. In the testing phase, the tone was presented with no shock to measure the strength of the CS. The blocking group did not react to the tone, but the control group did. This indicates that prior conditioning with the light blocked later conditioning since it added no new information about the US.
Basic concepts of the Rescorla-Wagner model
Learning will only occur if the subject is surprised.
Strength of US > strength of subject's expectations = excitatory conditioning (acquisition)
Strength of US < strength of subject's expectations = inhibitory conditioning (extinction)
Strength of = strength of subject's expectations = no conditioning, asymptote (blocking)
The larger the discrepancy between the strength of the expectation and the strength of the US, the greater the conditioning that occurs while more salient CSs will condition faster than less salient ones.
Miller's experiments with rats and autonomic responses
A rat was given curare to paralyze skeletal muscles. Whenever the rat's heart beat at a specific rate, a pleasurable shock was given to them (electrical brain stimulation). After awhile, they began to shift their heart rates until it was at this certain pace. This was also demonstrated with inner ear activity and intestinal and renal motility.
Logan's experiment with rats and response rate
An alley was set up to include a start box and a goal box, containing food. If the rat took more than 18 seconds to reach the goal box, it would receive 4 pellets. If it took less than 18 seconds, it would receive only one pellet. At first, the rat will move slowly because it is exploring its environment, but once it sees that it is rewarded, it will become enthusiastic and increase its rate of responding (run faster to the goal box). This decreases its reward, and the rat will decrease its enthusiasm (rate of responding) and run slower so that it receives 4 pellets again. Leads to an oscillation in behavior.
Thorndike's law of effect
Thorndike placed animals inside a cage and measured their escape latency. He found that the behaviors that did not decrease escape latency were decreased while behaviors that decreased escape latency were increased. The greater the satisfaction or discomfort experience by an animal, the more likely or less likely they are to repeat that behavior. Behavior is governed by its consequences.
Shaping
Method of successive approximations; makes use of a conditioned reinforcer, which is a previously neutral stimulus that has been repeatedly paired with a primary reinforcer; behaviors that approximate the desired behaviors are rewarded.
Free-operant procedure
Operant response (lever-pressing, button-pushing, etc.) can occur at any time and repeatedly for as long as the subject remains within the experimental chamber, which allows a relationship between the response rate and reinforcer to be determined.
Three-term contingency
In the presence of a specific stimulus, the reinforcer will occur if and only if the operant response occurs.
Continuous reinforcement
Schedule of reinforcement in which every occurrence of the operant response is followed by a reinforcer; rare in the real world
Discrimination hypothesis of schedules of reinforcement
In order for a subject's behavior to change once extinction begins, the subject must be able to discriminate the change in reinforcement contingencies; this is difficult to do in schedules of reinforcement other than continuous reinforcement because there is always a time when the reinforcer is not present, so they are resistant to extinction.
Generalization decrement hypothesis of schedules of reinforcement
An animal on a continuous schedule of reinforcement will show decreased responding when the test stimuli becomes less and less similar to the training stimulus (when the reinforcer is no longer presented every time they make an operant response) since they never learned to continuously respond. Animals on other schedules of reinforcement have learned to continuously respond and see no decrement in the testing stimuli; therefore, these schedules are resistant to extinction.
Differential reinforcement of low rates schedule
A response is reinforced if and only if a certain amount of time has elapsed since the previous response; the animal cannot perform an operant response before this time
Differential reinforcement of high rates schedule
A certain number of responses must occur within a fixed amount of time
Concurrent schedule
Subject is presented with two or more response alternatives, each associated with its own reinforcement schedule
Chained schedule
Subject must complete the requirement for two or more schedules in a fixed sequence; each schedule is signalled by a different stimulus; strength of responding decreases as a schedule is further and further removed from the primary reinforcer
Multiple schedule
Subject is presented with two or more different schedules, one at a time, and each schedule is signalled by a different stimulus.
Fixed ratio schedule
Reinforcer is delivered after every n responses; animal will have a postreinforcement pause (show no responses), which will eventually give way to an abrupt continuation of responding at a constant, rapid rate until the next reinforcer is delivered (i.e. piecework method in factories)
Variable ratio schedule
Exact number of required responses is not constant from reinforcer to reinforcer; postreinforcement pauses are brief because after each reinforcer there is a possibility that another reinforcer will be delivered after a few responses (i.e., gambling)
Fixed interval schedule
The first response after a fixed amount of time has elapsed is reinforced; subjects tend to make many more responses per reinforcer than is required; postreinforcement pause occurs, but the subject will begin responding slowly and respond more rapidly the closer it gets to the reinforcer time (i.e., waiting for a bus to show up, and the operant response = staring down the street, looking for the bus)
Variable interval schedule
The amount of time that must pass before a reinforcer appears varies unpredictably; long pauses after a reinforcer are not advantageous because a reinforcer could appear at any moment; tend to sustain a steady response rate (i.e., checking the mail)
Why is there a postreinforcement pause on fixed ratio schedules?
1) Fatigue hypothesis: the subject has made many responses and is now fatigued so it must rest
2) Satiation hypothesis: reinforcer causes a slight decrease in the animal's desire for the reinforcer so there is a brief interruption in responding
3) Remaining-responses hypothesis: the subject is farthest from the delivery of the next reinforcer immediately after the occurrence of a reinforcer; therefore, responses that are farthest removed from the reinforcer are the weakest
Peak procedure
Checks an animal's sense of timing; turn on a light for 20 minutes and halfway through, a peck will have a 50% chance of producing food (FI of 10 minutes but only 50% of the time); the pigeon is impulsive and begins pecking around 5-6 minutes even though it knows that 10 minutes has not elapsed; pecks more as the time approaches 10 minutes and decreases pecks the farther it gets from 1- minutes
Higher order conditioning
Does a second CS help the animal predict the US originally paired with the first CS? Pair CS1+US then pair CS1+CS2. This works but a third CS does not work as well.
Feature discrimination
A second CS is added to the first CS to signal trials on which the US will or will not occur.
In feature positive, an asterisk is added to a light to signal a VI 5 minute schedule; no asterisk = extinction
In feature negative, asterisk on the light = extinction; no asterisk = VI 5 minute schedule
The pigeons pecked if there were an asterisk or not in the feature negative schedule, but they pecked as far away from the asterisk as possible (didn't want the bad news)
Frustration-aggression hypothesis
If something makes you frustrated, then you will take your aggression out on something else.
Amsel set up an alley box that had two goal boxes with food and trained a rat to run through it; one day when the first goal box had no food, it ran quickly to the next one because it was frustrated.
Behavioral contrast
Response rate increases in the unchanged component of a multiple schedule when the conditions of reinforcement in the changed component are worsened (reduced).
Reynolds showed that a pigeon would peck at a light when it was red and green when these both meant VI 5 minutes. However, when the green light meant extinction, the pigeon began to peck faster when the light was red (increased response rate)
Interference account of behavioral contrast
Multiple VI 5 minute schedules with tone and VI 5 minute schedule without tone; rat has same response in all conditions since they both pay off; change it so that no tone = extinction; during this time, the rat will change its behavior to perform other activities during no tone; increased response rate has nothing to do with frustration as much as it has to do with maximizing activity and not wasting energy when there is no point
Additivity theory
Green light indicates reinforcer; when the light is another color, the pigeon will not peck at it since it does not mean food (autoshaping); the meaning of the green light can be degraded by providing the reinforcer when the light is another color; adds Pavlovian behavior to an operant behavior
Problems with Skinnerian theory
1) Schedule of reinforcement can shape behaviors; rate of response in VR > FR schedule, so by Skinner's value theory, the value of the VR should be greater, but they are the same.
2) The vigor of a response is unaffected by the value of a reinforcer over a certain range (asymptote)
Matching law and Hernnstein
Put a pigeon on a concurrent multiple VI 2 minute and VI 4 minute schedule, which both have their own buttons. The pigeon will peck at the VI 2 minute button because it has a "richer" schedule. Change it so that the VI is the same but it has a choice between hemp and wheat (pigeon will prefer wheat). Then give it a choice between hemp and corn, and the pigeon will choose the hemp. If you gave a pigeon corn and wheat, it would choose the wheat in a 4:1 ratio (hedonic scaling).
Matching law equation
Behavior to key 1 / behavior to key 2 = reinforcements of 1 / reinforcements of 2
Law of demand
The demand of something drops if the price is increased; eventually, if too much energy is put into something, and the reward is too little, the animal will stop responding
Elasticity of demand
If the demand for a product drops quickly as its price increases, it has high elasticity, but if the demand for a product drops slowly as the price increases, it has low elasticity; low value products are affected by substitutability whereas high value products are not affected by this.
Engel curves
Value is a function of income level; therefore, you can't assign value to a reinforcer without knowing the income level
Baboons & heroin
Baboons were rewarded with heroin or food depending on which button they pressed; they were given a set amount of buttons; "rich" baboons chose heroin over food but when they were made "poor", they chose food over heroin (value is a function of income level).
Giffen good
Anti-matching law good; explains the behavior of the Irish during the potato famine (price of potatoes went up, but the Irish bought more potatoes since this was a less rich schedule and more would make it richer, and they had to defend their body weight and avoid sickness by eating).
Copyist model
The idea that when you produce a reinforcer due to an operant response, you must do the exact same thing to produce it again; shown in Guthrie's experiments with cats that had to hit a stick to exit a box and in superstitions,.
Aversive control
The law of effect is asymmetric because positive punishment does not work as well (positive reinforcement is better); escape avoidance: in the presence of an aversive stimulus, the animal will make a response that escapes it or avoids it
Two-factor theory
The first factor is Pavlovian and the second is operant; a dog is placed on one side of a box and a shock is delivered; the dog will jump to the other side of the box, which will deliver a shock as well; the dog will learn to associate the light going out with an electrical charge (avoidance); and will display a decreased response rate eventually = learned helplessness
Desensitization (Pavlovian psychotherapy)
Developed by Wolpe (can't be anxious and relaxed at the same time); taught patients about muscle relaxation and developed a hierarchy of fear and worked through these while relaxing; reciprocal inhibition (imagine anxiety caused by fearful things and work through them until there is no anxiety)
Counterconditioning (Pavlovian psychotherapy)
Pair a stimulus with an aversive stimulus to change it from a positive to a negative valence
Implosion therapy (Pavlovian psychotherapy)
Expose a patient to a feared object all at once; seldom used
Overcorrection (operant psychotherapy)
Go above and beyond a correction of a behavior to the point that it is aversive, and the patient will quit the behavior
Devaluing a reinforcer (operant psychotherapy)
Make a reinforcer unimportant (woman who hoarded towels)
Token economy (operant psychotherapy)
Earn tokens by performing certain behaviors that can then be used to purchase privileges (schizophrenics)
Control systems theory
Deals with goal-directed behaviors; each system has a comparator, which receives reference input (what the value should be) and actual input (what it actually is); the comparator then has an output that will attempt to bring the actual input closer to the reference input; disturbances can also occur and affect the comparator and inputs
Reflexes (innate behavior pattern)
Stereotyped pattern of movement of a part of the body that can be reliably elicited by presenting the appropriate stimulus; mediated by interneurons, sensory neurons, and motor neurons; stretch receptors in the muscles are the comparators
Kinesis (innate behavior pattern; tropism)
The direction of movement is random in relation to some stimulus; wood lice must remain in humid areas; when they are in a dry area, they constantly move around, increasing their chances of locating a humid area, but when they are in a humid area, they decrease their movement, decreasing the possibility that they'll move away from it
Taxis (innate behavior pattern, tropism)
The direction of movement bears some relationship to the location of the stimulus; maggots move away from any bright light source because it has a light-sensitive receptor that can compare the light in various directions
Fixed action patterns (innate behavior pattern)
Behavioral sequences that are a part of the repertoire of all members of the species; experiments have confirmed that the animal's ability to perform the behavior is not learned; in the sequence of behaviors, the behaviors occur in a rigid order regardless of whether they're appropriate; nut-burying behavior of a squirrel or defense response of male three-spined stickleback
Reaction chains (innate behavior pattern)
The progression from one behavior to the next depends on the presence of the appropriate external stimulus; if the stimulus is not present, the chain of behaviors will be interrupted; hermit crabs looking for a new shell; more dependent on external stimuli, so they are more variable but more adaptable than FAPs.
Sign stimulus
Some signal that usually initiates a fixed action pattern even if it is not entirely appropriate
Habituation
Decrease in the strength of a response after repeated presentation of a stimulus that elicits a response.
1) Stimulus specific (distinguishes it from sensory adaptation or muscular fatigue).
2) Occurs whenever a stimulus is repeatedly presented, and the decrements in responding from trial to trial are large at first but get progressively smaller.
3) If the stimulus is withheld for some period of time, the response will recover
4) If habituation disappears after a long interval, it will proceed more rapidly in a second series of stimulus presentations
5) Proceeds more rapidly with a weak stimulus
Opponent-process theory
A subject's response to a stimulus changes simply as a result of repeated presentations of that stimulus; however, with stimulus repetition, some emotional reactions weaken while others are strengthened
Typical pattern of an emotional response
A stimulus produces the sudden appearance of an emotional reaction, which quickly reaches a peak of intensity, which gradually declines to a somewhat lower level (plateau). With the offset of the stimulus, there is a sudden switch to the emotional after-reaction, which is an opposite of the initial emotion. This gradually declines, and the emotional state returns to a neutral state. Arises from the antagonistic effects of the a-process (initial response) and b-process (after-reaction).
Sociobiology
Your body is a support system for your gonads; the point of life is to maximize fitness by increasing genetic representation in the future (having sexual reproductive success)
Last gamete hypothesis
The male leaves the last gamete in reproduction since the egg is already present; therefore, he maximizes his fitness because he is able to leave and impregnate someone else and increase his genetic representation in future generations.
Optimal market value
Ages that women are looking for in men vs. the number of men available at that particular age (women choose men who are older than them because this provides resources while men choose women who are younger because this ensures fertility)
Symmetry in reproductive success
Females prefer symmetry in males since this correlates with health and strength; if two bands of the same color are placed on a bird's legs, then he will have greater reproductive success than a bird that has two different colored bands (lacks symmetry)
Sexual conflict
Men desire to copulate immediately, but the woman does not because she wants to ensure that she will receive resources and commitment from the male. Because it takes time to invest in a female, the male is discouraged from starting another relationship because this would take just as long.
Sexual jealousy
Women don't want to have their resources compromised (feel worse when a man is emotionally invested in someone else) and men don't want to be cuckolded because they have invested in the woman already (feel worse when the woman is sexually invested in someone else); men who have poor connections with their mates will have more ejaculate when they return to their mates after a long period of time (sperm competition)
Correlation between divorce rates and rape rates
The higher the divorce rate, the higher the rape rate. Males divorce soon-to-be infertile wives and then tries to get with a woman that is fertile (has greater resources so he is able to remove her from the available pool of women). Other males rape to pass on genes due to the lessened pool of fertile women.
Parental favoritism
Fathers are more likely to support their genetic offspring with their ex-wife than their non-genetic offspring with a current wife; if men have a lot of money, they will invest in the male offspring more because this gives him huge reproductive success (can give bride's wealth to families with the most fertile daughters); invest some money in daughters so they can be elevated in terms of value (marriage)
Inequity aversion
Monkeys exchanged tokens for food (one monkey exchanged a token for a cucumber and another for a grape, which was more favored); effort control group vs. food control group (one monkey is the model and the other is the witness); both are given cucumbers; in another condition, the model is given a grape and the witness is given a cucumber, so the witness gets angry (maybe rejecting because he had previously gotten a grape and not because the other monkey got a grape)
Kahnemann & Tversky: Prospect theory (Asian disease problem)
Disease expected to kill 600 people:
Option A: 400 die for sure
Option B: 1/3 probability that no one dies or 2/3 that all die
Option C: 200 lives saved for sure
Option D: 1/3 probability all saved or 2/3 probability no one is saved
Present A & B together then C & D
People prefer option B when it is in the domain of losses (risky choice) but when it is in the domain of gains, people prefer C (risk averse).
Domain of gains vs. domain of losses
When an option is in the domain of losses, people are risk-seeking; however, when an option is in the domain of gains, people are risk-averse (loss aversion)
Certainty effect
People overweigh certainty and wildly overestimate the probability of the improbable = irrationality