2.) I have a three-year-old brother who is always throwing tantrums. Me and my family thought that the terrible two 's would pass by quickly and that my little brother wouldnn 't be throwing fits at age three, but he still is. He cries when he doesn 't get his way and the crying lasts a long time. That is very normal though for a three-year-old to do, but I still love …show more content…
2.) Positive reinforcement is adding something to a person 's environment; it 's the addition of something the person likes. An original example of positive reinforcement is a child finishing her dinner and is reinforced with dessert. Negative reinforcement is taking away something unpleasant; it 's the subtraction of something the person doesn 't like. An original example of negative performance is your calculus instructor saying that if you get A 's on all the exams during the quarter, you won 't have to take the cumulative final.
3.) A primary reinforcer is a reinforcer that is naturally rewarding, and usually has some connection to biological needs. Three primary reinforces are: food, water, and sexual contact. We don 't have to necessarily have to learn to like them. A secondary reinforcer is a reinforcer that is not automatically reinforcing, but that 's because of their interaction with other reinforcer 's. Three secondary reinforces …show more content…
On the other hand, a partial reinforcement is only some instances of the response that are reinforced.
6.) A interval schedule of reinforcement is different from a ratio schedule because a interval schedule of reinforcement means that the reinforcement is available after a fixed amount of time has passed by. On the other hand, a fixed ratio schedule is a fixed number of responses is required before the reinforcement will occur. Fixed interval schedules will lead to much lower rates of response than fixed ratio schedules.
7.) A variable schedule of reinforcement is different from a fixed schedule because a variable schedule of reinforcement is the number of responses required before reinforcement that occurs varies (variable ratio schedule) and the amount of time before the reinforcement will be available varies (variable interval schedules). On the other hand, a fixed schedule is a fixed number of responses that is required before reinforcement even occurs (fixed ratio schedule) and the reinforcement is available after a fixed amount of time has passed by (fixed interval