• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/45

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

45 Cards in this Set

  • Front
  • Back

Confirmation bias

When a belief is held or introduced it can create a situation where people tend to spot the hits and ignoring the misses of the belief being true. Or: any tendency of thought or action that contributes to a salient proposition seeming more warranted then it is.


Ex: stereotypes around money

Top down expectation bias

What we expect has an impact on what we are experiencing. And the expectation overrides the reality of what is true.


Ex: McGerk effect (where the visual processes of our eyes override our ears to make sense of the situation. If the same sound is played over different lips the sound will be heard differently. Ba will be heard as Fa. )

The hollow face illusion

A top down expectation bias and another example of perceptual bias. We recognize faces everywhere so even a rotating mask will appear as an outward facing face if we see the hollow inside.

Some one suggests that a song backwards sounds like certain words. Hearing it at first just sounded like random noises but after this suggestion It clearly sounds like the certain words. This is an example of:

Perceptual bias.

When told to focus on a certain thing Perceptual bias can lead to:

Missing other completely obvious things

A palm reader tells you that you will win the big game. You gain confidence and win. This is an example of what?

Self fulfilling proficy. Here it is a process is favour of confirmation rather than you falsely believing the prediction was confirmed owing to a bias.

A palm reader tell you you will play better this game. You play normally but every play seems significant. This is an example of what:

Confirmation bias. If you believe something then you are likely to treat neutral evidence as confirmation to your belief. Even having salience of a superstition (even if you do not believed it) will make misfortune related to the superstition seem significant.

Example of expectation biasing jusgement?

-perceptual bias


Confirmation being over interpreted in "temporarily open cases" (the messiah will return! .. yknow.. some day.

Reasoning vs representative cases:

reasoning is over taken by representative cases. Even though A is more likely then both A and B being true if the case mentions a lot about B lots of people will place A and B or just A.

Framing effects

The way a situation is presented can have a powerful influence on judgements about it.


Ex: baby killers vs people taking back their bodies from a patriarchal system.

Cognitive biases: judgement and interprertation. Biases especially have a powerful and ubiquitous effects at psychologically higher levels of processing leading to:

Judgements about what data really are


Decisions about how to weigh the evidence


Behaviors is seking evidence


Judgements about the importance of data.

Confirmation biases fit into 2 categories:

Biases towards evidence supporting the belief in question


Biases toward evidence undermining a belief

Biases toward evidence supporting the belief in question can manifest in what ways?

Looking specifically for evidence that supports the belief like:


1.Top down effects of perception: an exception of commitment to the truth og some belief can actually shape perceptions that seem to support it


2.Biases in evidential search methods (be that activilly searching for it OR not searching deep enough for things that contradict your view point)


3.structual biases - a structure that yields a specific result.

Biases toward evidence undermining a belief can manifest in what ways?

1.Evidential neglect


2.disproportionate criticism (which works well with giving disproportionate credence to the favored side and leads to moving goal posts)

Fallacy of moving goalposts

Different standards being used for the evaluation of evidence for and against X

Social cognition

The existence of other people in a reasoning context and the nature of our relations with them apply to our judgements and inferences in 2 broad ways.


1. Reasoning about other people


2. Reasoning influenced by them

Why are The number and kind of people around an enormous influence on the way we reason, problem solve and make decisions?:

-they are the source of much of our information


-Much of our reasoning is about them


-Much of our reasoning about other things is affected by their presence

If we wish to reason well in group contexts over the long term, we must be aware of what pitfalls of thinking in group contexts?:

-The flow of information through other people raises problems that we typically overlook.


-our reasoning about, and in the presence of, other people tends to be flawed in a predictable set of ways.

Business, family life an recreation are all mediated by what?

Our relations with people around us


-family members


Employees and employers


Business contacts


Friends, competitors, teammates etc.

Key factors in unreliable social reasoning (common forms.ofnpoor reasoning about others)

Optimistic assessment of oneself


Idealized/oversimplified theorizing


Overemphasis on character rather than context

Fundamental attribution error

Explaining "local behavior in terms of broad character traits which overlooking local situational explanations. (Being cut off on traffic likely means the person is in a rush rather then being an overall bad person. But most people just call them Jerks.) This leads to many lost opportunities. Ex: Jones and Harris 1964. (Castro statements must be given by pro Castro people even tho subjects are told the writer were forced to argue the position given to them)

Optimistic self assesment

Form of bias. Quick to think of positive qualities of ourselves and rarely negative ones. When reason about observes we tend to make the error of optimistic self assessment.

False polarizing

Overestimating the differences between one's own viewpoint and one who disagrees by interpreting the other person's view as closer to the polar opposite that it actually is.

Optimistic self assessment + oversimplification = ?


(How to over come it?)

Stereotyping and false polarization (we see our nuaces and stereotype to opposition)


One approach to reduce: take a few minutes to explicitly summarize the opponents reasoning for holding their view.write reasons or explain to someone. This improves discussions and facilitates agreements.

How reasoning about others entices false polarization

-believing it will be seen as an admission of weakness we are unwilling to articulate our reservations about the stereotypical view to those we broadly disagree with


- believing we will be seen as wishy washy to those we broadly agree with


This leads to moderate views being pressured out of the discussion from both directions by the social forces on both sides. Leading to stereotypical views to be overrespresented.

Explain the fallacy "people at both extremes on the issue disagree with me so I must be doing something right"

Confirmation bias at work.


-who one considers and extremists is a judgement that can simply follow from your view of yourself as reliable and centrist.


- as long as someone is more extreme you can interpret their dissent to make it seem equivalent to the dissent of everyone on the other side.


-biased definition plus biased interpretation of evidence confirms our preexisting idea that we are centrist, reasonable and moderate even when this may be false.

The bandwagon effect

When all of most people in a group are in agreement it is much more difficult to hold a dissenting view. This is a problem for belief not just expression.


Pressure against expressing dissent creates pressure against dissenting belief.

False consensus effect

Overestimating the extent to which others share one's own perception of a situation.


-on matters of taste or preference


-on the interpretation of ambiguous data


-interpreting silence as agreement

Debating strategies

+ make the habit of considering reasons why those who have not committed to your position might silently disagree


- create an environment in which voicing dissent is permissible; be the first to air contrary opinions at least Asa "devil's advocate"

Leveling and sharpening

Phenomena that jointly shape the content of the message in virtually every social transmission


Leveling: the elements of a message or narrative 5hqt are incidental or supporting details, get minimized in passing on the report.


Sharpening: the point of 5he message that is perceived as central becomes emphasized.

Why does Assimilation occur in regards to leveling and sharpening

We convey messages by the understanding of the pint and explains that point to our audience. This implicates our own biases which are then added when retelling to story.


So stories change, things and leveled and others are sharpened

Coverage

The property of a social context reguarding some particular claim that makes it reasonable for you to (provisionally) reject the claim on the grounds that if it were true, you would already know it.


Ex: I think I would know that by now


Results from: idealization and optimistic self assessment


(Easy to think our social context transmits information efficiently, and how exhaustive the distribution of the state of information is to our community)

Policing

The thought that a group context is policed with respect to some claim if it is reasonable for you to (provitionally) accept the claim on the grounds that if it were false, nobody would say it.


Ex: if false then 1. It would be a lie, 2 it would easily be disproven, 3. It would involve negative consequences for the speaker.


People overestimate the force of policing in a social informational context

What is the function of science?

To avoid the individual systematic biases seen in our own reasoning including: deductive reasoning, inductive reasoning, data selection, testimony, media, our own perceptions, our own memories, our own interpretations of data.


-Science provides a context of inquiry in which any competent practitioner may provide. the prospect for momentary individual error is factored out by a requirement of repeatability.


Specifically How does science fix biases?

It's a set of practices valuable for their effectiveness in minimizing the effects of any one specific error.


Silencing result of false consensus effect, & social pressurers against questioning assertions are put aside.


-There are explicit conventions that favour noting confounds and questioning outcomes


Systematic group biases is limited because of the openess of the practice to anyone who cam attain competence in it.

Verfibility

A characteristic/hallmark of science


Science differes from non scientific areas of human activity because in science we actually check if out claims are true


But giving a precise definition of what I'd verifiable is at least difficult and likely impossible

Falsibility

A characteristic of science


Karl popper argued that the defining feature of science is falsibilty. Meaning: science differs from pseudo science un making clear predictions. I'd the prediction does not come out true, we reject the theory. Unfortunately even good science does not conform to poppers strict views.

The scientific method

A characteristic of science


Unfortunately there is not a single unifying strand to the vast group of subjects that we call science that we can identify as the scientific method.

Starting only with data?

A characteristic of science


The idea that you start with data to see what theory fits it best is at best a guideline rather then a definition of science

How then to define science?

A set of discipline specific methods that bear a broad family resemblance plus an appropriate sort of attitude


"Cargo cult" science: the ide that mimicking some of the appearances of scientific practice one would thereby be doing science

Science vs cargo cult science

A kind of scientific integrity/utter honesty that involves being over backwards.


Ex: whilst experimenting one should report everything you think might make it invalid - not only what you think is right about it: other causes that could possibly explain your results; and things you thought of that you've eliminated by some other experiment, and how they worked to make sure others van tell they have been eliminated

Imperviousnes to contervailing evidence; especially a refusal to specify in advance wat data would count as probability lowering

Hallmark of psydo-science


-for dubious medical practices (e.g the tendency to blame the patients attitude) often this approach is taken by the patient!


-also setting conveniently vague success conditions (wellness)

"Folk plausibility"

-intutive ideas: like cures like; or something that causes a symptom in large amounts will cure it in small amounts (e.g homeopathy is extreme example)


-appeals to resentment of scientists; the pleasure of imagining oneself to know some truth that those unimaginative of dogmatic scientists cannot recognize


The spread of crack pottery: recall Elizabeth's nicksons article un national post arguing against evolution theory

Hallmark of pseudo science


Q: if modern Evolutionary theory is so obviouslu flawed that every non specialist can realize it, then why do the specialists overwhelmingly not realize it


Nickson: scientists 43sis seeing the problems with evolution theory "because they realize that a moral revolution necessarily followed from [this rejection or evolution theory.] If there is an intelligent designer or for and the immortal soul exists; then heaven and he'll may exist [and] each human possesses an immortal soul, which might be held to account. This is frightening to materialists."

Other characteristics of pseudo science

-positioning a conspiracy theory about mainstream science


-sometimrs there are dysteming problems with some domain of science: eg. Corporate sponsored pharmaceutical R&D


-But such arguments must h4binform3r and made carefully, and can rarely implicates some sort of global cover up by scientists


-critical thinking about science: irs practice publications and popular representationsm