• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/22

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

22 Cards in this Set

  • Front
  • Back
correlational research
independent variable is not manipulated by the experimenter. Instead research relies on finding natural variation in the independent variable or dependent variables. Not defined by whether you work with correlation coefficients.
one reason for doing experimental design
to stop variables correlating
collecting data
questionnaires, archives and official statistics, observations. Psychological scales are often used
analyzing the data:
correlation coefficients
Spearman rank correlation coefficient (Ρ (Greek Rho)) & the Pearson coefficient (r)
key concept of correlation
variance accounted for (r²) sometimes written as a %
linear regression
the line that produces the smallest squared error
Y= A+BX
Y= dependent variable
X= independent variable
A= a constant
B= a constant (slope of line)
Y'= predicted value
standardized formula
linear regression
means subtracted from them and then divided by their standard deviation.(written in small letters y and x)
y = βx β=r
multiple regression
better prediction more than 1 independent variable (Xi)
Y/Y'= A+B1X1 + B2X2 + ... +BnXn
multiple regression
Standardised
y/y' = β1x1 + β2x2+ ... + βnxn
main reasons to use multiple regression
separate out the effect of the different independent variables so tell which are causally important. attempt to help internal validity (third variable) statistically. used in field setting where you have less experimental control.
beta weights
measures of what the correlations would be if all the other variables in the equation were held constant
assumptions of multiple regression
1)independence
2)linearity
3)normality
4)nature of independent variables
5)identify key variables
assumptions of multiple regression
1)independence
observations/cases are independent of eachother
assumptions of multiple regression
2)linearity
the relationship between the dependent variable and each of the independent variables should actually be linear than than peaked.
assumptions of multiple regression
3)normality
dependent variable should be normally distributed
assumptions of multiple regression
4)nature of the independent variables
multiple regression is used when independent variables are continuous or they are are a mixture of categorical and continuous variables. (categorical 1,0 code etc how you code, makes a diff)
assumptions of multiple regression
5)identify key variables
in general beta weights will change every time you put in new independent variables or take out old ones. If you miss some out then you will not be measuring the true real world beta weights etc.
causal modelling
correlation does not = causation. some correlation can only go one way (time)
guidelines for causal modelling
1.time
2.A doesn't have to measured before B
3.causing-> caused not caused-> causing
multiple regression and causal modelling
helps us distinguish between causal scenarios. by signif beta weights
path analysis
one variable can be dependent and independent
moderation
e.g. contraceptive sex
|
=
----->pregnancy