Use LEFT and RIGHT arrow keys to navigate between flashcards;
Use UP and DOWN arrow keys to flip the card;
H to show hint;
A reads text to speech;
38 Cards in this Set
- Front
- Back
3 things that make up a p-value
|
Sample size, effect size, and distribution.
|
|
P-value
|
A statistical conclusion; displays the probability that your results are not random chance, NOT that there is an effect.
|
|
Three essential parts of the scientific method for psychological statistics
|
Design, measurement, and analysis.
|
|
Of design, measurement and analysis, which is the most important?
|
Design is the most important. A good analysis can't make up for a flawed design.
|
|
Descriptive statistics
|
Statistics that give you the basic shape of your data. Mean, median, mode, standard deviation, range.
|
|
Importance of descriptive statistics
|
Give you the basic 'shape' or 'feel' of the data. This is important because you need a basic understanding of what your data is like before you can know how to analyze it. But descriptive statistics lack context.
|
|
Covariance
|
A measure of how much two variables change together.
|
|
Range of covariance
|
Negative infinity to positive infinity.
|
|
What can covariance tell you?
|
Before it is standardized, covariance is only able to tell you directionality (whether the relationship is positive or negative).
|
|
Correlation
|
"r". Shows how two variables are related, but does not show the reason for or the nature of that relationship. r^2 is variance explained.
|
|
Range of correlation
|
-1 to +1
|
|
Partial correlation
|
The relationship between two variables, when accounting for another. Limited because it can only look at the relationship between 2 variables.
|
|
Regression
|
A type of statistical analysis built on correlations. Gives all the correlations between a set of predictors on one criterion.
|
|
Limits of regression
|
Regressions require continuous variables, and can't look at group differences.
|
|
Point estimate
|
Every analysis is based on a point estimate. This is your good variance over bad variance.
|
|
Relationship between types of variance and p-value
|
Your effect size is good variance, and your distribution is bad variance.
|
|
Relationship between regression, residual and p-value in a regression analysis
|
Regression = effect size
Residual = distribution |
|
How does your sample size (N) affect your F-statistic?
|
N does not affect the F-statistic. It does, however, affect your critical F-value.
|
|
Benefits of regression
|
Can compare more variables than a correlation; can retain unstandardized units; can be used to predict scores.
|
|
Polynomial regression
|
The shape of your data is not always perfectly linear; an equation can be modeled using an nth polynomial to create a better fit.
|
|
Continuous-by-continuous interactions
|
When the relationship between X and Y is dependent on variable Z.
|
|
Mediator
|
Controls the relationship entirely
|
|
Moderator
|
Changes the relationship
|
|
Regression F-value
|
(Mean square regression)/(mean square residual) = F
|
|
Validity vector
|
Contained within the correlation matrix, this is the column beneath your 'Y'. Gives the relation between each individual predictor and your dependent variable.
|
|
Correlation matrix
|
The diagonally symmetric matrix of all your correlations.
|
|
Model summary (regression)
|
Shows our multiple correlation and r^2 mult.
|
|
r^2 mult
|
The variance in our dependent variable explained by the set of all our predictors.
|
|
Coefficients table
|
Shows the overall coefficients needed for the regression equation, in both standardized and unstandardized form.
|
|
Shape of the line in polynomial regression
|
Changes based on the values of X.
|
|
Centering
|
When dealing with higher-variable terms and interactions, the values of your variable must be centered by subtracting them from the mean (Xbar - X). This reduces non-essential multicollinearity.
|
|
b0
|
Our predicted value at the mean of x, or at x=0 when uncentered.
|
|
Type I error
|
Probability you're saying something is meaningful when it isn't.
|
|
Type II error
|
Probability you're saying something isn't meaningful when it is.
|
|
Cronbach alpha
|
A statistical measure of the internal consistency or reliability of a test.
|
|
Covariance equation
|
[(x-xbar)(y-ybar)]/(n-1)
|
|
r
|
Pearson correlation. Sxy/SxSy.
|
|
r^2
|
Percent overlap in a variable; percentage by which one variable accounts for another.
|