Use LEFT and RIGHT arrow keys to navigate between flashcards;
Use UP and DOWN arrow keys to flip the card;
H to show hint;
A reads text to speech;
25 Cards in this Set
- Front
- Back
Measure of association
|
A general term that refers toa number of bivariate statistical techniques used to measure the strength of a relationship between two variables
|
|
correlation coefficient
|
a statistical measure of the covariation, or association, between two at least interval variables
|
|
covariance
|
extent to which two variables are associated systematically with each other
|
|
Negative (inverse) relationship
|
Covariation in which the association between variables is in the opposite direciton. As one goes up, the other goes down
|
|
Does covariation in and of itself establish causality?
|
No. Think of the example of the covariation between ice cream sales and drownings; or the roosters crow and the rising sun
|
|
Coefficient of Determination (R^2)
|
A measure obtained by squaring the correlation coefficient; the proporion of the total variance of a variable accounted for by another value of another value.
Measures that part of the variance of Y that is accounted for by knowing the value of X |
|
In the example about unemployment and hours worked, r = -.635; therefore r^2=0.403
How much of the variance in unemployment can be explained by the variance in hours worked? |
About 40% of the variance in unemployment can be explained by the variance in hours worked.
|
|
Correlation Matrix
|
The standard form for reporting observed correlations among multiple variables. Although any number of variables can be displayed in a correlation matrix, each entry represents the bivariate relationship between a pair of variables.
|
|
What is the procedure for determining statistical significance?
|
The procedure for determining statistical significance is the t-test of the significance of a correlation coefficient
|
|
Simple (Bivariate) Linear Regression
|
A measure of linear association that investigates straight line relationships between a continuous dependent variable and an independent variable that is usually continuous, but can be a categorical dummy variable
|
|
In simple linear regression, what do the following symbols stand for:
Y (alpha) (beta) X (Y = (alpha) + (beta)X |
(alpha) represents the Y intercept, or where the line crosses the y-axis
(beta) is the slope coefficient The slope is the change in Y associated with a change of one unit in X. Slope may also be thought of as rise over run. |
|
True/False: Beta provides the strength and direction of the relationship between the independent and dependent variable
|
True
|
|
True/False: (alpha) Y intercept is a fixed point that is considered a constant
|
True
|
|
Standardized regression coefficient
|
Estimated coefficient of the strength of relationship between the independent and dependent variables
Expressed on a standardized scale where higher absolute values indicate stronger relationships (between -1 to 1) |
|
Raw Regression Estimates (b1)
|
Raw regression weights have theadvantage of retaining the scale metric (also their key disadvantage). Used if the purpose of the regression analysis is forecasting.
|
|
Standard Regression estimates ((Beta)1)
|
Have the advantage of a constant scale. Should be used when the research is testing explanatory hypothesis
|
|
Ordinary Least Squares
|
Generates a straight line that minimizes the sum of squared deviations of the actual values from this predicted regression line.
|
|
The equation for the ordinary least squares means that X estimated how?
|
The equation menas that the predicted value for any value of X is determined as a funciton of the estimated slope coefficient, plus the estimated intercept coefficient + some error
|
|
Where does the explanatory power of regression lie?
|
The explanatory power of regression lies in hypothesis testing
|
|
What two conditions must be satisfied for the outcome of the ypothesis test?
|
The regression weight must be in the hypothesized direction. (positivie relationships require a positive coefficient and negative relationships require a negative one)
The t-test associated with the regression weight must be significant |
|
Multiple Regression Analysis
|
An analysis of association in which the effects of two or more independent variables on a single, intervalscaled dependent variable are investigated simultaneously
|
|
Dummy Variable
|
The way a dichotomous (two group) independent variable is represnted in regression analysis by assigning 0 to one group and 1 to another
|
|
Partial correlation
|
The correlation between two variables after taking into account the fact that they are correlated with other variables too
|
|
R^2 in multiple Regression
|
The coefficient of multiple determination in multiple regression indicates the percentage of variation in Y explained by all independent variables
|
|
F-Test
|
Tests statistical significance by comparing the variation explained by the regression equation to the residual error variation .
Allows for testing of the relative magnitudes of the sum of squares due to the regression and the error sum of squares |