• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/6

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

6 Cards in this Set

  • Front
  • Back
Imagine that we want to draw inferences about the value of multiple R^(2) in the population. Why and when is it necessary to adjust the R^(2) value that we calculate from our sample?
Why is it necessary to adjust R^(2)? R^(2) is a biased estimate of the population squared multiple correlation. It tends to overestimate rho^(2) (rho squared is represented by p^(2)). In contrast, adjusted R^(2) is an unbiased estimate of the population squared multiple correlation.

When is it necessary to adjust R^(2)? Adjust R^(@) when you want a good pt estimate of rho^(2) in the population. If you are simply hypothesis testing, you can use regular R^(2).
A multi-national study used GNP (gross national product) per capita, energy use per capita, and TVs per capita to predict infant mortality. None of the 3 predictors accounted for a signficiant proportion of unique variance in infant mortality (bs were not significant), although the R^(2) was 0.57 (p<.001). Explain why these results may have occured. Include a Venn diagram to illustrate the variance accounted for by the predictors.
high multicollinearity (highly correlated predictors).
What are some indicators that you might have high multicolinearity among your predictors?
1. large changes in regression weights when variables are added or removed from the model.
2. unexpected nonsignificant regression weights.
3. Valence (+/-) of regression weights doesn't make theoretical sense
4. Large correlations between predictors (0.70 or higher) (highly correlated predictors=high multicolinearity)
5. Wide confidence intervals for important predictors.
A researcher is trying to predict how fast speed skaters go in the 400 meter race using arm length,yrs of experience, and age as predictors.

Based on MLR, the researcher found that the squared partial correlation coeficient for age was 0.18. Interpret this value in relation to the variables.
Age accounts for 18% of unique variance

After removing variance due to arm length and years of experience, 18% of the remaining variance in how fast speed skaters go can be explained by age.
A researcher is trying to predict how fast speed skaters go in the 400 meter race using arm length,yrs of experience, and age as predictors.

The multiple R^(2) is 0.47. Interpret this value in relation to the variables that you're studying.
47% of the variance in how fast speed skaters go in a 400 meter race can be accounted for by the optimal linear combination of arm length, years of experience, and age.
NEED TO ADD THE REST OF THE INFO FROM NOTES FOR FEB 18-22
NEED TO ADD THE REST OF THE INFO FROM NOTES FOR FEB 18-22