• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/43

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

43 Cards in this Set

  • Front
  • Back

Adjusted Predicted Value

A measure of the influence of a given case of data on the model.

Adjusted R-2-

Tells us how much variance in the outcome would be accounted for if the model had been derived from the population from which the sample was taken.

Autocorrelation

When the residuals of two observations in a regression model are correlated. That is: when the assumption of independent errors is violated.

b(i)

The unstandardised regression coefficient, which indicates the strength of a given predictor and the outcome. The change in the outcome associated with a unit change in the predictor.

B(i)

The standardised regression coefficient, which does the same as unstandardised but in standard deviation changes.

Cook's Distance

A measure of the overall influence of a case on a model. Values greater than 1 may be cause for concern (i.e. indicate an outlier).

Covariance Ratio

Measure of whether a case influence the variation of the parameters in a regression model.

Cross-validation

Assessing the accuracy of a model across different samples.

Deleted residual

A measure of the influence of a particular case of data.

Dummy Variables

A way of recoding a categorical variable with more than two levels.

Durbin-Watson Test

Assesses the assumption of independence for regression models.

F-ratio

Tests the overall fit of the model in regression & overall differences between group means.

Generalisation

Ability for a model to be applicable to situations beyond the data that created it.

Goodness of fit

Index of how well the model fits the data.

Hat values

Multivariate analogue of the F-Ratio (so good for MANOVA)

Heteroscedasticity

At each point of the predictor variable there is unequal variance.

Hierarchical Regression

The order in which predictors are entered into the model is based on prior literature.

Independent errors

The residuals should be uncorrelated for any given two observations

Leverage statistics

The influence of the value of the outcome variable on the predictor(s)

Mahalanobis distances

Measure the influence of a case by examining the distance of a case from the mean of the predictor(s)

Mean squares

Measure of average variability

Model sum of squares

total amount of variability for which the model can account

Multicollinearity

Two or more variables are very closely related

multiple r

Multiple correlation coefficient.

Multiple regression

A regression with multiple predictor variables.

Ordinary least squares

A method of regression where the parameters are calculated according to the method of least squares.

Outcome variable

that which is being predicted

Perfect collinearity

When one predictor is perfectly correlated with another (or multiple)

Predicted value

The value of an outcome based on values of the predictors


Predictor variable

That which is doing the predicting

Residual

Error. Difference between the value the model predicts and the value observed in the data.

Residual sum of squares

same as residual, but broader and regarding deviance. I don't get it either.

Shrinkage

Loss of predictive power due to sampling, rather than taking data from full population

simple regression

one predictor, one outcome

standardised residuals

residuals of a model expressed in standard deviations

stepwise regression

variables are entered into the model based on a statistical criterion

studentised deleted residuals

measure of the influence of a particular case of data

suppressor effects

where a predictor has a significant effect, but only when another variable is held constant (otherwise it suppresses)

t-statistic

in regression, to test whether a regression coefficient is significantly different from 0

tolerance

measures multicollinearity

total sum of squares

measure of the total variability within a set of observations

unstandardised residuals

residuals of a model expressed in the original units

variance inflation factor (VIF)

a measure of multicollinearity. when it hits 10 freak out