• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/9

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

9 Cards in this Set

  • Front
  • Back
The multiple regression model relates a response y to a set of quantitative independent variables. To find least-squares estimates for b0, b1, . . . , and bk in a multiple regression model, we follow the same procedure that we did for a ____________ model in Chapter 11. We obtain a random sample of n observations; we find the least- squares prediction equation
yˆ = bˆ 0 + bˆ 1 x 1 + . . .+ bˆ k x k
by choosing bˆ 0, bˆ1, . . . , bˆk to minimize SS(Residual) = ∑i (yi - yˆi)^2.
linear regression
Although it was easy to write down the solutions to bˆ 0 and bˆ1 for the linear regression model,
y=b0 +b1x+e
we must find the estimates for b0, b1, . . . , bk by solving a set of simultaneous equations, called the _________.
normal equations
The coefficient of an independent variable xj in a multiple regression equation does not, in general, equal the coefficient that would apply to that variable in a simple linear regression. In multiple regression, the _________ refers to the effect of changing that xj variable while other independent variables stay constant.
coefficient
In simple linear regression, all other potential independent variables are ignored. If other independent variables are ____________ with xj (and therefore don’t tend to stay constant while xj changes), simple linear regression with xj as the only independent variable captures not only the direct effect of changing xj but also the ______________ of the associated changes in other xs. In multiple regression, by holding the other xs __________, we eliminate that indirect effect.
correlated, indirect effect, constant
In addition to estimating the intercept and partial slopes, it is important to estimate the ______________(σε). The residuals are defined as before, as the difference between the observed value and the predicted value of y:
y i - yˆ i = y i - ( bˆ 0 + bˆ 1 x i 1 + bˆ 2 x i 2 + . . . + bˆix i k )
model standard deviation
The sum of squared residuals, SS(Residual), also called _________, is defined exactly as it sounds. Square the prediction errors and sum the squares:
SS(Residual) = ∑ (yi - yiˆ )^2
∑ [ y i - ( bˆ 0 + bˆ 1 x i 1 + bˆ 2 x i 2 + . . . + bˆix i k )]^2
The df for this sum of squares is n -(k +1). One df is subtracted for the _________ and 1 df is subtracted for each of the k _______________.
SS(Error), intercept , partial slopes.
The mean square residual, MS(Residual), also called __________, is the residual sum of squares divided by n - (k + 1). Finally, the estimate of the model standard deviation sε is the square root of MS(Residual).
MS(Error)
Finally, the estimate of the model standard deviation sε is the square root of ____________.
The estimated model standard deviation sε is often referred to as the _________ standard deviation. It may also be called “std dev,” “standard error of estimate,” or “root MSE.”
MS(Residual), residual
Model standard deviation:
About 95% of the prediction errors will be within ± 2 standard deviations of the mean (and the mean error is automatically zero):
s ε = √[MS(Residual)]
Memorize