Use LEFT and RIGHT arrow keys to navigate between flashcards;
Use UP and DOWN arrow keys to flip the card;
H to show hint;
A reads text to speech;
51 Cards in this Set
- Front
- Back
econometrics
|
quantitative measurement and analysis of actual economic and business phenomena
|
|
3 uses of econometrics
|
describe economic reality
test hypotheses about economic theory forecasting the future of economic activity |
|
hypothesis testing
|
the evaluation of alternative theories with quantitative evidence
|
|
dependent variable
|
a function of movements in a set of other variables
|
|
independent variable
|
aka explanatory variable.
|
|
beta zero
|
constant/intercept term
|
|
beta 1
|
slope coefficient
|
|
stochastic error term
|
term that is added to a regression equation to introduce all of the variation in Y that cannot be explained by the included Xs
aka error term used to compensate for exogenous, unexplainable factors in a regression |
|
deterministic/stochastic component
|
deterministic=expected value. indicated Y and X relationship
stochastic= |
|
estimated regression equation
|
quantified version of the theoretical regression equation
|
|
estimated regression coefficients
|
beta hats
|
|
residual
|
difference between the observed Y and the estimated regression line
|
|
error term
|
difference between the observed Y and the true regression equation (expected value of Y)
|
|
error term differences
|
epsilon=true regression equation
|
|
cross sectional data
|
observations from same point in time and represent different individual economic entities from same point in time
|
|
OLS
|
regression technique that calculates beta hats so as to minimize the sum of the squared residuals
|
|
partial regresion coefficent
|
allow researcher to distinguish the impact of one variable on the dependent variable from that of other independent variables
|
|
multivariate regression coefficient
|
indicates change in the dependent variable associated with a one-unit increase in the independent variable, holding constant all other variables in the equation
|
|
total sum of squares
|
difference between observed Y value and the average Y value
|
|
explained sum of squares
|
difference between estimated Y value and the average Y value
|
|
residual sum of squares
|
difference between observed Y and the estimated Y value
|
|
how to compute R squared
|
ESS/TSS or 1-(RSS/TSS)
|
|
R squared aka...
|
coefficient of determination
|
|
adjusted R squared
|
measures % of variation of Y around its mean that is explained by the regression equation, adjusted for degrees of freedom.
|
|
is it fair to use adjusted R squared as a true indicator of the regression strength
|
no. Water demand example
|
|
error term in true regression value, estimated regression value
|
epsilon
ei or epsilon hat i |
|
what two things does OLS not do
|
panel data, time series data
|
|
3 reasons for OLS
|
easy to use
minimizes error each beta hats are useful characteristics |
|
regression analysis
|
technique used to explain a linear relationship between independent and dependent variables
|
|
expected value
|
expected influence that the independent variable has on the dependent variable
|
|
time series
|
regression that studies the relationship between a dependent variable OVER time
|
|
degrees of freedom
|
number of observations minus the number of estimated coefficient or samples
|
|
why not always use adjusted R squared
|
theory and previous research are just as important
|
|
Steps of Regression:
1. review literature and developt theoretical model |
-see what other people are doing
-apply previous models or develop new ones if you agree or disagree) |
|
Steps of Regression:
2. specify the model: select independent variables and functional form |
1. specify independent variable and how they should be measured
2. specify mathematical form of variables. 3. specify type of stochastic error term. steps 1-3 avoid specification error exclude variables that have little affect use dummy variables |
|
Steps of Regression:
3. hypothesize expected sign of coefficients |
positive or negative
Qdemand=f(P,Y,P)+ e add + or - to the top |
|
Steps of Regression:
4. collect, inspect, clean data |
determine units of measurement
look for typographical, conceptual, or definitional errors |
|
Steps of Regression:
5. estimate, evaluate equation |
walk fine line between going back and looking for more data vs trying to continue to fix the equation until it fits expcted theory
|
|
Steps of Regression:
6. document results |
produce equation,N, adjusted R squared
|
|
does beta zero mean anything?
|
only when zero is included in the range of xi values
|
|
The Classical Assumptions:
1. Regression model is linear, correctly specified, and has an additive error term |
-model is linear (or transformed)
-has no omitted variables or incorrect functional form -additive error term that is not multiplied or divided |
|
The Classical Assumptions:
2. error term has zero population mean |
mean of the error distribution is zero
|
|
The Classical Assumptions:
3. all explanatory variables are uncorrelated with the error term |
do not let an error term and an explanatory variable move together
|
|
The Classical Assumptions:
4. observations of the error term are uncorrelated with each other |
observations are independently drawn from each other
|
|
The Classical Assumptions:
5. error term has a constant variance |
constant variance- homoskedastic
not having constant variance (increasing or decreasing)-heteroskedastic |
|
The Classical Assumptions:
6. no explanatory variable is a perfect linear function of any other explanatory variable |
collinearity or multicollinearity between two variables implies that they are really the same variable.
(ex: tire sales and sales tax (which are a % of sales) |
|
The Classical Assumptions:
7. error term is normally distributed |
error must be drawn independently from a distribution with a mean of zero. as number of errors gets larger, distribution approaches a bell shaped curve (standard normal distribution)
|
|
unbiasedness
|
mean of sample distribution to be the same as the mean of the population
|
|
biased estimator
|
mean of the sample is not the same as the mean of the population
|
|
how to decrease variance
|
increase sample size
|
|
mean square error
|
variance + square of the bias
|