• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/73

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

73 Cards in this Set

  • Front
  • Back
nonexperimental data/observational data/retrospective data
data not accumulated through controlled experiment
experimental data
data collected in laboratory environments
empirical analysis
the use of data to test a theory or estimate a relationship
economic model
a model consisting of mathematical equations that describe various relationships
econometric model
a mathematic model founded on the theories of the economic model that solves its ambiguities
cross-sectional data set
a sample of individuals, households, firms, cities, states, countries, or a variety of other units, taken at a given point in time
random sampling
a tool used to simplify the analysis of cross-sectional data; a sampling scheme
time series data
observations on a variable or several variables over time. Difficult to study because economic observations can rarely be assumed to be independent across time.
data frequency
a feature of time series data that usually exists on a daily, weekly, monthly, quarterly, or annual schedule
pooled cross section
data sets with both cross-sectional and time series features, in which sample size is increased by combining data gathered in multiple places in time
panel data/longitudinal data
a data set consisting of a time series for each cross-sectional member in the data set. These are more difficult to obtain than pooled cross sections.
ceteris paribus
other relevant factors being equal; a factor in determining causal relationships by accounting for other variables
simple linear regression model/two-variable linear regression model/bivariate linear regression model
a method of measuring how two variables relate to one another
other names for "y"
dependent variable, explained variable, response variable, predicted variable, regressand
other names for "x"
independent variable, explanatory variable, control variable, predictor variable, regressor
variable "u"
error term; disturbance
factors other than x and y. Treated by simple regression analysis as unobserved. When assumption E(u given x)=E(u) holds, u is mean independent of x.
Beta sub 1
the slope parameter for the linear relationship that exists when the change in u is 0. Central in applied economics.
Bet sub 0
intercept parameter, aka the constant term. Rarely central to analysis.
zero conditional mean assumption
E(u given x)=0.

This is the result of the assumption that E(u)=0 and that E(u given x)=E(u).
population regression function (PRF)
E(y given x)
Ordinary Least Squares (answer is a formula in the book)
page 29, estimates 2.17 and 2.19

a method for estimating the parameters of a multiple linear regression model. These estimates are obtained by minimizing the sum of squared residuals
fitted value (answer is in book)
page 30, 2.20

The estimated values of the dependent variable when the values of the independent variables for each observation are plugged into the OLS regression line.
residual
difference between an actual variable and its fitted value
first order conditions for the OLS estimates (answer in book)
page 29, 2.14 & 2.15

The set of linear equations used to solve for the OLS estimates
OLS regression line (answer in book)
page 32, 2.23

The equation relating the predicted value of the dependent variable to the independent variables, where the parameter estimates have been obtained by OLS
sample regression function (SRF)
page 32, 2.23

The equation relating the predicted value of the dependent variable to the independent variables, where the parameter estimates have been obtained by OLS
total sum of squares (SST)
the total sample variation in a dependent variable about its sample average
explained sum of squares (SSE)
the total sample variation of the fitted values in a multiple regression model
residual sum of squares (SSR)/sum of squared residual
in multiple regression analysis, the sum of the squared OLS residuals across all observations
r-squared/coefficient of determination
r^2=SSE/SST=1-SSR/SST

in a multiple regression model, the proportion of the total sample variation in the dependent variable that is explained by the independent variable
elasticity
the percentage change in one variable given a 1% ceteris paribus increase in another variable
heteroskedasticity
page 53, slr.5

The error u has the same variance given any value of the explanatory variable.

The variance of the error term, given the explanatory variables, is not constant
error variance/disturbance variance
sigma squared

the variance of the error term in a multiple regression model
degrees of freedom
in multiple regression analysis, the number of observations minus the number of estimated parameters
standard error of the regression (SER)
the natural estimator of sigma; page 58 2.62

in multiple regression analysis, the estimate of the standard deviation of the population error, obtained as the square root of the sum of squared residuals over the degrees of freedom.
partial effect
the effect of an explanatory variable on the dependent variable, holding other factors in the regression model fixed
perfect collinearity
in multiple regression, one independent variable is an exact linear function of one or more other independent variables
exogenous explanatory variable
a variable that is uncorrelated with the error term in the model of interest and used to explain variation in the dependent variable
endogenous explanatory variable
an expalnatory variable in a multiple regression model that is correlated with the error term, either because of an omitted variable, measurement error, or simultaneity
inclusion of an irrelevant variable/overspecifying the model
the including of an explanatory variable in a regression model that has a zero population parameter in estimating an equation by OLS
excluding a relevant variable/underspecifying the model
in multiple regression analysis, leaving out a variable that has a nonzero partial effect on the dependent variable
misspecification analysis
the process of determining likely biases that can arise from omitted variables, measurement error, simultaneity, and other kinds of model misspecification
omitted variable bias
the bias that arises in the OLS estimators when a relevant variable is omitted from the regreession
upward bias
the expected value of an estimator is greater than the population parameter value
downward bias
the expected value of an estimator is below the population value of the parameter
biased toward zero
a description of an estimator whose expectation in absolute value is less than the absolute value of the population parameter
Gauss-Markov assumptions
the set of assumptions under which OLS is BLUE
standard deviation of beta hat
pg. 102 3.58

the square root of the variance
Gauss-Markov Theorem
under the five Gauss-Markov assumptions the OLS estimator is BLUE
normality assumption
the classical linear model assupmtion that states that the error has a normal distribution, conditional on the explanatory variables
minimum variance unbiased estimators
an estimator with the smallest variance in the class of all unbiased estimators, no JOKE Sherlock
null hypothesis
in classical hypothesis testing, we take this hypothesis as true and require the data to provide substantial evidence against it
t statistic/t ratio
the statistic used to test a single hypothesis about the parameters in an econometric model
alternative hypothesis
the hypothesis against which the alternative hypothesis is tested
one-sided hypothesis
an alternative hypothesis that states that the parameter is greater than or less than the value hypothesized under the nul
significance level
the probability of Type I error in hypothesis testing
critical value
in hypothesis testing, the value against which a test statistic is compared to determine whether or not the null hypothesis is rejected
one-tailed test
a hypothesis test against a one-sided alternative
statistically significant
rejecting the null hypothesis that a parameter is equal to zero against the specified alternative, at the chosen significance level
economic significance/practical significance
the practice or economic importance of an estimate, which is measured by its sign and magnitude, as opposed to its statistical significance
exclusion restrictions
restrictions that state that certain variables are excluded from the model or have zero populaation coefficients
multiple restrictions
more than one restriction on the parameters in an econometric model
multiple hypotheses test/ joint hypotheses test
test of a null hypothesis involving more than one restriction on the parameters
restricted model
in hypothesis testing, the model obtained after imposing all of the restrictions required under the null
F statistic
a statistic used to test multiple hypotheses about the parameters in a multiple regression model
numerator degrees of freedom
in an F test, the number of restrictions being tested
denominator degrees of freedom
in an F test, the degrees of freedom in the unrestricted model
jointly statistically significant
the null hypothesis that two or more explanatory variables have zero population coefficients is rejected at the chosen significance level
R-squared form of the F statistic
the F statistic for testing exclusion restrictions expressed in terms of the R-squareds from the restricted and unrestricted models
overall significance of the regression
a test of the joint significance of all explanatory variables appearing in a multiple regression equation
asymptopic properties/large sample properties
properties of estimators and test statistics that apply when the sample size grows without bound
consistency
an estimator converges in probability to the correct population value as the sample size grows
asymptopic bias
bias in an estimator that is always toward zero; thus, the expected value of an estimator with attenuation bias is less in magnitude than the absolute value of the parameter