• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/139

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

139 Cards in this Set

  • Front
  • Back
Chapter 1 - Key Terms
Definition
Causal Effect
a ceteris paribus change in one variable has an effect on another variable
Ceteris Paribus
all other relevant factors are held fixed
Cross-Sectional Data Set
s data set collected by sampling a population at a given point in time
Data Frequency
the interval at which time series data are collected. Yearly, Quarterly and Monthly are the most common data frequencies
Econometric Model
an equation relating the dependent variable to a set of explanatory variables and unobserved disturbances, where unknown population parameters determine the ceteris parabus effect of each explanatory variable
Economic Model
a relationship derived from economic theory or less formal economic reasoning
Empirical Analysis
a study that uses data in a formal econometric analysis to test a theory, estimate a relationship, or determine the effectiveness of a policy
Experimental Data
data that have been obtained y running a controlled experiment
Nonexperimental Data
data that have not been obtained through a controlled experiement
Observational Data
see nonexperimental data- data that have not been obtained through a controlled experiement
Panel Data
a data set constructed from repeated cross sections over time. With a balanced panel, the same units appear in each time period. With an unbalanced panel, some units do not appear in each time period, often due to attrition
Pooled Cross Section
a data configuration where independent cross sections, usually collected at different points in time, are combined to produce a single data set
Random Sampling
a sampling scheme whereby each observation is drawn at random from the population. In particular, no unit is more likely to be selected than any oter unit, and each draw is independet of all other draws
Retrospective Data
data collected based on past, rather than current, information
Time Series
data collected over time on one or more variables
Chapter 2
Coefficient of Determination
R squared -It is the proportion of variability in a data set that is accounted for by the statistical model
Constant Elasticity Model
a model where the elasticity of the dependent variable, with respect to an explanatory variable, is constant; in multiple regression, both variables appear in logarithmic form
control Variable
Covariate
Degrees of Freedom
in multiple regression analysis, the number of observations minus the number of estimated parameters
Dependent Variable
the variable to be explained in a multiple regression model (and a variety of other models)
Elasticity
the percentage change in one variable given a 1% ceteris paribus increase in another variable
Error Term (Disturbance)
the variable in a simple or multiple regression equation that contains unobserved factors that affect the dependent variable. The error term may also include measurement errors in the observed dependent or independent variables.
Error Variance
the variance of the error term in a multiple regression model
Explained Sum of Squares (SSE)
the total sample variation of the fitted values in a multiple regression model
Explained Variable
dependent variable
Explanatory Variable
in regression analysis, a variable that is used to explain variation in the dependent variable
First Order Conditions
the set of linear equations used to solve for the OLS estimates
Fitted Values
the estimated values of the dependent variable when the values of the independent variables for each observation are plugged into the OLS regression line
Gauss-Markov Assumptions
The set of assumptions (Assumptions MLR.1 through MLR.5 or TS.1 through TS.5) under which OLS is BLUE
Heteroskedasticity
The variance of the error term, given that the explanatory variables, is not constant
Homoskedasticity
the errors in a regression midel have constant variance conditional on the explanatory variables
Independent Variable
explanatory variable
Intercept Parameter
the parameter in a multiple linear regression model that gives the expected value of the dependent variable when all independent variables equal zero
Mean Independent
expected value-a measure of central tendency in the distribution of a random variable, including an estimator
OLS Regression Line
the equation relating the predicted value of the dependent variable to the independent variables, where the parameter estimates have been obtained by OLS
Ordinary Least Squares (OLS)
a method for estimating the parameters of a multiple linear regression model. The ordinary least squares estimates are obtained by minimizing the sum of squared residuals
Population Regression Function (PRF)
see conditional expectation- the expected or average value of one random variable, called the dependent or explained variable, tht dependes on the values of one or more other variables, called the independent or explanatory variables
Predicted Variable
see dependent variable
Predictor Variable
explanatory variable
Regressand
dependent variable
Regression through the Origin
regression analysis where the intercept is set to zero; the slopes are obtained by minimizing the sum of squared residuals, as usual
Regressor
explanatory variable
Residual
the difference between the actual value and the fitted (or predicted) value; there is a residual for each observation in the sample used to obtain an OLS regression line
Residual Sum of Squares (SSR)
see Sum of Squares Residuals (SSR)- in multiple regression analysis the sum of the squared OLS residuals across all observations
Response Variable
dependent variable
R-Squared
in a multiple regression midel, the proportion of the total sample variation in the dependent variable that is explained by the independent variable
Sample Regression Function (SRF)
see OLS regression line- the equation relating the predicted value of the dependent variable to the independent variables, where the parameter estimates have been obtained by OLS
Semi-elasticity
the percentage change in the dependent variable given a one-unit increase in an independent variable
Simple Linear Regression Model
a model where the dependent variable isa linear function of a single independent variable, plus an error term
Slope Parameter
the coefficient on an independent varaible in a multiple regression
Standard Error of Beta hat 1
an estimate of the standard deviation in the sampling distribution of Beta hat 1
Standard Error of the Regression (SER)
in multiple regression analysis, the estimate of the standard deviation of the population error, obtained as the square root of the sum of squared residuals over the degrees of freedom
Total Sum of Squares (SST)
the total sample variation in a dependent variable about its sample average
Zero Conditional Mean Assumption
a key assumption used in multiple regression analysis that states that, given any values of the explanatory variables, the expected value of the error equals zero (see Assumptions MLR.4, TS.3 and TS.3
Chapter 3
Best Linear Unbiased Estimator (BLUE)
among all linear unbiased estimators, the estimator with the smallest variance. OLS is BLUE, conditional on the sample values of the explanatory variables, under the Gauss-Markov assumptions
Biased Toward Zero
a description of an estimator whose expectation in absolute value is less than the absolute value of the population parameter
Ceteris Paribus
all other relevant factors are held fixed
Degrees of Freedom (df)
in multiple regression analysis, the number of observations minus the number of estimated parameters
Disturbance
error term
Downward Bias
the expected value of an estimator is below the population value of the parameter
Endogenous Explanatory Variable
an explanatory variable in a multiple regression model that is correlated with the error term, either because of an omitted varaible, measurement error, or simultaneity
Error Term
the variable in a simple of multiple regression equation that contains unobserved factors that affect the dependent variable. The error term may also include measurement errors in the observed dependent or independent variables
Excluding a Relevant Variable
in multiple regression analysis, leaving out a variable that has a nonzero partial effect on the dependent variable
Exogenous Explanatory Variable
an explanatory variable that is uncorrelated with the error term
Explained Sum of Squares (SSE)
the total sample variation of the fitted values in a multiple regression model
First Order Conditions
the set of linear equations used to solve the OLS estimates
Gauss-Markov Assumptions
the set of assumptions (Assumptions MLR.1 through MLR.5 or TS.1 through TS.5) under which OLS is BLUE
Gauss-Markov Theorem
the theorem that states that under the five Gauss-Markov assumptions (for cross-sectional or time series models) the OLS estimator is BLUE (conditional on the sample values of the explanatory variables)
Inclusion of an Irrelevant Variable
the including of an explanatory variable in a regression model that has a zero population parameter in estimateing an equation by OLS
Intercept
in the equation of a line, the value of the y variable when the x variable is zero
Micronumerosity
a term introduced by Arthur Goldberger to describe properties of econometric estimators with small sample sizes
Misspecification Analysis
the process of determining likely biases that can arise from omitted variables, measurement error, simultaneity, and other kinds of model misspecification
Multicollinearity
A term that refers to correlation among the independent variables in a multiple regression model; it is usually invoked when some correlations are "large", but an actual magniture is not well defined.
Multiple Linear Regression Model
a model linear it its parameters, where the dependent variable is a function of independent variables plus an error term
Multiple Regression Analysis
a type of analysis that is used to describe estimation of and inference in the multiple linear regression
OLS Intercept Estimate
the intercept in an OLS regression line
OLS Regression Line
the equation relating the predicted value of the dependent variables, where the parameter estimates have been obtained by OLS
OLS Slope Estimate
A slope in an OLS regression line
Omitted Variable Bias
the bias that arises in the OLS estimators when a relevant variable is omitted from the regression
Ordinary Least Squares
a method for estimating the parameters of a multiple linear regression model. The ordinary least squares estimates are obtained by minimizing the sum of squared residuals
Overspecifying the Model
See inclusion of an irrelevant variable
Partial Effect
the effect of an explanatory variable on the dependent variable, holding other factors in the regression model fixed
Perfect Collinearity
in multiple regression, one independent variable is an exact linear function of one or more other independent varaibles
Population Model
a model, especially a multiple linear regression model, that describes a population
Residual
the difference between the actual value and fitted (or predicted) value; there is a residual for each observation in the sample used to obtain an OLS regression line
Residual Sum of Squares
see Sum of Squared Residuals - in multiple regression analysis, the sum of the squared OLS residuals across all observations
Sample Regression Function (SRF)
See OLS regression line- the equation relating the predicted value of the dependent variables, where the parameter estimates have been obtained by OLS
Slope Parameter
the coefficient on an independent variable in a multiple regression model
Standard Deviation of B hat j
a common measure of spread in the distribution of B hat j
Standard Error of B hat j
and estimate of the standard deviation in the sampling distribution of beta hat 1
Standard Error of the Regression (SER)
in multiple regression analysis, the estimate of the standard deviation of the population error, obtained as the square root of the sum of squared residuals over the degrees of freedom
Sum of Squared Residuals (SSR)
in multiple regression analysis, the sum of squared OLS residuals across all observations
Total Sum of Squares (SST)
the total sample variation in a dependent variable about its sample average
True Model
the actual population model relating the dependent variable to the relevant independent variables, plus a disturbance, where the zero conditional mean assumption holds
Underspecifying the Model
see excluding a relevant variable- in multiple regression analysis, leaving out a variable that has a nonzero partial effect on the dependent variable
Upward Bias
the expected value of an estimator is greater than the population parameter value
Variance Inflation Factor (VIF)
in multiple regression analysis under the Gauss-Markov assumptions, the term in the sampling variance affected by correlation amot the explanatory variables
Chapter 4
Alternative Hypothesis
the hypothesis against which the null hypothesis is tested
Classical Linear Model
the multiple linear regression model under the full set of classical linear model assumptions
Classical Linear Model (CLM) Assumptions
the ideal set of assumptions for multiple regression analysis: for cross sectional analysis, Assumptions MLR.1 through MLR.6 and for time series analysis, Assumptions TS.1 through TS.6. The assumptions include linearity in the parameters, no perfect collinearity, the zero conditional mean assumption, homoskedasticity, no serial correlation, and normality of the errors
Confidence Interval (CI)
a rule used to construct a random interval so that a certain percentage of all data sets, determined by the confidence interval, yields an interval that contains the population value
Critical Value
in hypothesis testing, the value against which a test statistic is compared to determine whether or not the null hypothesis is rejected
Denominator Degress of Freedom
in the F-test, the degrees of freedom in the unrestricted model
Economic Significance
Exclusion Restrictions
restrictions that state that certain variables are excluded from the model (or have zero population coefficients)
F Statistic
a statistic used to test multiple hypotheses about the parameters in a multiple regression model
Joint Hypothesis Test
a test involving more than one restriction on the parameters in a model
Jointly Insignificant
failure to reject, using an F test at a specified significance level, that all coefficients for a group of explanatory variables are zero
Jointly Statistically Significant
the null hypothesis that two or more variables have zero population coefficients is rejected at tehe chosen significance level
Minimum Variance Unbiased Estimators
an estimator with the smallest variance in the class of all unbiased estimators
Multiple Hypotheses Test
a test of a null hypothesis involving more than one restriction on its parameters
Multiple Restrictions
more than one restriction on the parameters in an econometric model
Normality Assumption
the classical linear model assumption that states that the error (or dependent varaible) has a normal distribution, conditional on the explanatory variables
Null Hypothesis
in classical hypothesis testing, we take this hypothesis as true and require the data to provide substantial evidence against it
Numerator Degrees of Freedom
in the F-test, the number of restrictions being tested
One-Sided Alternative
an alternative hypothesis that states that a parameteris greater than (or less than) the value hypothesized under the null
One-Tailed Test
a hypothesis test against a one sided alternative
Overall Significance of the Regression
a test of the joint significance of all explanatory variables appearing in a multiple regression equation
p-value
the smalled significance level at which the null hypothesis can be rejected. Equivalently, the largest significance level at which the null hypothesis cannot be rejected
Practical Significance
the practical or economic importance of an estimate, which is measured by its sign and magnitude, as opposed to its statistical significance
R-Squared Form of the F-Statistic
the F statisticfor testing exclusion restrictions expressed in terms of the R-squareds from the restricted and unrestricted models
Rejection Rule
in hypothesis testing, the rule that determines when the null hypothesis is rejected in favor of the alternative hypothesis
Restricted Model
in hypothesis testing, the model obtained after imposing all the restrictions required under the null
Significance Level
the probability of Type 1 error in hypothesis testing
Statistically Insignificant
failure to reject the null hypothesis that a population parameter is equal to zero, at the chosen significance level
Statistically Significant
rejecting the null hypothesis that a parameter is equal to zero against the specified alternative, at the chosen significance level
t- ratio
see t-ratio
t- statistic
the statistic used to test a single hypothesis about the parameters in an econometric model
Two-Sided Alternative
an alternative where the population parameter can be either less than or greater than the value stated under the null hypothesis
Two-Tailed Test
a test against a two sided alternative
Unrestricted Model
in hypothesis testing, the model that has no restrictions placed on its parameters