• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/38

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

38 Cards in this Set

  • Front
  • Back

Econometrics

The quantitative measurement and analysis of actual economic and business phenomena - bridge between abstract theory and real activity

Three major uses of econometrics

1) describing economic reality


2) testing hypotheses about economic theory


3) forecasting future economic activity

Description

One of the three uses of econometrics; lets you quantify economic activity by estimating numbers and putting them in equations that previously contained only abstract symbols

Hypothesis testing

One of the three uses of econometrics; the ability to evaluate alternative theories with quantitative evidence

Forecasting

One of the three uses of econometrics; the ability to predict what is likely to happen in the next quarter, year, or further into the future based on what has happened in the past

Regression analysis

a statistical technique that attempts to explain movement in the dependent variable as a function of movements in the independent variables, through the quantification of a single equation

Can regression analysis confirm causality?

No

constant / intercept coefficient

The value of y when x = 0

Slope coefficient

The amount that y will change when x increases by one unit

An equation is linear if...

Plotting the function in terms of x and y generates a straight line

If linear regression techniques are going to be applied to an equation, that equation must be...

Linear

Stochastic error term

A term that is added to a regression equation to introduce all of the variation in y that cannot be explained by the included x's

The four sources of variation in y other than the variation in x

1) ommitted minor influences


2) measurement error in y


3) The underlying equation might have a different functional form than the one chosen for the regression (might not be linear)


4) purely random variation

Multivariate linear regression model

Single equation linear regression model

Multivariate regression coefficient

B1, B2, B3


serves to isolate the impact on y of a change in one variable from the impact on y of changes in other variables - tells you the impact of one independent variable on the dependent variable, holding constant the influence of the other variables in the equation

Dummy variable

A variable that can only take on two values (1 or 0), used to quantify a concept that is inherently qualitative, such as gender

Estimated regression equation

An equation where the coefficients (the B values) are empirical best guesses of the true regression coefficients, and thus actual numbers

Cross-sectional data set

A data set where all of the observations are from the same point in time and represent different individual economic entities (countries, houses) from that same point in time

Specification error

An error in selecting your independent variable and how it should be measured, the mathematical form of the variables, or the properties of the stochastic error term

Outlier

An observation that flies outside the range of the rest of the observations - a good way to find data entry errors

Ordinary Least Squares (OLS)

A regression estimation technique that calculates the coefficients so as to minimize the minimizing the sum of the squares of the differences between the observed dependent variable (values of the variable being observed) in the given dataset and those predicted by the linear equation.

Why use ordinary least squares?

1) they're relatively easy to use


2) The goal of minimizing the sum of all residuals is appropriate from a theoretical point of view (since the residual can be negative as well as positive, the sum of residuals could be low even if the equation is inaccurate. Squaring the residual fixes this)


3) OLS estimates have a number of useful characteristics (The sum of the residuals is always exactly zero, and OLS can be shown to be the "best" estimator possible under a set of specific assumptions)

Residual

The difference between the estimated value of the dependent variable and the actual value of the dependent variable

Total Sum of Squares

A measure of the amount of variation that can be explained by the regression

Components of the total Sum of Squares for Ordinary Least Squares

Also known as the decomposition of variance

Explained sum of squares

Tells you how much of the variation in the dependent variable your model explained

Residual sum of squares

The sum of the squared differences between the actual y and the predicted y; tells you how much of the dependent variables variation your model did not explain

Simple correlation coefficient (r)

A measure of the strength and direction of the linear relationship between two variables

If two variables are perfectly positively correlated then r equals =

+1

If two variables are perfectly negatively correlated, then r equals

-1

If two variables are totally uncorrelated, then r equals

0

R squared adjusted for degrees of freedom

Degrees of freedom

The excess of the number of observations over the number of coefficients to be estimated

Are more degrees of freedom better?

Yes, because when the number of degrees of freedom is large, every positive error is likely to be balanced by a negative error

Y hat (written ŷ )

the predicted value of y (the dependent variable) in a regression equation. It can also be considered to be the average value of the response variable.

X bar

The mean of the x variable

Y bar

The mean of the y variable