• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/20

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

20 Cards in this Set

  • Front
  • Back
stochastic error term
A Stochastic error term is a term that is added to a regression equation to introduce all of the variation in Y that cannot be explained by the included Xs
cross section
Cross-sectional data refers to data collected by observing many subjects (such as individuals, firms or countries/regions) at the same point of time, or without regard to differences in time
Residual
the residual of an observed value is the difference between the observed value and the estimated function value.
Dummy Variable
is one that takes the value 0 or 1 to indicate the absence or presence of some categorical effect that may be expected to shift the outcome
central limit theorem
the central limit theorem (CLT) states that, given certain conditions, the mean of a sufficiently large number of independent random variables, each with a well-defined mean and well-defined variance, will be approximately normally distributed
alternative hypothesis
In statistical hypothesis testing, the alternative hypothesis (or maintained hypothesis or research hypothesis) and the null hypothesis are the two rival hypotheses which are compared by a statistical hypothesis test.
type II error
term used within the context of hypothesis testing that describes the error that occurs when one accepts a null hypothesis that is actually false.
rejection region
If T falls in the rejection region, the null hypothesis is rejected.
p value
the p-value is the probability of obtaining a test statistic at least as extreme as the one that was actually observed
OLS
linear least squares is a method for estimating the unknown parameters in a linear regression model.
multiple regression
multiple linear regression tests the relationship between several independent variables and a dependent variable
time series
A time series is a sequence of data points, measured typically at successive points in time spaced at uniform time intervals
true regression line
a smooth curve fitted to the set of paired data in regression analysis; for linear regression the curve is a straight line
estimator
In statistics, an estimator is a rule for calculating an estimate of a given quantity based on observed data: thus the rule and its result (the estimate) are distinguished
total sum of squares
The sum, over all observations, of the squared differences of each observation from the overall mean
Gauss Markov theorem
states that in a linear regression model in which the errors have expectation zero and are uncorrelated and have equal variances, the best linear unbiased estimator of the coefficients is given by the ordinary least squares estimator.
null hypothesis
test that the things you were testing are not related and your results are the product of random chance events.
type I error
is the incorrect rejection of a true null hypothesis
acceptance region
If the value of T comes out to be in the acceptance region, the null hypothesis being tested is not rejected
level of significance
The probability of a false rejection of the null hypothesis in a statistical test