Use LEFT and RIGHT arrow keys to navigate between flashcards;
Use UP and DOWN arrow keys to flip the card;
H to show hint;
A reads text to speech;
38 Cards in this Set
- Front
- Back
What are the two kinds of random variable? |
Discrete and continuous. |
|
What is meant by the expected value of a random variable? |
The weighted average of the potential outcomes, weighted by their probabilities. |
|
Give another name for the expected value of a random variable. |
The population mean.
|
|
How do you go about calculating the expected value of a function of a discrete random variable, X? |
You apply the transformation to every possible value of X and take the weighted average of those. |
|
Give the first expected value rule. |
E(X + Y + Z) = E(X) + E(Y) + E(Z) |
|
Give the second expected value rule. |
E(bX) = b E(X) where b is a constant |
|
Give the third expected value rule. |
E(b) = b where b is a constant |
|
Prove expected value rule 2. |
E(bX) = Σ b Xi pi = b Σ Xi pi = b E(X) |
|
Give the formula for the population mean of a discrete random variable. |
μx = Σ xi pi |
|
Give the formula for the population variance of a discrete random variable. |
σx^2 = Σ (xi - μ)^2 pi |
|
Give another, often easier to calculate, formula for the population variance of a discrete random variable. |
σx^2 = E(X^2) - μx^2 |
|
Prove σX^2 = E(X^2) - μ^2 |
σX^2 = E {(X - μ)^2} = E(X^2 - 2Xμ + μ^2) = E(X^2) - E(2Xμ) + E(μ^2) = E(X^2) - 2μ E(X) + μ^2 = E(X^2) - 2μ^2 + μ^2 = E(X^2) - μ^2 |
|
What is the expected value of the disturbance term? |
Zero. |
|
Prove that the expected value of the disturbance term is 0. |
X = μ + u ∴ u = X - μ E(u) = E(X - μ) = E(X) + E(-μ) = μ - μ = 0 |
|
The population variance of the disturbance term is equal to... |
...the population variance of the random variable. |
|
Prove that σX^2 = σu^2 |
σX^2 = E{(X - μX)^2} = E(u^2) since u = X - μX σu^2 = E{(u - μu)^2} =E(u^2) since μu = 0 |
|
What is a probability density function? |
A probability density function, is a function that depicts the probability density of a continuous random variable (if it can be written as a function). |
|
Give the formula for the population covariance of two random variables. |
cov(X,Y) = σXY = E{(X - μX)(Y - μY)} |
|
Give the formula for the expected value of a continuous random variable. |
E(X) = ∫ X f(X) dx where f(X) is the probability density function and the integration is performed over the range for which X is defined. |
|
Give the formula for the expected value of a function of a continuous random variable. |
E[g(X] = ∫ g(X) f(X) dx |
|
Give the formula for the population variance of a continuous random variable. |
σX^2 = E{(X - μx)^2} = ∫ (X - μx)^2 f(X) dx |
|
What is the difference between a measure of association and a measure of correlation? |
A measure of correlation controls for the unit of measurement so as to determine a numerically-significant level of relatedness (e.g. normalised between -1 and 1). |
|
How do you establish the independence of two random variables? |
If E[g(X)h(Y)] = E[g(X)] E[h(Y)] then they are independent since E(X - μX)E(Y - μY) = [E(X) - μX][E(Y) - μY] = 0 x 0 |
|
Give covariance rule 1. |
If Y = V + W, cov(X, Y) = cov(X, V) + cov(X, W) |
|
Give covariance rule 2. |
If Y = bZ, where b is a constant and Z is a variable, cov(X, Y) = b cov(X, Z) |
|
Give covariance rule 3. |
If Y = b, where b is a constant, cov(X, Y) = 0 |
|
Prove covariance rule 1. |
If Y = V + W, μY = μV + μW cov(X, Y) = E{(X - μX)(Y - μY)} =E{( X - μX )( [V + W] - [μV + μW] )} |
|
Prove covariance rule 2. |
If Y = bZ, μY = bμZ cov(X, Y) = E{( X - μX )( Y - μY )} = E{( X - μX )( bZ - bμZ )} = b E{( X - μX ) (Z - μZ )} = b cov(X, Z) |
|
Prove covariance rule 3. |
If Y = b, μb = b cov(X, Y) = E {(X - μX)(Y - μY)} = E{( X - μX )( b - b )} = E{0} = 0 |
|
Give variance rule 1. |
If Y = V + W, var(Y) = var(V) + var(W) + 2 cov(V, W) |
|
Give variance rule 2. |
If Y = bZ, var(Y) = b^2 Var(Z) |
|
Give variance rule 3. |
If Y = b, where b is a constant, var(Y) = 0 |
|
Give variance rule 4. |
If Y = V + b, where b is a constant, var(Y) = var(V) |
|
Prove variance rule 1. |
if Y = V + W, var(Y) = cov(Y, Y) = cov(Y, [V+W]) = cov(Y, V) + cov(Y, W) = cov([V+W], V) + cov([V+W], W) = cov(V, V) + 2cov(V, W) + cov(W, W) = var(V) + var(W) + 2cov(V, W) |
|
Prove variance rule 2. |
if Y = bZ, where b is a constant, var(Y) = cov(Y, Y) = cov(bZ, Y) = b cov(Z, Y) = b cov(Z, bZ) = b^2 cov(Z, Z) = b^2 var(Z) |
|
Prove variance rule 3. |
if Y = b, where b is a constant, var(Y) = cov(b, b) = 0 since b = μb ∴ b - μb = 0 |
|
Prove variance rule 4. |
if Y = V + b, where b is a constant var(Y) = var (V + b) = var(V) + var(b) + 2 cov(V, b) = var(V) + 0 + 0 |
|
Give the formula for the population correlation coefficient. |
ρXY = σXY / √(σX^2 σY^2) |