• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/31

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

31 Cards in this Set

  • Front
  • Back
  • 3rd side (hint)

Random Variable

Random Variable X is any* function X: S --» R. (from sample space to rational numbers)



*X must be measurable ({X<t} must belong to F)

Discrete r.v.

A r.v. is discrete if it assumes only finite or infinitely countable amount of values. x1, x2, ...

Probability Mass Function

p.m.f.


p_x(a) = P(X=a), aєR

Cumulative Function

c.d.f.


F_x(t) = P(X<=t)

Properties of cdf

0 <= F_x(t) <= 1



F_x(t) is non-decreasing


· lim(F_x(t)) = 0 (for t --» -inf)


· lim(F_x(t)) = 1 (for t --» +inf)



F_x(t) = F_x(t+0) = F_x(t+e), e --»0


but


F _x(t) ≠ F_x(t-0)

Bounds


Non-decreasing (stairs up)


Right-continuity

pdf vs cdf

cdf as sum of pdf-s


pdf as cdf(t) - cdf(t-0)

Bernoulli r.v.

X=0 or X=1 (indicator of success)



p_x(a) = P(X=a) = p (if X=1)


1-p (if X=0)



F_x(a) :


1-p


1

Binomial r.v.

Bernoulli repeated n times


Binomial ~ num of succ (1)


p_x(a) = P(X=a) = C(a, n) · p^{a} · (1-p)^{n-a}



sum p_x(a) = 1 ( by binomial theorem)

Geometric r.v

Numbers of Bernoulli experiments before success



p_x(a) = p · (1-p)^{a-1}

Poisson r.v

limit of binomial as n --» inf


assume parameter lambda = n·p



p_x(a) = P(X=a) = lambda^{a} · e^{-lambda} / a!

when event happens very rarely but number of experiments is very big

Conditional distribution

p_{X|Y} (x|y) = P{X=x|Y=y} = P{X=x, Y=y} / P{Y=y}


for a fixed value y is a pmf of


r.v {X|Y=y} which has conditional distribution

Expected value

Assume X is a r.v taking on values x1,x2,... . If sum{x_i · p(x_i)} < inf, then expected value E(X) = sum{x_i · p(x_i)}

Expected values for different distributions

Binomial


Bernoulli


Geometric


Poisson


Exponential


Normal

Independent r.v-s

Random Variables X and Y are independent if for every values (x, y) evens A:={X=x} and B:={Y=y} are independent

Independence by p.m.f

p_{X,Y} (x, y) = p_X (x) · p_Y (y)

Expectation of g(X)

E{Y} = E{g(X)} = sum {g(x_i) · p(x_i)}

Linearity of expectation

E{a·X + b·Y} = a·E{X} + b·E{Y}

Total probability rule for expectations

If E_1...E_n - partition of the sample space, then:


E{X} = E{X|E_1}·P{E_1} + ... + E{X|E_n}·P{E_n}

k_th moment of X

If sum{|x|^k · p(x) } < inf



mu_k = E{X^k}

k-th central moment

If sum{|x|^k · p(x)} < inf



K-th Central moment = E( (X-E(X))^k )

k-th factorial moment of X

If sum {|x|^k · p(x) } < inf:



mu^(k) = E( X·(X-1)·(X-2)·...·(X-k+1) )

Variance

sigma_2 := Var(X) = E( ( X - E(X) )^2 )

measure of disperse/spread around tge mean

Standard deviation

square root of Var(X)

Properties of variance

E( (X - E(X))^2 ) = E(X^2) - E(X)^2



Var(a·X) = a^2 · Var(X)

Variances for different distributions

Binomial -- np(1-p)


Expectation of the product

If X abd Y are independent:


E(X·Y) = E(X) · E(Y)

Covariance

Cov(X,Y) = E( (X - E(X)) · (Y - E(Y)) )=


= E(X·Y) - E(X)·E(Y)




If Cov(X,Y) > 0:


X and Y are dependant



Positive correlation:


larger values of X usually respond to larger values of Y

Properties of Covariance

If X and Y are independent:


Cov(X,Y) = 0



Var(X+Y) = Var(X) + 2·Cov(X,Y) + Var(Y)

Variance of the sum

Only if X and Y are independent:



Var(aX+bY) = a^2 · Var(X) + b^2 · Var(Y)

Correlation coefficient

Correlation coefficient between X and Y = Cor(X,Y) = Var(X,Y) / sigma(X)·sigma(Y)

Properties of correlation

|Cor(X, Y)| <= 1



Cor(X, Y) = ±1 --» Y = aX + b

Cauchy-Bunyakivsky-Shwartz inequality)