Study your flashcards anywhere!
Download the official Cram app for free >
 Shuffle
Toggle OnToggle Off
 Alphabetize
Toggle OnToggle Off
 Front First
Toggle OnToggle Off
 Both Sides
Toggle OnToggle Off
 Read
Toggle OnToggle Off
How to study your flashcards.
Right/Left arrow keys: Navigate between flashcards.right arrow keyleft arrow key
Up/Down arrow keys: Flip the card between the front and back.down keyup key
H key: Show hint (3rd side).h key
A key: Read text to speech.a key
88 Cards in this Set
 Front
 Back
Expected Value (definition)

Process of averaging when random variable is involved


Expected Value EX[g(x)] (continuous equation)

EX[g(x)]
= integral(g(x)f(x)dx,inf.,inf.) 

Expected Value EX[g(x)] (discrete equation)

EX[g(x)]
= sum(g(xi)P(xi),all i) 

Mean

same as expected value but when g(x)=X.
1)mean is weighted average value of all xi values 2) the mean is center of gravity of the probability density function 

different notations for mean

E[X]
=muX _ =X 

Mean (discrete equation)

sum(xi*PX(xi),all i)


Mean (continuous equation)

integral(x*fX(x)dx,inf.,inf.)


Variance (definition)

spread of r.v. around mean values. average values for error. Larger variance = larger spread. Smaller variance means closer around the center.


Variance (notation)

var[X]
=sigmaX^2 

Variance (discrete equation)

=EX[(XmuX)^2]
=sum((ximuX)^2*P(xi),all i) 

4 properties of variance

1) variance is a measure of the spread about the mean value
2) variance is a positive value 3) variance is moment of inertia of the P.d.f. about the mean 4) if σX²=0 then P(X=μX)=1 (ie X is a constant) 

If probability density fu is even symmetric about x=a

then μX=a since f(ax)=f(a+x)


example: random variable X is uniformly distributed on interva [a,b], findmean and variance of random variable X
find: mean and variance 
mean=μX=E[X]=
∫(x*(1/(ba)),x,a,b)=(a+b)/2 < midpoint var[X]=σX² = E[(xμX)²] =∫((xμX)²*(1/(ba)),x,+a,b) =(1/12)*(ba)² 

example ***
you are playinga game with a friend and throwing a pair of dice, if the sum of the up faces is a prime number you win corresponding number of dollars. and if it is not a prime number you lose corresponding number of dollars. how much do you expect to win or to lose? 
X={2,3,4,...,12}
E[X]=∑(xi*P(xi),all i) = (1/36)*2+(2/36)*3(3/36)*4+(4/36)*5(5/36)*6+(6/36)*7(5/36)*8(4/36)*9(3/36)*10+(2/36)*11(1/36)*12 =1.89 LOSE 

example:
given X={2,3,5,6} f(x)=0.2*∂(x2)+0.1*∂(x3)+0.4*∂(x5)+0.3*∂(x6) find: mean and variance 
_
X =∑(xi*P(xi),all i) =2*0.2+3*0.1+5*0.4+6*0.3 =4.5 σ²=E[(x4.5)²] =∑(xi*P(xi),all i) =(24.5)²*0.2+(34.5)²*0.1+(54.5)²*0.4+(64.5)²*0.3 =2.25 

mode (definition)

value of X where f(x) is maximum. for normal distribution function mode and mean are the same quantities


median (definition)

the value of X such as xm where:
FX(xm)=P(X≤xm) =∫(f(x),x,∞,xm) =1/2 

Standard Deviation (def and eqn)

positive square root of variance of random variable
sdv[X]=σX=√(σX²) 

moments (around origin)

mk
=E[X^k] =∫(x^k*f(x),x,∞,∞) (continuous) =∑(xi^k*P(xi),all i) (discrete) 

central moments (definition)

moment around mean values


Central Moments (equation)

λk
=E[(xμX)^k] =∫((xμX)^k*f(x),x,∞,∞) (continuous) =∑((xμX)^k*P(xi),all i) (discrete) 

m1

μ=mean
first central moment is the mean 

m2

E[X²]
=average power =mean squared value 

λ2

=σX²
=variance 

λ3

=E[(xμX)³]
which is a measure of the asymmetry of f(x) about mean. 

λ3/σX³

normalized third central moment is called skewness or coefficient of skewness


λ3 if p.d.f. is symmetric about x=μX

λ3=0


EX[c]
where c is a constant 
EX[c]=c


E[X1+X2]

E[X1+X2]
=E[X1]+E[X2] 

when does
E[X1*X2] =E[X1]*E[X2} 
for two independent r.v. X1 and X2


E[X1*X2] for two independent r.v. X1 and X2

E[X1*X2]=E[X1]*E[X2]


under what conditions does
E[(X1+X2)²]=E[X1²]+E[X2²] 
when they are both independent random variables whose mean values are zero


variance (another equation)

var[X]
=σX² =E[(xμX)²] =E[X²]μX² 

exmaple
consider two random variables X and Y with the following relation: Y=aX+b find: mean and variance of Y in terms of mean and variance of X 
mean:
μY=E[Y]=E[aX+b]=E[aX]+E[b]=a*μX+b variance: σY²=E[Y²]μY² with: E[Y²]=E[(aX+b)²]=a²E[X²]+b²+2abE[X] and μY² defined above thus: σY² =( a²E[X²]+b²+2abE[X])(a*μX+b)² =a²*σX² 

consider bernouli r.v. X whihc takes two values of 0 or 1 with: P(x=1)=p and P(x=0)=q
find mean and variance 
mean:
μX=E[X]=∑(xi*P(xi),all i)=1*p+0*q=p variance: σX²=E[X²]μX² with second moment m2=E[X²] =∑(xi²*P(xi),all i)=1²*p+0²*q=p thus σX²=pp²=p*(1p)=p*q 

example:
for binomial random variable X calculate the mean and variance [recall P(x=k)=(n,k)p^k*q^(nk) with k=0,1,...,n] 
mean:
μX=E[X] =∑(xi*P(xi),all i) =∑(k*PX(k),k,0,n) =∑(k*(n,k)p^k*q^(nk),k,0,n) =∑(k*{n!/[(nk!)*k!]}p^k*q^(nk),k,0,n) =n*p*∑({(n1)!/[(nk!)*(k1)!]}*p^(k1)*q^(nk),k,1,n) =n*p*∑((n1,k1)p^(k1)*q^(nk),k,1,n) =n*p*(p+q)^(n1)=np variance: m2=E[X²] =∑(k²*(n,k)p^k*q^(nk),k,0,n) =n*(n1)*p²*+np σX² =m2m1² =[n*(n1)*p²*+np][np]² =np(1p) =npq 

mean and variance of binomial random variable X

mean=np
variance=npq 

example:
a fair coin is flipped ten times what is the mean and variance? 
mean=5
variance=2.5 

in which conditions would we use moments of normal

for zero mean gaussian random variable X or for gaussian random variable X with mean μX


moments of normal equation
for r.v. X=N(0,σ) (ie zero mean gaussian r.v. X) 
nth moment mn=
when n is even: =(1)(3)…(n1)*σX^n when n is odd: =0 

moments of normal equation for gaussian random variable X with mean μX and variance σX²

λn=E[(xμX)^n]
when n is even: =(1)(3)…(n1)*σX^n when n is odd: =0 

Chebychev's Inequality
upper band 
for any arbitrary density function f(x)
P(xμX>k)≤σX²/k² which gives the upper band on probability of x deviated from mean by k 

Chebychev's Inequality
lower band 
for any arbitrary density function f(x)
P(xμX≤k)≥ 1σX²/k² ie 1 P(xμX>k)≥1σX²/k² which gives the lower band on probability of x deviated from mean by k 

example
consider r.v. X with σX²=3 1)find upper band on the r.v. X deviated from its mean by 2 2)find lower band on probability of X to be in the range of 2 from its mean 
1) P(xμX>2)≤σX²/2²
>P(xμX>2)≤ 3/4 2) P(xμX≤2)≥ 1σX²/k² > P(xμX≤2)≥ 1/4 

characteristic function
(symbol and equation) 
ΦX(ω)
=E[exp(jωx)] =∫(exp(jωx)*f(x),x,∞,∞) (x is continuous) =∑(exp(j*ω*xi)*PX(xi),all xi) (x is discrete) 

relation of ΦX(ω) to the probability density fu.

ΦX(ω) is the "fourier transform" of the probability density function with sign change on ω as:
FX(ω)=F.T.(fX(x))=ΦX() 

inversion formula (uses

finds pdf from characteristic eqn


inversion formula (eqn)

fX(x)=1/(2*π) * ∫( ΦX(ω)*exp(jωx),ω,∞,∞)


Uses of characteristic eqn

1) determination of higher order moments
2)determination of densities of sums of independent ranom 

ΦX(0)

ΦX(0)=1


ΦX(ω)

ΦX(ω)≤1


euler’s eqn

exp(jωx)=cos(ωx)+jsin(ωx)=√(cos²ωx + sin²ωx)=1


if X and Y are two independent r.v. and Z=X+Y then ΦZ(ω)=?

ΦZ(ω)=ΦX(ω)*ΦY(ω)


if Y=aX+b
ΦY(ω)=? 
ΦY(ω)=exp(jωb)ΦX(aω)


under what conditions will
ΦZ(ω)=ΦX(ω)*ΦY(ω) 
Z=X+Y
with X and Y as independent random variables 

moments in terms of characteristic function:
mn=? 
mn=E[X^n]=(1/j^n)*nth_derivative(ΦX(ω),ω)ω=0


j^n*E[X^n]=?

nth_derivative(ΦX(ω),ω)ω=0


example:
calculate the c.f. for poisson r.v. X using c.f. find mean and variance of X [recall poisson eqn: P(X=k)=exp(b)*b^k/k! with k=0,1,...] 
characteristic function:
ΦX(ω) =E[exp(jωx)] =∑(exp(jωk)*exp(b)*b^k/k =exp[b*(exp(jω)1)] mean: E[X]=(1/j)*∂(ΦX(ω),ω)(ω=0) =(1/j)∂(exp[b*(exp(jω)1)]) variance: E[X²]=2nd_derivative(ΦX(ω)(ω=0)) =b+b² thus σX²=b+b²b²=b 

example:
find c.f. for standard normal r.v. [ie N(0,1)] 
ΦX(ω)
=E[exp(jωx)] =∫(exp(jωx)*f(x),x,∞,∞) =(1/√(2*π))*∫(exp(jωx)*exp(x²/2),x,∞,∞) =exp(ω²/2) 

example:
find c.f. for normal r.v. with mean μ and variance σ² by using the following transformaton Y=σX+μ 
c.f. of Y is:
ΦY(ω) =exp(jωμ)*ΦX(σω) =exp(jωμ)*exp((jω)²/2) =exp(jωμω²σ²/2) 

example:
find mean and variance of Y for standard normal r.v. [ie N(0,1)] using ΦY(ω) 
E[Y]
=(1/j)*∂(Φ(ω),ω)(ω=0) =(1/j)*(jμ2σ²ω/2)*ΦY(ω)(ω=0) =μ E[Y²] =(1/j²)*2nd_derivative(Φ(ω),ω)(ω=0) =σ²μ² thus σ²=E[Y²]{E[Y]}² =σ²μ²+μ² =σ² 

exercise:
find c.f. for geometric r.v.with the following probability mass function calculate mean and variance using c.f. geometric probability: P(X=k)=p*q^(k1) k=1,2,... 
ΦX(ω)
=∑(exp(jωk)*p*q^(k1),k,1,∞) =p*exp(jω)/(1q*exp(jω)) since exp(jω)*q<1: E[X] =(1/j)*∂(ΦX(ω),ω)(ω=0) =1/p σX²=p/q² 

expected value of a function of random variables

E[g(x1,…,xn)]
=∫…(∫(g(x1…xn)*f(x1,…,xn),x1,∞,∞)…,xn,∞,∞) if g(x1,…,xn)= ∑(ai*xi,i,1,n) then _ g=E[g(x1…xn)] =∑(ai*E[xi],i,1,n) ie the mean value of a weighted sum of r.v.s equals the weighted sum of mean values 

joint moments of two r.v.
mnk=? 
mnk
=E[x^n*y^k] =∫(∫((x^n)*(y^k)*f(x,y),x,∞,∞),y,∞,∞) 

order of joint moment

n+k


joint moments of two r.v.
m01 
_
m01=Y 

joint moments of two r.v.
m10 
_
m10=X 

joint moments of two r.v.
m11 
m11
=E[xy] =Rxy =∫(∫(xy*f(xy),x,∞,∞),y,∞,∞) 

correlation (notation)

m11
=E[xy] =Rxy 

uncorrelated (definition)

Rxy=E[X]E[Y]
or if two random variables are independent 

orthogonal r.v. (definition)

Rxy=0


if two r.v. are independent they are also ____

if two r.v. are independent they are also uncorrelated. (Viceversa is not true in general.)


joint central moments (notation)

λnk
=E[(XμX)^n*(YμY)^k] 

joint central moments (equation)

λnk
=E[(XμX)^n*(YμY)^k] =∫(∫( [(xμX)^n*(yμY)^k]*f(x,y),x,∞,∞), y,∞,∞) 

λ20=?
λ02=? 
λ20=E[(xμx)²]=σx²
λ02=E[(yμy)²]=σy² 

covariance (definition)

covariance of X and Y is defined by σxy or Cxy and defined as:
Cxy =E[(xμX)*(yμY)] =E[XY]μX*μY 

if Cxy=0

then X and Y are uncorrelated


when X and Y are two independent r.v. the covariance is

Cxy=0 thus the two variables are uncorrelated


λ11

λ11=Cxy


correlation coefficient (definition)

correlation coefficient ρxy is a measurement of linear dependency of two r.v. X and Y


correlation coefficient
(eqn) 
ρxy=Cxy/(σx*σy)


properties of correlation coefficient (2)

1)ρxy≤1
2)if Y=aX+b with a and b as constants, then ρxy=1 if a>0 ρxy=1 if a<0 

relationship of correlation and independence

if two r.v. are independent they are also uncorrelated. Howerver, the converse is not necessarily true.


what does the correlation between r.v. X and Y measure?

measures their tendency to lie in a straight line


example:
consider y=x² when X is N(0,1). X and Y are dependent. find covariance. 
covariance:
Cxy =[XY]μx*μy =E[X³]0*μy =E[X³] =0 (b/c odd moment) thus X and Y are uncorrelated but certainly not independent 

conditional expectation
E[g(x)B]=? 
E[g(x)B]
=∫(g(x)*f(xB),x,∞,∞) =∑(g(xi)*P(X=xiB),all i) 

conditional expectation (notation)

E[g(x)B]
=EXB[g(x)B] 

conditional mean value

E[xB]
=∫(x*f(xB),x,∞,∞) =∑(xi*P(X=xiB),all i) 