Study your flashcards anywhere!
Download the official Cram app for free >
 Shuffle
Toggle OnToggle Off
 Alphabetize
Toggle OnToggle Off
 Front First
Toggle OnToggle Off
 Both Sides
Toggle OnToggle Off
 Read
Toggle OnToggle Off
How to study your flashcards.
Right/Left arrow keys: Navigate between flashcards.right arrow keyleft arrow key
Up/Down arrow keys: Flip the card between the front and back.down keyup key
H key: Show hint (3rd side).h key
A key: Read text to speech.a key
198 Cards in this Set
 Front
 Back
P(AUB)

P(A)+P(B)P(AB)


P(AUB) if A and B are disjoint

P(A) + P(B)


example: Draw one card at random from a deck of ordinary set of playing cards
1) P(Ace or Club) 2) P(Heart or Diamond) 
1) P(ace U club)
= P(A)+P(C)P(AC) = 4/52 + 13/52  1/52 = 16/52 2) P(H U C) = P(H) + P(C) = 13/52 + 13/52 

P(A')

1P(A)


P(AB)

P(AB)=P(AB')
= P(A)  P(AB) 

conditional probability
P(AB) 
P(AB)
=P(AB)/P(B) 

joint probability
P(AB) 
P(AB)
=P(AB)P(B) =P(BA)P(A) 

total or marginal probability
P(A) 
sum(P(ABi),i,1,n)
=sum(P(ABi)P(Bi),i,1,n) 

Baye's Theorem
P(BjA) 
P(BjA)
=P(BjA)/P(A) = P(ABj)P(Bj)/sum(P(ABi)P(Bi),i,1,n) 

Example: Pick a card
1) P(spade) 2) P(spadeblack) 3) P(spadered) 
1) P(spade)=13/52
2) P(spadeblack) = P(black and spade)/P(black) 3) P(spadered) = P(spade and red)/P(red) =0/(26/52) = 0 

P(A'B)

P(A'B)
= 1P(AB) 

P((AUC)B)

P((AUC)B)
= P(AB) + P(CB)  P(ACB) 

disjoint

AnC=0


P(AB)

P(AB)
=P(AB)P(B) =P(BA)P(B) 

Example:
Choosetwo balls without replacement at random from a box which has 3 white and 2 red balls: 1) prob first ball removed white and second ball red? 
1) P(W1R2)
=P(R2W1)P(W1) =(2/4)*(3/5) 

example:
combined experiment toss a coin and die what is the total sample space? 
S=S1 * S2
={(H,1),(H,2), ..., (H,6),(T,1), ... (T,6)} 

cartesian product:
S1 has n elements S2 has m elements what is the total sample space of S? 
m*n pairs ofelements
S=S1 * S2 S is called a cartesian product 

geometric interpolation:

table to see possible outcomes of combined experiment
a1 a2 ... an b1 a1b1 a2b2 ... b2 a1b2 ... . . . bm a1bm a2bm ... anbm 

example:
number of outcomes if toss a coin 3 times 
2*2*2=8


example:
number of outcomes if roll a six headed dice three times 
6*6*6


marginal/total probability
P(A)= 
P(A)
= sum(P(ABi),i,1,n) = sum(P(ABi)P(Bi),i,1,n) with BiBj=0 ie disjoint 

example:
f=factory, D=defective given 30% of xistors from f1 50% of xistors from f2 20% of xistors from f3 and that 2% of xistors from f1 are D 4% of xistors from f2 are D 5% of xistors from f3 are D what is 1) the probability we have a defective transistor? 2) probability the defective transistor is from factory 2? 
1) P(D)
= P(Df1)+P(Df2)+P(Df3) = P(Df1)P(f1) +P(Df2)P(f2) +P(Df3)P(f3) = 0.02*0.3 + 0.04*0.5 +0.05*0.2 = 3.6% 2) P(f2D) = P(f2D)/P(D) = P(Df2)*P(f2)/P(D) 

example:
choose a fruit (apple or orange) at random from a box chosen at random. P(apple) = ? 
P(apple)
= P(AB1)+P(AB2)+P(AB3) = P(AB1)P(B1) + P(AB2)P(B2) + P(AB3)P(B3) = (3/6)(1/3) + (2/3)(1/3) + (2/6)(1/3) = 37/90 

Bayes' Theorem
P(BjA) 
P(BjA)
=[P(ABj)/P(Bj)]/P(A) 

example:
communication channel (binary)  consider binary communication channel transmit data in form of zero or ones. define: T1={# 1 transmitted} T0={# 0 transmitted} R1={# 1 received} R0={# 0 received} given: P(R1T0)=0.91 P(R0T0)=0.94 P(T0)=0.45 find: 1) P(R1) 2) P(R0) 3) P(T1R1) 4) P(T0R0) 5) P(error) 
1) P(R1)
=P(R1T1)+P(R1T0) =P(R1T1)P(T1)+P(R1T0)P(T0) =0.5275 2) P(R0) = 1P(R1) = 0.4725 3) P(T1R1) = P(T1R1)/P(R1) = [P(R1T1)P(T1)/P(R1)] = 0.9488 4) P(T0R0) =[P(R0T0)P(T0)]/P(R0) = 0.8952 5) P(error) = P(R1T0) + P(R0T1) = P(R1T0)P(T0) + P(R0T1)P(T1) = 0.0765 

example:
a system consists of two subsystems A1 and A2. Define events: Wi = {Ai working} Wi'= {Ai not working} for i = 1,2 given: P(W1W2)=0.9 P(W2)=0.8 P(W1)=0.6 find: P(W2W1') 
P(W2W1')
=P(W2 W1')/P(W1') =[P(W1'W2)P(W2)/P(W1')] ={[1P(W1W2)]P(W2)} /(1P(W1)) 

P(A'B)

P(A'B)
= 1P(AB) 

Tree Diagrams:

Useful graphical tool for defining the basic outcomes of a combined experiment. Particularly when events are dependent.


example:
select a box at random and draw two balls without replacement. given box1 = 4 red, 2 white box2 = 3 red, 2 white find probability both balls are white. 
P(W1W2)
=P(B1W1W2 U B2W1W2) = P(B1W1W2) + P(B2W1W2) = P(W2W1B1)P(W1B1)P(B1) + P(W2W1B2)P(W1B2)P(B2) =(1/5)(2/6)(1/2) +(1/4)(2/5)(1/2) =1/12 

P(ABC)

P(ABC)
= P(ABC)/P(BC) 

example:
combined experiment toss a coin and die what is the total sample space? 
S=S1 * S2
={(H,1),(H,2), ..., (H,6),(T,1), ... (T,6)} 

cartesian product:
S1 has n elements S2 has m elements what is the total sample space of S? 
m*n pairs ofelements
S=S1 * S2 S is called a cartesian product 

geometric interpolation:

table to see possible outcomes of combined experiment
a1 a2 ... an b1 a1b1 a2b2 ... b2 a1b2 ... . . . bm a1bm a2bm ... anbm 

example:
number of outcomes if toss a coin 3 times 
2*2*2=8


example:
number of outcomes if roll a six headed dice three times 
6*6*6


marginal/total probability
P(A)= 
P(A)
= sum(P(ABi),i,1,n) = sum(P(ABi)P(Bi),i,1,n) with BiBj=0 ie disjoint 

example:
f=factory, D=defective given 30% of xistors from f1 50% of xistors from f2 20% of xistors from f3 and that 2% of xistors from f1 are D 4% of xistors from f2 are D 5% of xistors from f3 are D what is 1) the probability we have a defective transistor? 2) probability the defective transistor is from factory 2? 
1) P(D)
= P(Df1)+P(Df2)+P(Df3) = P(Df1)P(f1) +P(Df2)P(f2) +P(Df3)P(f3) = 0.02*0.3 + 0.04*0.5 +0.05*0.2 = 3.6% 2) P(f2D) = P(f2D)/P(D) = P(Df2)*P(f2)/P(D) 

example:
choose a fruit (apple or orange) at random from a box chosen at random. P(apple) = ? 
P(apple)
= P(AB1)+P(AB2)+P(AB3) = P(AB1)P(B1) + P(AB2)P(B2) + P(AB3)P(B3) = (3/6)(1/3) + (2/3)(1/3) + (2/6)(1/3) = 37/90 

Bayes' Theorem
P(BjA) 
P(BjA)
=[P(ABj)/P(Bj)]/P(A) 

example:
communication channel (binary)  consider binary communication channel transmit data in form of zero or ones. define: T1={# 1 transmitted} T0={# 0 transmitted} R1={# 1 received} R0={# 0 received} given: P(R1T0)=0.91 P(R0T0)=0.94 P(T0)=0.45 find: 1) P(R1) 2) P(R0) 3) P(T1R1) 4) P(T0R0) 5) P(error) 
1) P(R1)
=P(R1T1)+P(R1T0) =P(R1T1)P(T1)+P(R1T0)P(T0) =0.5275 2) P(R0) = 1P(R1) = 0.4725 3) P(T1R1) = P(T1R1)/P(R1) = [P(R1T1)P(T1)/P(R1)] = 0.9488 4) P(T0R0) =[P(R0T0)P(T0)]/P(R0) = 0.8952 5) P(error) = P(R1T0) + P(R0T1) = P(R1T0)P(T0) + P(R0T1)P(T1) = 0.0765 

example:
a system consists of two subsystems A1 and A2. Define events: Wi = {Ai working} Wi'= {Ai not working} for i = 1,2 given: P(W1W2)=0.9 P(W2)=0.8 P(W1)=0.6 find: P(W2W1') 
P(W2W1')
=P(W2 W1')/P(W1') =[P(W1'W2)P(W2)/P(W1')] ={[1P(W1W2)]P(W2)} /(1P(W1)) 

P(A'B)

P(A'B)
= 1P(AB) 

Tree Diagrams:

Useful graphical tool for defining the basic outcomes of a combined experiment. Particularly when events are dependent.


example:
select a box at random and draw two balls without replacement. given box1 = 4 red, 2 white box2 = 3 red, 2 white find probability both balls are white. 
P(W1W2)
=P(B1W1W2 U B2W1W2) = P(B1W1W2) + P(B2W1W2) = P(W2W1B1)P(W1B1)P(B1) + P(W2W1B2)P(W1B2)P(B2) =(1/5)(2/6)(1/2) +(1/4)(2/5)(1/2) =1/12 

P(ABC)

P(ABC)
= P(ABC)/P(BC) 

P(ABC)

P(ABC)
=P(ABC)P(BC) =P(ABC)P(BC)P(C) 

equations to test independence

1) P(AB) = P(A)
2) P(AB) = P(A)P(B) 

example:
toss a coin twice P(H2H1)=? 
P(H2H1)=P(H2)
independent events 

conditionally independent

P(ABC)
= P(AC)P(BC) 

independent events: definition

two events are called statistically independent if the probability of occurance of one event is not affected by occurance or nonoccurance of the other one.


Three events A, B, and C are independent iff:

P(AB) = P(A)P(B)
P(AC) = P(A)P(C) P(BC) = P(B)P(C) P(ABC) = P(A)P(B)P(C) 

total # eqns required to establish n events are independent

2^n  (n+1)


example:
toss a die events  A ={odd number} B ={even number} C ={one or two} 1) are A and B statistically independent events? 2) are A and C statistically independent events? 
1) P(AB) = P(A)P(B)?
P(B)=P(A)=1/2 P(AB)=P(0)=0 P(AB)is not equal to P(A)P(B) thus, A and B are not independent 2) (a) P(AC) = P(A)P(C)? or (b) P(CA) = P(C)? P(C)=2/6=1/3 P(A)=1/2 P(CA)=1/3 P(AC)=P(1)=1/6 (a) and (b) are true so A and C are statistically independent 

example:
two relay contacts are activated by a single armature where: A={First contact closed} B={Second contact closed} C={armature is activated} given P(C)=0.5, P(AC)=0.9, P(BC)=0.9, P(AC')=0.2, P(BC')=0.1 and A and B are CONDITIONALLY independent find: 1) P(ABC) 2) P(B) 3) P(AB) 
1) P(ABC)
=P(AC)P(BC) =0.81 2) P(B) =P(BC)P(C)+P(BC')P(C') =0.5 3) P(AB) =P(ABC)P(C)+P(ABC')P(C') =P(ABC)P(C) + P(AC')P(BC')P(C') =0.415 

if n events are independent from eachother, what else can we assume they are independent from?

any one event Ai is independent of any event formed by unions, intersections, and complements of the other events.


Cartesian Product of sets A and B

C=A*B


Number of Elements in Cartesian Product of sets A and B (A has m elements, B has n elements)

# elements of aibj = m*n


Permutation: definition

Permutation:
An ORDERED arrangement of k distinct objects taken from n distinct objects without replacement 

Combination: definition

NONORDERED combination of n objects taken k at a time


Difference between Permutation and Combination

Permutation order matters.
Combination order does not matter. 

P(n,k)

P(n,k)
=n!/(nk)! 

P(n,n)

P(n,n)
=n! 

C(n,k)

C(n,k)=
(n k) =n!/(k!*(nk)!) 

Combination: definition

Selection of k distinct objects from total of n distinct objects when order of selection is not needed. ie number of ways to divide n objects into two groups of k and nk elements.


example:
number of ways we can choose two applicants from five different applicants of a1,a2,a3,a4,a5 
C(5,2)
=(5 2) =5!/(2!3!) =10 

binomial coefficient

(n
k)=(n,k) 

binomial expansion

(p+q)^n
=sum((n,k)p^k*q^(nk),k,0,n) 

generalized combination:
definition 
number of ways to partition n distinct objects into k distinct groups containing n1,n2,...,nk objects respectively where
sum(ni,i,1,k)=1 

generalized combination:
equation 
C(n,(n1,n2,...,nk))
= n!/(n1!n2!...nk!) 

Bernoulli Trials (aka Binomial Experiment): definition

*n identical subexperiments
*Each sub experiment has two outcomes: success or failure *subexperiments are independent *each subexperiment has identical probabilities 

Bernoulli Trials (aka Binomial Experiment):
equation 
P(success any trial)=p
P(success any trial)=q p+q=1 P(number of successes = k) =(n,k)p^k*q^(nk) 

Bernoulli Trial:
P(k1<=k<=k2) 
Bernoulli Trial:
P(k1<=k<=k2) = sum((n,k)p^k *q^(nk),k,k1,k2) 

example:
toss a coin three times where P(H)=p and P(T)=q find: 1) P(3H) 2) P(2H and 1T) 3) P(2H at most) 4) P(1H at least) 
1) P(3H)
=(3,3)p^3*q^0 =p^3 2) P(2H and 1T) =P(2H) =(3,2)p^2*q^1 =3p^2*q 3) P(2H at most) =P(k<=2) =P(0H)+P(1H)+P(2H) =1P(3H) =1p^3 4) P(1H at least) =P(k>=1) =P(1H)+P(2H)+P(3H) =1P(0H) =1P(k=0) =1(3,0)p^0*q^3 =1q^3 

example:
consider a relay is operating 98% of the time under certain conditions. for 10 trials under the same conditions find: 1) P{all trials successful} 2) P{1 failure and 9 success} 3) P{1st 9 success & last fails} 
1) P{all trials successful}
=(10,10)p^10 =0.98^10 =0.8171 2) P{1 failure and 9 success} =(10,9)p^9*q =0.1667 3) P{1st 9 success & last fails} =(0.98)^9*(0.02)^1 =0.0167 

reliability: definition

reliability: definition 
reliability of a system is probability of success of that system. 

elementery systems:
FAi= FAi'= pAi= qAi= R= Q= 
elementery systems:
FAi={Ai open/fails} FAi'={Ai closed/works} pAi=P(FAi') qAi=P(FAi) R=P(success) Q=P(fail) 

elementery systems:
series [A1][A2][A3] R=? Q=? 
R
=P(FA1'FA2'FA3') =pA1*pA2*pA3 Q =P(FA1 U FA2 U FA3) =1R =1pA1*pA2*pA3 

elementery systems:
parallel [A1] [A2] [A3] R=? Q=? 
R
=1Q =1qA1*qA2*qA3 Q =qA1*qA2*qA3 

Real Random Variable: definition

Real Random Variable: definition 
A real single valued function defined over a sample space and mapped the sample space into a set of real numbers. We represent a random variable by capital letters and any particular values of r.v. with lower case letters. 

example:
tos a die and define the random variable X as: X(fi)=10i with i=1,2,3,4,5,6 1)P(X=20) 2)P(X=25) 3)P(X<=25) 4)P(15<=X<=35) 
1)P(X=20)
=1/6 2)P(X=25) =0 3)P(X<=25) = P(X=10 U X=20) = 1/6 + 1/6 = 2/6 4)P(15<=X<=35) =P(X=20 U X=30) =2/6 

Discrete Random Variable: definition

Discrete Random Variable: definition 
Can take a discrete (countable) numbers of values. 

example:
toss 2 coins and let X define the number of heads observed. 1)P(X=1) 2)P(X=0) 3)P(X=2) 
events:
H1H2 (Two heads) H1T2,T1H2 (One head) T1T2 (No heads) 1)P(X=1) =P(H1T2 U T1H2) =1/4 + 1/4 =1/2 2)P(X=0) =P(T1T2) =1/4 3)P(X=2) =P(H1H2) =1/4 

Continuous Random Variable: definition

Continuous Random Variable: definition 
can take all values within a specified interval. Can NOT result from discrete sample space. 

Probability Mass Function: discrete random variables:
equation 
PX(x)
=P(X=x) =P({Si:X(Si)=x}) 

example:
tos a die and define the random variable X as: X(fi)=10i with i=1,2,3,4,5,6 find probability mass function 
f(x)=PX(x)
=1/6 delta(x10) +1/6 delta(x20) +1/6 delta(x30) +1/6 delta(x40) +1/6 delta(x50) +1/6 delta(x60) 

Probability Distribution Function (P.D.F)
(aka Cumulative Distribution Function) 
Px of the random variable X is defined as:
FX(x) = P(X<=x) = P({Si:X(Si)<=x}) for continuous random variables, probability of one specific point is zero 

example:
toss a die. X(fi)=10i with i=1,2,3,4,5,6 0) FX(x) 1)FX(0) 2)FX(10) 3)FX(20) 4)FX(30) 5)FX(100) 6)P(x<40) 
0) FX(x)=
1/6*U(X10) +1/6*U(X20) +1/6*U(X30) +1/6*U(X40) +1/6*U(X50) +1/6*U(X60) 1)FX(0)=0 2)FX(10) =P(X<=10) =1/6 3)FX(20) =P(X<=20) =2/6 4)FX(30) =P(X<=30) =3/6 5)FX(100) =P(X<=100) =6/6 = 1 6) P(x<40) = 3/6 

impulse function/delta function:
1) delta(x) 2) delta(xa) 
1) delta(x)=
1 when x=0 0 when x!=0 2) delta(xa)= 1 when x=a 0 when x!=a 

unit step function:
1) u(x) 2) u(xa) 
1) u(x)
1 when x>=0 0 when x<0 2) u(xa)= 1 when x>=a 0 when x<a 

relation between delta(x) and u(x)

delta(x)=du(x)/dx
and u(x) = integral(delta(x),x) 

relation between PDF and probability mass function

F(x)=integral(f(x),x)
f(x)=(d/dx)F(x) probability mass function: f(x) probability distribution fu: F(x) 

example:
Toss a coin two times and let X be the number of heads observed 1)find the probability mass fu. 2) find the probability distribution fu 
1) f(x)
=PX(x) =(1/4)delta(x) +(1/4+1/4)delta(x1) +(1/4)delta(x) 2)FX(x) =(1/4)u(x) +(1/4+1/4)u(x1) +(1/4)u(x2) 

integral(
f(x)delta(xx0) ,x,infinity,infinity) 
integral(
f(x)delta(xx0) ,x,infinity,infinity) =f(x0) 

short note notation for P(x) and F(x)

PX(x)
=f(x) =sum (P(X=xi)delta(xxi),all xi) FX(x) =sum (P(X=xi)u(xxi),all xi<=x) 

Four properties of PDF

1)FX(infinite)
=P(X<infinite) =P(0)=0 2)FX(infinite) =P(X<infinite) =P(S) =1 3)FX(x) is an increasing fu. of x (positive slope) 4)FX(x)=FX(x+) at the point of discontinuity where: F(x+)=lim(F(x+epsilon),epsilon>0) {epsilon>0} 

Probability Measurements from PDF
1)P(x1<X<=x2) 2)P(X=x1) 3)P(x1<=X<=x2) 
1)P(x1<X<=x2)
=FX(x2)FX(x1) for x1<x2 2)P(X=x1) =F(x1+)F(x1) when X discontinuous at point x1 =0 when X is continuous at point x1 3)P(x1<=X<=x2) =P(X=x1)+P(x1<=X<=x2) =FX(x2+)FX(x1) when X is discontinuous at point x1 =FX(x2)FX(x1) when X is continuous at x1 

Probability Measurements from PDF (for discrete r.v.)
1)P(x1<X<=x2) 2)P(X=x1) 3)P(x1<=X<=x2) 
1)P(x1<X<=x2)
=F(x2)F(x1) 2)P(X=x1) =FX(x1+)F(x1') 3)P(x1<=X<=x2) =FX(x2)FX(x1') 

example:
toss a die where X(fi)=10i where i=1,2,3,4,5,6 1)P(X=30) 2)P(X=25) 3)P(20<X<=40) 4)P(20<=X<=40) 
1)P(X=30)
=FX(30+)FX(30) =3/6  2/6 =1/6 2)P(X=25)=0 since x is continuous at 25 3)P(20<X<=40) =FX(40)FX(20) =4/62/6 =2/6 4)P(20<=X<=40) =FX(40)FX(20) =4/61/6 =3/6 

probability density function (pdf): eqn

pdf
=fX(x) =dFX(x)/dx =sum(PX(xi)delta(xxi),i,1,n) 

four properties of pdf

four properties of pdf
1) fX(x)>=0 always positive fu since has positive slope 2)integral(fX(x) ,x,infinity,infinity) =FX(x),infinity,infinity =FX(infinity)FX(infinity) =1 3)FX(x) = integral(fX(z),z,infinity,x) 4)P(x1<=X<=x2) =integral(fX(x),x,x1,x2) =FX(x2)FX(x1) 

example:
let X be a continuous r.v. with pdf =c*x^2 when 0<=X<=1 =0 elsewhere 1)P(0.5<=X<=0.75) 2)FX(x) 
First find c:
integral(c*x^2,x,0,1)=1 >c=3 1)P(0.5<=X<=0.75) integral(3x^2,x,0.5,0.75) =0.75^3  0.5^3 2)FX(x) = 0 when x<0 =integral(3z^2,z,0,x) when 0<=x<=1 =1 when x>1 

Binomial distribution used by
1)probability mass fu 2)PDF 3)pdf 
X=# successes
n=# trials 1)probability mass fu P(X=k)=pX(k) =(n,k)p^k*q^(nk) 2)PDF = FX(l) =sum(PX(k)u(xk),k,0,l) =sum((n,k)p^k*q^(nk)*u(xk) ,k,0,l) 3)pdf = fX(x) =sum(PX(k)*delta(xk),k,0,n) 

example:
toss a coin 10 times and find pdf and PDF for #heads 
X=# heads, n=10
f(x)=sum((10,k)*p^k*q^(10k) *delta(xk),k,0,10) F(x) =P(X<=2) =sum((10,k)*p^k*q^(10k) *u(xk),k,0,10) 

Poisson: definition & eqn

If X defines number of events happening in an interval of length T then X has a poisson distribution where:
P(X=k) =PX(k) =e^b*b^k/k! when k=0,1,2,... =0 otherwise b=avg number events in interval T 

relation binomial and poisson

binomial prob fu converges to poisson when n>infinity and p>0 in such a way that np=b


example:
let X be the # of accidents in one intersection duringone week where avg # of accidants in one week is reported as 2 1)write PDF and pdf of X 2)FX(2) 3)P(2<=X<=4) 
1)write PDF and pdf of X
b=2 fX(x) =sum (PX(xi)delta(xxi),all xi) =sum(e^b*b^k/k!*delta(xk),k,0,infinity) PDF=FX(l) =sum(e^b*b^k/k!*u(lk),k,0,l) =sum(e^2*2^k/k!*u(lk),k,0,l) 2)FX(2) =sum(e^2*2^k/k!,k,0,2) =0.675 3)P(2<=X<=4) =sum(e^2*2^k/k!*,k,2,4) 

uniform density function: definition and pdf and PDF eqns

X is equally likely between a andb
pdf: fX(x) =1/(ba) a<=x<=b =0 elsewhere PDF: FX(x) =0 x<=a =1 x>=b =(xa)/(ba) a<=x<=b 

example:
randomX uniformly distributed between 2 and 4 1)pdf 2)PDF 
1)fX(x) = 1/(42) = 1/2
2)FX(x) = (x2)/(42) =(x2)/2 

example:
R is a random variable for which all values between 80 and 100 are equally likely. All otehr values are impossible. Find: 1) P(R between 90 and 95) 2) P(R between 90 and 95between 85 and 95) given f(x)=0.05 when 80<R<100 
1) P(90<R<95)
=(0.05)*(9095)=0.25 2) P(90<R<9585<R<95) =P(90<R<95)/P(85<R<95) =[(9095)*0.05]/[(9585)*0.05] 

exponential pdf

fx(X)
= (1/b)e^(x/b) when x>=0 =0 when x<0 FX(x) =1e^(x/b) when x>=0 =0 when x<0 

applications of exponential pdf

*time between arrival of plane
*fluctuations in signal strength received by radar from certain types of aircraft 

Normal or Gaussian: eqn

f(x)
= (1/(sqrt(2Pi)*sigma)) *e^((xmu)^2/(2*sigma^2)) 

Normal or Gaussian: def

1) f(x) is bell shaped and symmetrical around mu
2) f(x) has two parameters: mu=mean sigma=std deviation sigma^2=variance 

Normal or Gaussian: short notation

N(mu,sigma)


Application of Normal or Gaussian:

noise in communication system


Probability Measurements from PDF
1)P(x1<X<=x2) 2)P(X=x1) 3)P(x1<=X<=x2) 
1)P(x1<X<=x2)
=FX(x2)FX(x1) for x1<x2 2)P(X=x1) =F(x1+)F(x1) when X discontinuous at point x1 =0 when X is continuous at point x1 3)P(x1<=X<=x2) =P(X=x1)+P(x1<=X<=x2) =FX(x2+)FX(x1) when X is discontinuous at point x1 =FX(x2)FX(x1) when X is continuous at x1 

Probability Measurements from PDF (for discrete r.v.)
1)P(x1<X<=x2) 2)P(X=x1) 3)P(x1<=X<=x2) 
1)P(x1<X<=x2)
=F(x2)F(x1) 2)P(X=x1) =FX(x1+)F(x1') 3)P(x1<=X<=x2) =FX(x2)FX(x1') 

example:
toss a die where X(fi)=10i where i=1,2,3,4,5,6 1)P(X=30) 2)P(X=25) 3)P(20<X<=40) 4)P(20<=X<=40) 
1)P(X=30)
=FX(30+)FX(30) =3/6  2/6 =1/6 2)P(X=25)=0 since x is continuous at 25 3)P(20<X<=40) =FX(40)FX(20) =4/62/6 =2/6 4)P(20<=X<=40) =FX(40)FX(20) =4/61/6 =3/6 

probability density function (pdf): eqn

pdf
=fX(x) =dFX(x)/dx =sum(PX(xi)delta(xxi),i,1,n) 

four properties of pdf

four properties of pdf
1) fX(x)>=0 always positive fu since has positive slope 2)integral(fX(x) ,x,infinity,infinity) =FX(x),infinity,infinity =FX(infinity)FX(infinity) =1 3)FX(x) = integral(fX(z),z,infinity,x) 4)P(x1<=X<=x2) =integral(fX(x),x,x1,x2) =FX(x2)FX(x1) 

example:
let X be a continuous r.v. with pdf =c*x^2 when 0<=X<=1 =0 elsewhere 1)P(0.5<=X<=0.75) 2)FX(x) 
First find c:
integral(c*x^2,x,0,1)=1 >c=3 1)P(0.5<=X<=0.75) integral(3x^2,x,0.5,0.75) =0.75^3  0.5^3 2)FX(x) = 0 when x<0 =integral(3z^2,z,0,x) when 0<=x<=1 =1 when x>1 

Binomial distribution used by
1)probability mass fu 2)PDF 3)pdf 
X=# successes
n=# trials 1)probability mass fu P(X=k)=pX(k) =(n,k)p^k*q^(nk) 2)PDF = FX(l) =sum(PX(k)u(xk),k,0,l) =sum((n,k)p^k*q^(nk)*u(xk) ,k,0,l) 3)pdf = fX(x) =sum(PX(k)*delta(xk),k,0,n) 

example:
toss a coin 10 times and find pdf and PDF for #heads 
X=# heads, n=10
f(x)=sum((10,k)*p^k*q^(10k) *delta(xk),k,0,10) F(x) =P(X<=2) =sum((10,k)*p^k*q^(10k) *u(xk),k,0,10) 

Poisson: definition & eqn

If X defines number of events happening in an interval of length T then X has a poisson distribution where:
P(X=k) =PX(k) =e^b*b^k/k! when k=0,1,2,... =0 otherwise b=avg number events in interval T 

relation binomial and poisson

binomial prob fu converges to poisson when n>infinity and p>0 in such a way that np=b


Probability Distribution Function (P.D.F)
(aka Cumulative Distribution Function) 
Px of the random variable X is defined as:
FX(x) = P(X<=x) = P({Si:X(Si)<=x}) for continuous random variables, probability of one specific point is zero 

example:
toss a die. X(fi)=10i with i=1,2,3,4,5,6 0) FX(x) 1)FX(0) 2)FX(10) 3)FX(20) 4)FX(30) 5)FX(100) 6)P(x<40) 
0) FX(x)=
1/6*U(X10) +1/6*U(X20) +1/6*U(X30) +1/6*U(X40) +1/6*U(X50) +1/6*U(X60) 1)FX(0)=0 2)FX(10) =P(X<=10) =1/6 3)FX(20) =P(X<=20) =2/6 4)FX(30) =P(X<=30) =3/6 5)FX(100) =P(X<=100) =6/6 = 1 6) P(x<40) = 3/6 

impulse function/delta function:
1) delta(x) 2) delta(xa) 
1) delta(x)=
1 when x=0 0 when x!=0 2) delta(xa)= 1 when x=a 0 when x!=a 

unit step function:
1) u(x) 2) u(xa) 
1) u(x)
1 when x>=0 0 when x<0 2) u(xa)= 1 when x>=a 0 when x<a 

relation between delta(x) and u(x)

delta(x)=du(x)/dx
and u(x) = integral(delta(x),x) 

relation between PDF and probability mass function

F(x)=integral(f(x),x)
f(x)=(d/dx)F(x) probability mass function: f(x) probability distribution fu: F(x) 

example:
Toss a coin two times and let X be the number of heads observed 1)find the probability mass fu. 2) find the probability distribution fu 
1) f(x)
=PX(x) =(1/4)delta(x) +(1/4+1/4)delta(x1) +(1/4)delta(x) 2)FX(x) =(1/4)u(x) +(1/4+1/4)u(x1) +(1/4)u(x2) 

integral(
f(x)delta(xx0) ,x,infinity,infinity) 
integral(
f(x)delta(xx0) ,x,infinity,infinity) =f(x0) 

short note notation for P(x) and F(x)

PX(x)
=f(x) =sum (P(X=xi)delta(xxi),all xi) FX(x) =sum (P(X=xi)u(xxi),all xi<=x) 

Four properties of PDF

1)FX(infinite)
=P(X<infinite) =P(0)=0 2)FX(infinite) =P(X<infinite) =P(S) =1 3)FX(x) is an increasing fu. of x (positive slope) 4)FX(x)=FX(x+) at the point of discontinuity where: F(x+)=lim(F(x+epsilon),epsilon>0) {epsilon>0} 

example:
let X be the # of accidents in one intersection duringone week where avg # of accidants in one week is reported as 2 1)write PDF and pdf of X 2)FX(2) 3)P(2<=X<=4) 
1)write PDF and pdf of X
b=2 fX(x) =sum (PX(xi)delta(xxi),all xi) =sum(e^b*b^k/k!*delta(xk),k,0,infinity) PDF=FX(l) =sum(e^b*b^k/k!*u(lk),k,0,l) =sum(e^2*2^k/k!*u(lk),k,0,l) 2)FX(2) =sum(e^2*2^k/k!,k,0,2) =0.675 3)P(2<=X<=4) =sum(e^2*2^k/k!*,k,2,4) 

uniform density function: definition and pdf and PDF eqns

X is equally likely between a andb
pdf: fX(x) =1/(ba) a<=x<=b =0 elsewhere PDF: FX(x) =0 x<=a =1 x>=b =(xa)/(ba) a<=x<=b 

example:
randomX uniformly distributed between 2 and 4 1)pdf 2)PDF 
1)fX(x) = 1/(42) = 1/2
2)FX(x) = (x2)/(42) =(x2)/2 

example:
R is a random variable for which all values between 80 and 100 are equally likely. All otehr values are impossible. Find: 1) P(R between 90 and 95) 2) P(R between 90 and 95between 85 and 95) given f(x)=0.05 when 80<R<100 
1) P(90<R<95)
=(0.05)*(9095)=0.25 2) P(90<R<9585<R<95) =P(90<R<95)/P(85<R<95) =[(9095)*0.05]/[(9585)*0.05] 

exponential pdf

fx(X)
= (1/b)e^(x/b) when x>=0 =0 when x<0 FX(x) =1e^(x/b) when x>=0 =0 when x<0 

applications of exponential pdf

*time between arrival of plane
*fluctuations in signal strength received by radar from certain types of aircraft 

Normal or Gaussian: eqn

f(x)
= (1/(sqrt(2Pi)*sigma)) *e^((xmu)^2/(2*sigma^2)) 

Normal or Gaussian: def

1) f(x) is bell shaped and symmetrical around mu
2) f(x) has two parameters: mu=mean sigma=std deviation sigma^2=variance 

Normal or Gaussian: short notation

N(mu,sigma)


Application of Normal or Gaussian:

noise in communication system


example:
let X be the # of accidents in one intersection duringone week where avg # of accidants in one week is reported as 2 1)write PDF and pdf of X 2)FX(2) 3)P(2<=X<=4) 
1)write PDF and pdf of X
b=2 fX(x) =sum (PX(xi)delta(xxi),all xi) =sum(e^b*b^k/k!*delta(xk),k,0,infinity) PDF=FX(l) =sum(e^b*b^k/k!*u(lk),k,0,l) =sum(e^2*2^k/k!*u(lk),k,0,l) 2)FX(2) =sum(e^2*2^k/k!,k,0,2) =0.675 3)P(2<=X<=4) =sum(e^2*2^k/k!*,k,2,4) 

uniform density function: definition and pdf and PDF eqns

X is equally likely between a andb
pdf: fX(x) =1/(ba) a<=x<=b =0 elsewhere PDF: FX(x) =0 x<=a =1 x>=b =(xa)/(ba) a<=x<=b 

example:
randomX uniformly distributed between 2 and 4 1)pdf 2)PDF 
1)fX(x) = 1/(42) = 1/2
2)FX(x) = (x2)/(42) =(x2)/2 

example:
R is a random variable for which all values between 80 and 100 are equally likely. All otehr values are impossible. Find: 1) P(R between 90 and 95) 2) P(R between 90 and 95between 85 and 95) given f(x)=0.05 when 80<R<100 
1) P(90<R<95)
=(0.05)*(9095)=0.25 2) P(90<R<9585<R<95) =P(90<R<95)/P(85<R<95) =[(9095)*0.05]/[(9585)*0.05] 

exponential pdf

fx(X)
= (1/b)e^(x/b) when x>=0 =0 when x<0 FX(x) =1e^(x/b) when x>=0 =0 when x<0 

applications of exponential pdf

*time between arrival of plane
*fluctuations in signal strength received by radar from certain types of aircraft 

Normal or Gaussian: eqn

f(x)
= (1/(sqrt(2Pi)*sigma)) *e^((xmu)^2/(2*sigma^2)) 

Normal or Gaussian: def

1) f(x) is bell shaped and symmetrical around mu
2) f(x) has two parameters: mu=mean sigma=std deviation sigma^2=variance 

Normal or Gaussian: short notation

N(mu,sigma)


Application of Normal or Gaussian:

noise in communication system


Let X be N(mu,sigma) find:
P(x1<=X<=x2)=? 
P(x1<=X<=x2)
=integral(fX(x),x,x1,x2) = integral([1/(sqrt(2Pi)*sigma)]*e^(Z^2/2),Z,Z1,Z2) =FSN(Z2)FSN(Z1) with Z=(xmu)/(sigma) 

Let X be N(mu,sigma) find:
P(x1<=X<=x2)=? using the short way and three properties 
P(x1<=X<=x2)
=FSN(Z2)FSN(Z1) with Z=(xmu)/(sigma) remember 1)FSN(z)=1FSN(z) 2)FSN(infinity)=1 3)FSN(infinity)=0 

example:
consider X is N(2,3) find 1) P(1<=X<=5) 2) P(X>=5) 
1) P(1<=X<=5)
=FSN((52)/3)FSN((12)/3) =FSN(1)FSN(1/3) and from table =0.84134(10.6293)=0.4707 2)P(X>=5) =P(5<=X<infinity) =FSN(infinity)FSN((52)/3) =10.8413=0.1586 

Joint Probability Mass Fu (Discrete r.v): notation and 2 properties

PXY(x,y)=P(X=x and Y=y)
1)PXY(x,y)>=0 2)sum(sum(PXY(xi,yj),all xi) ,all yj)=1 

Joint Distribution Function (JPDF): eqn

A={X<=x} B={Y<=y}
P(AB) =FXY(x,y) =P(X<=x and Y<=y) 

Joint Distribution Function (JPDF): 6 properties

1)FXY(infinity,infinity)=0
2)FXY(infinity,infinity)=1 3)FXY(infinity,y)=0 FXY(x,infinity)=0 4)FXY(infinity,y)=FY(y) FXY(x,infinity)=FX(x) 5)FXY(x,y)=FXY(x+,y+) continuous toward higher limit for both arguments 6)nondecreasing in each argument 

example:
findthe joint probability density function and joint distribution fu of X and Y associated with the experiment of tossing one pair of die where X and Y each represent the number appearing on each die 
PXY(X=i,Y=j)
=PXY(i,j) =1/36 1)jpdf =fXY(x,y) =sum(sum(PXY(i,j)*delta(xi)*delta(yj)),i,1,6),j,1,6) 2)JPDF =FXY(m,n) =P(X<=m,Y<=n) =sum(sum(PXY(i,j)*u(mi)*u(nj),i,1,6),j,1,6) =m*n/36 

Joint Probability Density Function: (continuous) eqn

fXY(x,y)=[d^2/(dxdy)]FXY(x,y)
where FXY(x,y)= integral(integral( fxy(mu,lambda), mu,infinity,x), lambda,infinity,y) 

Joint Probability Density Function: (continuous) properties {2}

1) fXY(x,y)>=0
always positive 2) integral(integral( fXY(x,y) ,x,infinity,infinity) ,y,infinity,infinity) =P(S)=1 area under density fu is always equal to one 

Probabilities in terms of jpdf:
1) P(x1<=X<=x2 and y1<=Y<=y2) 2) fXY(x,y) (discrete) 
1) P(x1<=X<=x2 and y1<=Y<=y2)
=integral(integral( fXY(x,y) ,x,x1,x2) ,y,y1,y2) ie integrating to find volume 2) fXY(x,y)= sum(sum( P(xi,yj) *delta(xxi)*delta(yyj) ,i,1,n),j,1,m) 

example:
consider r.v. X and Y with the following jpdf: fXY(x,y) =kxy when 0<=x<=1 and 0<=y<=1 =0 elsewhere 1)FXY(x,y) 2)P(X<=1/2 and Y<=3/4) 3)P(x<=y) 
first find k:
integral(integral( kxy,x,0,1),y,0,1)=1 >k=4 1)FXY(x,y) =integral(integral( 4*mu*lambda ,mu,0,x),lambda,0,y) =x^2*y^2 2)P(X<=1/2 and Y<=3/4) =FXY(1/2,3/4) =9/64 3)P(x<=y) =integral(integral( 4xy,x,0,y),y,0,1) =1/2 draw the graph to find integral limitations 

Marginal Distribution Functions:
FX(x)= FY(y)= 
FX(x)=P(X<=x and Y<infinity)
=FXY(x,infinity) FY(y)=P(X<infinity and Y<=y) =FXY(infinity,y) 

Marginal Probability Mass Fu:
PX(xi)= PY(yi)= 
PX(xi)=
sum(PXY(xi,yj),all j) PY(yi)= sum(PXY(xi,yj),all i) 

Marginal Probability Density Fu:
fX(x)= fY(y)= 
fX(x)
=integral( fXY(x,y) ,y,infinity,infinity) fY(y) =integral( fXY(x,y) ,x,infinity,infinity) 

Marginal Probability Density Fu:
find FX from fX 
FX
=integral( fX(mu),mu,infinity,x) 

Marginal Probability Density Fu:
find FX from fXY 
FX
=integral(integral( fXY(mu,lambda) ,lambda,infinity,infinity ,mu,infinity,x) 

independence of two random variables:
continuous eqns 
independent iff:
fXY(x,y)=fX(x)*fY(y) 

independence of two random variables:
discrete eqns 
independent iff:
PXY(x,y)=PX(x)*PY(y) or FXY(x,y)=FX(x)*FY(y) or f(xy)=f(x) f(yx)=f(y) 

example:
fXY(x,y) =8xy when 0<=x<=1 and 0<=y<=x =0 elsewhere 1)P(x<=1/2) 2)f(y) 3)f(x) 4)are X and Y independent? 
1)P(x<=1/2)
integral(f(x),x,0,1/2) =integral(integral( f(x,y),y,0,x),x,0,1/2) =1/16 2)f(y) =integral(fXY(x,y),x,y,1) =integral(8xy,x,y,1) =4y*(1y^2) 3)f(x) =integral(fXY(x,y),y,0,x) =4x^3 4)are x and y independent? NO b/c fXY(x,y) != f(x)f(y) 

example:
X and Y are jointly distributed with FXY(x,y)=(1/6)[x^2*y+x*y^2] 0<=x<=1 0<=y<=2 1)FX(x) 2)FY(y) 3)fXY(x,y) 4)fX(x) 5)fY(y) 6)are X and Y independent? 
1)FX(x)
=FXY(x,2) =(1/3)*[x^2+2x] 2)FY(y) =FY(y) =FXY(1,y) =(1/6)[y+y^2] 3)fXY(x,y) =(d^2/(dxdy))FXY(x,y) =(1/3)(x+y) 4)fX(x) =integral(fXY(x,y),y,0,2) =(2/3)(x+1) or =dF(x)/dx=(2/3)(x+1) 5)fY(y) =integral(fXY(x,y),x,0,1) 6)are X and Y independent? NO b/c f(x,y)!= f(x)f(y) 

example (discrete):
assume the joint sample space XY has only three possible values: P(1,1)=0.2 P(2,1)=0.3 P(3,3)=0.5 1)jpdf 2)JPDF 3)FX(x) 4)FY(y) 
1)jpdf
=f(x,y) =0.2*delta(x1)*delta(y1) +0.3*delta(x2)*delta(y1) +0.5*delta(x3)*delta(y3) 2)JPDF =F(x,y) =0.2*u(x1)*u(y1) +0.3*u(x2)*u(y1) +0.5*u(x3)*u(y3) 3)FX(x) =FXY(x,infinity) =0.2*u(x1) +0.3*u(x2) +0.5*u(x3) 4)FY(y) =FXY(infinity,y) =0.2*u(y1) +0.3*u(y1) +0.5*u(y3) 

Conditional Distribution Function:
FX(xB)=? 
FX(xB)
=P(X<=x and B)/P(B) 

Conditional Distribution Function:
properties {4} 
1)FX(infinityB)=0
2)FX(infinityB)=0 3)non decreasing fu of X 4)FX(xB)=FX(x+B) discontinues at point x 

Conditional Density Function:
fX(xB)=? 
fX(xB)
=dF(xB)/dx 

Conditional Distribution Function:
properties {2} 
1) fX(xB)>=0
density fu always positive 2)integral(fX(xB),x, infinity,infinity)=1 area under density fu always equal to one 

FX(xB)
in terms of fX(xB) 
FX(xB)
=integral( fX(muB) ,mu,infinity,X) 

P(x1<=X<=x2B)
in terms of fX(xB) 
P(x1<=X<=x2B)
=integral( fX(xB),x,x1,x2) 

Conditional Distribution and Desnity functions:
5 cases of defining event B 
case 1:
B={x1<=X<=x2} case 2: B={X<=x1} case 3: B={y1<=Y<=y2} case 4: B={Y=y} case 5: B={Y<=y} 

case 1:
B={x1<=X<=x2} FX(xB)=? fX(xB)=? 
case 1:
B={x1<=X<=x2} FX(xB) =0 when X<x1 =[FX(x)FX(x1)]/P(B) when x1<=X<=x2 =1 when X>=x2 fX(xB) =0 when X<x1 =fX(x)/P(B) when x1<=X<=x2 =1 when X>=x2 where P(B) =integral(f(x),x,x1,x2) 

case 2:
B={X<=x1} FX(xB)=? fX(xB)=? 
case 2:
B={X<=x1} FX(xB) =F(x)/P(B) when X<x1 =1 when X>=x1 fX(xB) =f(x)/P(B) when x<x1 =0 when x>=x1 

case 3: interval conditioning on Y
B={y1<=Y<=y2} FX(xB)=? fX(xB)=? 
case 3:
B={y1<=Y<=y2} FX(xB) =P(X<=x and y1<=Y<=y2)/P(B) =[integral(integral( f(x,y) ,x,infinity,x) ,y,y1,y2)]/ [integral(f(y),y,y1,y2)] fX(xB) =[integral( f(x,y),y,y1,y2]/ [P(B)] =[integral( f(x,y),y,y1,y2]/ [integral(f(y),y,y1,y2)] 

case 4: point conditioning
B={Y=y} FX(xB)=? fX(xB)=? 
case 4:
B={Y=y} FX(xB) =[integral(fXY(mu,lambda) ,mu,infinity,x)]/fY(y) fX(xB)=fXY(xy) =f(xY=y) =f(x,y)/f(y) 

case 5:
B={Y<=y} FX(xB)=? fX(xB)=? 
F(xB)
=F(x,y)/F(y) f(xB) =(1/F(y))*(dF(x,y),x) 

example:
consider two r.v. with jpdf defined as: fXY(x,y)=k(x+y) with 0<=x<=1 and 0<=y<=2 determine: 1)f(xy=1) 2)F(xy=1) 3)f(yB) where B={1<=Y<=2} 4)f(xy<=1) 5)F(xy<=1) 
first find k:
integral(integral( k*(x+y),x,0,1),y,0,1)=1 >k=1/3 1)f(xy=1) =f(x,1)/f(y) =[1/3(x+y)]/ [integral(1/3(x+y),y,0,1)] =(x+y)/((1/2)+y) >f(xy=1)=(2/3)(x+1) 2)F(xy=1) =integral(fXY(muy),mu,0,x) =integral( [mu+y]/[1/2+y],mu,0,x) =[0.5x^2+xy]/[0.5+y] >F(xy=1)=[0.5x^2+x]/[1.5] 3)f(yB) where B={1<=Y<=2} =f(y)/P(1<=Y<=2) =(1/4)+(1/2)y where P(1<=Y<=2) =integral(f(y),y,1,2)=2/3 4)f(xy<=1) =(d/dx)F(xy<=1)=x+1/2 5)F(xy<=1) =FXY(x,1)/FY(1) =[integral(integral( fXY(mu,lambda) ,mu,0,x),lambda,0,y)]/ [FXY(1,y)] >(x^2+x)/2 

how to get marginal from conditional jpdf:
f(x) P(xi) 
X is continuous:
f(x) =integral( f(xy)f(y) ,y,infinity,infinity) X is discrete: P(xi)= sum( PXY(xiyj)PY(yj),all j) 

example: let Y=1,2,3 depend on the number of coins tossed. let X be the number of heads. given:
PY(1)=1/4, PY(2)=1/4, PY(3)=1/2, and that the coins are fair coins, also i=0,1,2,3 find PX(i)= 1)PX(0) 2)PX(2) 
1) step 1: find conditionals
P(01)=1/2 P(02)=1/4 P(03)=1/8 step 2: PX(0)=sum( P(0j)*PY(j),j,1,3) =(1/2)(1/4)+(1/4)(1/4) +(1/8)(1/2) =1/4 2) step 1: find conditionals P(22)=1/4 P(23)=3/8 hint: can also do these using binomial > P(23)=(3,2)(1/2)^2*(1/2)^1 step 2: PX(2)=sum( P(2j)*PY(j),j,1,3) =4/16 

random vectors:
(using n=4) 
PDF:
FX1X2X3X4(x1,x2,x3,x4) =P{(X1<=x1,X2<=x2, X3<=x3,X4<=x4)} pdf: fX1...Xn =[d^n/(dx1...dxn)]* [FX1...Xn(x1...xn)] fX1X2(x1,x2)= integral(integral( fX1X2X3X4(x1,x2,x3,x4) ,x3,infinity,infinity) ,x4,infinity,infinity) 

random vectors:
conditional density fX1X2X3X4(x1x2x3x4) 
fX1X2X3X4(x1x2x3x4)
=fX1X2X3X4(x1x2x3x4)/ fX3X4(x3x4) 

example:
given F(x1,x2,x3) and f(x1,x2,x3) find marginals F(x1) f(x1) 
F(x1)=F(x1,infinity,infinity)
f(x1)=integral(integral( f(x1,x2,x3) ,x2,infinity,infinity) ,x3,infinity,infinity) 

example:
given f(x1,x2,x3) find f(x1x2x3) 
f(x1x2x3)
=f(x1,x2,x3)/f(x2,x3) 

independent random variables:
if A={X<=x} and B={Y<=y} what eqn determines the two are independent? 
FXY(x,y)=FX(x)FY(y)
or fXY(x,y)=fX(x)fY(y) f(xy)=f(x) 

example:
given X and Y are two independent r.v. and f(x)=2x 0<=x<=1 f(y)=2y 0<=y<=1 1)f(x,y) 
1)f(x,y)
=f(x)f(y) =4xy 0<=x<=1 0<=y<=1 

example:
given X and Y are two independent r.v. and f(x)=2x 0<=x<=1 f(y)=2y 0<=y<=1 2)P{max(x,y)<=1/2} 3)P{min(x,y)<=1/2} 
2)P{max(x,y)<=1/2}
to find max: graph x=y max(x,y) =x when x>y =y when x<y P(max(x,y)<=1/2) =integral(integral( 4xy,x,0,1/2),y,0,1/2) =1/16 3)P{min(x,y)<=1/2} min(x,y) =y when x>y =x when x<y P(min(x,y)<=1/2) =1P(min(x,y)>1/2) =1integral(integral( f(x,y),x),y) =1integral(interal( 4xy,x,1/2,1),y,1/2,1) =19/16 =7/16 