• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/59

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

59 Cards in this Set

  • Front
  • Back
As a fish swimming in waters leaves no mark behind,
So a free soul moves in the world unnoticed by others.
What is the formula for the the Probability of A given B?
Prob of A times Prob of B all over Prob of B - given independence of A and B.

[ P(A=m)*P(B=n) ] / P(B=n)
What is the multiplication rule for independent events?
P(AB) = P(A)P(B)
What is the formula for Binomial Distribution?
(N Ch k) p^k q^n-k

Where:

N = Size of overall group
k = Size of desired success or, what you are looking for
p = Probability or Rate of success
q = Probability or Rate of failure
What is the binomial expansion?
(p+q)^n = Sum from k=0 to n of (n C k) p^k q^n-k
When is Mew = np?
When it's binomial (n,p) distribution
What are the two parameters of the normal curve?
Mew and the standard deviation.
What is the range of x on the normal curve?
(-I , I)
What is the equation for the normal curve in x? (w/o normalizing the values)
y = 1/ (rt(2 pi sigma))*e^((-1/2))*(x-M)^2 / sigma ^2)

Where sigma is the SD
What is the equation for the normal curve in z using normalized (standardized) values?
y = 1 / (rt(2 pi sigma))*e^ (-1/2)*z^2

Where sigma is the SD
What is the formula for standardizing a value?
Z = ( x - M) / sigma

sigma = SD

In other words, you standardize a variable so that you can use the Normal Table and values of Phi.

Take your variable, subtract the Mean (Mew), then divide by the Standard Deviation (Sigma)
What area of the curve does your CDF give you?
The area to the left of z under the standard normal curve.
What is the equation for Poisson approx?
P( k successes) ~~ L^k * e^ -L / k!

Where L = lamda
How can the mean of a probability distribution be defined?
Over a finite set of numerical values x, it is the average of the values x weighted by their probabilities.
What is the definition of expectation of a random variable X?

Also called the mean.
average of all possible values of X, weighted by their probabilities.

Sum over all x, x * P( X = x)
What is the addition rule for expectation for any two random variables X and Y?
E(X +Y) = E(X) + E(Y)

No matter whether X and Y are independent or not.
What is the definition for the expectation of an indicator?
The expectation of the indicator of an event is the probability of the event.

E(I sub A) = P(A)
What is the rule for multiplication fo random variables X and Y?
IF they are Independent, then;

E(XY) = E(X)*E(Y)

But we need independence for this to occur, otherwise you cannot use this method.
What kind of distributions can you have a CDF for?
ANY KIND!!!!
What can you do with a CDF?
Calculate the probability of any event determined by X.
Because probabilities must be nonnegative, what is a characteristic of the CDF?
It must be a nondecreasing function of X.
How do you get the density function from the CDF?
Differentiate the CDF.
How do you get the CDF from the density function?
Integrate the density function.
How do you find the Max of a collection of ind. rand. var?
Look at it as an inequality:

If the max value of the set is less than or equal to, say x, then that can only happen if the entire set is less than or equal to x.
How do you find the Min of a collection of ind. rand. var?
Look at this as an inequality:

The min of the collection of variables is greater than a limit x only if all the variables in that set are greater than x.
What do you get when you add two ind rand variables together?
A new density function.
How can you find the new density function formed from adding two ind rand variables?
You can integrate the product of their density funcs over the correct area, then differentiate to get back down or you can plug both the density functions into the convolution formula.
When working with joint densities of X and Y, you might not always have to get the distribution, sometimes getting the expectation is enough. How can you express the expectation of a joint function of X and Y?
E(X*Y) = double integral of: x*y*f(x,y) dxdy

=E(X)*E(Y) <--- IF X and Y are independent!

Any other type of relationship, i.e. E( X + Y) , E( X / Y ), E ( sqrt( X^2 + Y^2 )) can also be expressed in that integral form using the density functions for the variables.
What would be the CDF method of finding the distribution of Z = g( X , Y )?
You'd want to FIND the CDF P(Z <_ z) by integrating the PRODUCT of the DENSITY functions fx(x) and Fy(y) over the region in the xy-plane where g(x,y) <_ z.

Then, after doing that you have the CDF of your variables so , if you differentiate that thing you get the new density function of the joint variables that you were looking for.
What is the addition rule for expectation?

Are there any necessary conditions for this rule to hold?
E( X + Y ) = E(X) + E(Y)

No, even if X and Y are dependent variables you can always do this.
What distribution does the phrase "n independent Bernoulli (p)" make you think of?
Binomial ( n, p )

n = outcome space

p = rate or probability

Expectation for the variable = n*p
What is the density convolution formula?
f x+y (z) = integral of f( x, z-x ) dx

This integral is from neg infinity to infinity.

Subsitute in (z-x) for every y in the density function for y, make sure x is plugged in as the variable for the density function for x.

You'll end up with integral of: fx(x) * fy(z-x) dx

NOTE: If X and Y are non-negative, then the lower limit of integration in the convolution formula can be changed to zero, and the upper limit to z, since fy(z-x) = 0 for all x > z.
What is the convolution formula a special case of?
It is a special case of the formula for the density of X + Y when f(x*y) = fx(x) * fy(y) by INDEPENDENCE.

This formula leads to a new density which is the density of the sun of the two random variables which are assumed to be independent.
For ind. normal variables X and Y, what is the relationship for the density functions of X / Y and Y / X and what is the integral for them?
f x/y (z) = f y/x (z) by symmetry between X and Y.

Integral from negative infinity to infinity of: abs valu (x) * f x,y ( x , xz ) dx

Plug in your appropriate density functions and variables in their proper places and change the bounds of integration according to the range of your density functions.
So, we have two uniformly distributed variables over a region, what is the formula for the marginal density of f x (x) ?
Well, since we have a DENSITY that we are looking for, we will be using an integral.

For f x (x) we'll have integral from negative infinity to infinity of: f(x,y) dy

So, the MARGINAL DENSITY for f x(x) of two joint variables is an INTEGRAL with respect to Y.

Just remember that for marginal density it will be an integral of f( x,y ) but integrated with respect to the opposite variable of the marginal density that you are looking for.
What is the formula for the marginal density of f y(y) if X and Y are two uniformly distributed joint variables?
Integral from negative infinity to infinity of: f( x,y ) dx <-- Notice that this is w/ respect to x and you are looking for the marginal density of f y(y).

So, this is for the joint density function - if you don't have it already you may have to get it by the CDF or Convolution method. CDF meaning integrate the product of the two density functions then differentiate back down for density, Convolution meaning to plug into the Convolution formula and integrate.

Remember: Take the integral with respect to the variable opposite of the variable for the marginal density that you are looking for.
How can you check if uniformly dist variables X and Y are independent or not?
if f ( x,y ) = f x(x) * f y(y) then yes, they are independent.

If that equation is not satisfied, then those variables in question are NOT independent.
Say we have X and Y independent exponentially distributed rand variables with respective parameters lamda and Mew.....

What exactly does that mean and how can you calculate P ( X < Y ) ?
If we are given two ind. exp. rand variables w/ parameters. Then this means that each variable can be represented by its own density function.

So, then their joint density is the PRODUCT of those two functions. (By independence).

Then, P( X < Y ) is going to be found by integrating a double integral of this product over the appropriate area.
If you are asked what the probability is that a standard normal variable has value between a and b, what it the procedure to solve this?
Since it is given as standard normal you only have to use the Normal Table and subtract Phi(a) from Phi(b) since the table will give you everything to the left, so Phi(b) - Phi(a) gives the prob that your stand. norm. variable has value between a and b.
If you are given that X has normal density f(x) = c/x^4 for x > 1 and f(x) = 0 otherwise, then how can you find out what c is, if c is a constant?
If the density function is zero for all values of x other than x > 1, then we know that the area under the function from 1 to infinity must add up to 1. Integrate c/x^4 from 1 to infinity, set the result equal to 1 and solve for c.
X has normal density f(x) = 3/x^4 for x > 1 and 0 otherwise. How can you find the expectation of the variable X?

What about the Variance of X?
E(X) is easy enough with a normal density function, we'll just want to use the definition to integrate it (also times x) over the area where it is not 0.

So, integrate from 1 to infinty: (x)* 3/x^4 dx.

Variance can be calculated the same way using the definition so E(x^2) = integral from 1 to infinty of (x^2) * 3/x^4 dx, then finish the formula for Variance.
If you are given normal variable X with prob dens f(x) = 1/(1+x)^2 for x > 0 and f(x) = 0 otherwise, how would you go about finding P ( X > 3) ?
We have the density function already, and know that it is normal, so we can just integrate this from 3 to infinity, the resulting number is P ( X > 3)
If you are given normal variable X with prob dens f(x) = 1/4 for x > 3 , and then you have X1, X2, X3, and X4 each with the same dist as X and independence, how could you find the chance that exactly 2 of those four variables is greater than 3?
Well, they all have identical probabilities and are all independent events - this makes a simple success or failure type of situation and hence binomial with n = 4 , and p = 1/4

So we have (4 Ch 2)*(1/4)^2 * (3/4)^2
When finding probabilities and expectations for normal variables, what is the mechanical difference between E(x) and P( X > some value) ?
From the def of expectation, we know that when integrating we first multiply by the variable to the first power.

When finding the probability, we just have to integrate over the appropriate area, no tweaks are mandatory.
The probability of a random time falling in any interval can be found from what?
The survival function - this pertains to Exponential and Gamma distributions.
What is the formula for the survival function of the exponential function?

How about Mean and Standard Deviation?
P(X > t) = e^-Lt ( t >_0)

Both the Mean and the Standard Deviation will be 1/L for the Exponential Distributions.
What do we generally call E[ X^n ] ?
The nth moment of X
What is the formula for the moment generating function?
M x(t) = Sum from n = 0 to infinity of: E ( X^n ) *( t^n / n! )

Which by linearity is equal to E [ e^xt ]
What does M x^n (0) equal?
E [ x^n ]

By Taylor's Theorem.
For moment generating function M x (t) = E [ e^xt ] what does the sum formula look like and how do you FIND the expectation on that?
You can just use the definition:

Sum from k = 1 to whatever you are given since it is discrete, of e^kt * P [ x = k ]

You should an exponential polynomial in terms of t.

Like 1/3 * e^3t + 4/5 * e^5t + ... which is your moment generating function M x(t), the first derivative of this function evaluated at 0 is E [ x^1 ], the second derivative at zero is E [ x^2 ], the second moment of x, and it goes on in that fashion.
Say you have a variable x with density function f(x) = 5e^-5x for x > 0 and zero otherwise. What is M x (t)?
We know that M x(t) = E [ e^xt ]

So, take an integral from x = 0 to infinity of e^xt * the density function.

So we have int. from x = 0 to infinity of e^xt * 5e^-5x dx
Say that you're given that M x(t) = 5 / 5-t and you want to find a formula for E [ x^n ]. How might you go about that?
Take the definition, and tweak it into a recognizable sum formula, THEN tweak that sum formula into t^n / n! with a coefficient and whatever the coefficient of t^n / n! is will be the E [ x^n ].

For this one in particular, divide everything by 5 and you have the sum of x^n, so you have sum from 0 to infinity of (t/5)^n. Once you have that it can be tweaked into the form you want and your desired coefficient becomes n! / 5^n.
Say that X is uniform on [ a, b ], how can you find a formula for E [ x^n ] ?
Since x is uniform, set up the integral from a to b of: 1 / (b -a) * x^n dx

Then integrate that, and whatever you get is E [ x^n ].
How can you transform a probability generating function in a moment generating function?
Probability Generating Func = G x(t)

Moment Generating Func = M x(t)

Prob uses "z" and Moment uses "e^t"

Prob = sum of z^k * P [ x = k ]

Moment = sum of e^kt * P [ x = k ]
Say that you have x as a discrete variable with G x(z) = z / (4 - 3z)^2.

What is this and how can you find E [(X) falling 3] ?

(That falling 3 means that you want a MOMENT btw)
This is a Probability Generating Function.

We are looking for the third moment of G x(z).

This can be found by taking the third derivative of the function and evaluating it at "1".

That's it.
Say that you have x as a discrete variable with G x(z) = z / (4 - 3z)^2.

How can you find E [ X^3 ] ?
We are given a probability generating function and we want to get to the moment generating function.

This can be done by replacing "z" with "e^t" and then for X^3 you take the third derivative of that function and evaluate it at zero.
Say that you have x as a discrete variable with G x(z) = z / (4 - 3z)^2.

What is this thing?
This is a probability generating function.
Say that you have x as a discrete variable with G x(z) = z / (4 - 3z)^2.

How can you find P [ X = 3 ] ?
Remember what the PGF is, it's a power series whose coefficients are probabilities.

Here, we need the coefficient of z^3 in the power series for this function.

by Taylor's Theorem, that coefficient should be the third derivative of G at 0, divided by 3 factorial.
How does the method of tailsums work?
Use an inequality, say, you are looking for the min of something, or you want to know P [ X < 4 ] - for that particular case, you can just add up all the probabilities for 1, 2, and 3.

Don't forget that when using tailsums you must start your series at k = 1, you cannot include zero or things will be counted too many times.

Find them, add them , done.