Use LEFT and RIGHT arrow keys to navigate between flashcards;
Use UP and DOWN arrow keys to flip the card;
H to show hint;
A reads text to speech;
50 Cards in this Set
- Front
- Back
Variance
|
(1/(n-1))*sum(xi-x)^2
|
|
Standard Deviation
|
square root of variance
|
|
Five-Number Summary
|
Minimum Q1 Mean Q2 Maximum
|
|
Density function
|
describes a density curve in functional form (defined over all possible values the variable can take)
|
|
Normal Distribution
|
1) has a "bell shape"
2) mean=median=mode 3) symmetric 4) inflection pts at m+sd, m-sd |
|
Rule for Normal Distribution
|
65-95-99.7
|
|
Z-Score
|
z= (x-m)/sd
|
|
Standard Normal Distribution
|
Normal distribution N(0,1) with mean 0 and standard dev 1 If a varable x has any Normal distribution with mean M and standard dev, the standardized variable z has the standard normal distribution
|
|
Covariance
|
S2XY = (1/n-1)sum (xi - x)(yi - y)
|
|
Correlation
|
Cov(X,Y)/SxSy
|
|
Least-Squares Regression Line
|
equations on sheet
|
|
Lurking variable
|
variable that is not among the explanatory or response variables yet may influence the interpretation of relationships among variables
|
|
Simpson's Paradox
|
an association or comparison that holds for all of several groups can reverse direction when the data are combined to form a single group
|
|
Simple Random Sample
|
consists of n individuals from the population chosen in such a way that every set of n indivviduals has an equal chance to be the sample actually selected
|
|
Sampling Distribution
|
distribution of values taken by the statistic in all possible samples of the same size from the same population
|
|
Probability Rules
|
1. Probability of any event A satisfies 0<=P(A)<=1
2. If S is the sample space in a probability model, P(S)=1 3. the complement of any event A is the event that A does not occur, written as A^C P(A does not occur)=1-P(A) 4. Two events A &B are disjoint if they have no outcomes in common and therefore cannot occur simultaneously P(A or B)=P(A)+P(B) |
|
Rules for Means
|
1. If x is a random varaible and a and b are fixed numbers, then
M(a+bx)=a+BMx 2. If x and y are random varialbe, then M(x+y)=Mx+My |
|
Variance for Discrete and Continuous Variables
|
E(X^2)-M^2
|
|
Rules for Variances
|
1. V(a+bx)=b^2V(x)
2. If X&Y are independent V(x+y)=V(x)+V(Y) V(x-y)=V(x)-V(Y) 3. If X&Y have correlation p, V(x+y)=V(x)+V(y)+2pV(x)V(y) V(x-y)=V(x)+V(y)-2pV(x)V(y) |
|
Covariance of Two Random Variables
|
Cov(XY)=E(XY)-MxMy
|
|
Correlation Between X&Y
|
Cov(XY)/(sdX*sdY)
|
|
Joint Probability Function
|
f(x0,y0)=P{X=x0 and Y=y0)
|
|
Marginal Probability Function
|
fx(x0)=P(x=x0)=sum over yi of f(x0,yi)
|
|
Joint Probability Funciton for continuous variable
|
To find P{x1<x<x2 and y1<y<y2)
double integral of f(x,y)dydx |
|
Law of Large Numbers
|
Draw independent oservations at random from any populations with finite mean m. As the number of observations drawn increases, the mean of the observed values gets closer and closer to the mean of the population.
|
|
Mean and Standard Deviation of a Sample Mean
|
mean is m and standard deviation is sd/sqrtn
|
|
Central Limit Theorem
|
When n is large, the sampling distribution of the sample mean is approximately normal
|
|
General Addition Probability Rule
|
For any two events
P(A or B) = P(A)+P(B)-P(A and B) |
|
Multiplication Rule of Independent Events
|
P(A and B) = P(A)P(B)
|
|
Sampling Distribution
|
distribution of values taken by the statistic in all possible samples of the same size from the same population
|
|
Probability Rules
|
1. Probability of any event A satisfies 0<=P(A)<=1
2. If S is the sample space in a probability model, P(S)=1 3. the complement of any event A is the event that A does not occur, written as A^C P(A does not occur)=1-P(A) 4. Two events A &B are disjoint if they have no outcomes in common and therefore cannot occur simultaneously P(A or B)=P(A)+P(B) |
|
Rules for Means
|
1. If x is a random varaible and a and b are fixed numbers, then
M(a+bx)=a+BMx 2. If x and y are random varialbe, then M(x+y)=Mx+My |
|
Variance for Discrete and Continuous Variables
|
E(X^2)-M^2
|
|
Rules for Variances
|
1. V(a+bx)=b^2V(x)
2. If X&Y are independent V(x+y)=V(x)+V(Y) V(x-y)=V(x)-V(Y) 3. If X&Y have correlation p, V(x+y)=V(x)+V(y)+2pV(x)V(y) V(x-y)=V(x)+V(y)-2pV(x)V(y) |
|
Covariance of Two Random Variables
|
Cov(XY)=E(XY)-MxMy
|
|
Correlation Between X&Y
|
Cov(XY)/(sdX*sdY)
|
|
Joint Probability Function
|
f(x0,y0)=P{X=x0 and Y=y0)
|
|
Marginal Probability Function
|
fx(x0)=P(x=x0)=sum over yi of f(x0,yi)
|
|
Joint Probability Funciton for continuous variable
|
To find P{x1<x<x2 and y1<y<y2)
double integral of f(x,y)dydx |
|
Law of Large Numbers
|
Draw independent oservations at random from any populations with finite mean m. As the number of observations drawn increases, the mean of the observed values gets closer and closer to the mean of the population.
|
|
Mean and Standard Deviation of a Sample Mean
|
mean is m and standard deviation is sd/sqrtn
|
|
Central Limit Theorem
|
When n is large, the sampling distribution of the sample mean is approximately normal
|
|
General Addition Probability Rule
|
For any two events
P(A or B) = P(A)+P(B)-P(A and B) |
|
Multiplication Rule of Independent Events
|
P(A and B) = P(A)P(B)
|
|
Definition of Conditional Probability
|
Conditional Probabiility of B given A is P(B|A)=P(A&B)/P(A)
|
|
General Multiplication Rule for Two Events
|
P(A&B)=P(A)P(B|A)
Two events A&B are independent if P(B|A)=P(B) |
|
Binomial Setting
|
1. There are a fixed number n of observations
2. n observations are all independent 3. Each observation is either a success or a failure 4. Probability of success is the same for each observation |
|
Poisson Setting
|
1. Number of success that occur in any unit of measure is independent of the number of success that occure in any nonoverlapping unit of measure
2.Probability that success will occur in a unit of measure is the same for all units of equal size, proportional to the size of the unit. 3. The probability that two or more successes will occur in a unit approaches zero as the size of the unit becomes smaller. |
|
Geometric Distribution
|
Probability of the number of trials needed to get the first success.
|
|
Negative Binomial Distribution
|
Probability of the number of trials needed to get r successes.
|