• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/46

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

46 Cards in this Set

  • Front
  • Back
'Goodness of fit' Chi suqare tells us...
how representative is the sample? [compares overall sample to overall population distribution]
standardized residuals tells us...
how close individual values are to expected values (breaks down chi square)
if standardized residuals are more than _______ in magnitude, they are significant
1.96 [treated like z-scores]
kolmogorov-smirnov test is related to chi square, but it is appropriate for what type of variables?
ordinal
you can use these tools to locate blunders and outliers
frequency distributions and histograms
cross tabulations use ______ scale variables.
nominal
in chi-square tests, the "sig" number tells us...
the probability you will be wrong if you reject Ho. [if less than alpha .05, then reject]
two variables are statisticall independent if...
knowledge of one would offer no information to the identity of the other
the null hypothesis for statistical independence is..
Ho: the variables are statistically independent.
cross tab is used to...
find associations between 2 variables (not necessarily causation)
for continuous variables, the probably of the true mean = some exact number is..
zero!
the alternative hypothesis for ONE-WAY anova is...
h1: at least 2 of the means are different
t-tests is to compare 2 means, anova is to compare...
3+ means
for ANOVA, if h0 is true, the ratio of 'between treatement' and 'within treatment' should be close to...
1
TWO-WAY anova tests...
effects of 2 or more factors (independent variables) on a dependent variable
factors are also called...
independent variables
response variables are also called...
dependent variables
treatments are also called
levels (ie - different responses for independent variables)
what are the null hypotheses for ANOVA, where A is one factor, and B is another factor?
1) h0- no differences between treatment1a and treatment 2a
2) h0- no differences between treatment1b and treatment2b,
3) h0- no interaction between A and B
univariate anova checks..
the INTERACTION effect
(if A is one factor and B is another factor, it tests the sig of A*B)
univariate anova checks..
the INTERACTION effect
(if A is one factor and B is another factor, it tests the sig of A*B)
True or False: When using regression, the dependent variable is on the X axis.
False - dependent variable is on the Y axis, independent variable is on the X axis
When using regression, what is the null hypothesis when testing significance of independent variables?
There is no linear relationship between the independent and dependent variables.
When using regression, what is the alternative hypothesis when testing significance of independent variables?
There is a linear relationship between the independent and dependent variables.
What is the null hypothesis when testing the significance of the intercept in regression?
Ho: The intercept = 0.
What is the null hypothesis when testing the significance of the intercept in regression?
Ha: The intercept is not zero.
R-square is also called...
Coefficient of Determination
What does coefficient of determination tell us, and what is its name in an SPSS output?
Tells us % of the whole picture you can explain with the given independent variables [regression model's ability to predict]. Name in SPSS output = R-Square!
the formula for R-square in simple terms is...
explained variation / total variation
correlation measures...
th degree to which there is an association between two or more variables
True or False: in correlation analysis, if r = 0, there is an absence of linear association
True.
True or False: in correlation analysis, if r = 0, there is no association.
False -- there may an association that is square or some cubed.. ( but r=0 says there is no LINEAR association)
Abdominal abscesses and peritonitis, pelvic inflammatory disease, diarrhea (ETBF strains) primarily in children 1-5yrs, and inflammatory bowel disease (ETBF strains in adults)
B. fragilis
What is multicollinearity?
When there are correlations amongst independent (predictor) variables.
What is the rule of thumb for reducing multicollinearity?
r between Xs cannot be larger than the largest r between X and Y
What is conjoint measurement used for?
understand consumer trade-offs [between price and quantity/quality of attributes] ; importance of attributes ; implications for design
Discriminant analysis is used when the ________ variable is _______.
dependent (y) ; categorical (x)
this analysis is used to classify individuals into one of two or more alternative groups on the basis of a set of measurements
Discriminant
the following function is considered what type of analysis: nonmetric dependent variable = f(metric independent variables)
discriminant
the following function is considered what type of analysis: metric dependent variable = f(metric independent variables)
regression
the following function is considered what type of analysis: metric dependent variable = f(nonmetric -dummy- independent variables)
conjoint (think attitude scores)
Factor analysis deals with data reduction in relation to the effect of _______ amongst variables.
multicollinearity
If testing correlations between factor A and factor B in factor analysis, what does "-1" in the "pearson correlation" signify?
Factor A and B are NOT correlated.
When testing correlations between factor A and factor B in factor analysis, and the correlation is close to 1, what does this mean?
Factor A and B are correlated
Cluster analysis is very closely related to...
market segmentation! (grouping people with similar interests together)
multidimensional scaling is commonly used with
perceptual mapping (measuring relative distances)