• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off

Card Range To Study



Play button


Play button




Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

78 Cards in this Set

  • Front
  • Back
The science concerned with “the collection, organization, analysis, interpretation, and presentation of data
a process that results in some outcome.
a result that we observe
sample space
The collection of all possible outcomes of an experiment
the likelihood that an outcome occurs
Probability Properties
Label the n outcomes in a sample space as O1, O2, … On, where Oi represents the ith outcome in the sample space.
a collection of one or more outcomes from a sample space
Calculating Probabilities
Rule 1: The probability of any event is the sum of the probabilities of the outcomes that compose that event.

Rule 2: The probability of the complement of any event A is P(Ac) = 1 – P(A).

Rule 3: If events A and B are mutually exclusive, then P(A or B) = P(A) + P(B)

Rule 4: If two events A and B are not mutually exclusive, then P(A or B) = P(A) + P(B) – P(A and B)
Conditional probability
the probability of occurrence of one event A, given that another event B is known to be true or have already occurred

P(A|B) = P(A and B)/P(B)
random variable, X
a numerical description of the outcome of an experiment. Formally, a random variable is a function that assigns a numerical value to every possible outcome in a sample space.
probability distribution, f(x),
is a characterization of the possible values that a random variable may assume along with the probability of assuming these values.
cumulative distribution function, F(x),
specifies the probability that the random variable X will assume a value less than or equal to a specified value, x, denoted as P(X ≤ x).
binomial distribution
the probability of obtaining exactly x “successes” in a sequence of n identical experiments, called trials.
Poisson Distribution
Allows the sample size to become very large and the probability of success or failure to become very small while the expected value remains constant
Probability Density Function
A curve that characterizes outcomes of a continuous random variable, and is described by a mathematical function f(x).

Probabilities are only defined over intervals.
standard normal distribution
If a normal random variable has a mean μ = 0 and a standard deviation σ = 1,

It is represented by z
Calculating Normal Probabilities
z= (x-u)/s
Exponential Distribution
models the time between randomly occurring events, such as the time to or between failures of mechanical or electrical components.
Simple Random Sampling
Every item in the population has an equal probability of being selected.
Stratified Sampling
The population is partitioned into groups, or strata, and a sample is selected from each group
Systematic Sampling
Every nth (4th, 5th, etc.) item is selected.
Cluster Sampling
A population is partitioned into groups (clusters) and a sample of clusters is selected. Either all elements in the chosen clusters are included in the sample or a random sample is taken from each of them.
Judgment Sampling
Expert opinion is used to determine the sample.
Sampling Error
occurs naturally and results from the fact that a sample may not always be representative of the population, no matter how carefully it is selected.

Only way to reduce sampling error is to take a larger sample from the population
Systematic Error
result from poor sample design and can be reduced or eliminated by careful planning of the sampling study
a complete set of collection of objects of interest
a subset of objects taken from the population
occurs most frequently
simplest measure of dispersion and is computed as the difference between the ex and min values in the data set
a measure of dispersion that depends on all the data. The larger the variance, the more the data are "spread out" from the mean, and the more variability one can expect in the observations
the lack of symmetry of data

If the mean and median are equal, then the distribution is symmetrical.

If the mean is greater than the median, then the distribution is skewed to the right.
Vice Versa
Coefficient of skewnesss
If CS is positive, the distribution of values is positively skewed; if negative, it is negatively skewed

The closer CS is to zero, the less the degree of skewness
refers to peakedness of a histogram
Sampling Distribution
the distribution of a statistic for all possible samples of a fixed size
Confidence Intervals
An interval estimate of a population parameter that also specifies the likelihood that the interval contains the true population parameter
Idea Generation (Product Development Phase 1)
New or redesigned product ideas should incorporate customer needs and expectations
Preliminary Concept Development (Product Development Phase 2)
New ideas are studied for feasibility
Product/Process Development (Product Development Phase 3)
If an idea survives the concept stage, the actual design process begins by evaluating design alternatives and determining engineering specifications for all materials, components, and parts.
Full Scale Production (Product Development Phase 4)
Once the design is approved and the production process has been set up, the company releases the product to manufacturing or service delivery teams
Market Introduction (Product Development Phase 5)
The product is distributed to customers
Market Evaluation (Product Development Phase 6)
Ongoing product development process that relies on market evaluation and customer feedback
Concurrent Engineering
a process in which all major functions involved with bringing a product to market are continuously involved with product development form conception through sales
Design for Six Sigma (DFSS)
represents a structured approach to product development and a set of tools and methodologies for ensuring that goods and services will meet customer needs and achieve performance objectives, and that the processes used to make and deliver them achieve high levels of quality.
DFSS four principal activities
Concept development
Detailed design
Design optimization
Design verification
Concept development
the process of applying scientific, engineering, and business knowledge to produce a basic functional design that meets both customer needs and manufacturing or service delivery requirements.
involves the adoption of an idea, process, technology, product, or business model that is either new or new to its proposed application.
Theory of Inventive Problem Solving (TRIZ)
Developed by a Russian patent clerk who studied thousands of submissions, and observed patterns of innovation common to the evolution of scientific and technical advances.
He recognized that these concepts could be taught, and he developed some 200 exercises to foster creative problem solving.
Axiomatic design
based on the premise that good design is governed by laws similar to those in natural science.
Independence Axiom
good design occurs when the functional requirements of the design are independent of one another.
Information Axiom
good design corresponds to minimum complexity.
Quality Function Deployment (QFD)
is a planning process to guide the design, manufacturing, and marketing of goods by integrating the voice of the customer throughout the organization.
Four Linked Houses of Quality
Customer Requirements
Technical Requirements
Component Characteristics
Process Operations
Quality Control Plan
Nominal dimensions
the ideal dimension or target value that manufacturing seeks to meet
permissible variation, recognizing the difficulty of meeting a target conistently
tolerance design
involves determining the permissible variation in a dimension
Taguchi Loss Function
Measures quality as the variation from the target value of a design specification, and then translates that variation into an economic "loss function" that expresses the cost of variation in monetary terms
the probability that a product, piece of equipment, or system performs its intended function for a stated period of time under specified operating conditions.
Key elements of reliability



Operating conditions
Functional failure
failure that occurs at the start of product life due to manufacturing or material detects
Reliability failure
failure after some period of use
Inherent reliability
predicted reliability determined by the design of the product or process.
Achieved reliability
actual reliability observed during use.
Failure Rate
Number of failures/Total unit operating hours
mean time to failure (MTTF).
For items that must be replaced when a failure occurs, the reciprocal of the failure rate
mean time between failures (MTBF)
For repairable items
Product Life Characteristics Curve
shows the instantaneous failure rate at any point in time (referred to as a "bathtub" curve)
Infant Mortality Period
early failure period
reliability function, R(T),
characterizes the probability of survival to time T.
Series system
all components must function or the system will fail.
Parallel system
uses redundancy. The system will successfully operate as long as one component functions
Robust design
designing goods and services that are insensitive to variation in manufacturing processes and when consumers use them
Design failure mode and effects analysis (DFMEA)
identification of all the ways in which a failure can occur, to estimate the effect and seriousness of the failure, and to recommend corrective design actions.
Fault Tree Analysis
a method to describe combinations of conditions or events that can lead to a failure.
Design for Manufacturability
the process of designing a product for efficient production at the highest level of quality
Design for Excellence
- an emerging concept that includes many design-related initiatives such as concurrent engineering, design for manufacturability, design for assembly, design for environment, and other “design for” approaches
Life testing
run devices until failure occurs
Accelerated life testing
overstress devices to reduce time to failure
Highly accelerated life testing
focused on discovering latent defects that would not otherwise be found through conventional methods.