Study your flashcards anywhere!
Download the official Cram app for free >
 Shuffle
Toggle OnToggle Off
 Alphabetize
Toggle OnToggle Off
 Front First
Toggle OnToggle Off
 Both Sides
Toggle OnToggle Off
 Read
Toggle OnToggle Off
How to study your flashcards.
Right/Left arrow keys: Navigate between flashcards.right arrow keyleft arrow key
Up/Down arrow keys: Flip the card between the front and back.down keyup key
H key: Show hint (3rd side).h key
A key: Read text to speech.a key
229 Cards in this Set
 Front
 Back
What is Total Production Maintenance (TPM)? Module 22

TPM = Preventive Maint = Cost Improvement + Operator Involvement + Maint & Operator teams = TPM
Process for preventing breakdowns.  Goal is to maximize equipment effectiveness and availability.  TPM is part of everyone's job (i.e. daily inspections, cleaning, repairs, replacing worn parts, periodic inspections, procedures, and records.).  Maximizes equipment effectiveness.  Focuses on entire life cycle.  Coordinates all departments.  Involves all employees.  Is implemented through teams. 

What is Overall  Equipment Effective (OEE) and what is the formula? Module 22

OEE is mathematical score that indicated overall effectiveness of an individual piece or group of machines.
 Availability losses  breakdowns/failures.  Performance losses  minor stoppages/reduced speed.  Quality losses  defects and rework/startup and yield loss. Formula:  OEE = Percent Equipment Availability X Performance Rate X Quality Rate X 100. 

What is Pull System? Module 22

No one upstream should produce a good or service until the downstream customer asks for it.
 Pull systems are simplified visual information systems designed to respond to changes in demand simply, quickly, and accurately.  Information systems to achieve pull production.  Most utilize "Kanban"  taglike card that communicates product info.  "Kanban" is the vehicle to obtain pull system linking subsequent operations. PUSH:  Anticipated usage.  Large lots.  High inventories.  Waste.  Management by Expediting.  Poor Communication. PULL:  Actual Consumption.  Small lots.  Low inventories.  Waste reduction.  Management by Sight.  Better Communications. 

What is Kanban? Module 22

 Japanese word that means "signboard". Signal authorizing production or delivery of required materials, and is initiated by consumption (downstream process should pull products from upstream process) ex: mail flag on mailbox.
 System that coordinates products/materials/transaction services to create "pull", through a series of operations using cards or placecards.  Prouction Kanban  License to "make"  signals supplier to product specific quantity.  Withdrawal Kanban  License to "Take"  defines quanity customer may withdraw. 

What factors impact inventory levels and what do all the factors EQUAL that drive inventory? Module 22

 Factors that impact inventory levels: (Inventory is a waste, but the need is to calculate where and how much you need  not guess at it.
 Variability of production demand rate.  Long changeover times.  Long Product Family Turnover Rate ("wheel time").  Machine downtime.  Schedule increases/ decreases.  Large process or transfer batch sizes.  Process variability.  Poor quality.  Long cycle times.  Bottleneck operations. = LONG REPLENISHMENT TIME  Inventory (min) = Total cycle time / TAKT 

How do you calculate Inventory (min). Module 22

 Total inventory = (ADD x R) + (Service Factor X Sigma).
 ADD = Average Daily Demand.  R = Replenishment Time (processing time + wait time + transport time).  Service Factor = Based on Service Level decision.  Sigma = Calculated based on demand variability.  Inventory Buffer  management decision about acceptable level of exposure to stockout. (What do we need for pull?).  Service Factor  statistical probability of meeting demand at any given point. 

How do you calculate Total Inventory? Module 22

Total Inventory = (ADD X R) + (Service Factor X Sigma.
 ADD = Avg Daily Demand.  R = Replenishment Time (processing + wait + transport times).  Service Factor = Based on Service Level decision.  Sigma = Calculated based on demand variability. Service Level% Service Factor 800.84 901.28 951.65 97.51.96 992.33 99.93.08 

What is Production Sequence Analysis? Module 22

Changing sequence of steps to improve the process.


Describe Cycle Time (CT), Throughput (TH), and Work in Process (WIP) and how is it calculated? Module 22

Little's Law: WIP=TH*Ct
 CT: Time from when a job is released into routing to when it exits.  TH: Qty of good product produced in a process per unit time.  WIP: Inventory between start and end points of a product routing. 

What are some metrics to describe quality for variable data? Module 16

 Mean, Std Deviation.
 Defects per million (DPM).  Sigma Level.  Sigma Capability.  Cp and/or Cpk. 

What is the difference between variable and attribute data? Module 16

 Variable data is continuous (measurement data).
 Attribute data records just the presence or absence of something (good/bad, pass/fail). 

What are some metrics to describe quality for attribute data? Module 16

 First Pass Yield (FPY).
 Rolled throughput yield (RTY).  % "good" or "bad".  Defects per unit (DPU).  Defects per million opportunities (DPMO).  Sigma Capability. 

What is "probability" and how is it measured? Module 16

 Measured of the likelihood (or relative frequency) of something happening.
 Measured on a scale from 0 to 1 (0%  100%). 

What is a binomial (attribute data) distribution used for? Module 16

 Situations where there are 2 outcomes (pass/fail, heads/tails).


What is the normal distribution used for? Module 16

 To calculate probabilities of meeting or not meeting a specification limit when distribution of data is bell shaped.


What is Poisson (attribute data) distribution used for? Module 16

When studying the number of occurrences of something per unit interval (breakdowns per hour, defects per unit).


Why do you need to transform nonnormal data distributions sometimes? How can you do it? Module 16

 To make Cpk metric reliable. Trial and error  but once the transformation is found, it will probably work the next time.


What do you do if you can not transform a distribution to make it look normal? Module 16

 Use Defects per million (dpm).


What are some important things to remember about correlation? Module 16

 Correlation is fairly strong when r > .7 or < .7.  Just because 2 variables are highly correlated, there is not necessarily a cause and effect relationship (we need to design an experiment to control for other factors before we can conclude cause and effect).


What is regression used for? Module 16

 To explore the relationship between variables.
 To come up with the "best fitting" model in order to make predictions. 

What criteria are used to find the "best fitting" line? Module 16

 Least squares regression...The sum of the squared deviations above the line is minimized.


What is extrapolation? Module 16

Using a prediction equation beyond the range of data that was used to generate it. Care must be taken when extrapolating because there is no guarantee that a prediction equation will continue to fit well beyond the range of the data we have.


How do we determine whether a predictor variable is important / useful in making predictions?

 The pvalue in the regression output tells us the importance of a particular variable. When confirming, how close should we be to the predicted value? with +/3 standard errors.


What does R2 measure? Module 17

 It measures the strength of the model, the proportion of variation that has been explained (quantifies variation that can be explained by our model).


What is adjusted R2? Module 17

Adjusts for too many insignificant terms in a model or low sample size. Want 42 and adjusted 42 to be fairly close (.9).


What is the standard error used for? Module 17

 Can be used as an estimate of deviation when there is noting significant in the shat model.


What are some measures for assessing the goodness of fit or our regression models?

 Rsquared, adjusted Rsquared, error.


What is linear regression line? Module 17

It is a line that best represents most data with least amount of variation.
 y = mx + b.  m = slope which is rise  change in yrun  change in x...b is where line intercepts the y axis.  y  Dependent variable (output)  x  Independent variable (input). 

What is the Linear Regression line a.k.a. prediction line? Module 17

 E = residual (how far dots are from line  variance from regression line.
 SS residual = sum of the squares residual which is a measure of variation that is not explained in the model. Take the distance of each point on the residual line and square the numbers to get all positive numbers. 

What are (2) reasons to square numbers? Module 17

 All numbers are positive.
 Distance to residual is great (magnifying distance to line which will make outliers easy to identify). 

How do you determine which effects are significant in yhat models? Module 20

 Look at the pvalues. If pvalues are great than .10 indicate significance.


How do you determine which effects are significant in an shat models (2 levels) when there are no pvalues? Module 20

 Use the rule of thumb where coefficients are greater than or equal to roughly half the constant are considered significant.


What are some tools you can use for selecting the right design? Module 20

DOE KISS (computer aided mode) and the flow chart on the inside front cover of the DOC text.


Why is a sample size important? Module 20

It helps us minimize our chances of making a mistake (missing an important factor or accidentally thinking something is important when it really isn't).


How do I know what samples sizes to use when designing experiments? Module 20

 Recommendations are built in DOE KISS. Follow KISS guidelines for choosing an experimental design.
 Table in Appendix M (we should consult this if we have to deviate from the recommendation in the software). 

How do we measure orthogonality? Module 20

 tolerance (TOL) values.
 measured on a scale from 0 to 1 perfect.  .1 indicates perfect orthogonality.  ROT: when modeling, we want tolerance (TOL) values to all be greater than .5, with at least half above .7. 

What values for Tolerance (TOL) are desired? Module 20

TOL = 1 indicates perfect orthogonality. When modeling, we want to have all TOL values greater than .5 with a least half above .7 (for trustworthy pvalues and coefficients).


What are the rules of thumb for choosing an experimental design. Module 20

 (K) (Design) (Runs X) (Reps =) (Experiments)
 (2) (22 FF) (4) (9) (36)  (3) (23 FF) (8) (5) (40)  (4) (24 FF) (16) (3) (48)  (5) (25 FF) (32) (2) (64)  (5) (251Fracf) (16) (3) (48)  611 Screening 

How do you determine statistical significance for TERMS in the yhat model? Module 20

 P value < .05 term is significant, place in yhat (black in spreadsheet).
 P value > .10 term is insignificant, leave out of yhat model (red in spreadsheet).  .05 < p value < .10 is a gray zone where you will typically decide to place the term in yhat (blue in spreadsheet). 

How do you determine statistical significance for TERMS in the shat model? Module 20

 shat models from 2 level designs; half effects for any term should be > or = to sbar/2 (i.e. absolute value of the term coefficient should be > or = to 1/2 of the constant).


How do you interpret R2, Adj R2, and Reasons for Low R2? Module 20

 R2 measures strength of the model, 0 < or = R2 < or = 1.
 Adj R2 adjusts R2 down for small sample size and/or for to many insignificant terms in the model.  Reasons for low R2: 1. Poor PF/CE/CNX/SOPs. 2. Low and High X settings close together. 3. X's not related to y. 4. Wrong model, i.e, linear fit when the true relationship is nonlinear (may have to use different level). 

How do you interpret Tolerance (TOL) values from statistical output? Module 20

 Tolerance is the proportion of orthogonality (testing factors independently) for each effect. All tolerance values should be > or = .5 for meaningful coefficients, at least half should be > .7.


What is an IPO diagram? Module 21

 Graphical representation of a process, showing the inputs and outputs...good communication tool.


What is PF/CE/CNX/SOP and why is it important? Module 21

 Process flow / Cause & Effect/ Constant, Noise, Experimental / Standard Operating Procedures.
 #1 tool for reducing variation...a must on all projects...you will fail to get good data without doing this. 

Describe the basic 6 Sigma Roadmap and Master Strategy and describe which statistical tools you could use. Module 21

Define 
 What's important to the customer? Business? Right project?  Get a Team formed.  Tools used for attribute (qualitative) & variable (quantitative) data = VSM, VOC/HOQ, SIPOC, PF/CE/CNX/SOP. Measure   What does the current process look like? (baseline).  How well are we doing?  What about our measurement process? Is it accurate, repeatable, and reproducible?  What is our current process capability?  Tools used for attribute (qualitative)data = PF/CE/CNX/SOP, Pareto, Control Charts, Run Charts,  Tools used for variable (quantitative) data = PF/CE/CNX/SOP, Histograms, Control Charts, Run Charts, Box Plots, Cp, Cpk, DPMO, MSA. Analyze   What are the key input variables and output variables in the process?  What is the Y = F (X) relationship?  Define the performance gaps.  What are some of the problems and the root causes(s) of those problems?  what are some potential solutions to investigate?  Tools used for attribute (qualitative)data = PF/CE/CNX/SOP, Pareto, Control Charts, Run Charts, Scatter Diagram, Voting, IPO Matrix, First Pass Yield (FPY).  % "good" or "bad".  Defects per million opportunities (DPMO).  Sigma Capability.  Tools used for variable (quantitative) data = PF/CE/CNX/SOP, Histograms, Control Charts, Run Charts, Scatter Diagram, Voting, IPO Matrix, Box Plots, Cp, Cpk, DPMO, Voting, IPO Matrix, Hypothesis Test, DOE. Improve   Make improvements to the process.  Test out the solutions.  Reassess the capability of the new process.  Define new process flows, SOP's etc.  Tools used for attribute (qualitative)data = PF/CE/CNX/SOP, Pareto, Control Charts, Run Charts, Scatter Diagram, Chisquare test, Test of Proportion, Binomial distribution, Poisson distribution.  Tools used for variable (quantitative) data = PF/CE/CNX/SOP, Histograms, Run Charts, Scatter Diagram, Control Charts, Box Plots, Cp, Cpk, DPMO, Hypothesis Test, DOE, TTest fTest, Anova, Defects per million (DPM), Sigma Level, Sigma Capabilitiy, Capability Index (Potential) (Cp) and/or Capability Index (Actual) (Cpk), Normal Distribution. Control   How successful were we? Quantify the gains, especially the financial gains.  Put plan in place to hold the gains.  Communicate the results.  Celebrate with Team.  Tools = Control Plan, SOPs, Control Charts, Project Storyboard. 

For Binomial, Poisson, and Normal Distribution, what kind of data is this Module 16

 Binomial Distribution is ATTRIBUTE DATA and PChart  and is used to show # of DEFECTIVES.
 Poisson Distribution  is ATTRIBUTE DATA and is used for DEFECTS.  Normal Distribution is VARIABLE DATA (continuous) use XBAR R. 

What is a binomial distribution (attribute data) used for? Module 16

 Situations where there are 2 outcomes (pass/fail, heads/tails, defective/not defective, male/female, success/failure).


What is Poisson distribution used for? Module 16

 When studying the number of occurrences of something per unit interval (brekdowns per hour, defects per unit).


What is the normal distribution? Module 16

 To calculate probabilities of meeting or not meeting a specifiction limit when distribution of data is bell shaped.


Why do you need to transform nonnormal data distributions sometimes? How can you do it? Module 16

 To make Cpk metric realiable.
 Trail and error  but once the transformation is found , it will probably work the next time. 

What do you do if you can't transform a distribtuion to make it look normal? Module 16

 Use Defects per Million (dpm).


Important things to remember about correlation? Module 16

 Correlation is fairly strong when r > .7 or < .7.
 Just because 2 variables are highly correlated, there is not necessarily a cause and effect relationship (we need to design an experiment to control for other factors before we can conclude cause and effect). 

What is regression? Module 16

A statistical technique for determining the mathematical relationship between the output for process and one or more input variables.


What criteria are used to find the "best fitting" line? Module 16

 Least squares regression.
 The sum of the squared deviaitons about the line is minimized. 

What is extrapolation? Module 16

 Using a prediction equation beyond the range of data that wa used to generate it.
 Care must be taken when extrapolating because there is no guarantee that a prediction equation will continue to fit well beyond the range of the data we have. 

How do we determine whether a predictor variable is important / useful in making predictions? Module 16

 the pvalue in the regression output tells us the importance of a particular variable.
 When confirming, how close should we be to the predicted value? Within +/3 standard errors. 

What does R2 measure? Module 16

It measures the strength of the mode; the proportion of variation that has been explained (quantifies varition that can be explained by your model).


What is adjusted R2? Module 16

 Adjusts for too many insignificant terms in a model or low samle size.
 Want R2 and Adjusted R2 to be fairly close (.9). 

What is Probability and the Rule of Thumb? Module 16

 Measure likeiihood (or relative frequency) of an event happening.
 Measure on a scale from 0 to 1 (or from 0% to 100%).  Probability of 0 indicates; it will not happen.  Probability of 1 indicates; it will happen.  Probability of .5 indicates; 50/50 (coin toss). 

What are the 5 elements of Lean Six Sigma Roadmap? Module 14

 Define: What's important to the customer? Business? Right project? Team?
 Measure: How well are we doing? What does the current process look like?  Analyze: Define the performance gap. What are the root causes of problems?  Improve: Make improvement to the process.  Control: Measure success and hold the gains. 

What are the 5 graphical analysis methods for data and what they are used for? Module 14

 Pareto diagrams (prioritization.
 Histograms (visual picture of shape of the distribution of data).  Box plots (visual picture of distribution; especially usedful for sidebyside comparisions).  Run charts (show trends in data over time).  Scatter diagrams (shows relationship between 2 variables). 

What is PF/CE/CNX/SOP and why is it important? Module 14

Process Flow / Cause & Effect / Constant, Noise, Experimental / Operating Procedures.
 #1 tool for reducing variation...a must on all projects...you will fail to get good data without doing this. 

Lean focuses on eliminating nonvalue added steps. What are the 8 ways to reduce wastes? Module 15

DOWNTIME
 Defects.  Over Production.  Waiting.  Non Utilized Talent.  Transportation.  Inventory.  Motion.  Extra Processing. (DOWNTIME) Six Sigma: focuses on reducing variation from the remaining steps (Shift & Squeeze) Simplify and Perfect? 

For Value Added vs. NonValue Added: What maps illustrates those activities? Module 15

 Time Value Map  time scaled graph that illustrates the amount of active and inactive time spent during one complete cycle of a process. (Current processes usually have small amount of valueadded (5%) and the nonvalue added (95%) activities are waste.
 Used to illustrate presence of delay and waste time.  Used during current state analysis to highlight issues related to queue time or poor flow. 

What are the 8 steps to create a Time Value map? Module 15

1. Identify value add and nonvalue added activies.
2. Determine typical amount of time spent to complete each step. 3. Determine queue time between steps. 4. Calculate cumulative time required to complete the process and create an axis representing total cycle time. 5. Place bars above and below the time axis, proportional to duration of each value added and nonvalue added process steps. 6. Draw rework loops and indicate frequency or yield impact. 7. Determine relative proportions of queue time and active time as well as value added and nonvalue added time. 8. Create stacked bar graphs to summarize data. 

When and how do you use Time Value Maps? Module 15

 Used to illustrate presence of delay and wasted time.
 Used during current state analysis to highlight issues related to queue time or poor flow.  Commonly used n conjunction with stacked bar graphs to show the cumulative amounts of active and inactive time as well as the relative proportions of value added and nonvalue added acvtivities.  Commonly applied to either administrative or production oriented processes. 

For the Value Stream Mapping  Who said: "Put a good person ina bad system and the bad system wins, no contest." Module 15

Deming


What is Value Stream? Module 15

Is a sequence of all activities required to convert raw material to finished good delivered to your customer.
 A primary focus is TIME, Product and / or Service Flow, and Information Flow which should be done quickly in all directionns. 

What are the Value Steam Components? Module 15

 Suppliers.
 Design.  Procure.  Make.  Sell.  Customers. 

What are the steps to create value stream maps. Note: sometimes it helps to start at the end  customer receives). Module 15

 Identify customer's requirements. ID Suppier and information needed from supplier (material).
 Perform an upsteam walk.  Record info in data boxes.  Add information flow data.  Add timeline information. 

What are the benefits for Value Stream maps? Module 15

 Helps visualize entire process.
 Links flow of material with flow of information.  Points out sources of wastes.  Highlights which operations are plant constraints or pacemakers.  Helps visualize how improvements will translate into product flow. 

For Value Stream maps, we are looking for WASTE. Where do you want to reduce waste? Module 15

 Mismatch to actual customer demand.
 Low VA ratio.  Complicated planning.  Long cycle time.  Resources not correctly allocated.  Downtime.  Constraints.  Inventory.  Push production.  Large batches. 

Six Sigma focuses on reducing __________ from the remaining value added steps. Module 15

variation


Lean focuses on eliminating _____ ______ _____ _____and activities in a process. Module 15

nonvalue added steps


What are the 3 steps or activities you want to identify in a process to reduce varition and eliminate nonvalue added steps. Module 15

 Value Added Activities.
 NonValue Added BUT Required Activities.  NonValue Added Activities. 

What is a Time Value Map? Module 15

Is a time SCALED graph that illustrates the amount of active and inactive time spent during one complete cycle of a process. Note: Drawn to SCALE.


What is the definition of a Value Steam? Module 15

Is the entire set of activities performed to transform the products and services in what is required by the CUSTOMER.
 A Primary Focus is TIME.  Product and / or Service Flow (left to right).  Information Flow: Quickly in all Directions (Both ways). 

What is the best ways to document a Value Stream map? Module 15

 Do it by hand.
 Use a large sheet of paper or flipchart so the diagram can be saved. 

For Value Stream maps, Step 5 is Add Timeline Information. What is Process Lead Time & Inventory Lead Time? Module 15

 Process Lead Time  usually the sum of all cycle time.
 Inventory Lead Time  is the time customer orders it or time it takes material to work through the entire system (sum of all cycle time and waiting time).  Inventory Days of Supply calculation = to AMOUNT ON HAND devided by DAILY CONSUMPTION. 

What are some characteristics of the Binomial (attribute data) ? You use it to find distribution. Module 16

 The experiment consists of a SEQUENCE of n Bermoulli trails. A Bermoulli trail is itself an exeriment in which there are only two possible outcomes, which we call success or failure.
 The n TRAILS are IDENTIAL, i.e., the same trail performed under idential conditions.  The TRIALS are INDEPENDENT. The outcome of any one trial neither influences nor influenced by the outcome of any other trail.  The probability of SUCCESS on every trail is the SAME; call that PROBABILITY p. 

What are the Binomial Distribution parameters in the tool and what is the equation for Population Average, Population Variance and what are the shape rules? Module 16

Parameters:
 n = number of trails.  p = probability of success on any trails. This number is provided by the CUSTOMER.  q = 1 = p  The two parameters n and p uniquely determine the particular binomial distribution out of an entire family of binomial distributions. Population Average: EXPECTED  E(X) = n *p Population Variance: V(X) = sigma2 = npq N * P * Q Shapes:  If p < .5, then skewed right.  If p > .5, then skewed left.  If p = .5, then symmetric.  p < = .5 is rare occurrence. 

What are the paramenters for Poisson (attribute data)? Module 16

Parameters:
 x = number of occurrence PER unit INTERVAL (time or space).  Lamdbda = average number of occurrences PER UNIT INTERVAL (Note: This interval MUST match the interval given in the definition of the Random Variable (RV). Population Average: E(X) = Lambda. (an estimate of the population average for Poisson data is the sample average number of occurrence, x bar). Populaton Variance V(X) = Lambda. (an estimate of the population variance for Poisson data is also the sample average number of occurrences, x bar). Shaped: Skewed Right. 

What is PMF? Module 16

Probability Mass Function.
P(X = x) = 

How do you know when it is a Poisson? Module 16

TEST the PER.


What is Normal Random Distribution, parameters, population average, populaton variance and shape? Module 16

c
 X = normal distribution, Gaussian, BELL SHAPED.  M = mean (where center of distribtion is).  sigma = standard deviation (determines how narrow or flattened the distribution is). Population Average: (an estimate of the population average is xbar, the sample average). E(X) = M (mean) Population Variance: (an estimate of the population variance is s2, the sample variance). Shape: symmetric about M. 

For Normal Distribution, how do you read: X ~ N(70,22) =.
Module 16 
 N stands for Normal Distribuion.
 70 is means of 70.  2 is Standard Deviaiton of 2. 

For Normal Distribution, the specifications come from who? N stands for Normal Distribuion, 70 is means of 70 and 2 is Standard Deviaiton of 2. Module 16

from customer.


What is difference between variable & attribute data? Module 16

 Variable data is continuous (measurement data) such as; time, temperture, call volume, speed, etc).
 Attribute data records just the presence or absence of something such as; good/bad, pass/fail, yes/no, female/male. 

For attribute (discrete) data which graph could you run? Module 16

Graph =
 Pareto.  Runchart.  Control Chart. 

What are some metrics to describe quality for ATTRIBUTE data? Module 16

 DPU = Defect Per Unit.
 DPMO = Defect Per Million Opportunties.  Sigma Capability.  First Pass Yield (FPY).  Average FPY  Rolled Throughput Yield (RTY). Note:  DPU = A measure of quality for discreate or attribute data.  DPMO = A measure of quality for discreate or attribute data which is used for benchmarking purposes. Six sigma process will have no more than 3.4 defects per million opportunities. 

What kind of graphs do you use for attribute (discreate) data? Module 16

GRAPHS 
 Pareto diagrams.  Run Charts.  Control Charts. 

What is difference between variable & attribute data? Module 16

 Variable data is continuous (measurement data) such as; time, temperture, call volume, speed, etc).
 Attribute data records just the presence or absence of something such as; good/bad, pass/fail, yes/no, female/male. 

What are some metrics to describe quality for VARIABLE data? Module 16

Metric, Measure =
 Mean, Standard Deviation.  Defects per million (DPM).  Sigma Level.  Sigma Capabiltiy.  Capaility Index (Potential) (Cp) and/or Capability Index (Actual) (Cpk). 

What are some metrics to describe quality for VARIABLE data? Module 16

Graphs =
 Histogram.  Run Chart.  Control Chart.  Scatter Plot.  Box Plot. 

How do you compute sigma level or sigma capability regardless of the type of data (attribute or variable). Module 16

Attribute (decrete) 
 Find DPM.  Sigma Capability = .8406 + devide 29.37 = 2.221* In (dpm) Variable (continuous)   Is it NORMAL?  YES, then Sigma Level = min  USL  y bar devide by sigma , y bar  LSL devide by sigma.  NO, Can we TRANSFORM to NORMAL?  No, Find DPM. If you can not transform treat as attribute data.  Yes, Transform (Central Limit Theorem (CLT)) WHAT WOULD YOU USE FIRST? LOG and you can compute Cp and Cpk using trnsformed data.  then Sigma Level = min  USL  y bar devide by sigma , y bar  LSL devide by sigma. 

What terminology is involved in SIMPLE LINEAR REGRESSION? Module 16

 Intercept.
 Slope.  Prediction Equation.  Residual.  Rsquared. 

What is F LOF? Module 16

< 6.
 Add higher order term to your model.  Model does not fix data.  Missing a term.  Add a quadratice.  < 6 NOT GOOD  missing significant term in the model. Add term to model to correct. 

What are the two (2) categories of Kanban's? Module 22.

 Production Kanban (license to Make)  signals to supplier to produce a specif quantity.
 Withdrawal Kanban (license to take)  defines quantity customer may withdraw 2 Bin Kanban...empty bin is signal. 

What is Flow? Module 22

 Flow is continuous, progressive adding of value: starts at receipt of customer request and ends at deliver to customer (fewest # of steps with no interruptions).
 Value stream map which adds time value for each step...looks at nonvalue to remove. 

What is Little's Law and how is it calculated? Module 22

 Little's Law:
WIP = TH * CT  this allows us to focus on certain items. WIP: Work in Process TH: Throughput CT: Cycle Time  Focus on TH / Ability to meet demand: Want to increase throughput...descrease CT; could increase WIP, but not preferred to increase WIP. TH = WIP / CT  Focus on CT reduction: CT = WIP / TH.  Reduce WIP or increase TH.  Focus on WIP: WIP = CT * TH.  reduces cycle time or reduce TH. Preference is to reduce CT).  CT  order entry to ship (move time, queue time, setup time, and process time).  tH  Quantity of good product produed in a process per unit time.  WIP  Inventory between start and end points of a product routing. 

What is hypothesis testing? Module 23

 A statistical procedure for testing a claim and helping us ake good decisions in light of variation.
 Typical questions include: Has the average shifted? Has the standard deveiation shifted? Are two (2) proportions really different? 

What test (and rules of thumb) ar used to determine if there is a significant shift in average? Module 23

 A quick test is the Tukey End Count Test. Here total end counts of a least seven (7) indicate a significant shift in the average with at least 95% confidence.
 A formula (exact) statistical test is the ttest. As a rule of thumb, pvalues less than .05 or p(x) < .05, (or our desired leel of significance) indicate a significant shift in the average. 

What tests ( and rules of thumb) are used to determine if there is a significant shift in standard deviation? Module 23

 A quick test is the s (max) / s (min) test (somethimes called the "e" test). Here, if the ratio is greater than 2.72, we can assume with at least 95% confidece that there has been a significat shift in standard deviation.
 A formula (exact) statistical test is the ttest. As a rule of thumb, pvalues less than .05 or p(x) < .05, (or our desired leel of significance) indicate a significant shift in the average. 

What is a pvalue and how is it used? Module 23

 A pvalue is used to determine whether a significant change has occurred in a hypothesis test.
 The pvalue is calculated based on our data ad represnets the probability of obtaining the sample results we did, if the null hypotheses were true.  If the chance if obtaining our sample results uner the null hypothesis is very small (generally less than .05), then we conclude in favor of the alternate hypothesis.  The pvalue is our risk of making a Type 1 error. It provides the exact level of significance of a hypothesis test.  If pvalue is .05 (red) not the same, not normal (highly significant). HO has to Go. 

What is Chisquare test for independece used for? Module 23

 The chi square tet for independece is used for categorical (classified) data. It tests whether two variables are independent, or whether there is some relationship between the two (2).


How do you performa a single sample hypothesis test? Module 23

By using a confidence interval (CI).


Why do you need to use a paired test instead of a typical ttest? Module 23

Individual items in two (2) samples are sometimes pared. Perform a confidence interval on the deltas and see if zero is in the CI or not.


What is residual? e=residual Module 24

 Difference between the actual and predicted values.
 The actual yvalue minus the predicted yvalue. The mean of the residuals is always zero.  e=residual 

What can residual plotting help us do? Module 24

Identify outliers and/or potential problems with the model fit.


What ar some other diagnostics for assessing the regression model? Module 24

 Studentized residulas and RStudentized residuals help identify outliers.
 Leverage ad Cook's D help identify potential influential points. 

How can we determine if there is a serious lack of fit with our regression models. Module 24

 Residual plots.
 Ftest for lack of fit in DOE PRO. 

What is Studentized Residual? Module 24

This is the residual divded by its standard deviation. It has an approximate N (0,1) distribution for larger data sets. It is used to detect outliers. You want > 3.


What is Leverage? Module 24

 Identifies points which can exhibit an unusually high degree of influence
 This occurs when points lie outside the normal range of the input data.  This is a diagnostic of the X's only.  A value larger than 2 p/n is considered to be significant.  Where p = total number of terms in the model (including the constant) and n = total number of observations. 

What is Cook's D? Module 24

 This statistic is used to detect both outliers and influential points.
 The statistic has approximate F (p, np) distribution so values bigger than 2.0 will usually be significant. 

What ar some other diagnostics for assessing the regression model? Module 24

 Studentized residulas and RStudentized residuals help identify outliers.
 Leverage ad Cook's D help identify potential influential points. 

How can we determine if there is a serious lack of fit with our regression models. Module 24

 Residual plots.
 Ftest for lack of fit in DOE PRO. 

What is Studentized Residual? Module 24

This is the residual divded by its standard deviation. It has an approximate N (0,1) distribution for larger data sets. It is used to detect outliers.


What is Leverage? Module 24

 Identifies points which can exhibit an unusually high degree of influence
 This occurs when points lie outside the normal range of the input data.  This is a diagnostic of the X's only.  A value larger than 2 p/n is considered to be significant.  Where p = total number of terms in the model (including the constant) and n = total number of observations. 

What is Cook's D? Module 24

 This statistic is used to detect both outliers and influential points.
 The statistic has approximate F (p, np) distribution so values bigger than 2.0 will usually be significant. 

What ar some other diagnostics for assessing the regression model? Module 24

 Studentized residulas and RStudentized residuals help identify outliers.
 Leverage ad Cook's D help identify potential influential points. 

How can we determine if there is a serious lack of fit with our regression models. Module 24

 Residual plots.
 Ftest for lack of fit in DOE PRO. 

What is Studentized Residual? Module 24

This is the residual divded by its standard deviation. It has an approximate N (0,1) distribution for larger data sets. It is used to detect outliers.


What is Leverage? Module 24

 Identifies points which can exhibit an unusually high degree of influence
 This occurs when points lie outside the normal range of the input data.  This is a diagnostic of the X's only.  A value larger than 2 p/n is considered to be significant.  Where p = total number of terms in the model (including the constant) and n = total number of observations. 

What is Cook's D? Module 24

 This statistic is used to detect both outliers and influential points.
 The statistic has approximate F (p, np) distribution so values bigger than 2.0 will usually be significant. 

What ar some other diagnostics for assessing the regression model? Module 24

 Studentized residulas and RStudentized residuals help identify outliers.
 Leverage ad Cook's D help identify potential influential points. 

How can we determine if there is a serious lack of fit with our regression models. Module 24

 Residual plots.
 Ftest for lack of fit in DOE PRO. 

What is Studentized Residual? Module 24

This is the residual divded by its standard deviation. It has an approximate N (0,1) distribution for larger data sets. It is used to detect outliers.


What is Leverage? Module 24

 Identifies points which can exhibit an unusually high degree of influence
 This occurs when points lie outside the normal range of the input data.  This is a diagnostic of the X's only.  A value larger than 2 p/n is considered to be significant.  Where p = total number of terms in the model (including the constant) and n = total number of observations. 

What is Cook's D? Module 24

 This statistic is used to detect both outliers and influential points.
 The statistic has approximate F (p, np) distribution so values bigger than 2.0 will usually be significant. 

What ar some other diagnostics for assessing the regression model? Module 24

 Studentized residulas and RStudentized residuals help identify outliers.
 Leverage ad Cook's D help identify potential influential points. 

How can we determine if there is a serious lack of fit with our regression models. Module 24

 Residual plots.
 Ftest for lack of fit in DOE PRO. 

What is Studentized Residual? Module 24

This is the residual divded by its standard deviation. It has an approximate N (0,1) distribution for larger data sets. It is used to detect outliers.


What is Leverage? Module 24

 Identifies points which can exhibit an unusually high degree of influence
 This occurs when points lie outside the normal range of the input data.  This is a diagnostic of the X's only.  A value larger than 2 p/n is considered to be significant.  Where p = total number of terms in the model (including the constant) and n = total number of observations. 

What is Cook's D? Module 24

 This statistic is used to detect both outliers and influential points.
 The statistic has approximate F (p, np) distribution so values bigger than 2.0 will usually be significant. 

What are the Regression Diagnostics? Module 24

 Outliers
 Unusual observation (Y).  Check to ensure these points reprsent valid data.  Do not necessary impact the regression fit.  Diagnostics: Studentized and RStudentized residuals.  High Leverages Observations. Unusualy observations (X). Have a potential influence on the regression fit and should be checked very carefully. Diagnostics: leverage values.  Highly Influential Obervations. Without these observations, the regression fit would be significant altered. Diagnostics: Cook's D > 2.  TPM is part of everyone's jot (i.e. daily inspections, cleaning, repairs, relacing worn parts, periodic inspections, procedures, and records). 

What are some notes on Regression Diagnostics? Module 24

 diagnostics alert us to potential problems.
 Diagnostics signal things we should investigate more closely.  Diagnostics are not meant to be a data cleansing machine.  For unusual observations: Try to find a cause. If the data oint is invalid, then correct it or delete it. If the dat oint appear valid, then consider collection additional dta near that point.  Unusual observation (X).  Have a potential influence on the regression fit and should be checked very carefully. Diagnostics: leverage values.  Highly Influential Observations. Without these observations, the regrestion fit would be significantly altered. Diagnostics: Cook's D > 2. TPM is part of everyone's job (i.e. daily inspections, cleaning, repairs, replacing worn parts, periodic inspections, procedures, and records). 

What does Tolerance (TOL) measure and what values are desired and is used for variablecontinuous data? Module 25

 It measures orthogonality.
 Would like to see values of 1 (perfect orthogonality), however no lower than .5 (in the case of historical data). 

When analyzing historical data, what are some things we should remember? Module 25

 Always check for outliers.
 Data is often messy.  May need to eliminate some of the variables.  Always code standardized the factor settings.  Always check the TOL values. Want all values above .5, and at least half above .70. You would standardize if values not in this range.  Low tolerance  you can not trust Pvalue or coefficients. Delete highest level first, rerun regression.  Each time you remoe term errors your PValue and Coefficients change. 

Why is orthogonality important? Module 25

 It allows us to estimate the effects independently.


What are two (2) methods to improving orthogonality when modeling? Module 25

 Autocoding (1 to +1).
 Standard (substracting off the mean ad dividing by the standard deviaiton). 

Name at lest three (3) sources for continued learning. Module 27

 Project work.
 Organization's Black Belt, Master Black Belts, and Champions.  Professional societies. 

Where are some training options ofr Black Belts beyond Black Belt training to expand knowledge? Module 27

 Train the trainer.
 Design for Six Sigma (product, transactional, software).  Process simulation (WB Modeler)  Lean Principles. 

Waht is HTT? Module 27

 HTT stands for High Throughput Testing.
 It is a technique for testing combinations of many factors and levels to save testing time and dollars. HTT: A = 3 levels B = 4 levels C = 5 levels HTT = multile levels of 2 highest factors. So in this case 4 X 5 = 20 runs.  for DOE multiple all of them 3 * 4 * 5 = 60 

Waht is the purpose of DFSS? Module 27

 To utilize special Six Sigma tools like Expected Value Analysis, Parameter Desing, and Tolerance Allocation to reduce design and deelopmet time, improves quality, increase market share, and to reduce cost.


Name the two discrete (attribute) and the continuous (variable) DISTRIBUTIONS. Module 16

 Two (2) Discreate Distributions (Attribute).
BINOMIAL  Proportion defective. CHARACTERISTICS: 1. 2 possible outcomes: success or failure; defectie or not defective; male or femail. 2. Trails re identical. 3. Trails are independent (outcome of one doesn't influence outcomeof other). 4. Probability of success o every trail is the same; we call that probability p. BINOMIAL RANDOM VARIALE X  Expected value "X" = number of successes out of n trails. "X" = (0, 1...n)  Parameters: n = number of trails. p = probability of success on any trail. q = 1  p.  E(X) = mu = np.  Use SPC XL > Discrete Distributions ? Binomial  POISSON (# of occurrences per unit interval of time or space).  Each unit can hae 0, 1, or multiple errors, defects, or some other ype of measured occurrence. Examples: number of speeding tickets issued ina certain county per week. number of disk drive failure per month for a particualr kind of disk drive. number of calls arriving at an emergency dispatch station per hour. numer of flaws per square yard ina certain type of fabric. number of typos per page in a techinical book. ***Skewed right (how much). ***Convert to same time interval. ***Mean and variance are Lambda in Poisson distributon. CONTINUOUS DISTRIBUTIONS (VARIABLE):  Normal Random Variable X.  X = normally distributed, Gaussian, bell shaped.  Mu = mean.  Sigma = standard deviation.  Population average is E(X) = mu.  Population variance is V(X) = sigma squared.  = Xmu / sigma.  Use Z tales to look up.  Use SPC XL > Continuous > Normal. 

What are the ways to finding the probability? Module 16

Calculate probabilities of meeting or not meeting a spec limit when distributon of data is bell shaped.
 Use SPC XL > Continuous Distribtuon > Inverse Normal.  To find "X" given a probabilty. 

How do you compute sigma level or sigma capability regardless of the type of data? Module 16

 Is it normal. Run histogram. ks pvalue is > .05 normal data.
 If not normal, data must be transformed...logarithms first, second: 4th root (in excel = highlight cell...(shift 6) * .25 ...no parenthesis needed.  ROT: If max / min > 10, then will have nonnormal data and need to transform the data. 1. Apply log function to data; do histogram, Cpk  upper spec limit needs converting to Logg too. 2. If data still nonnormal, apply another transformation technique. ROT: If dta set contains serveral values = 0, then apply 4th root. If data cannot be transformed, then find dpm. Can we look up sigma capability in chart after finding dpm? 

Why do we need to transorm nonnormal data distribution? How can you do it? Module 16

 This is to make the Cpk metric realiable.
 Trail and eror  but once the transformation is found it will probably work the next time also. 

What is the blended approach for DOE? Module 18

 Taking the best of the best from Taguchi, Shanin, and classical statisticians, and putting it together with a strong emphasis on KISS.


What is the purpose of DOE? Module 18

 Strategy for dta collection where we test multiple inputs, rather than ust changing one variable at a time.
 We want ot lean which factors affects the average and/or standard deviaion as well as which have no effects. 

What is an interaction? Module 18

A combination effect, where the effect of one factor on the response depends onthe setting of another.


What is the effect of aliasing Module 18

 Aliasing occurs when two (2) columns in a design matrix ar identical.
 When this happens, we cannot separate the effects of the variables, imparing our learning. 

What are the three (3) main methods for experimentaiton and which is preferred? Module 18

Yesterday's method  tradational
 One factor at a time.  Best guess.  All possible combinations (full factorial), DOE. Today's Approach:  Doe is preferred so that we can gain the maximum knowledge from the minimu, rsources. Full Factorial ;us other selected designs. 

What is the problem with One Facotr At a Time (OFAT)? Module 18

 We cannot learn abut interactions.


What is an orthogonal design advanages. Module 18

A design (test) matrix which has both vertical and horizontal balance.
 It allow us to independently estimate the effects of the factors.  makes term independent.  Put them on same scale.  More accurate Pvalue.  make easy to select the most important effect.  help to see if there are any significant interactions. 

What is a screening design used for Module 18

 A screening desig is used to look at a large number of factors (generally six (6) or more) to determine which have the biggest affect so that followup experimentaiton can be done.
 It helps us prioritize our efforts  NOTE  for modeling K <  5 or fewer. 

Why is ti important to take replicates? Module 18

So that we can estimate the process variability at different settings ad obtain shat models.


What process do we follow when designing experiments? Module 18

 The 12 step process.
 It give us a roadmap and quetions to answer as we proceed through the plan, deign, conduct, analyze, and confirm stages of experimentation. 

When shuld resources be placed when doing DOE's? Module 18

 Upfront, when planning within +/3 stanard deviations.
 Important considrations include the customers requirements, objective o the experiment, schedule, and budget, the measurement process that will be used. 

What are some graphical ways to analyze DOE data? Module 18

 Mrginal means plot.
 Pareto of effects.  Ineraction plots.  Contour plots. 

What are other ways to analyze the data? Module 18

 Building Yhat and Shat prediction equations.
 Statistical tests of significance. 

Whatt type of proceses can DOE be applied to? Module 18

 All processes (admin, marketing, financial, desing, manufacturing, sales, etc...).
 As long as we can define the input and the outputs. 

What does R2 measure? Module 18

 R2 measures theproportion of variation in the data (Y values) that is explained by the model (X's).
 R2 is a metric raning for 0 to 1 (or 0%  100%). 

What is Adjusted R2? Module 18

 It adjusts for too many insignificant terms ina model or low sample size.
 You want R2 and Adusted R2 to faily close (.9) 

How do we determine whether a predictor variable is important /useful in making predictions? Module 18

 The pvalue in the regression output tells us the importance of a particular variable.


When confirming, how close should e be to the predicted value? Module 18

 Within +/3 standard deviations.


What is standard error used for? Module 18

It can be used as an estimate of sigma wen there is nothing significant in the shat model.


Why is coding important? Module 18

 coding improves the orthogonality of the design.
 If orthogonality is a problem (TOL values less than .5), then we cannot trust the pvalues or coefficients that are reported. 

What are the five (5) main steps in DOE? Module 18

 Plan.
 Design.  Conduct.  Analyze.  Confirm. 

What are the two (2 main use (strategies) for DOE? Module 18

 Screening.
 Modeling. 

What can we use as a guide when planning for DOE? Module 18

The twelve (12) step process.


Why do we do Design of Experiments (DOE's)? Module 18

 Efficient way to collect data.
 Methodology to relate inut factors to output factors y = f(x).  Tells us how to squeeze and /or shift our process.  Allows one to test multiple inut factors simultaneously in the most efficient manner possible. (can see interaction effects).  Used in ANALYZE and IMPROVE phases.  Can be used to determine acceptable tolerances on the inputs.  Uses a blended approach. 

What is designed experiment? Module 18

 Purposeful changes of the input (factors) in order to observe corresponding changes in the output (response).


Name what you need in a Design of Experiment (DOE). Module 18

 Number of levels for each factor (low and high setting)
 Number of factors (runs).  Design Name (Full Factorial)  Number of Reps (repetitions) Note: For 2 factors  need 9 reps. For 3 factors  need 5 reps. For 4 factors  need 3 reps. 

What are the strategies for experimentation? v

 Screening  used to help few input factors; if not sure of critical inut factors, screen. ROT: If > or = to 6 inut facotrs screen to find significant few...brainstorm, PF/CE/CNX/SOP.
 Modeling: Less than 6 or know your significant few. 

What is the objectives of an experimental design? Module 18

 Obtain maximum aount of info using a min aount of resources.
 Determine which factors (inuts) SHIFT the AVERAGE rsponse, which SHIFT the VARIABILITY and which have NO effect.  Build empirical models relating the response of interest to the input factors.  Find factor settings that optimize the response and minimize the cost.  Validate results. 

What are some orthogonal or nearly orthogonal designs? Module 18

Orthogonal testing input factors independently (each run had different combo of input factors)...Independent estimation of factors of a response.
 Full factorial (for small numbers of factors...up to 4).  Fractional factorials.  Taguchi designs. 

What is the simple definition of twolevel orthogonal designs?

 2 level (low setting 1; igh setting +1).
 2 to the 3rd power = 2 level, 3 factor design. 

What is a Coded Matrix? Module 18

Uses the low and high settings...Linear modeling with interactions  testing all outcomes.
Responses  measuring output (y's) We code the matrix for 3 reasons: 1. Different units. 2. Different scales. 3. Evaluate the interactions. Vertical Balance: When the sum of all coded matrix columns = 0. 

Why use the plotted pareto? Module 18

To see which had most effects.


What are some assumptions for prediction equation. Module 18

 orthogonal design  tesing all factors independently.
 Orthogonal coding (Coded Matrix +1's, 1's).  2 levels for each factor. 

When assessing the Regression Model Fit, what are the critical elements of regression table). Module 18

 Pvalue: P(2 tail) tells which predictor variables are significant; ROT: If p < .05 (turns RED, when BLUE moderately significant), the predictor variable is highly significant.
 r squared value: Measure of "goodness of fit"; scale from 0 to 1: measures proportion of variation that is explained by the regression model. 1R2 is proportion of variation of the data not explained in the model.  Adjusted Rsquared value (more statistically correct): Modified measure of Rsquared; adjusts for sample size and/too many terms in the model; mor realistic measure of the explanatory value of the model.  ROT: Want to see the adjusted Rsquard value close to the Rsquared value (deally, nt less than 90% of Rsquared) Adj R squared / R squard = .9 or >.  Standard Error (estimate): Measured the variation (std dev) about the regression line; used when there is not Shar model available. When you only have 1 rep. s = e(y1y) 2 devided by n1 

What is aliasing? Module 18

 Is 2 factors or interactions have identical or opposite settings throughout the design matrix.


What is confounded? Module 18

 Is 2 factors or ineractions have PARTIAL or FULL identical or opposite settings throughtout the design matrix.


What is an Interaction? Module 18

 Is a synergistic (or combination)effect. Two variables are said to ineract when the effect that one variable has on the output depends on the setting of the other varialble.
Example: Two parallel lines have NO interaction.  Two line that are NOT parallel ahve interaction. 

What is a full factorial design and when should it be used? Module 19

 Looks at all possible combinations of the factor settings.
 Should be used when the number of factors is small, generally 4 or fewer. 

What is a fractional factorial design and when shud it be used? Module 19

 A subset of the full factorial design.
 Should be sued when k=5 factors, or when the number of runs needs to be reduced and there is some prior knowledge about interactions. 

What is a screening design used for and which one is recommended? Module 19

Objectives:
 Screening is used when there are a large number of factors (> = 6) to consider and you want ot quickly find the most important ones to focus on.  The taguchi L12 deign is the recommended screening design. It has partical confounding of the the interactions, rather than perfect confounding.  Advantages: Identify factor that shift sigma or xbar using few runs.  Disadvantages: No information on interactions due to aliasing and partical comfounding. 

What is the resolution of a design? Module 19

 It is a measure of the power of the design in terms of the seriousness of aliasing.
 The higher the resolution, the better the design (in general) and the lses aliasing. Rsoltuio V is good for building models. 

Waht are some situations where the standard factorial designs won't work? Module 19

 Whent he settings for a factor depend on another factor (nested design). The KISS approach says to chip this in 2 experiments.
 When there is a constraint, such as the total of all components must sum to 100 (misture design). The KISS approach says to eliminate a variable forming ratios. 

What is a screning design used for? Module 19

 A screening desing is used to look at a large number of factors (generally four or more) to determine which have the biggest affect sot hat follow up experimentation can be done. It help us prioritize our efforts.


What is a full factorial design and when should it be used? Module 19

 Looks at all possible combinations of the factor setings. Should be used when the number of factors is small, generally 3 or fewer.


What is the difference between a qualitative and quantitative factor? Module 19

 Qualitative factors have settings that are categorical (blue/green, Bill/Sue).
 Quantitative factors have numerical settings that are measurement data (20 degrees/40 degrees, 3.4/3.6.3.7) 

Waht do Screen Designs objective, advantages, disadvantages? Module 19

 Objective: Help identify vital few input factors.
 Advantages: Finding input factors that effet the output (variation).  Disadvantages: By cutting down to a few runs, we lose insight into your interactions (how they affect the output).  12 Run Hadamard (PlackettBurman)  diagonals are positive.  TaguchiL12  can look at main factors ad 2way interactions...1st row is all negative...when few are identified, then ca run full factorial with the vital few. 

How do you run the DOE NonComputer Aided? Module 19

DOE PRO > Create Design > NonComputer Aided.
 Always 2 level for us.  Enter factor...name all factors with low and high settings,  1 response  enter response name.  replication  default.  

How do you run Marginal Mean Plot and Multiple Response Regression in DOE? Module 19

 DOE PRO > Analyze Design > Marigal Means Plot
 select the s and y hat models.  DOE PRO > Analyze Design > Multiple Response Regression.  R2 want > .9.  Adjusted R2 want as close to R2 as possible.  F...want greater than 6 = strong model. 

What is Full Factorial Designs (2 Levels)? Module 19

 Objective: test as many factors as we can (all possible combinations).
 Advantages: gain insight into all interactions (evaluate all linear & all interactions).  Disadvantages: more factors you have, more runs you need  too many runs when # of X's is > 4....If K > or = to 5, it is costly and becomes complicated. 

In a Regression table, there are no Pvalues in shat, what do you do? Module 19

 degrees of freedom are used up.
 If you had 8 runs and df = 7, then you have none left since the coefficient counts as a df.  ROT: absolute value that is > or + to sbar/2 and see if thre are any Pvalue that are > or = to half the value of the ceefficent  sbar/2.  If there are any above, then they would be significant factors that you would leave in the model...rerun the model after removing the insignifiant factors. 

What is Degree of Freedom (df)? Module 19

 These get used up for the Yhat.
 Look at the paretto and determine which factors have the most significant and go with those. 

DOE can be used to find ________ _________ as well as to possibly verify suspected etting that will improve the process. Module 19

Optimal Settings


___________ is in the graphs and ___________ tool in DOE. Module 19

Optimizer and Optimization.


How do you build a Fraction of the Full Factorial? Module 19

 The full factorial for 2 to the 4th power = 16 runs.
 a 1/2 fractor will have n = 2 to the 4th / 2 = 2 to the 41 power = 8 runs.  Want to avoid this one due to aliasing and confounding iwth the 2way interactions.  2 to the 51 power is ideal. aliasing doesn't happen with the 2way interactions. 

What at the objectives, advantages, and disadvantages for Fractional Factorial Design (2 levels)? Module 19

 Objective: test an orthogonal subset of the full factorial.
 n=2k1  q = 1 = > 1/2 fraction  q = 2 = > 1/4 fraction Advantages: Time & Money. Saves Runs. Disadvantages: Aliasing that occurs which results in lost information on some or all interactions.  Notes: do no use Res III or IV. 

What is the rule of Hierarchy? Module 19

 If you have a main factor that is part of a significant interaction, the main factor remains (even if in blue or black in regression table).


What are some reasons for NonConfirmatory results (way off from what you predicted)? Module 19

 Experimental discipline.
 Measurement error.  Other errors.  Too much variation in the response.  Aliased Effects.  Inadequte Model (2 level assumes linear relationships (could have a polynomial relationship).  Something changed (process changed after data collection).  Resolution V: 2 way aliased with 3 ways which is OK.  Primary DOE screening: Taguchi L12 (not recommended less than 12 due to aliasing).  Resolution  what level of aliasing in model; measure degrees of aliasing. 

What is the numerical (tabular) analysis? Module 17

SPC > Analysis Tools > Multiple Resgression.


How do you run Graphicl Analysis? Module 17

SPC XL > Analysis Diagrams > Scatter Plots.


What is the Rule of Thumb for R2?

TestR2Strenght
>= .95>= 9Strong .94.95.7.9Moderate Strong .71 .84.57Weak .55=/8135Weaker <.55<3Noise 

Why DOE? Module 18

Show how best (in the eyes of the customer) to systematically change (go faster, reduce cost) the input factors that are thought to influence the output.


What are Geometric designs used for?

Modeling.


What are NONGeometric designs used for?

Screening


For a Marginal Means Plot, identify the one that cause the most variaton and effect yhat.

Say operator is largest  you will want to reduce. Which factor plays most significant role that effect yhat.


What does you Yhat do?

Shift


Waht does Shat do?

Squeeze


What is the simple DOE augmentation to possibly reduce the number of runs?

Probability of Significance.
 Primary = A or B (2 A,B factor=4 runs) or C (3 factors A,B,C=8 runs)or D (4 factors A,B,C,D=16 runs)  2way 20% = AB or AC or BC or AD or DC  3way ABC, BCD, ACD 1%  4way ABCD 

For Rule of Hierarchy, if you have a A, B, C that is sufficient you have to leave what?

A, B, C


When NO PValues, what graph can you run? Module 19

Pareto


What is Total Lead Time?
Module 22 
Total Lead Time (time to start a process until customer receives product or service) always greater than cycle time (time to perform task within the process).


For Hypothesis Test, what are some questions you could ask and which hypothesis test could you run? Module 23

For Two Tail Test:
 Are the means of two samples different. ttest (mean) (tukey quick test)  Are the standard deviation of two samples different. Ftest (stdDev) ("e" ration quick test)  Are the proportions of two samples different? Test of Proportions.  Is there a dependence between two ways to classify data? Independence Tests  Chi Square (attribute). 1 Tail Test (new tests covered in this module).  Is the mean of ONE SAMPLE of data different from a standard value? One sample Test (compare to given value. Sample mean of people (BA1) that scored with 80% or higher.  Are the mean of TWO SAMPLES different where individual value in one sample are paired with individual values the other sample? Paired ttest. Two sample data set  same person compared two answers old/new data.  Is there any difference between the means of multiple samples? Use when more than two samples. ANOVA Variable(Analysis of Variance). Mean of afternoon class to mean of morning class. 

What is RStudentized Residual?

 Similar to the Studentized Residual, except that the standard deviaion is calculated without the ith data point. For most data sets an absolute value of rStudentized Residual > 3.5 will give a significance level of 0.05 in finding an outlier or an influential point.


What is Historical Data Option? Module 25

 NonStandard DOE
Factor setting misset. (setting 164 instead of 160. Some runs not completed.  Data from other than a DOE. "happens to have" data. Best guesses or other trails that have been recorded 

How do you perform the Historical Data Option in DOE PRO? Module 25

 Create the design matrix.
 Specify interaction term (if any to model).  Specify coding for factor settings.  Specify # reps and input data.  Analyze data as usual, paying particular attention to "tol" (remember orthogonality) values. 

When do Coding or Standardizing X's? Module 25

Coding 
 Sets the lowest value to a "1".  Sets the highest value to a "+1".  Does a linear transformation for all values in between.  Example: Actual XCoded X 10(lowest)1.0 200.0 250.5 30(highest)1.0 Note:  Coded doesn't SUM to Zero.  "C" stands for data Coded on Historical data sheet. Standardiziing:  The mean (X bar) and standard deviation (S) are used to normalize the values. Equation: Standardized x = actual (data point) value  Xbar actual devided S actual. X  Xxbar devided Sx. 

How many steps for Historical Data Analysis Strategy? Module 25

9 Steps.
 Step 1 & 2 get rid of outliers. Use Summary Stats, histograms and/or stemandleaf plots for all variables, scatter plots  Steps 3  check for intercorrelations between independent variables (i.e., multicolliearity). Look at scatter plot and correlation matrix. Remove or reduce the multicollinearity by centering the data (coding or standardizing (leave factors that are colinear because if not it will reduce), removing variables, etc. Centering data using DOE PRO > Historical Data by either standardizing or coding the input variables. If correlated, they are ot independent.  Step 4 Use DOE PRO to build Regression Model.  Step 5  Examine pvalues, tolerances, R2, Adjusted R2, etc. Identify outliers, remove and rerun step 4.  Step 6   For Step 6 Examine the model for insignificant terms. Stepwise, remove terms fromt he model that have high Pvalues and low tolerancs and perform regression analysis again o the new model.  Step 7 Repeat steps 5 and 6 until the "best" possible model is obtained. Higher R2 and higher F value.  Step 8 Test for adequacy of the model.. Examine risidual plot and test for lack of fit F LOF. Remove any outliers and repeat back to step 4.  Step 9  perform confirmation. Use half of the data (more like 80/20 rule) you have for building the model, and save the other half for cross validation (confirmation) purposes. 

What is decreate event simulation software package? Module 27

WB Moduler or DMADV.


What problems can cause a LOW R Squared or the model to not confirm? Module 24

 Problem 1: Experimental disciplie (inadequate SOP's, SOP's not followed). FIX: Better use of PF/CE/CNX/SOP.
 Problem 2: Data recording or input errors: FIX: Better use of PF/CE/CNX/SOP.  Problem 3: Inputs not related to Output. FIX: Better use of PF/CE/CNX/SOP.  Problem 4: Setup Variation. FIX: Better use of PF/CE/CNX/SOP.  Problem 5: Significant Factor(s) not in model. FIX: Better use of PF/CE/CNX/SOP.  Problem 6: too much variation in inputs. FIX: Better SOP for input (consider modeling with DFSS Master.  Problem 7: Poor Measurement System. FIX: Perform MsA and fix measurement system.  Problem 8: Aliased Effects. FIX: Use full factorial or higher resolution design (should not ba problem with Res V recommended designs).  Problem 9: Highly signifiant factor in shat model causing low R squared in yhat model. FIX: Use shat model to decrease variation.  Problem 10: Lack of Fit (LOF). FIX: Use 3level design (reason the CCD design is recommended). Add Quad terms.  Problem 11: Low and High setting of a factor are too close. FIX: Increase spread between settings.  Problem 12: Input is blocked with a noise factor. FIX: Better randomization. 

For DOE which factors are vertical and horizontal balanced?

 Main Factor A,B,C = Vertical Balance.
 Ineraction AB, AC, BC, ABC = Horizontal Balance. 

When designing a DOE experiment what do you need to know?

1. # of levels for each factor.
2. # of factors. 3. Design Name (Full Factorial). 4. # of Reps. NOTES:  Level K is 2 to 3rd.  K = Factor.  Orthogonality  vertical and hortizonal balance, independent.  n = levels K # of Runs  n  1 = # of terms  yhat = > affects meanneed to shift.  shat = affets Standard Deviation  need to squeeze. 

What is MSA?

Measurement System Anslysis synomom  HR&R
