Use LEFT and RIGHT arrow keys to navigate between flashcards;
Use UP and DOWN arrow keys to flip the card;
H to show hint;
A reads text to speech;
38 Cards in this Set
- Front
- Back
Human factors
|
Designing things people use in order to enhance performance and minimize errors.
|
|
Human error
|
Inappropriate or undesirable human decision or behavior that reduces, or has the potential for reducing, effectiveness, safety or system performance.
(undesirable effect of an action/decision) |
|
Error
|
* an action or lack of action that violates or exceeds some tolerance limits of the system
* defined in terms of system requirements and capabilities * does not imply anything about the human |
|
______ is cited as one of the most common causes of accidents.
|
Human error
|
|
Errors of omission
|
failure to perform required action
|
|
errors of commission
|
incorrect action is performed
|
|
Subclasses of commission (4)
|
* Sequence errors- step performed out of sequence
* Timing errors- actions performed at the wrong time * Selection errors- incorrect control is manipulated * Quantitative errors- too much of too little of the appropriate control manipulation is made |
|
Discrete-Action Classifications
|
* Errors of commission
* Errors of omission |
|
Intentional Classification
|
error can also be classified according to whether or not the action was performed (or omitted) as intended
* slips * mistakes |
|
Slips
|
occur through a failure in execution of action
*deviation from the intended action often provides immediate feedback about the error |
|
Mistakes
|
arise from errors in planning/decision
* feedback about error delayed, and more difficult to detect (mistakes are more serious than slips) |
|
Why do people make mistakes?
|
* Inadequate workspace and layout
* Poor environmental conditions * Inadequate human engineering design * Inadequate training * Poor supervision * Stress * Lack of knowledge or skill |
|
Why not just automate to reduce risk?
(machines can only do what they are programed to do) |
* Flexibility- people can do more things than machines
* Adaptability- people can solve under defined problems or new situations better * Ingenuity- people can solve under defined problems better * Thinking- people can go beyond data, can see possibilities or challenges |
|
Flexibility
|
people can do more things than machines
|
|
Adaptability
|
people can solve under defined problems or new situations better
|
|
Ingenuity
|
people can solve under defined problems better
|
|
Thinking
|
people can go beyond data, can see possibilities or challenges
|
|
How to avoid or reduce risk
|
* build higher quality components and systems
* redundancies to back-up system * compensations mechanisms that mitigate the failures * escapes to get away from the consequences of error * estimate the likelihood of potential human/machine failure and design for it * recommend improvements in human-machine iteration |
|
How to deal with human error
|
* Selection- person's capabilities and skills
* Training- can reduce errors, but expensive and not always effective long term *Design -Exclusion designs- impossible to commit error -Prevention designs- difficult to commit error -Fail-safe designs- reduces consequences |
|
What is often the most cost effective approach?
|
Designing to reduce errors
|
|
HEP
|
Human Error Probability
HEP = (# of errors) / (# of error opportunities) |
|
THERP
|
Technique for Human Error Rate Prediction
|
|
What is an accident?
|
* unexpected, unintentional act, chance
* unanticipated event which demands the system and/or the individual or affects the accomplishment of the system mission or the individual's task |
|
What did Suchman do?
|
(1961) produced a list of indicators of the accidental nature of an event
1. Low degree of expectedness 2. Low degree of avoidably 3. Low degree of intention |
|
Critical-incident technique
|
description of observed unsafe acts or near-miss accidents described in detail
|
|
Limitations of accident databases
|
* countermeasures to reduce future accidents rarely reported
* focus tends to be on reporting accidents in a method suitable for computer analysis * trends can be identified, but most accidents are the result of complex chain of events not adequately described * untrained individuals collecting data * focus in the injury, not on the case * not all accidents are reported |
|
Accident liability ______ when job demands exceed worker capabilities.
|
increases
|
|
Adjustment to stress theory
|
accidents will be higher in situations where the level of stress exceeds the capacity of the people to meet it
|
|
Arousal-alertness theory
|
accidents are more likely to occur both when arousal is too low or too high
|
|
Describe Swiss cheese model
|
Slices of cheese represent defenses, barriers, and safeguards.
Holes represent failures and latent conditions. If these holes line up a accident can occur. |
|
Errors (HFACS)
|
represent the mental or physical activities of individuals that fail to achieve their intended outcome
|
|
Violations (HFACS)
|
refer to the willful disregard for the rules and regulations
|
|
How to reduce accidents by altering behavior
(management type of improvements) |
* Procedural checklists- list of steps to be executed, memory aid
* Training- help people acquire safe behaviors practices, transferability * Feedback- reinforces training, maintains change after "intervention" * Contingency reinforcement strategies- repetition of specific behaviors by providing rewards (reinforcement) when observed * Incentive programs- incentives for achieving certain safety records |
|
Resilience
|
intrinsic ability of an organization of system to maintain or regain a stable state, which allows it to continue safe operations after a serious event and/or in the presence of continuous stress
|
|
Resilience Engineering
|
enables organizations to create processes that are robust yet flexible, to monitor and revise risk models, and to use resources proactively in the face of disruptions or ongoing production and economic pressures
|
|
Qualities of a resilient system
|
* Anticipation- knowing what to expect
* Focus- knowing what to look for * Feedback- knowing what to respond to * Response- knowing what to do |
|
Risk-important steps
|
procedure steps or actions that expose products, services, or assets to the potential for harm
|
|
Critical step
|
any action that, if performed improperly, will trigger immediate, irreversible harm
|