This system is known as Just Culture, which was created by James Reason. Just Culture is based on the idea that accidents and safety violations are the result of deficiencies in the system rather than the result of irresponsible workers (Wood, 2003). To illustrate, in the 1960s, the United States Air Force began taking oil samples from engines to be able to better predict wear, and determine the service life of engines (Wood, 2003). The sample bottles that the Air Force would use were small, and in the case of the J-57 engine, the bottles perfectly fit at the bottom standpipe of the oil reservoir which, if blocked, will prevent oil to go to the engine (Wood, 2003). If this occurred, it would eventually result in catastrophic engine failure. If a mechanic accidentally dropped a sample jar into the reservoir tank, the jar would rattle around, and eventually make its way into the standpipe, killing the engine. Sometimes it would happen on the next flight; sometimes it would happen after multiple flights, so finding out who dropped the jar into the tank was not always an easy task (Wood, 2003). Eventually, it was occurring enough to launch an investigation. The Air Force discovered that the organizations that did not have a problem with failed engines would thank, and sometimes even reward technicians who reported that they inadvertently dropped a jar into the reservoir (Wood, 2003). Conversely, the organizations that were having a problem with failed engines would punish mechanics that accidentally drop a jar into the tank. Since the mechanics would fear reprisal for making a mistake, in many cases they would not report dropping a jar into the tank, which would eventually lead to engine failure (Wood, 2003). This was a scenario where knowledge of the problem was more important than punishing the person who made the mistake.
This system is known as Just Culture, which was created by James Reason. Just Culture is based on the idea that accidents and safety violations are the result of deficiencies in the system rather than the result of irresponsible workers (Wood, 2003). To illustrate, in the 1960s, the United States Air Force began taking oil samples from engines to be able to better predict wear, and determine the service life of engines (Wood, 2003). The sample bottles that the Air Force would use were small, and in the case of the J-57 engine, the bottles perfectly fit at the bottom standpipe of the oil reservoir which, if blocked, will prevent oil to go to the engine (Wood, 2003). If this occurred, it would eventually result in catastrophic engine failure. If a mechanic accidentally dropped a sample jar into the reservoir tank, the jar would rattle around, and eventually make its way into the standpipe, killing the engine. Sometimes it would happen on the next flight; sometimes it would happen after multiple flights, so finding out who dropped the jar into the tank was not always an easy task (Wood, 2003). Eventually, it was occurring enough to launch an investigation. The Air Force discovered that the organizations that did not have a problem with failed engines would thank, and sometimes even reward technicians who reported that they inadvertently dropped a jar into the reservoir (Wood, 2003). Conversely, the organizations that were having a problem with failed engines would punish mechanics that accidentally drop a jar into the tank. Since the mechanics would fear reprisal for making a mistake, in many cases they would not report dropping a jar into the tank, which would eventually lead to engine failure (Wood, 2003). This was a scenario where knowledge of the problem was more important than punishing the person who made the mistake.