Human error model
Models of accident causation are models used for the risk analysis and risk management of human systems. Since the 1990 they have gained widespread acceptance and use in healthcare, in the aviation safety industry, and in emergency service organizations. It is sometimes called the cumulative act effect.
Swiss Cheese Model
Reason proposed what is referred to as the “Swiss Cheese Model” of system failure. Every step in a process has the potential for failure, to varying degrees. The ideal system is analogous to a stack of slices of Swiss cheese. Consider the holes to be opportunities for a process to fail, and each of the slices as “defensive layers” in the process. An error may allow a problem to pass through a hole in one layer, but in the next layer the holes are in different places, and the problem should be caught. Each layer is a defense against potential error impacting the outcome.
For a catastrophic error to occur, the holes need to align for each step in the process allowing all defenses to be defeated and resulting in an error. If the layers are set up with all the holes lined up, this is an inherently flawed system that will allow a problem at the beginning to progress all the way through to adversely affect the outcome. Each slice of cheese is an opportunity to stop an error. The more defenses you put up, the better. Also the fewer the holes and the smaller the holes, the more likely you are to catch/stop errors that may occur.
It is generally known that most of the air accidents are related to human errors, while the mechanical failures in aircraft maintenance today has enormously been on the decrease with a number of new high technological equipments inventions (Hawkins, 1987).
Furthermore, in the perception of human factors, every individual, either who takes part in the operation or the supporting part of aviation, has individual capabilities and limitations. Thus, many countries in the world strive to secure the safety by training based on the interactions of each of SHELL components (Hawkins, 1987).
Most importantly, the SHELL model more emphasizes on the interfaces between a person (centre Liveware) and the other four components rather than its components themselves. On the other side, it is inapplicable in this model to cover the interfaces which are outside human factors such as Hardware-Hardware, Environment-Software (Reinhart, 1996).
From this SHELL model, each person (Centre Liveware) is applied to and interacted with the other four components and the different interactions between the person and each of other components considered as the human possibility, while it is believed from this theory that a mismatch between the centre Liveware and any other four components always leads to a source of human error (“Marine”, 2000).
Hardware : Various equipments, tools, aircraft, workspace, buildings and other physical resources without human elements in aviation constitute the Hardware.
Software : the Software comprehends all non-physical resources, which are for organically operation, like organizational policies/rules, procedures, manuals and placards.
Environment : The Environment includes not only the factors which influence where people are working such as climate, temperature, vibration and noise, but also socio-political and economic factors.
Liveware : The Liveware includes factors like teamwork, communication, leadership and norms.
Central Liveware : The Liveware, which is in the centre of the SHELL Model, can be defined as human elements such as knowledge, attitudes, cultures and stress. This Liveware is regarded as the core of the SHELL Model and other components match with the Liveware as the central figure (Hawkins, 1987).
Firstly, the interaction between the Liveware and the Hardware (L-H system) is usually named man-machine system. This system can be easily explained by an example which aircrafts should provide a great value of services as much as they can, such as fitting seat in aircraft, for the passenger’s comfortable flight.
Hawkins (1987) argued that the design of controls and displays, which is subject to the L-H interaction, should be matched with human characteristics and conveniences in order to minimize the possibility of L-H error occurrences. In addition, the errors originated from the deficiency of this L-H interaction are commonly seen when human factor specialists only consider the design on the in-flight control and display leading to the common errors (Hawkins, 1987).
Some research in 1940s showed an instance of the common error of L-H interface that the old three-pointer altimeter had caused common errors in aviation field. Therefore, displays should indicate information that people can process their tasks in order to successfully minimize the occurrence of the error, such as knowledge of human behavior and the way that people can process information, make decisions and act on them.
There was a fatal air accident in February 6, 1996 that a Boeing B-757-225 crashed into the Atlantic Ocean. This accident can be an appropriate instance that shows how the L-H error caused the air accidents. In this case, the aircraft had a problem of incorrect airspeed indication and it, eventually, led to fatal disaster that every 189 passengers in the aircraft were killed (Kebabjian, 2005).
The second interface, in the SHELL Model, is represented as the interaction between the Liveware and Software. As the Software indicates intangible objects than those of the Hardware, it is clear that the error of L-S interaction is more difficult to solve than the error of L-H interaction.
The deficiencies conceptual aspects of warning system can be applied to the L-S interface and it can lead to an irrational indexing system in the manual operations to the delaying or errors when people seek vital information.
For example, in the past, some early checklists did not have any written responses for the specific situation change on the lists and the pilots at the time did not check the checklists properly. To reduce the error of L-S interaction, Hawkins (1987) found a solution, which is called SOPs. However, he commented on this solution that SOPs is not for every possible condition, but for some flexibility.
It is not too much to say that the air accident of Korean Airlines, August 6, 1997, was the one of the worst air disasters up to recently that drew 229 people, who consisted of 215 passengers and 14 crew members, into the demises. This disaster caused by the defect of the software problem resulting in the impediment of the MSAW (Minimum Safe Altitude Warning System) (Kebabjian, 2005).
The efforts toward the error of this L-E interaction is well shown from flight instruments, which help overcome obstacles of flight, like helmet against the noise, flying suit against the cold, goggles against the effects of altitude.
Additionally, this L-E interface is concerned on the organizational, regulatory and socio-aspects of environment like the morale of employees and the health of organization in the aviation field. Hawkins (1987) especially emphasized on the three environmental factors: noise, heat and vibration, which can result in the error of L-E interaction. He also provided that these errors can be minimized through optimizing control of those three factors as many successful research have been shown.
In the recent, February 3, 2005, there was an air accident related to the L-E interface. The Boeing B-737-200 was unable to land at Kabul, which was the destination of the flight, by for the reason of a blizzard. Although the aircraft, then, tried to land at another safe place, it finally exploded with the crash into the mountain and made 104 causalities (Kebabjian, 2005).
Finally, there is the last interface in the SHELL Model, which is the interaction between the Liveware and Liveware. This L-L interface is also related to leadership, crew cooperation and personality interaction and human factors experts have ascertained that, the problems of L-L interaction, such as errors within team-work, had caused a great deal of accidents.
Hawkins (1987) also suggested a possible solution in terms of L-L interaction, which are CRM (Cockpit/crew Resource Management), TRM (Team Resource Management), and LOFT (Line Oriented Flight Training). He also determined that these effective training programs on crew members towards better cooperation and communication would build considerable reductions in the occurrences of L-L error.
In July 31, 1992, the Airbus A310-304 which was going to Kathmandu burst into flames. This was assumed the accident regarded as an error of L-L error. The origin of this accident reported that there had been some confusing communications between the flight control tower and the crew members in the aircraft and those miscommunications, finally, led to the fatal accident (Kebabjian, 2005)