Pathological state of aviation safety. Learnings from recent SpiceJet incidents.

Is there a formula or method which can be used to predict accidents? A spate of runway excursions, tail strike and finally an unfortunate accident claiming the life of a Spicejet technician. 

The “accident pyramid”, as depicted by H. Heinrich in the second edition of his book Industrial Accident Prevention: A Scientific Approach. Note the last sentence: “Moral — prevent the accidents and the injuries will take care of themselves”.

F. Bird’s work revealed the following ratios in the accidents reported to the insurance company:

For every reported major injury (resulting in a fatality, disability, lost time or medical treatment), there were 9.8 reported minor injuries (requiring only first aid). For the 95 companies that further analyzed major injuries in their reporting, the ratio was one lost time injury per 15 medical treatment injuries.

The safest way to operate in any industry is to eliminate all hazards. Unfortunately, this is a virtually impossible task since the cost of such an operation would be prohibitive. The effort required to run such a system will make it inviable in every sense. A Safety Management System (SMS) is a systematic approach to managing safety, including the necessary organizational structures, accountabilities, policies and procedures. As per ICAO requirements, service providers are responsible for establishing an SMS, which is accepted and overseen by their State.

Prior to implementation of the ICAO Annex 19 SMS, the regulations were more prescriptive. One could either perform an action or not, it was black or white. With the implementation of the SMS, actions can be performed if the risk generated is acceptable interns of the product of frequency and probability. Therefore, if the SMS program is fully implemented, then risk based decisions could be taken. The SMS program would keep a track of the risk to ensure that it does not exceed the limits set by the system through a process of feed back and periodic review. Training of personnel and effective reporting through a healthy safety culture would make the system more robust and effective.

A typical safety risk analysis of Spice Jet operations would reveal the risk level as 5C due to the high frequency of major occurrences. This would typically entail immediate mitigating action by the operator/regulator or restrict operations. These measures are required to ensure that the operator gets their act together and ensures that the safety management system is working and more importantly effective.

Note: The above table is not based on actual data but indicative of a likely scenario.

The SMS process actively looks out for hazards, evaluates them for the level of risk and implements control strategies. There is a review process which determines the effectiveness of the control process and determines the further strategy. This process ensures that the hazards are not out of sight and new or latent hazards that might pose a risk are trapped by the mitigation process. When looking for hazards, it is important  that all available data is used. This is one of the key principles of high reliability organizations (HRO). A HRO does not believe in simplifying data because the more the data , the chances of predicting or identifying a hazard are more and the effectiveness of the control strategy can be accurately measured. The importance of a good and through investigation provides key findings and determines the root causes for the occurrence. This also prevents any immediate or future similar occurrence. Most errors are attributed to human factors. Humans, by their very nature, make mistakes; therefore, it should come as no surprise that human error has been implicated in a variety of occupational accidents, including 70% to 80% of those in civil and military aviation (O’Hare, Wiggins, Batt, & Morrison, 1994; Wiegmann and Shappell, 1999; Yacavone, 1993).

Unsafe Acts


Decision errors: These “thinking” errors represent conscious, goal-intended behavior that proceeds as designed, yet the plan proves inadequate or inappropriate for the situation. These errors typically manifest as poorly executed procedures, improper choices, or simply the misinterpretation and/or misuse of relevant information.

Skill-based errors: Highly practiced behavior that occurs with little or no conscious thought. These “doing” errors frequently appear as breakdown in visual scan patterns, inadvertent activation/ deactivation of switches, forgotten intentions, and omitted items in checklists. Even the manner or technique with which one performs a task is included.

Perceptual errors: These errors arise when sensory input is degraded, as is often the case when flying at night, in poor weather, or in otherwise visually impoverished environments. Faced with acting on imperfect or incomplete information, aircrew run the risk of misjudging distances, altitude, and descent rates, as well as of responding incorrectly to a variety of visual/vestibular illusions.


Routine violations: Often referred to as “bending the rules,” this type of violation tends to be habitual by nature and is often enabled by a system of supervision and management that tolerates such departures from the rules.

Exceptional violations: Isolated departures from authority, neither typical of the individual nor condoned by management.

The question which arises is, if the root cause of most errors/violations is the human being then isn’t it better to fix the cause rather than creating engineering controls for every error that the human makes? It is important to determine why normal procedures failed to work or the reasons for failure of standard operating procedures to trap the error or error prevention. it will be more fruitful to concentrate our effort on revising the SOP’s and retraining humans to ensure behavioral control. The hierarch of controls depicts the effectiveness of controls from highly effective to least effective. Elimination of the hazard is the best means but may not be prudent for the bottom line or the financial viability of the enterprise. Replacing the hazard may not be possible every time since it may be the essential part of the business. An aircraft itself may be an hazard, or other equipment used onboard or attached to the aircraft may be an hazard that cannot be replaced. Isolation can be effective for activities like e.g. isolate passengers when refueling the aircraft, but there are procedures to ensure safety with passengers on board while refueling is carried out. This is to ensure commercial interests are not compromised without lowering the safety standards. Administrative controls is about changing the way people work. It is one of the least effective control but the most challenging one to implement. Organizations must ensure that a safety culture prevails in the organisation. A safety culture is the way people do their business without any oversight. The aim of any organisation is to establish a generative safety culture where safety comes in naturally as against the pathological culture where people don’t care as long as they are not caught. An organisation with a highly effective safety culture will spend less time on finding ways to reduce errors and losses. They will need least amount of controls for error prevention and will therefore be more financially, commercially efficient. Safety culture maturity model Implementing a safety culture is a long term strategy which has several stages. Quality and safety both pay in the long run but one has to stay invested in them to reap the benefits. Losses are prevented at every stage and hazards are actively identified and risk mitigated.