Archives of Personal Papers ex libris Ludwig Benner, Jr.
- - - - - -Last updated on Saturday, July 12, 2012
Originally Posted 23 Jan 1998
PREVENTING FLIGHT CREW ERRORS: PRIMARY DATA MUST DRIVE ANALYSES |
Ludwig Benner, Jr., P.E. | Ira J. Rimson, P.E. |
Ludwig Benner & Associates | the validata corporation |
12101 Toreador Lane | 8223 Mosquero NE |
Cakton, VA 22124 | Albuquerque NM 87109 |
(703)758-4800 | (As of 07/01/96) |
Two recent major transport crashes again focus public and professional interest on human factors. They occurred within highly safety-sensitive organizations. Why were the relevant human behaviors exhibited by these accidents not identified and changed before these crashes?
The authors draw on research by Professor Diane Vaughan in her book The Challenger Launch Decision, released in February 1996 to commend a new approach to the Office of System Safety's Flight Crew Accident and Incident Human Factors project. Vaughan's work persuasively relates how reliance on findings of prior Presidential and Congressional investigations tainted her early views of the Challenger launch decision. Once Vaughan discovered that the earlier investigations were unable to answer basic questions about the launch decision, she reopened and completed the investigation by working with participants in the launch decision process, and got "turned around" by the additional new decision and action data when it was placed in the context of the time.
The authors conclude that OSS should get investigations of mishaps "turned around" by changing its project goals, approaches and methods, and propose specific actions.
Since the June 1995 FAA Office of System Safety (OSS) Workshop on Flight Crew Accident and Incident Human Factors, two major transport aircraft crashes have focused public and professional attention on human factors. Both the American Airlines Flight 965 enroute to Cali, Columbia, and the U. S. Air Force T-43A (USAF B737) enroute to Dubrovnik, Croatia, occurred within highly safety sensitive organizations, with a reputation for strong commi"ents to human performance improvement Both mishaps are currently under investigation by experienced investigative authorities. It is not yet clear -It is not clear whether these investigations can or will produce greater insights that previous investigations, whose deficiencies in human factors data development prompted the OSS's initiatives and these workshop.
Abut 10 years ago, the National Aeronautics and Space Administration (NASA) was generally recognized as one of the most highly safety-sensitive organizations in the world. Its reputation for leadership in risk assessment and safety decision making was severely undermined with the Challenger shuttle accident in 1986. The Challenger accident was investigated by both a Presidential Commission and a Congressional Committee. The investigations created a documentary record that became the basis for the historically accepted explanation of this historic event: production pressures and managerial wrongdoing.[1] Challenger is remembered as a technical failure to which the NASA organization contributed. The House Committee's finding blamed individuals, suggesting they were unqualified for their positions, this indicating the possibility that the decision to launch was the result of management incompetence.[2]
A new book was published in February 1996 (Vaughan) that is relevant and instructive for this OSS Workshop. Vaughan's initial review of the public records from the Challenger
investigations led her to believe that it resulted from organizational misconduct - an area of specific scholarly interest for her. But as she tried to understand the process that led to the launch decision, she discovered that the data developed by the two investigations were inadequate to answer why the decision to launch was made, a fundamental question. This question, and questions about connections between the economic strain at NASA, rule violations, and decision making about the Solid Rocket Booster hadn't been asked during the investigation. Vaughan discovered organizational and managerial issues, influential on the mental sets of the participants, were similarly ignored during the investigation.
To find out why the launch decision was made, she had to reopen the investigation and complete it herself by conducting additional interviews with the people who participated in the decisions. Using the new data obtained from these original sources, she discovered that her initial impressions and interpretations - and those of the Presidential Commission and House Committee - were wrong. She describes in detail how she reached those conclusions. The discovery that stood the other investigations on their head was:
Does this reversal of the conventional wisdom hold a message for the investigation and human factors communities? You bet they do! If investigations under such illustrious authorities as the President and Congress of the United States can accept uncritically the assumptions and mind sets which led to their fallacious conclusions and prevention initiatives, surely we should not expect anything better from common day-today investigative efforts of traditional authorities.
Once these interactions and their contextual world view were understood, Vaughan was able to identify prevention initiatives. In the NASA/Challenger aftermath these included (1) organizational acceptance of normalized deviations; (2) organizational and managerial cultures of production, and (3) structural secrecy. Vaughan not only identified these problems, she demonstrates how their correction would have prevented the decisions which led to the disastrous outcome.
Vaughan's "Lessons Learned" can well be adopted as object lessons for this OSS initiative.
We addressed the need for new investigation paradigms in our paper for the 1995 Workshop.[7] We recommend the following specific positions be adopted by the Workshop.
identify statistically inferred associations, the ethics of which seem dubious in this area, given the cost of additional accidents in human lives and suffering.
The key to reducing Human Error or erroneous actions or decisions is to change Human Behavior or actions or decisions. That requires identifying the behavior or actions and decisions to be changed, and its intended replacements. Undesired behavior can only be identified from original source data about specific actions and decisions, properly documented, and analyzed to recognize effective prevention strategies.
Benner, Ludwig Jr.; 10 MES Investigation Guides, Ludwig Benner & Associates, Oakton, VA 1979. 2nd Edition 1994
Benner, Ludwig Jr. & Ira J. Rimson; "Quality Management for Accident Investigators". forum 24:3, October 1991 (Part 1); 25:1, March 1992 (Part 2). International Society of Air Safety Investigators (ISASI), Sterling, VA.
Federal Aviation Administration, Office of System Safety, Proceedings of Workshop on Flight Crew Accident and Incident Human Factors, June 21-23, 1995,
Hendrick, Kingsley & L Benner, Jr.; Investigating Accidents with STEP. New York, Marcel Dekker, 1987.
Hendrick, Kingsley, L Benner & R. Lawton; "A Methodological Approach to the Search for Indirect Human Elements in Accident Investigations." Proceedings of the Fourth International Symposium on Aviation Psychology. (R. S. Jensen, Ed.) The Ohio State University, Columbus; April 29, 1987.
Reason, James T.; Human Error. Cambridge University Press, 1990.
Rimson, Ira J.; "Standards for the Conduct of Aircraft Accident Investigations". forum 23:4, February 1991. ISASI, Sterling, VA.
Vaughan, Diane, THE CHALLENGER LAUNCH DECISION; Risky Technology, Culture and Deviance at MM,. University of Chicago Press, Chicago, IL 1996
Footnotes
[1] See Vaughan, p 8. "Unraveling the history of the decision making.. in its televised hearing, the Commission laid the groundwork for what became the historically accepted explanation of the Challenger launch decision: production pressure and managerial wrongdoing."
[2] Ibid. p 11 for differences between Commission and Committee findings. Neither emphasized technical difficulties in reaching decisions, so public could recognize problems.
[3] Ibid. p 73 about situating controversial actions In the "stream of actions" in which they occurred. See also pp 243-247 for discussion of the decision stream context for examining decisions.
[4] Ibid. p 393. "Retrospection corrects history, altering the past to make it consistent with the present, implying that errors should have been anticipated. " Compare to J. Reason's comments about "falling prey to the fundamental attribution error (blaming people and ignoring situational factors)" and ".the retrospective observer should be aware of the beam of hindsight in his own eye." (Reason) p 216
[5] Ibid. p 393. "Understanding organizational failure depends on ..going beyond secondary sources, relying instead on personal expertise based on original sources that reveal the complexity, the culture of the task environment, and the meanings of actions to insiders at the time."
[6] See 055 1995 Workshop Proceedings, Wise and Wise, A-105 for discussion of a priori questions, in the context of conducting the investigation of decision making.
[7] See 055 1995 Workshop Proceedings, Benner & Rimson, A-21-22. Vaughan's work shows how they might be implemented.
[8] See Benner & Rimson 1992 for example of non-statistical validation of a description of what happened, including decisions and actions by decision makers.
[9] See ethnographic research methods (Vaughan) or STEP (Hendrick and Banner) or events overiays (Hendrick, Benner and Lawton) or multilinear events sequencing methods (Benner 1994) The last provides for "progressive" testing of data as data are acquired during an investigation, to determine data still needed, or sufficiency of explanation of what happened
[10] See OSS 1995 Workshop Proceedings, p 7 reporting the need for narrative description of events and access with a narrative search capability.
[11] A personal communication with NASDAC staff disclosed that an obstacle to data integration by NASDAC has been inconsistency of terms across reports processed.