Professionalism/The Aviation Safety Reporting System

The ASRS is a confidential self reporting system which provides a valuable database about safety incidents in the aviation industry. On one hand, it incentivizes professional behavior by providing employees a platform to self-report safety incidents. However, incentivising reporting through immunity from prosecution arguably lowers the expected standard of professionalism. This chapter explores the dilemma that professionals face when using incident reporting systems like the ASRS.

History of formation

edit

On December 1, 1974, TWA Flight 514, crashed into a Virginia mountain top as it descended on to Dulles Airport in Washington, DC, killing all passengers and flight crew. Six weeks earlier, an United Airlines had narrowly escaped the same fate. Unfortunately, this safety information was only shared internally by United Airlines and not relayed to other airlines.[1] The TWA 514 disaster highlighted the benefits of having an industry-wide database of safety incidents and thus the ASRS was institutionalized in 1976.

How it works

edit
 
Figure 1: How the ASRS works

Figure 1 illustrates how the ASRS works. Any employee in the aviation industry can report a mistake or unprofessional behavior to the ASRS. The ASRS is administered by NASA, which functions as an independent third party. NASA de-identifies the reports and compiles them into a publicly searchable database.[2] This information is used by the aviation industry and government agencies such as the Federal Aviation Administration and National Transportation Safety Board to improve safety by issuing alerts, making policy changes,[3] and communicating safety trends through newsletters to the aviation community. In return, the employee is granted immunity from prosecution.[3] If an employee is suspected of negligence or professional misconduct through an independent investigation, the employee can use this immunity if an ASRS report was filed. This immunity can be exercised once every five years.

Professionalism Dilemma of the ASRS

edit

On February 12, 2009, Colgan Air Flight 3407 crashed into a house in Clarence Center, New York killing 50 people. An investigation by the National Transportation Safety Board revealed that the accident was caused by the pilots’ inability to respond properly to the warnings due to pilot fatigue.[4] Jackson and Earl’s study showed that 75% of pilots are severely fatigued while flying.[5] The study also reported that 21% of incidents reported in the ASRS are related to fatigue. A fatigued pilot can report to the ASRS and gain immunity from any consequences. Thus, there is an incentive to report and gain immunity instead of admitting guilt and facing the consequences. This is contrary to the actions of a professional who takes full responsibility for his or her actions. From this perspective, the immunity clause is similar to a ‘get out of jail free’ card. The ASRS is a utilitarian system that values information about safety incidents more than punishment of unprofessional behavior.

Similar Systems

edit

Healthcare

edit

Legislation in more than 22 states requires healthcare organizations to maintain such systems.[6] One example is the Patient Safety Reporting System (PSRS) that is used at Veterans Affairs (VA) medical centers. This system was modeled after the ASRS, and NASA was contracted to develop the system for the VA.[7] These systems serve to collect valuable information on medical near misses and supplement well established mandatory reporting systems focused on preventable deaths and serious injuries.[8]

Medical professionals face the same professional dilemma as aviation staff in reporting safety events. However, healthcare reporting systems do not achieve the same level of reporting as the ASRS. Studies conducted in the United States indicate that, despite mandatory reporting requirements, the underreporting of adverse events may be as high as 96 percent.[9] Cullen et al. reported that only 6% of hospital adverse drug events were reported through the hospital incident reporting system.[10] Similar healthcare quality assurance programs based on voluntary reporting identify only a small fraction of critical events.[11] While most physicians agreed there is a need to develop a reliable means for reporting medical error, many doctors stated a reluctance to participate in such programs.[12]

UVa Honor System

edit

The Honor System at the University of Virginia (U.Va.) operates similarly to incident reporting systems in aviation and healthcare. It relies on peer reporting of academic dishonesty, and facilitates student self-reporting through the conscientious retraction and informed retraction mechanism in return for a less severe penalty. While the UVa Honor System may serve as an effective deterrence to cheaters, surveys indicate it does not encourage students to keep others and themselves accountable and report academic dishonesty. An honor survey conducted in 2012 shows that 64% of students do not report other people for honor offenses as they are uneasy with the single sanction.[13] Students see a “widespread distaste for accusing one’s classmates” for honor violations,[14] and analogize the honor system to a “hit squad”.[15]

Factors Influencing Self Reporting

edit

Incentives and Tradeoffs

edit

The alignment of individual and institutional incentives plays a critical role in self-reporting systems. Systems like the ASRS are designed to encourage employees at any level to report incidents or mishaps. Employees receive confidentiality and immunity in exchange for providing valuable first-hand information which is favorable for management. This contrasts a culture where employees withhold information on breach-of-safety incidents as they fear that admission will be detrimental to their career.

However, such systems have a major tradeoff. By providing immunity for cooperation, the aviation system foregoes its right to punish employees who have erred. Consequently, employees may become less prudent if they feel the consequences for their mistakes are less severe. Thus, organizations that use incident reporting systems like the ASRS value the acquisition of information more than the punishment of unprofessional behavior because these organizations can use the information to mitigate causes of accidents.

The success of confidential reporting systems like the ASRS depends on the type of mishap being reported (i.e. human-factors error, meditative or overly-careless acts) and the severity of punishment avoided through immunity. Reporting in the medical profession is much lower than in aviation. The lack of immunity could explain why hospitals are less successful with incident reporting systems. Harper and Helmreich show that immunity could increase reporting.[16] Less than a third of hospital reporting systems let doctors, nurses and others recount mistakes anonymously and promise immunity for those who identify themselves.[17] Likewise, students in universities have no incentive to report other people or themselves. Most students will not admit guilt because a single sanction punishment (expulsion) is significant.

If you say 'I'm guilty', then you can say goodbye to the University.

—University of Virginia Honor Counsel Conor O'Boyle[18]

Shift in responsibility

edit

The ASRS is a mechanism for the redistribution of safety responsibility to a non-affiliated third-party appointed to collect reported information and analyze it to take steps to fix it. Dr. Charles Billings, one of the NASA founders of the ASRS system, states that “the use of this knowledge must be the responsibility of the system stakeholders”[1] as opposed to being entrusted to individual employees to take action on. Since the aviation system is run with little margin for safety error, if administration cannot confirm that each pilot will take decisive action to address an issue observed on the job, they instead take this responsibility out of the hands of all pilots. The ASRS ensures that these reports are accepted in a standardized manner.

The lack of reliance on employees may be attributable to a doubt in the individuals’ ethical fortitude or a limitation to their capacity within the company structure. From past incidents, including the Applegate Memo and the Challenger reentry, respectively, most common reasons for failure of effective incident reporting is because (A) workers only report to the extent that they can redistribute responsibility to another body to avoid being liable, or (B) because they choose not to take self-directed action due to lack of motivation, misaligned motives in, or pressure from within a hierarchical structure. The ASRS provide a path of least resistance for these reports to reach the NASA analysis group. Subsequently, and addressed properly The ASRS and other incident reporting systems thus provide a path of least resistance to reporting. This is similar to the shift of responsibility for control of academic dishonesty from faculty and administrators to students through the honor code[19].

Organizational factors: culture and hierarchy

edit
 
Air Traffic Controller in EHAM TOWER, Schiphol Airport 2006

Organizational factors play a large role in encouraging or discouraging incident reporting. Many organizations in aviation embrace a ‘just culture’ that facilitates accountability through fair treatment.[20] While this culture does not guarantee complete freedom from prosecution, it supports the belief that accidents happen and that people and systems are fallible.[21] These cultures will in turn define associations for errors and affect how forthcoming people will be for reporting these incidents. For example, to ease the stigma for admitting mistakes when air traffic controllers bring two planes too close together, the FAA changed the categorization of such events from "operational errors" to “operational incidents”.[22]

However, the culture of health care is not as forgiving of errors by clinicians, especially physicians. Neither the public nor physicians themselves are tolerant of medical error; indeed, physicians generally do not feel comfortable talking about their errors,[23] nor can colleagues generally be expected to offer support.[24] As Leape wrote recently,

...In everyday hospital practice, the message is [...] clear: mistakes are unacceptable. Physicians are expected to function without error, an expectation that physicians translate into the need to be infallible. . . .This kind of thinking lies behind a common reaction by physicians: How can there be an error without negligence?

—Leape [9]

The actions of others, especially senior employees in an organization, sets a precedence for junior employees to follow. Junior doctors say they rarely see their seniors report or act on errors—their own or those of others.[25] Unwarranted administrative intrusions into practice also inhibits error acknowledgement[26] as this enforces the hierarchy within organizations and creates an environment that is adversarial to admitting mistakes. Research shows that pilots were less likely to deny the effects of fatigue on performance compared to surgeons and anaesthetists,[27] thus highlighting the potential impact of cultural differences between aviation and medicine on error acknowledgement.

Utilitarian vs Professional Conduct

edit
The most detrimental error is failing to learn from an error.

—Reason, J. T.[28]

From a utilitarian perspective, incident reporting systems like the ASRS are mechanisms that gather sensitive information which would be otherwise difficult to collect in industry. By aggregating a large amount of this information for analysis, system managers can make informed decisions to facilitate productive changes to organizations or industries. It is to this end which a utilitarian will justify the methods of data collection in incident reporting systems.

However, if a professional is defined as one who exercises expert judgement and takes personal responsibility for the consequences, integrity will suffice to motivate self-directed action to mitigate any negative outcomes of one’s actions. Incident reporting systems like the ASRS that rely on incentives to drive reporting assume a lower expectation of professional conduct in favor of centralized action.

Conclusion

edit

Confidential self-reporting systems like the ASRS enable the collection of valuable information that might otherwise go unreported. However, by providing immunity, the ASRS permits some professional misconduct to go unpunished. Immunity is an incentive for non-heroic subordinates to report safety incidents. While an ideal professional like Frances Oldham Kelsey would report any safety incident even if no immunity was provided, proponents of these confidential reporting systems recognize that not all professionals can be held to the same ideal standard. Significant cultural barriers might also prevent reporting and organizational learning. By expecting a lower standard of professional behavior and adopting a utilitarian approach, reporting systems like the ASRS encourage non-heroic subordinates to report incidents for systemic improvements to organizations and industries.

References

edit
  1. a b NASA. ASRS: The Case for Confidential Incident Reporting Systems. NASA Pub. 60. Retrieved from http://asrs.arc.nasa.gov/docs/rs/60_Case_for_Confidential_Incident_Reporting.pdf
  2. NASA. (2013).Publicly searchable ASRS database. http://asrs.arc.nasa.gov/search/reportsets.html [1]
  3. a b NASA. (2011). ASRS program briefing. http://asrs.arc.nasa.gov/docs/ASRS_ProgramBriefing2011.pdf [2]
  4. Hradecky, S. (2010). Crash: Colgan DH8D at Buffalo on Feb 12th 2009, impacted home while on approach. Retrieved from http://avherald.com/h?article=414f3dbd/0037&opt=0
  5. Jackson & Earl. (2006). Prevalence of fatigue among commercial pilots [3]
  6. Pronovost P., Morlock L., Sexton J., Miller M., Holzmueller C., Thompson D., Lubomski L. Wu A.. Improving the Value of Patient Safety Reporting Systems. Retrieved from http://www.ahrq.gov/downloads/pub/advances2/vol1/advances-pronovost_95.pdf
  7. NASA (2004). The Patient Safety Reporting System, Program overview. Retrieved from http://psrs.arc.nasa.gov/program_ overview.htm
  8. Gambino R, Mallon O. (1991). Near misses - an untapped database to find root causes. Lab Report,13:41–44.
  9. a b Leape L. (1994). Error in medicine. JAMA 1994; 272:1851-7.
  10. Cullen D. J., Bates D.W., Small S. D., Cooper J. B., Nemeskal A. R., Leape L. L. (1995). The incident reporting system does not detect adverse drug events: A problem for quality improvement. Jt Comm J Qual Improv, 21:541-8.
  11. Sanborn K.V., Castro J., Kuroda M., Thys D. M. (1996). Detection of intraoperative incidents by electronic scanning of computerized anesthesia records: Comparison with voluntary reporting. Anesthesiology, 85:977-87.
  12. Robinson A. R., Hohmann K. B., Rifkin J. I. et al. (2002). Physician and public opinions on quality of health care and the problem of medical errors. Arch Intern Med., 162(19):2186–90.
  13. The University of Virginia Magazine (2013). Honor Committee Proposes Changes to System. Retrieved from http://uvamagazine.org/only_online/article/honor_committee_proposes_change_to_system#.UWdo8KKG3Io
  14. Bok, D. (1990). Universities and the Future of America. Durham, N.C.: Duke University Press
  15. McCabe, D. L. & Trevino, L. K. (1993). Academic dishonesty: Honor codes and other contextual influences. Journal of Higher Education, 522-538.
  16. Harper & Helmreich. (2005). Identifying Barriers to the Success of a Reporting System. Retrieved from http://www.ncbi.nlm.nih.gov/books/NBK20544/
  17. O’Reilly, K. (2009). Hospital error-reporting systems falling short. Retrieved from http://www.amednews.com/article/20090212/profession/302129998/8/
  18. NBC29 (2013). UVA students to vote on honor system changes. Retrieved from http://www.nbc29.com/story/20679475/uva-students-to-vote-on-honor-system-changes
  19. Bowers, W. J. (1964). Student Dishonesty and Its Control in College. New York: Bureau of Applied Social Research, Columbia University.
  20. Gill, G. K., & Shergill, G. S. (2004). Perceptions of safety management and safety culture in the aviation industry in New Zealand. Journal of Air Transport Management, 10, 233-239. Retrieved from http://hfskyway.faa.gov/(A(gPImTtVtzgEkAAAAM2IwM2Y5NDMtODVkYy00ZWViLWJhNGQtYzY5N2NmOGZjZGJk25-tfzAsoS9i4M8Pi0i03m9P8VM1))/HFTest/Bibliography%20of%20Publications%5CHuman%20Factor%20Maintenance%5CPerceptions%20of%20safety%20management%20and%20safety%20culture%20in%20the%20aviation%20industry%20in%20New%20Zealand.pdf
  21. Scott., R (2011). Do you operate in a “just culture”? Retrieved from https://www.iaftp.org/2011/11/do-you-operate-in-a-%E2%80%9Cjust-culture%E2%80%9D/
  22. Ahlers., M. M. (2012). FAA says new ‘safety culture’ will stress solutions, not blame. Retrieved from http://www.cnn.com/2012/03/14/travel/faa-nonpunitive-reporting
  23. Newman M. (1996). The emotional impact of mistakes on family physicians. Arch Fam Med; 5:71-5.
  24. Ely J. (1996). Physicians' Mistakes: Will your colleagues offer support? Arch Fam Med; 5:76-7.
  25. Lawton R., Parker D. (2002). Barriers to incident reporting in a healthcare system. Qual Saf Health Care, 1115–18.18.
  26. Waring J. (2005). Beyond blame: cultural barriers to medical reporting. Soc Sci Med, 601927–1935.1935.
  27. Sexton J, Thomas E., Helmreich R. (2000). Error, stress, and teamwork in medicine and aviation: cross sectional surveys. BMJ, 320. Retrieved from http://www.bmj.com/content/320/7237/745.abstract?ijkey=089b05c47228169635bbae42155fe7e647ab7e39&keytype2=tf_ipsecsha
  28. Reason, J. T. (1990). Human error. New York: Cambridge University Press