GET THE APP

Journal of Ergonomics

Journal of Ergonomics
Open Access

ISSN: 2165-7556

+44 1300 500008

Research Article - (2021)Volume 11, Issue 5

Human Factors in Aviation and Healthcare: Best Practices, Safety Culture and the Way Ahead for Patient Safety

Philip G. Fatolitis* and Anthony J. Masalonis
 
*Correspondence: Philip G. Fatolitis, Spectrum Software Technology, Atlantic City International Airport, New Jersey, USA, Tel: +1 850-525-4495, Email:

Author info »

Abstract

Error resilience is an important aspect of design and operations in complex sociotechnical systems, especially those that involve the potential for risk to human life. Many healthcare organizations have promoted or successfully adopted human factors approaches that have been historically applied in aviation in order to improve patient safety and healthcare provider performance. However, the integration of aviation human factors approaches has not been seamless in some healthcare scenarios. Here, the authors explore prominent human factors issues in the two industries in terms of challenges and successes in conducting human factors research and in applying its best practices.

Keywords

Information technology; Safety management system; Safety data; Organizational culture

Introduction

Complex, sociotechnical systems are defined by the interdependence of social and technical aspects of an organization in the accomplishment of goals. As “systems of systems,” and to achieve optimal performance, organizations must consider both aspects in system design and operations. Both the aviation and healthcare industries have been identified as being composed of sociotechnical systems [1,2]. Aircraft operations, air traffic control and maintenance are a few domains found in aviation that must interact effectively to safely and efficiently execute the industry’s mission of transporting people and goods. Professionals from many disciplines need to cooperate in support of this larger overall mission. Each individual exerts effort in optimizing their own outcomes and in coping with system elements in a way that might not be preferable to other operators. For example, airlines are motivated to optimize their own performance (e.g., schedule adherence) while air traffic management personnel’s goals include optimizing system efficiency and fairness to all airlines and other aircraft operators. Operators engaged in these dynamic tasks employ a wide range of procedures and technologies. Similarly, in the healthcare setting, professionals are challenged with providing patient care using complicated procedures and advanced technologies in settings where conflicts can arise with respect to the needs and preferences of the patient, organizational priorities and standards of care. Critical human factors performance demands involving communication, decision-making, appropriate procedure application, information technology usability and workload are inherent to both industries and deserve appropriate consideration in the design and operations of both.

Both industries involve highly trained, specialized professionals who must make decisions rapidly and effectively, often as a team and in matters affecting safety and risk to human life. These professions attract, acculturate and retain individuals who demonstrate a high degree of competency in knowledge, skills, judgments and decision-making, and who exhibit the high confidence in themselves and their decisions that their tasks require. At times, this confidence comes at the expense of questioning one’s own judgments or those of colleagues who may be or be perceived to be senior. Nevertheless even when expertise is applied or when appropriate questioning does occur, the competency of all parties concerned does not preclude error occurrence. Evidence shows that even seasoned experts can commit errors as factors such as competing task demands, ambiguity and organizational pressures adversely influence the cognition and performance of both individuals and teams [3].

This work, by authors with long careers in aviation human factors who have recently applied their skills in the healthcare domain, explores the parallels and differences in the two industries in terms of the challenges and successes in conducting human factors research and applying its best practices. We also suggest ways in which the industries can learn from each other to improve safety and performance in sociotechnical systems. The bulk of this analysis focuses on application of best practices in the aviation industry to the healthcare domain, rather than the other way around. We note that orienting this editorial in that direction may be primarily a consequence of the authors’ experience and knowledge; there may be many areas where aviation safety research and practice can be informed by successes in the healthcare industry.

Medical and aviation error in the u.s.

Human error is “the failure of planned actions to achieve their desired ends without the intervention of some unforeseen event” [4]. Errors generally emerge from a combination of active and latent factors. Active failures are defined as errors of commission or omission by operators. Latent conditions create an environment where active failures are more likely to occur and can exist throughout an organization. Latent factors can present in a matter of seconds or may exist over the span of years. Undesired outcomes are the result of the interaction between latent and active failures, exemplified by the well-known Swiss cheese model of human error [4].

While many practices in both industries are aimed at maintaining and improving safety, incidents involving injury or loss of life do occur, and human error is all too often a primary or contributing causal factor. Identifying and mitigating latent and active system faults that facilitate error occurrence is a crucial goal for human factors/ergonomics professionals and all those seeking to improve system performance and safety.

In their oft-cited 2000 report, “To Err is Human: Building a Safer Health System,” the Institute of Medicine (IOM) defined medical error, like James Reason [5], as “the failure of a planned action to be completed as intended or the use of a wrong plan to achieve an aim [6].” Based on an examination of historical data, the report estimated an annual mortality of 44,000-98,000 resulting from preventable medical errors in the U.S. The associated financial cost was estimated to be between $17 and $29 billion per year. According to the report, errors that are commonly observed during healthcare provision include: adverse drug events, improper transfusions, surgical injuries and wrong-site surgery, restraint-related injuries or death, falls, burns, pressure ulcers and mistaken patient identity [6]. High error rates with serious consequences were most likely to occur in intensive care units, operating rooms and emergency departments [6]. Further, the report indicated that medical errors are also costly in terms of lost trust and satisfaction in the health care system by patients and in diminished job satisfaction for health professionals [6].

Research conducted subsequent to the IOM report has produced varying estimates regarding medical error-related mortality in the U.S. A 2016 BMJ (formerly British Medical Journal) report estimated the annual inpatient death rate from medical error in the U.S. to be over 250,000 per year based on metanalysis of international literature [7]. The report claimed that mortality was underestimated in the IOM report primarily due to the absence of well-defined human factors-related entries available in the International Classification of Disease (ICD) for death certificate documentation. A 2020 metanalysis estimated 22,165 preventable deaths annually and 7150 deaths for patients with greater than 3-month life expectancy [8]. According to this report, most medical errors involved poor monitoring or management of medical conditions, diagnostic errors and errors related to surgery and procedures. However, this study included reports from Canada and Europe from which U.S. mortality was extrapolated. Further, increased organizational oversight subsequent to publication of the IOM results may have influenced data that were examined in the 2020 report.

Aircraft incidents are also responsible for far too many deaths and injuries, as well as property damage and loss. The National Transportation Safety Board’s (NTSB) accident statistics report for 2019 tabulated 1303 civil aviation accidents in the U.S., combining general aviation, commuter and on-demand carriers, and air carriers [9]. These included 248 accidents resulting in 444 fatalities. Just as with medical errors, it is not always possible to determine if casualties were the result of human error or were preventable, though every aviation casualty may be considered “preventable.” Furthermore, there is some consensus that the majority of aviation accidents result from human error, with the percentage of incidents attributable to human factors estimated to be 60% perhaps closer to 80% [10,11].

Regardless of variation in reported estimates of medical and aviation-related error and aviation-related mortality, human error is usually a causal factor and results in adverse events. In the medical milieu, the prevalence of incorrect diagnoses, procedure execution errors, prescription errors and treatment delays may be underestimated because investigation and reporting of both active and latent failures may not be encouraged or rewarded by organizational cultures. Making rational, data-based and cultural changes in the U.S. healthcare system can help to improve provider performance, prevent errors and improve patient outcomes.

Applying Human Factors Practices in Aviation to Healthcare

The intent of applying human factors principles and methods in healthcare has been to identify and decrease the incidence of error-related patient harm and to improve organizational and provider performance. Although progress has been made, improvements are needed. The healthcare domain has adapted various human factors and ergonomics-related approaches from aviation with varying degrees of success due to differences between the industries and the way in which the application takes place [12]. Broad categories of areas where human factors insights from the aviation industry have been shown to be effective in healthcare include:

Work processes

Standardization and interoperability, checklists [13]

Data

Obtaining research participants and incident data [14]

Cultural

Crew Resource Management (teamwork, training and open communication) [15,16]

We now consider each of these in turn.

Interoperability

An area of significant relevance is standardization and interoperability of Health Information Technology (HIT) systems. Centralized data repositories, similar to those used by the FAA and other aviation organizations, can make data available to both providers and investigators whose aim is to improve patient safety. The IOM report indicates that the likelihood of medical error increases when patients see multiple providers in different settings and this may be partly because providers may not have access to complete information [6]. This can result in ambiguity with respect to a patient’s current state or a decreased state of provider situation awareness. EHRs have become ubiquitous throughout healthcare and as a result, HIT usability and effectiveness continue to receive a great deal of attention in the research and healthcare provider communities. HIT offers the potential to standardize workflow as well as solutions to increase patient safety in areas related to adverse pharmacological events, diagnoses, transfusions and adherence to evidence-based care [17]. Just as an aviator requires access to real-time, valid information to maintain safe and efficient flight, it is critical for healthcare providers to have access to relevant information at the right time using systems that employ usercentered, context-dependent design.

Availability of Safety Data

The aviation industry has a well-developed, enterprise-wide, interoperative infrastructure that supports open and honest error reporting (e.g., Safety Management System (SMS) Aviation Safety Reporting System (ASRS). The Federal Aviation Administration (FAA) and the NTSB share responsibility in a well-developed aviation crash investigation process. The aviation mishap investigation process is thorough, where both active and latent causal factors are actively sought. In the healthcare domain, this kind of infrastructure is less evident. The Food and Drug Administration (FDA) approves technologies that include pharmaceutical and imaging devices. However, technologies such as Electronic Health Records (EHRs) and other software do not fall within its purview. On the other hand, the FAA has established regulations and well-defined guidance with respect to approved technologies.

A compounding factor to the aforementioned issues is that most HIT systems are proprietary and operate independently. Although the data derived from HIT could allow for medical error investigations, there are no regulatory requirements, and there is sometimes unwillingness by vendors to make data available for research purposes. Further, implementation of medical safety management and reporting systems may introduce new and unanticipated errors. As is the case with Electronic Health Records (EHRs), the need exists for research into the safe design, implementation and surveillance of technologies intended to investigate report and reduce the rate of medical errors [17].

Similarly, although the FDA and individual healthcare systems have made some efforts to approximate safety-relevant methods and systems that have been employed by the aviation industry, there is no standardized, nationwide, integrated system dedicated to investigating and reporting medical errors or systems related to process improvement. For example, aviation mishap data are readily available via FAA platforms, but few data sets of observations or self-reports relating to medical error are publicly available, making the issue of addressing the topic more difficult for patient safety researchers [18].

In many cases, legal liability makes the implementation of medical error reporting systems difficult. Indeed, there have been calls to enact or amend laws that do not facilitate organizational practices that enable a safety culture, such as mandatory but confidential reporting systems and to modify the fault-based system approach to medical liability [19]. Liability may cause organizations and individuals to be reluctant to implement or use error reporting systems. Along these lines, there are stringent privacy (Health Insurance Portability and Accountability Act (HIPAA) and patient consent requirements that can reduce the quality and quantity of safety-relevant data available for safety-related analyses. The same requirements can restrict safety investigators’ and researchers’ ability to conduct inperson interviews or to make direct observations in clinical settings.

Challenges in the research and incident investigation domains that affect both industries include low operator availability, coupled with difficulty in facilitating collaboration and coordination across disciplines and organizations; these challenges can make data collection difficult. For example, the collection of patient safety data is constrained not only by HIPAA requirements, but also by access to providers who can select, provide and discuss the information that would be most relevant to a given research or investigation effort. In addition, qualitative research with providers, involving discussion of experiences with certain technology-based tools, elicitation of business processes and associated pain points and success stories, is a key element in the healthcare research toolset. Likewise, objective real-time study of interaction with other actors (colleagues, patients, caregivers) and tools, in a real or simulated environment, is an important source of useful information. Limited availability of healthcare professionals slows down these types of efforts or limits sample sizes.

In comparison, aviation personnel are in some ways more readily recruited for research participation; operators are just as willing to support research activities but face fewer barriers. General aviation pilots are relatively numerous and usually willing to participate in in studies out of a personal interest in improving aviation safety, especially if compensated, and unlike commercial pilots, employment considerations are not likely to hinder them from serving as study subjects. Airline pilots and their management also are willing to support activities that show promise for improving the system. In the U.S., air traffic controllers are employees (or at some small facilities, contractors) of the FAA and in many cases can participate in research while on duty. Like aircraft operators and individual pilots, both the personnel and management are keenly interested in aviation safety and are often willing to participate in research activities. In both populations, however, at least for controllers and for commercial pilots employed by airlines and other aircraft operating companies, potential participants are often constrained by a lack of free time due to work schedules, and in some cases union scrutiny of research protocols can slow or stop the execution of studies.

In both industries, there are no simple answers to the challenges, though the authors have found in aviation research that establishing ongoing relationships with user groups and stakeholders helps with recruitment. This is also possible in medical settings, especially where hospitals and medical centers are operated by or closely affiliated with universities.

Safety Culture

A fundamental concept in human factors is that human error is unavoidable in any system. However, organizations that design error-resilient systems have the capability to recognize and prevent errors and to mitigate the undesired outcome of errors that cannot be prevented [20]. Safety culture is arguably the most important aspect of error resilience in any organization and is characterized by the information provided in Table 1.

Culture element Description
Informed Those who manage and operate the system have current knowledge about the human, technical, organizational and environmental factors that determine the safety of the system as a whole.
Reporting The organization cultivates an atmosphere where people are prepared to report errors and near misses without fear of reprisal.
Learning An organization must possess the willingness and competence to draw the right conclusions from its safety information/management system and the will to implement major reforms.
Just An atmosphere of trust in which people are encouraged or rewarded for providing essential safety-related information.
Flexible A culture in which an organization can rapidly reconfigure in the face of high-tempo or high-risk operations with the will and capability to shift from a hierarchical to a flatter mode.

Table 1: Safety culture elements according to James reason [20].

The robust safety culture in the aviation industry has driven the development of technologies and processes that have resulted in very safe operations. Creating a just culture, detecting close calls and latent failures and implementing organizational improvements all depend on a healthy safety culture. Organizations with safe, error-resilient cultures actively seek to improve policies, procedures and tools that optimize operators’ ability to efficiently incorporate safe working practices.

Discussion

The Joint Commission defines safety culture in healthcare as knowledge, attitudes, behaviors and beliefs that staff share regarding the primary importance of patient well-being and care, supported by systems and structures that reinforce a focus on patient safety [21]. Implementation of processes and technologies that enable the different aspects of safety culture elements can mitigate many of the identified roadblocks to improving patient safety. For example, anonymity and legal protections for providers and organizations can help to improve the quality and quantity of reported safety data.

Another example of safety culture’s manifestation in the aviation domain is that when hazards or potential sources of error are identified in the aviation industry, and to enhance safety, the FAA has a number of outlets available to rapidly and widely disseminate information to personnel (e.g., Notices to Airmen (NOTAM), Aircraft Safety Alerts/Safety Alerts for Operators (SAFO)). Corresponding systems do not yet exist in the medical domain but should be explored.

Teamwork is an important aspect of safety culture, and is as important to patient safety as it is to safety of flight. In commercial aviation, CRM is introduced early during training and is continuously applied, reinforced and evaluated. Like aviation, medical teamwork not only combines critical issues related to safety culture and communication, but it has been demonstrated to be associated with patient well-being. For example, after establishing a comprehensive patient safety/highreliability program at a hospital system, a longitudinal study found that safety and teamwork climate improved and these factors were associated with decreased patient harm and mortality [22]. However, some challenges have been identified in optimizing medical team performance. These include: high throughput of different team members in some hospital units; varying provider work schedules and shift lengths; patients for which a given provider is responsible for may be distributed over a number of different units within a hospital or among different hospitals; professionals who work from a central location (such as pharmacists) interact with a number of different customers from different units with unique procedures, needs and subcultures [23].

Finally, training programs designed to increase interdisciplinary team experiences can help to overcome medical team performance challenges [24]. Best practices from the aviation industry that have been identified in the design of such training programs take a multifunctional (systems) approach that integrates traditional organizational divisions and facilitates open communication, accountability and the creation and maintenance of interdisciplinary teams [25].

Conclusion

Incorporating a standard set of human factors best practices into the design and operation of healthcare technologies and procedures can help to improve healthcare safety culture and ultimately, patient safety. Translation of human factors principles and methods from aviation settings to healthcare applications has been successful in some instances; prominent among those is the recognition that organizational culture plays a significant role in patient safety. For healthcare to derive maximum benefit from the lessons learned in aviation, human factors principles should be recognized, maintained and emphasized. The value of applying human factors principles and methods is clear in the identification, investigation and prevention of human error. It has been suggested that integrated safety systems that recognize the complexities of social, technical and cultural processes are needed for healthcare organizations to learn from the past and improve the future. As the trend of applying human factors expertise to improving medical systems along these lines increases, so will both organizational and provider performance, not to mention both patient safety and public trust in the healthcare system.

References

  1. Harris D, Stanton NA. Aviation as a system of systems: Preface to the special issue of human factors in aviation. Ergonomics. 2010;53(2):145-148.
  2. Effken JA. Different lenses, improved outcomes: A new approach to the analysis and design of healthcare information systems. Int J Med Inf. 2002;65(1):59-74.
  3. Dismukes KR, Berman BA, Loukopoulos L. The limits of expertise: Rethinking pilot error and the causes of airline accidents. Rout Ledge. 2007.
  4. Reason J. Managing the risks of organizational accidents. Ash Gate. 1997.
  5. Donaldson MS, Corrigan JM, Kohn LT. Institute of medicine, committee on quality of health care in America. Nat Acad. 2000.
  6. Makary MA, Daniel M. Medical error the third leading cause of death in the US. BMJ. 2016.
  7. Rodwin BA, Bilan VP, Merchant NB, Steffens CG, Grimshaw AA, Bastian LA, et al. Rate of preventable mortality in hospitalized patients: A systematic review and meta-analysis. J Gen Int Med. 2020;35(7):2099-2106.
  8. Shappell SA, Wiegmann DA. U.S. naval aviation mishaps differences between single and dual-piloted aircraft. Environ Med. 1996;67(1):65-69.
  9. Rankin W. Maintenance Error Decision Aid (MEDA) investigation process. Aero. 2007;2(26):15-21.
  10. Mitchell I, Schuster A, Smith K, Pronovost P, Wu A. Patient safety incident reporting: A qualitative study of thoughts and perceptions of experts 15 years after ‘To Err is Human’. Quality Safety. 2015;25(2):92-99.
  11. Clebone A, Strupp KM, Whitney G, Anderson MR, Hottle J, Fehr J, et al. Development and usability testing of the society for pediatric anesthesia pedicrisis mobile application. Anesth Analg. 2019;129(6):1635-1644.
  12. Diller T, Helmrich G, Dunning S, Cox S, Buchanan A, Shappell, S. The Human Factors Analysis Classification System (HFACS) applied to health care. Ameri J Med Quality. 2013;29(3):181-190.
  13. Savage C, Gaffney FA, Hussain-Alkhateeb L, Ackheim PO, Henricson G, Antoniadou I, et al. Safer paediatric surgical teams: A 5-year evaluation of crew resource management implementation and outcomes. Int J Quality Health Care. 2017;29(6):853-860.
  14. Hettinger AZ, Hall KH, Fitall, E. Updates in the role of health IT in patient safety PSNet. Agency Health Res Quality. 2009.
  15. Applying Human Factors and Usability Engineering to Medical Devices. Food Drug Admin. 2016.
  16. Guillod O. Medical error disclosure and patient safety: Legal aspects. J Pub Health Res. 2013;2(3):31.
  17. Reason J. Achieving a safe culture: Theory and practice. Work Stress. 1998;12(3):293-306.
  18. Berry JC, Davis JT, Bartman T, Hafer CC, Lieb LM, Khan N, et al. Improved safety culture and teamwork climate are associated with decreases in patient harm and hospital mortality across a hospital system. J Patient Safety. 2016;16(2):130-136.
  19. Blegen MA, Sehgal NL, Alldredge BK, Gearhart S, Auerbach AA, Wachter RM. Republished paper: Improving safety culture on adult medical units through multidisciplinary teamwork and communication interventions: The TOPS Project. Postgrad Med J. 2010;86(1022):729-733.
  20. Hamman WR. The complexity of team training: what we have learned from aviation and its applications to medicine. Quality Safety Health Care. 2004;13:72-79.
  21. Macrae C. Errors and near misses: What health care could learn from aviation? AHRQ. 2016.

Author Info

Philip G. Fatolitis* and Anthony J. Masalonis
 
Spectrum Software Technology, Atlantic City International Airport, New Jersey, USA
 

Citation: Fatolitis PG, Masalonis AJ (2021) Human Factors in Aviation and Healthcare: Best Practices, Safety Culture and the Way Ahead for Patient Safety. J Ergonomics.11:289.

Received: 01-Oct-2021 Accepted: 15-Oct-2021 Published: 22-Oct-2021 , DOI: 10.35248/2165-7556.21.11.289

Copyright: © 2021 Fatolitis PG, et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Top