Image
Doctor touching modern virtual screen interface
Perspectivas

Errar es humano: diseñar para la seguridad es esencial: por qué los factores humanos son importantes en la atención médica

Summary

  • The realization of safety requires the intentional design of systems that optimize the expertise and performance of the dedicated health care workforce.

Safety scientists and experts have long known that humans are imperfect and prone to error. Despite this reality, efforts to address safety have largely focused on attempts to perfect human performance at the sharp end of care, often absent attention and efforts to build a safer health system. The term “error” is used interchangeably with “accident” and “harm,” focusing the causes of harm on human behavior, and the solutions on achieving zero human errors — not on the deeper systemic roots of error and the defenses that prevent harm.

The Roots of Human Factors and Safety

Human factors engineering has long been a critical component of system design, integrated into the aviation, military, and nuclear industries since the mid-1900s. The groundbreaking research of anesthesia safety pioneer Jeffrey Cooper, and pediatric surgeon Lucian Leape, for whom IHI’s patient safety think tank, the Lucian Leape Institute is named, revealed critical factors associated with adverse health care events. These include cognitive overload, poor design of equipment and systems, miscommunication, and the vital importance of system design. Cooper, Leape, and others embraced experts from outside health care in the quest to make health care safer. For example, James Reason, an expert in human error and accident theory, reinforced that safety is a continually emerging property of a dynamic system, that errors are common and normal, that accidents should be viewed as systemic failures rather than the fault of an individual, and solutions must address the intentional design and continuous improvement of systems. 

Three years later, the report To Err is Human served as a rallying cry for the patient safety movement. It raised awareness with the public and across health care about accidental harms in medicine, which were deemed largely preventable. The report called for system-level changes, including integrating human factors in the design and improvement of safer health systems and learning from other industries. 

Enabling Human Excellence in Health Care

Despite these recommendations, improvements in safety have been incremental and fragile. Critics note the medicalization and bureaucratization of health care safety, the heavy focus on eliminating risk (often through standardization, checklists, and protocols) as opposed to intelligently managing risk, and missed opportunities to properly leverage lessons learned from other industries. They reference that the embrace of safety scientists and human factors engineers has been limited and at times the knowledge translated from other industries has been oversimplified. 

The misinterpretation of safety science in general, and human factors in particular (well detailed in Wears and Sutcliffe’s Still Not Safe), has hampered progress in patient safety. The prevention of medical accidents and resultant harm is complex. We cannot simply count errors and fix the most proximal cause. Safety science involves balancing risks and resources with existential human challenges, by enabling and amplifying the expertise of dedicated humans who create safety within health care, rather than treating them as weak or deficient.

The Need for Skilled Action

Recognition of unacceptable failures in health care safety, pandemic-related setbacks in recent years, and clarion calls from patient and family activists and the safety community have spurred renewed attention to safety and the importance of human factors engineering. This is especially true considering the broad benefits of integrating human factors and systems engineering expertise into the highly complex realities of health care delivery. In 2010, the US Food and Drug Administration (FDA) began to require human factors certification for new devices. Throughout the following decade, systems engineering models and human factors professionals found some traction in health systems. 

Recent calls for action to assess, engineer, and improve the systems of health care sparked by the Safer Together National Action Plan, the Centers for Medicare and Medicaid Services (CMS) Patient Safety Structural Measure, and the Department of Health and Human Services National Action Alliance for Patient and Workforce Safety, further reinforce that health care faces a critical decision and action point. Yet health care faces a dearth of human factors and resilience engineering skills. These cannot easily be addressed, even if all the human factors professionals in the world shifted their focus to health care.

The time has come to rethink and recommit to how we build, validate, and integrate human factors knowledge and competencies into the daily work of health care to design and continuously improve safe systems. We have an opportunity — and an urgent need — to support existing and emerging leaders in this work.

Read part two of this series to learn more: Why Every Health Care Organization Needs Human Factors Expertise, and How to Get There.

Ken Catchpole, PhD, CErgHF, is SmartState Endowed Chair in Clinical Practice and Human Factors at the Medical University of South Carolina.

Patricia McGaffigan, MS, RN, CPPS, is an IHI Senior Advisor for Safety and President, Certification Board for Professionals in Patient Safety.  

Photo by rawpixel.com 

You might also be interested in:  

Share

*Contenido convertido automáticamente por Google. Aprende más
Traducido por Google