Article Type
Changed
Fri, 09/14/2018 - 12:34
Display Headline
Perilous Intersection

When Amsterdam Airport Schiphol in the Netherlands revamped its men’s restrooms, the architects installed small, Euro-style urinals: a surefire way to throw urine off target. To solve this problem, the black outline of a fly was etched in the porcelain near each urinal’s drain. Users’ aim improved and spillage was reduced by 80%. “They try to power blast it away,” says Sanjay Saint, MD, hospitalist and professor of internal medicine at the Ann Arbor VA Medical Center, University of Michigan. “By the time they might realize that the fly isn’t going anywhere, the men are done and walking away.”

It’s a guy thing, sure. It also is an example of a human factors intervention. “Science teaches us that implementing a design for a machine or device that elicits an instinctive reaction from someone using it is a clear-cut way to avoid error,” Dr. Saint explains.

What It Is and Why It’s Important

Human factors (HF), or human factors engineering (HFE), also sometimes called usability engineering or systems-based practice, refers to the study of human abilities and characteristics as they affect the design and smooth operation of equipment, systems, and jobs.1 HF is the basic science underlying much of patient safety practice. For instance, the current recommendation that hospitals standardize equipment, such as ventilators, programmable IV pumps, and defibrillators, is an example of making tasks human friendly. The use of cognitive psychology and biomechanics to develop and improve software and hand tools are another example of HF principles.

To sum up the essence of Dr. Nagamine’s project, she invokes her favorite quotation from systems expert James Reason: “We can’t change the human condition, but we can change the conditions under which humans work.”

In general, HF examines the component tasks of an activity in terms of three factorial domains: physical and environmental factors, cognitive factors (skill demands and mental workload), and organizational factors. Each task is assessed in terms of necessary interactions of the individual and work environment, the device/system design, and associated team dynamics.

HF use in healthcare is not new; for roughly four decades HF researchers have emphasized the key role of HF in safe medical design, healthcare facility operations, and patient safety processes. HF helps organizations deepen analyses of adverse events and develop effective solutions.2 HF is used in the design of labeling, warnings or alarms, software programs, information displays, paper forms, process and activity flow, workplace design, cognitive aids, decision support systems, and policies and protocols.

Human Factors in Hospital Medicine

As the area medical officer with the Schumacher Group’s hospital medicine division in Lafayette, La., David Grace, MD, considered human factors when tweaking designs of simple paper or software templates. His sense of what human factors encompasses prompted him to address some cognitive pitfalls to help prevent error and oversight on standardized “old-fashioned paper” progress notes. “While most docs know how to take care of patients and what patients need given an acute condition,” he says, “in the heat of battle, little things get overlooked. I created little prompts to remind docs what every patient needs.”

Dr. Grace also realized as hospitalists reviewed patient charts, their mindset was typically looking for problems, things like unstable vital signs. Yet, by the time they return to their notes, hospitalists on occasion forget things. “Now, at the top of the progress notes, we have a box marked Problem List, with a space for jotting them down as they go,” he explains. Dr. Grace says as a result of the new checklist, he directly associates an increase in patient satisfaction rates.

He also tackled standardizing reminders for important care procedures. “We all know DVT prophylaxis needs to be done, but it’s easy to overlook when considering the patient’s other problems,” Dr. Grace says. The group’s medical record software template has a single mouse-click to indicate the bundle has been initiated. “Our compliance with DVT prophylaxis has increased dramatically,” he says.—AS

 

 

For hospitalists, human factors knowledge is most useful in process improvement, says John Gosbee, MD, MS, a human factors engineering and healthcare specialist at the University of Michigan. Dr. Gosbee, who has worked with hospitalists in Ann Arbor and around the country, originally studied aerospace medicine, pursued a subspecialty in occupational medicine, and from 1988 to 1992 worked at NASA designing space hospitals. In the dozens of lectures and workshops he has conducted, he has learned numerous physicians resist learning about HF. At first they protest, claiming they “didn’t go to medical school to become engineers” or “weren’t hired to have you tell us we need to be some kind of designer or computer-science software evaluator.”

Dr. Gosbee couldn’t agree more, but after the a-ha! moment, usually in an interactive experience when the hospitalist sees a poor system design is an obstacle to safety and process flow, they open up to adopting the HF mindset. Once on board with HF, hospitalists are quick to translate the theories to their own practices, identifying potential vulnerabilities and risks.

Manufacturers of healthcare equipment and systems don’t want to hear from “safety geeks,” Dr. Gosbee says; the companies want to hear from front-line providers who regularly use the products. “Hospitalists are in great position to provide that input because they see what happens across a broad swath of hospital settings,” he says, “and they could amalgamate the fact that everyone across specialties is having some trouble with this computer screen or new infusion device.”

Dr. Gosbee’s first-hand knowledge and experience solving hospitalist issues with HF techniques evolved into a teaching career. He says the university administration supports his belief in the practicality of HF lessons, and he now works as the lead instructor for a majority of the university’s medical residents.

“Human factors engineering is an efficient way to flip people’s brains around 180 degrees toward systems thinking,” Dr. Gosbee explains, “which is required if the organization wants to become a high-reliability organization.”3

Examples in Medicine

Russ Cucina, MD, MS, hospitalist at the University of California San Francisco Medical Center, describes a practical example of human factors engineering in a simple, widely used design. When cars ran on leaded gasoline, the design of the leaded gas pump nozzle precluded it from being inserted into an unleaded gas tank. “Even though one was clearly labeled leaded and other unleaded, human beings are bad at catching those things, especially when they’re in a hurry and under stress,” says Dr. Cucina, whose research includes clinical human-computer interaction science with an emphasis on human factors and patient safety.

A similar concept is what is missing from the Swan Ganz catheter design. The three ports (proximal, middle, and distal) connecting the catheter to the ICU monitor all have the same shape, making it easy to errantly connect one or more to the wrong port. “You’d think the manufacturers would shape the connectors in a way that would preclude incorrect connections,” Dr. Cucina says, “but that has not been done. We leave it to the vigilance of the bedside nurse or intensivist or hospitalist to hook these up correctly, rather than redesigning them so that cannot be done incorrectly.”

One way to think about human factors engineering is to think about forcing “a round peg into a square hole.” In the hospital setting, round pegs into square holes equate to errors. HF tries to solve the issue (round peg into a round hole, and vice versa). “Were you to apply human factors to the Swan Ganz catheter port connectors,” Dr. Cucina says, “you’d have round into the round hole, square into the square, and triangular into the triangular. You’d have no choice but to do the right thing.”

 

 

Efforts to implement systems that anticipate and minimize the chances of human error, such as computer physician order entry and patient bar coding, are attempts to overcome by design those instances where it is possible to place round pegs into square holes.

Take-Home Messages

Human factors design is an accessible topic with intuitive content. Educating oneself, even a little bit, about human factors could go a long way to inform the individual hospitalist’s thinking about systems.

—Russ Cucina, MD, MS, hospitalist, University of California San Francisco Medical Center

Given the complexity of the care that we deliver, it is no longer realistic to think that, if you’re smart and conscientious and try hard, things will be OK. All hospitalists will be involved in some sort of bad outcome. It behooves us to accept that approach and design systems that are failsafe.

—Janet Nagamine, MD, hospitalist, Kaiser Permanente, Santa Clara, Calif.

There are some patient safety problems that lend themselves to an epidemiologic approach, such as rates of infection, for instance, where we can see we’ve done something to improve upon those rates. The human factors/ergonomics approach is complementary to that approach. Human factors concepts help us design interventions to prevent those rare errors, for which we don’t have rates or readily obtainable rates. The need is not for one approach or the other. We need both.

—Sanjay Saint, MD, hospitalist, professor of internal medicine, Ann Arbor VA Medical Center, University of Michigan, Ann Arbor, Mich.

Hospitalists can hone a human factors mindset with attention to three areas. First, improve your philosophical and attitudinal view toward what you’re trying to redesign. Second, understand the underlying methodology of the systems that people are troubleshooting in your wards and committees.

Third, explore what HF has found in terms of what works and what doesn’t in patient safety.

Hospitalists are also the recipients of new devices, tools, and technologies for patient care. As members of review committees and procurement committees, hospitalists are asked for input. Knowledge of the nuts and bolts of human factors science will give that input some foundation.

—John Gosbee, MD, MS, human factors engineering and healthcare specialist, University of Michigan, Ann Arbor, Mich.

HF Projects in Motion

A number of hospitalists around the country have or are using HF as part of projects and studies to reduce human errors.

Culture change: In the early 2000s, Janet Nagamine, MD, a hospitalist with Kaiser Permanente in Santa Clara, Calif., and her colleagues took human factors concepts to front-line ICU staff. The human factors training provided a framework to reinforce three basic concepts: all humans make errors; processes can be designed to reduce the possibility of error; and processes can be designed so errors are detected and corrected before causing injury.4 “My colleagues and I knew that the punitive, ‘shame-and-blame’ culture around mistakes and errors were preventing us from identifying problems and moving forward with solutions,” Dr. Nagamine says.

A former ICU nurse and current chair of SHM’s Hospital Quality and Patient Safety (HQPS) Committee, Dr. Nagamine first became involved in HF when she realized how many patients suffered adverse events stemming from poorly designed medical systems. “Some of my most respected mentors were involved in these kinds of cases, and I knew eventually that would be me,” she says. It was a disturbing reality. During her medical training it was drilled into her head smart, diligent doctors would be successful. “But bad things happen in medicine; it’s part of what we do,” she says. “Rather than deny that things will inevitably go wrong, I wanted to study safety science and reliable system design.” She asked herself, how can we prevent the same mistakes from happening to competent people who practice in poorly designed systems? “The patterns are there,” she says. “You can train your eyes to look for vulnerabilities and patterns, then find the solutions.”

 

 

After she started looking at adverse events as system failures, rather than solely personal failures, she engaged the staff to redesign systems. She introduced HF concepts and provided an infrastructure to make it safe to report and discuss problems. The project included a new medication error reporting system and the creation of departmental patient safety teams. A palpable culture change developed when front-line staff and managers became empowered to find solutions working side-by-side with the quality and risk management departments.

The result? A dramatic increase in medication errors and near-miss reports: from eight faulty problems per quarter in 2000 to 200 reports per quarter by 2001.

To sum up the essence of Dr. Nagamine’s project, she invokes her favorite quotation from systems expert James Reason: “We can’t change the human condition, but we can change the conditions under which humans work.”1,5

Bar coding workarounds: Hospitalist Tosha Wetterneck, MD, and her colleagues at the University of Wisconsin School of Medicine and Dentistry focused their HF-trained eyes on medication errors.5 The team applied HF concepts as part of a study of bar-coded medication administration systems (BCMAs). Ideally, BCMAs help confirm the five rights of medication administration: the right patient, drug, dose, route, and time. The study authors identified the hospitalist staff had developed 46 workarounds in place of proper use of the BCMA. With each workaround, the researchers identified six potential errors. Furthermore, nurses were overriding the BCMA alerts for 4.2% of patients charted, and for 10.3% of total medication.

By creating an exhaustive template, the study authors broke down the use of BCMA workarounds to the finest detail of task component. They learned many workarounds were engendered by difficulties with the technology and by interactions between BCMA technologies and environmental, technical, process, workload, training, and policy concerns. Data shows BCMAs still have an important role in preventing error; during one year, almost 24,000 BCMA alerts led users to change their action, instead of overriding an alert. “These causes (and related workarounds) are neither rare nor secret,” the authors write. “They are hiding in plain sight.”1,5

Dr. Wetterneck is part of the Systems Engineering Initiative for Patient Safety (SEIPS), an interdisciplinary research group located within the Center for Quality and Productivity Improvement in the College of Engineering at the University of Wisconsin-Madison.6,7 SEIPS uses HF principles to study the safety and quality of healthcare systems.

Congestive heart failure order sets: Researchers in another study incorporated HF science in their review of clinical practice guideline use and application for congestive heart failure (CHF). Reingold and Kulstad studied the impact of HF design elements on order set utilization and recommendations compliance.8

Using retrospective medical record review of adult patients admitted from the emergency department with CHF, the study measured acuity and clinical practice guideline (CPG) parameters before and after introducing new orders. In 87 adult patients before and 84 patients after beginning the new order set, attention to HF design elements significantly improved utilization of the orders and CPG compliance.

Infusion device programming: In another instance, a multidisciplinary research team applied HF design principles to common nursing procedures: programming an insulin infusion and programming a heparin infusion.9,10 An HF usability checklist was developed, and it revealed systematic error-provoking conditions in both tasks.

The good news is the pitfalls were remedied easily.

Not only did researchers subsequently commit to modify training procedures and redesign preprinted orders, they took the bigger step of providing feedback to the manufacturer and committing to incorporate usability testing in future procurement of medical devices. TH

Andrea M. Sattinger is a medical writer based in North Carolina and a frequent contributor to The Hospitalist.

 

 

References

1. Gosbee JW. Conclusion: you need human factors engineering expertise to see design hazards that are hiding in “plain sight!” Jt Comm J Qual Saf. 2004;30(12):696-700.

2. Gosbee J. Introduction to the human factors engineering series. Jt Comm J Qual Saf. 2004;30(4): 215-219.

3. Reason J. Human error: models and management. BMJ. 2000; 320(7237):768-770.

4. Etchells E, Juurlink D, Levinson W. Medication errors: the human factor. CMAJ. 2008;178(1):63-64.

5. Koppel R, Wetterneck T, Telles JL, Karsh BT. Workarounds to barcode medication administration systems: their occurrences, causes, and threats to patient safety. J Am Med Inform Assoc. 2008;15(4):408-423.

6. SEIPS model. http://cqpi.engr.wisc.edu/seips_ home/. Accessed Dec. 20, 2008.

7. Carayon P, Schoofs Hundt A, Karsh BT, et al. Work system design for patient safety: the SEIPS model. Qual Saf Health Care. 2006;15 Suppl 1:850-858.

8. Reingold S, Kulstad E. Impact of human factor design on the use of order sets in the treatment of congestive heart failure. Acad Emerg Med. 2007;14(11):1097-1105.

9. Etchells E, Bailey C, Biason R, et al. Human factors in action: getting “pumped” at a nursing usability laboratory. Healthc Q. 2006;9 Spec No:69-74.

10. Carayon P, Wetterneck T, Schoofs Hundt A, et al. Observing nurse interaction with infusion pump technologies. In: Henriksen K, Battles J, Lewin D, eds. Advances in Patient Safety: From Research to Implementation. Rockville, Md.: Agency for Healthcare Research and Quality; Feb. 2005, AHRQ Publication No. 05-0021-2.

Issue
The Hospitalist - 2009(01)
Publications
Sections

When Amsterdam Airport Schiphol in the Netherlands revamped its men’s restrooms, the architects installed small, Euro-style urinals: a surefire way to throw urine off target. To solve this problem, the black outline of a fly was etched in the porcelain near each urinal’s drain. Users’ aim improved and spillage was reduced by 80%. “They try to power blast it away,” says Sanjay Saint, MD, hospitalist and professor of internal medicine at the Ann Arbor VA Medical Center, University of Michigan. “By the time they might realize that the fly isn’t going anywhere, the men are done and walking away.”

It’s a guy thing, sure. It also is an example of a human factors intervention. “Science teaches us that implementing a design for a machine or device that elicits an instinctive reaction from someone using it is a clear-cut way to avoid error,” Dr. Saint explains.

What It Is and Why It’s Important

Human factors (HF), or human factors engineering (HFE), also sometimes called usability engineering or systems-based practice, refers to the study of human abilities and characteristics as they affect the design and smooth operation of equipment, systems, and jobs.1 HF is the basic science underlying much of patient safety practice. For instance, the current recommendation that hospitals standardize equipment, such as ventilators, programmable IV pumps, and defibrillators, is an example of making tasks human friendly. The use of cognitive psychology and biomechanics to develop and improve software and hand tools are another example of HF principles.

To sum up the essence of Dr. Nagamine’s project, she invokes her favorite quotation from systems expert James Reason: “We can’t change the human condition, but we can change the conditions under which humans work.”

In general, HF examines the component tasks of an activity in terms of three factorial domains: physical and environmental factors, cognitive factors (skill demands and mental workload), and organizational factors. Each task is assessed in terms of necessary interactions of the individual and work environment, the device/system design, and associated team dynamics.

HF use in healthcare is not new; for roughly four decades HF researchers have emphasized the key role of HF in safe medical design, healthcare facility operations, and patient safety processes. HF helps organizations deepen analyses of adverse events and develop effective solutions.2 HF is used in the design of labeling, warnings or alarms, software programs, information displays, paper forms, process and activity flow, workplace design, cognitive aids, decision support systems, and policies and protocols.

Human Factors in Hospital Medicine

As the area medical officer with the Schumacher Group’s hospital medicine division in Lafayette, La., David Grace, MD, considered human factors when tweaking designs of simple paper or software templates. His sense of what human factors encompasses prompted him to address some cognitive pitfalls to help prevent error and oversight on standardized “old-fashioned paper” progress notes. “While most docs know how to take care of patients and what patients need given an acute condition,” he says, “in the heat of battle, little things get overlooked. I created little prompts to remind docs what every patient needs.”

Dr. Grace also realized as hospitalists reviewed patient charts, their mindset was typically looking for problems, things like unstable vital signs. Yet, by the time they return to their notes, hospitalists on occasion forget things. “Now, at the top of the progress notes, we have a box marked Problem List, with a space for jotting them down as they go,” he explains. Dr. Grace says as a result of the new checklist, he directly associates an increase in patient satisfaction rates.

He also tackled standardizing reminders for important care procedures. “We all know DVT prophylaxis needs to be done, but it’s easy to overlook when considering the patient’s other problems,” Dr. Grace says. The group’s medical record software template has a single mouse-click to indicate the bundle has been initiated. “Our compliance with DVT prophylaxis has increased dramatically,” he says.—AS

 

 

For hospitalists, human factors knowledge is most useful in process improvement, says John Gosbee, MD, MS, a human factors engineering and healthcare specialist at the University of Michigan. Dr. Gosbee, who has worked with hospitalists in Ann Arbor and around the country, originally studied aerospace medicine, pursued a subspecialty in occupational medicine, and from 1988 to 1992 worked at NASA designing space hospitals. In the dozens of lectures and workshops he has conducted, he has learned numerous physicians resist learning about HF. At first they protest, claiming they “didn’t go to medical school to become engineers” or “weren’t hired to have you tell us we need to be some kind of designer or computer-science software evaluator.”

Dr. Gosbee couldn’t agree more, but after the a-ha! moment, usually in an interactive experience when the hospitalist sees a poor system design is an obstacle to safety and process flow, they open up to adopting the HF mindset. Once on board with HF, hospitalists are quick to translate the theories to their own practices, identifying potential vulnerabilities and risks.

Manufacturers of healthcare equipment and systems don’t want to hear from “safety geeks,” Dr. Gosbee says; the companies want to hear from front-line providers who regularly use the products. “Hospitalists are in great position to provide that input because they see what happens across a broad swath of hospital settings,” he says, “and they could amalgamate the fact that everyone across specialties is having some trouble with this computer screen or new infusion device.”

Dr. Gosbee’s first-hand knowledge and experience solving hospitalist issues with HF techniques evolved into a teaching career. He says the university administration supports his belief in the practicality of HF lessons, and he now works as the lead instructor for a majority of the university’s medical residents.

“Human factors engineering is an efficient way to flip people’s brains around 180 degrees toward systems thinking,” Dr. Gosbee explains, “which is required if the organization wants to become a high-reliability organization.”3

Examples in Medicine

Russ Cucina, MD, MS, hospitalist at the University of California San Francisco Medical Center, describes a practical example of human factors engineering in a simple, widely used design. When cars ran on leaded gasoline, the design of the leaded gas pump nozzle precluded it from being inserted into an unleaded gas tank. “Even though one was clearly labeled leaded and other unleaded, human beings are bad at catching those things, especially when they’re in a hurry and under stress,” says Dr. Cucina, whose research includes clinical human-computer interaction science with an emphasis on human factors and patient safety.

A similar concept is what is missing from the Swan Ganz catheter design. The three ports (proximal, middle, and distal) connecting the catheter to the ICU monitor all have the same shape, making it easy to errantly connect one or more to the wrong port. “You’d think the manufacturers would shape the connectors in a way that would preclude incorrect connections,” Dr. Cucina says, “but that has not been done. We leave it to the vigilance of the bedside nurse or intensivist or hospitalist to hook these up correctly, rather than redesigning them so that cannot be done incorrectly.”

One way to think about human factors engineering is to think about forcing “a round peg into a square hole.” In the hospital setting, round pegs into square holes equate to errors. HF tries to solve the issue (round peg into a round hole, and vice versa). “Were you to apply human factors to the Swan Ganz catheter port connectors,” Dr. Cucina says, “you’d have round into the round hole, square into the square, and triangular into the triangular. You’d have no choice but to do the right thing.”

 

 

Efforts to implement systems that anticipate and minimize the chances of human error, such as computer physician order entry and patient bar coding, are attempts to overcome by design those instances where it is possible to place round pegs into square holes.

Take-Home Messages

Human factors design is an accessible topic with intuitive content. Educating oneself, even a little bit, about human factors could go a long way to inform the individual hospitalist’s thinking about systems.

—Russ Cucina, MD, MS, hospitalist, University of California San Francisco Medical Center

Given the complexity of the care that we deliver, it is no longer realistic to think that, if you’re smart and conscientious and try hard, things will be OK. All hospitalists will be involved in some sort of bad outcome. It behooves us to accept that approach and design systems that are failsafe.

—Janet Nagamine, MD, hospitalist, Kaiser Permanente, Santa Clara, Calif.

There are some patient safety problems that lend themselves to an epidemiologic approach, such as rates of infection, for instance, where we can see we’ve done something to improve upon those rates. The human factors/ergonomics approach is complementary to that approach. Human factors concepts help us design interventions to prevent those rare errors, for which we don’t have rates or readily obtainable rates. The need is not for one approach or the other. We need both.

—Sanjay Saint, MD, hospitalist, professor of internal medicine, Ann Arbor VA Medical Center, University of Michigan, Ann Arbor, Mich.

Hospitalists can hone a human factors mindset with attention to three areas. First, improve your philosophical and attitudinal view toward what you’re trying to redesign. Second, understand the underlying methodology of the systems that people are troubleshooting in your wards and committees.

Third, explore what HF has found in terms of what works and what doesn’t in patient safety.

Hospitalists are also the recipients of new devices, tools, and technologies for patient care. As members of review committees and procurement committees, hospitalists are asked for input. Knowledge of the nuts and bolts of human factors science will give that input some foundation.

—John Gosbee, MD, MS, human factors engineering and healthcare specialist, University of Michigan, Ann Arbor, Mich.

HF Projects in Motion

A number of hospitalists around the country have or are using HF as part of projects and studies to reduce human errors.

Culture change: In the early 2000s, Janet Nagamine, MD, a hospitalist with Kaiser Permanente in Santa Clara, Calif., and her colleagues took human factors concepts to front-line ICU staff. The human factors training provided a framework to reinforce three basic concepts: all humans make errors; processes can be designed to reduce the possibility of error; and processes can be designed so errors are detected and corrected before causing injury.4 “My colleagues and I knew that the punitive, ‘shame-and-blame’ culture around mistakes and errors were preventing us from identifying problems and moving forward with solutions,” Dr. Nagamine says.

A former ICU nurse and current chair of SHM’s Hospital Quality and Patient Safety (HQPS) Committee, Dr. Nagamine first became involved in HF when she realized how many patients suffered adverse events stemming from poorly designed medical systems. “Some of my most respected mentors were involved in these kinds of cases, and I knew eventually that would be me,” she says. It was a disturbing reality. During her medical training it was drilled into her head smart, diligent doctors would be successful. “But bad things happen in medicine; it’s part of what we do,” she says. “Rather than deny that things will inevitably go wrong, I wanted to study safety science and reliable system design.” She asked herself, how can we prevent the same mistakes from happening to competent people who practice in poorly designed systems? “The patterns are there,” she says. “You can train your eyes to look for vulnerabilities and patterns, then find the solutions.”

 

 

After she started looking at adverse events as system failures, rather than solely personal failures, she engaged the staff to redesign systems. She introduced HF concepts and provided an infrastructure to make it safe to report and discuss problems. The project included a new medication error reporting system and the creation of departmental patient safety teams. A palpable culture change developed when front-line staff and managers became empowered to find solutions working side-by-side with the quality and risk management departments.

The result? A dramatic increase in medication errors and near-miss reports: from eight faulty problems per quarter in 2000 to 200 reports per quarter by 2001.

To sum up the essence of Dr. Nagamine’s project, she invokes her favorite quotation from systems expert James Reason: “We can’t change the human condition, but we can change the conditions under which humans work.”1,5

Bar coding workarounds: Hospitalist Tosha Wetterneck, MD, and her colleagues at the University of Wisconsin School of Medicine and Dentistry focused their HF-trained eyes on medication errors.5 The team applied HF concepts as part of a study of bar-coded medication administration systems (BCMAs). Ideally, BCMAs help confirm the five rights of medication administration: the right patient, drug, dose, route, and time. The study authors identified the hospitalist staff had developed 46 workarounds in place of proper use of the BCMA. With each workaround, the researchers identified six potential errors. Furthermore, nurses were overriding the BCMA alerts for 4.2% of patients charted, and for 10.3% of total medication.

By creating an exhaustive template, the study authors broke down the use of BCMA workarounds to the finest detail of task component. They learned many workarounds were engendered by difficulties with the technology and by interactions between BCMA technologies and environmental, technical, process, workload, training, and policy concerns. Data shows BCMAs still have an important role in preventing error; during one year, almost 24,000 BCMA alerts led users to change their action, instead of overriding an alert. “These causes (and related workarounds) are neither rare nor secret,” the authors write. “They are hiding in plain sight.”1,5

Dr. Wetterneck is part of the Systems Engineering Initiative for Patient Safety (SEIPS), an interdisciplinary research group located within the Center for Quality and Productivity Improvement in the College of Engineering at the University of Wisconsin-Madison.6,7 SEIPS uses HF principles to study the safety and quality of healthcare systems.

Congestive heart failure order sets: Researchers in another study incorporated HF science in their review of clinical practice guideline use and application for congestive heart failure (CHF). Reingold and Kulstad studied the impact of HF design elements on order set utilization and recommendations compliance.8

Using retrospective medical record review of adult patients admitted from the emergency department with CHF, the study measured acuity and clinical practice guideline (CPG) parameters before and after introducing new orders. In 87 adult patients before and 84 patients after beginning the new order set, attention to HF design elements significantly improved utilization of the orders and CPG compliance.

Infusion device programming: In another instance, a multidisciplinary research team applied HF design principles to common nursing procedures: programming an insulin infusion and programming a heparin infusion.9,10 An HF usability checklist was developed, and it revealed systematic error-provoking conditions in both tasks.

The good news is the pitfalls were remedied easily.

Not only did researchers subsequently commit to modify training procedures and redesign preprinted orders, they took the bigger step of providing feedback to the manufacturer and committing to incorporate usability testing in future procurement of medical devices. TH

Andrea M. Sattinger is a medical writer based in North Carolina and a frequent contributor to The Hospitalist.

 

 

References

1. Gosbee JW. Conclusion: you need human factors engineering expertise to see design hazards that are hiding in “plain sight!” Jt Comm J Qual Saf. 2004;30(12):696-700.

2. Gosbee J. Introduction to the human factors engineering series. Jt Comm J Qual Saf. 2004;30(4): 215-219.

3. Reason J. Human error: models and management. BMJ. 2000; 320(7237):768-770.

4. Etchells E, Juurlink D, Levinson W. Medication errors: the human factor. CMAJ. 2008;178(1):63-64.

5. Koppel R, Wetterneck T, Telles JL, Karsh BT. Workarounds to barcode medication administration systems: their occurrences, causes, and threats to patient safety. J Am Med Inform Assoc. 2008;15(4):408-423.

6. SEIPS model. http://cqpi.engr.wisc.edu/seips_ home/. Accessed Dec. 20, 2008.

7. Carayon P, Schoofs Hundt A, Karsh BT, et al. Work system design for patient safety: the SEIPS model. Qual Saf Health Care. 2006;15 Suppl 1:850-858.

8. Reingold S, Kulstad E. Impact of human factor design on the use of order sets in the treatment of congestive heart failure. Acad Emerg Med. 2007;14(11):1097-1105.

9. Etchells E, Bailey C, Biason R, et al. Human factors in action: getting “pumped” at a nursing usability laboratory. Healthc Q. 2006;9 Spec No:69-74.

10. Carayon P, Wetterneck T, Schoofs Hundt A, et al. Observing nurse interaction with infusion pump technologies. In: Henriksen K, Battles J, Lewin D, eds. Advances in Patient Safety: From Research to Implementation. Rockville, Md.: Agency for Healthcare Research and Quality; Feb. 2005, AHRQ Publication No. 05-0021-2.

When Amsterdam Airport Schiphol in the Netherlands revamped its men’s restrooms, the architects installed small, Euro-style urinals: a surefire way to throw urine off target. To solve this problem, the black outline of a fly was etched in the porcelain near each urinal’s drain. Users’ aim improved and spillage was reduced by 80%. “They try to power blast it away,” says Sanjay Saint, MD, hospitalist and professor of internal medicine at the Ann Arbor VA Medical Center, University of Michigan. “By the time they might realize that the fly isn’t going anywhere, the men are done and walking away.”

It’s a guy thing, sure. It also is an example of a human factors intervention. “Science teaches us that implementing a design for a machine or device that elicits an instinctive reaction from someone using it is a clear-cut way to avoid error,” Dr. Saint explains.

What It Is and Why It’s Important

Human factors (HF), or human factors engineering (HFE), also sometimes called usability engineering or systems-based practice, refers to the study of human abilities and characteristics as they affect the design and smooth operation of equipment, systems, and jobs.1 HF is the basic science underlying much of patient safety practice. For instance, the current recommendation that hospitals standardize equipment, such as ventilators, programmable IV pumps, and defibrillators, is an example of making tasks human friendly. The use of cognitive psychology and biomechanics to develop and improve software and hand tools are another example of HF principles.

To sum up the essence of Dr. Nagamine’s project, she invokes her favorite quotation from systems expert James Reason: “We can’t change the human condition, but we can change the conditions under which humans work.”

In general, HF examines the component tasks of an activity in terms of three factorial domains: physical and environmental factors, cognitive factors (skill demands and mental workload), and organizational factors. Each task is assessed in terms of necessary interactions of the individual and work environment, the device/system design, and associated team dynamics.

HF use in healthcare is not new; for roughly four decades HF researchers have emphasized the key role of HF in safe medical design, healthcare facility operations, and patient safety processes. HF helps organizations deepen analyses of adverse events and develop effective solutions.2 HF is used in the design of labeling, warnings or alarms, software programs, information displays, paper forms, process and activity flow, workplace design, cognitive aids, decision support systems, and policies and protocols.

Human Factors in Hospital Medicine

As the area medical officer with the Schumacher Group’s hospital medicine division in Lafayette, La., David Grace, MD, considered human factors when tweaking designs of simple paper or software templates. His sense of what human factors encompasses prompted him to address some cognitive pitfalls to help prevent error and oversight on standardized “old-fashioned paper” progress notes. “While most docs know how to take care of patients and what patients need given an acute condition,” he says, “in the heat of battle, little things get overlooked. I created little prompts to remind docs what every patient needs.”

Dr. Grace also realized as hospitalists reviewed patient charts, their mindset was typically looking for problems, things like unstable vital signs. Yet, by the time they return to their notes, hospitalists on occasion forget things. “Now, at the top of the progress notes, we have a box marked Problem List, with a space for jotting them down as they go,” he explains. Dr. Grace says as a result of the new checklist, he directly associates an increase in patient satisfaction rates.

He also tackled standardizing reminders for important care procedures. “We all know DVT prophylaxis needs to be done, but it’s easy to overlook when considering the patient’s other problems,” Dr. Grace says. The group’s medical record software template has a single mouse-click to indicate the bundle has been initiated. “Our compliance with DVT prophylaxis has increased dramatically,” he says.—AS

 

 

For hospitalists, human factors knowledge is most useful in process improvement, says John Gosbee, MD, MS, a human factors engineering and healthcare specialist at the University of Michigan. Dr. Gosbee, who has worked with hospitalists in Ann Arbor and around the country, originally studied aerospace medicine, pursued a subspecialty in occupational medicine, and from 1988 to 1992 worked at NASA designing space hospitals. In the dozens of lectures and workshops he has conducted, he has learned numerous physicians resist learning about HF. At first they protest, claiming they “didn’t go to medical school to become engineers” or “weren’t hired to have you tell us we need to be some kind of designer or computer-science software evaluator.”

Dr. Gosbee couldn’t agree more, but after the a-ha! moment, usually in an interactive experience when the hospitalist sees a poor system design is an obstacle to safety and process flow, they open up to adopting the HF mindset. Once on board with HF, hospitalists are quick to translate the theories to their own practices, identifying potential vulnerabilities and risks.

Manufacturers of healthcare equipment and systems don’t want to hear from “safety geeks,” Dr. Gosbee says; the companies want to hear from front-line providers who regularly use the products. “Hospitalists are in great position to provide that input because they see what happens across a broad swath of hospital settings,” he says, “and they could amalgamate the fact that everyone across specialties is having some trouble with this computer screen or new infusion device.”

Dr. Gosbee’s first-hand knowledge and experience solving hospitalist issues with HF techniques evolved into a teaching career. He says the university administration supports his belief in the practicality of HF lessons, and he now works as the lead instructor for a majority of the university’s medical residents.

“Human factors engineering is an efficient way to flip people’s brains around 180 degrees toward systems thinking,” Dr. Gosbee explains, “which is required if the organization wants to become a high-reliability organization.”3

Examples in Medicine

Russ Cucina, MD, MS, hospitalist at the University of California San Francisco Medical Center, describes a practical example of human factors engineering in a simple, widely used design. When cars ran on leaded gasoline, the design of the leaded gas pump nozzle precluded it from being inserted into an unleaded gas tank. “Even though one was clearly labeled leaded and other unleaded, human beings are bad at catching those things, especially when they’re in a hurry and under stress,” says Dr. Cucina, whose research includes clinical human-computer interaction science with an emphasis on human factors and patient safety.

A similar concept is what is missing from the Swan Ganz catheter design. The three ports (proximal, middle, and distal) connecting the catheter to the ICU monitor all have the same shape, making it easy to errantly connect one or more to the wrong port. “You’d think the manufacturers would shape the connectors in a way that would preclude incorrect connections,” Dr. Cucina says, “but that has not been done. We leave it to the vigilance of the bedside nurse or intensivist or hospitalist to hook these up correctly, rather than redesigning them so that cannot be done incorrectly.”

One way to think about human factors engineering is to think about forcing “a round peg into a square hole.” In the hospital setting, round pegs into square holes equate to errors. HF tries to solve the issue (round peg into a round hole, and vice versa). “Were you to apply human factors to the Swan Ganz catheter port connectors,” Dr. Cucina says, “you’d have round into the round hole, square into the square, and triangular into the triangular. You’d have no choice but to do the right thing.”

 

 

Efforts to implement systems that anticipate and minimize the chances of human error, such as computer physician order entry and patient bar coding, are attempts to overcome by design those instances where it is possible to place round pegs into square holes.

Take-Home Messages

Human factors design is an accessible topic with intuitive content. Educating oneself, even a little bit, about human factors could go a long way to inform the individual hospitalist’s thinking about systems.

—Russ Cucina, MD, MS, hospitalist, University of California San Francisco Medical Center

Given the complexity of the care that we deliver, it is no longer realistic to think that, if you’re smart and conscientious and try hard, things will be OK. All hospitalists will be involved in some sort of bad outcome. It behooves us to accept that approach and design systems that are failsafe.

—Janet Nagamine, MD, hospitalist, Kaiser Permanente, Santa Clara, Calif.

There are some patient safety problems that lend themselves to an epidemiologic approach, such as rates of infection, for instance, where we can see we’ve done something to improve upon those rates. The human factors/ergonomics approach is complementary to that approach. Human factors concepts help us design interventions to prevent those rare errors, for which we don’t have rates or readily obtainable rates. The need is not for one approach or the other. We need both.

—Sanjay Saint, MD, hospitalist, professor of internal medicine, Ann Arbor VA Medical Center, University of Michigan, Ann Arbor, Mich.

Hospitalists can hone a human factors mindset with attention to three areas. First, improve your philosophical and attitudinal view toward what you’re trying to redesign. Second, understand the underlying methodology of the systems that people are troubleshooting in your wards and committees.

Third, explore what HF has found in terms of what works and what doesn’t in patient safety.

Hospitalists are also the recipients of new devices, tools, and technologies for patient care. As members of review committees and procurement committees, hospitalists are asked for input. Knowledge of the nuts and bolts of human factors science will give that input some foundation.

—John Gosbee, MD, MS, human factors engineering and healthcare specialist, University of Michigan, Ann Arbor, Mich.

HF Projects in Motion

A number of hospitalists around the country have or are using HF as part of projects and studies to reduce human errors.

Culture change: In the early 2000s, Janet Nagamine, MD, a hospitalist with Kaiser Permanente in Santa Clara, Calif., and her colleagues took human factors concepts to front-line ICU staff. The human factors training provided a framework to reinforce three basic concepts: all humans make errors; processes can be designed to reduce the possibility of error; and processes can be designed so errors are detected and corrected before causing injury.4 “My colleagues and I knew that the punitive, ‘shame-and-blame’ culture around mistakes and errors were preventing us from identifying problems and moving forward with solutions,” Dr. Nagamine says.

A former ICU nurse and current chair of SHM’s Hospital Quality and Patient Safety (HQPS) Committee, Dr. Nagamine first became involved in HF when she realized how many patients suffered adverse events stemming from poorly designed medical systems. “Some of my most respected mentors were involved in these kinds of cases, and I knew eventually that would be me,” she says. It was a disturbing reality. During her medical training it was drilled into her head smart, diligent doctors would be successful. “But bad things happen in medicine; it’s part of what we do,” she says. “Rather than deny that things will inevitably go wrong, I wanted to study safety science and reliable system design.” She asked herself, how can we prevent the same mistakes from happening to competent people who practice in poorly designed systems? “The patterns are there,” she says. “You can train your eyes to look for vulnerabilities and patterns, then find the solutions.”

 

 

After she started looking at adverse events as system failures, rather than solely personal failures, she engaged the staff to redesign systems. She introduced HF concepts and provided an infrastructure to make it safe to report and discuss problems. The project included a new medication error reporting system and the creation of departmental patient safety teams. A palpable culture change developed when front-line staff and managers became empowered to find solutions working side-by-side with the quality and risk management departments.

The result? A dramatic increase in medication errors and near-miss reports: from eight faulty problems per quarter in 2000 to 200 reports per quarter by 2001.

To sum up the essence of Dr. Nagamine’s project, she invokes her favorite quotation from systems expert James Reason: “We can’t change the human condition, but we can change the conditions under which humans work.”1,5

Bar coding workarounds: Hospitalist Tosha Wetterneck, MD, and her colleagues at the University of Wisconsin School of Medicine and Dentistry focused their HF-trained eyes on medication errors.5 The team applied HF concepts as part of a study of bar-coded medication administration systems (BCMAs). Ideally, BCMAs help confirm the five rights of medication administration: the right patient, drug, dose, route, and time. The study authors identified the hospitalist staff had developed 46 workarounds in place of proper use of the BCMA. With each workaround, the researchers identified six potential errors. Furthermore, nurses were overriding the BCMA alerts for 4.2% of patients charted, and for 10.3% of total medication.

By creating an exhaustive template, the study authors broke down the use of BCMA workarounds to the finest detail of task component. They learned many workarounds were engendered by difficulties with the technology and by interactions between BCMA technologies and environmental, technical, process, workload, training, and policy concerns. Data shows BCMAs still have an important role in preventing error; during one year, almost 24,000 BCMA alerts led users to change their action, instead of overriding an alert. “These causes (and related workarounds) are neither rare nor secret,” the authors write. “They are hiding in plain sight.”1,5

Dr. Wetterneck is part of the Systems Engineering Initiative for Patient Safety (SEIPS), an interdisciplinary research group located within the Center for Quality and Productivity Improvement in the College of Engineering at the University of Wisconsin-Madison.6,7 SEIPS uses HF principles to study the safety and quality of healthcare systems.

Congestive heart failure order sets: Researchers in another study incorporated HF science in their review of clinical practice guideline use and application for congestive heart failure (CHF). Reingold and Kulstad studied the impact of HF design elements on order set utilization and recommendations compliance.8

Using retrospective medical record review of adult patients admitted from the emergency department with CHF, the study measured acuity and clinical practice guideline (CPG) parameters before and after introducing new orders. In 87 adult patients before and 84 patients after beginning the new order set, attention to HF design elements significantly improved utilization of the orders and CPG compliance.

Infusion device programming: In another instance, a multidisciplinary research team applied HF design principles to common nursing procedures: programming an insulin infusion and programming a heparin infusion.9,10 An HF usability checklist was developed, and it revealed systematic error-provoking conditions in both tasks.

The good news is the pitfalls were remedied easily.

Not only did researchers subsequently commit to modify training procedures and redesign preprinted orders, they took the bigger step of providing feedback to the manufacturer and committing to incorporate usability testing in future procurement of medical devices. TH

Andrea M. Sattinger is a medical writer based in North Carolina and a frequent contributor to The Hospitalist.

 

 

References

1. Gosbee JW. Conclusion: you need human factors engineering expertise to see design hazards that are hiding in “plain sight!” Jt Comm J Qual Saf. 2004;30(12):696-700.

2. Gosbee J. Introduction to the human factors engineering series. Jt Comm J Qual Saf. 2004;30(4): 215-219.

3. Reason J. Human error: models and management. BMJ. 2000; 320(7237):768-770.

4. Etchells E, Juurlink D, Levinson W. Medication errors: the human factor. CMAJ. 2008;178(1):63-64.

5. Koppel R, Wetterneck T, Telles JL, Karsh BT. Workarounds to barcode medication administration systems: their occurrences, causes, and threats to patient safety. J Am Med Inform Assoc. 2008;15(4):408-423.

6. SEIPS model. http://cqpi.engr.wisc.edu/seips_ home/. Accessed Dec. 20, 2008.

7. Carayon P, Schoofs Hundt A, Karsh BT, et al. Work system design for patient safety: the SEIPS model. Qual Saf Health Care. 2006;15 Suppl 1:850-858.

8. Reingold S, Kulstad E. Impact of human factor design on the use of order sets in the treatment of congestive heart failure. Acad Emerg Med. 2007;14(11):1097-1105.

9. Etchells E, Bailey C, Biason R, et al. Human factors in action: getting “pumped” at a nursing usability laboratory. Healthc Q. 2006;9 Spec No:69-74.

10. Carayon P, Wetterneck T, Schoofs Hundt A, et al. Observing nurse interaction with infusion pump technologies. In: Henriksen K, Battles J, Lewin D, eds. Advances in Patient Safety: From Research to Implementation. Rockville, Md.: Agency for Healthcare Research and Quality; Feb. 2005, AHRQ Publication No. 05-0021-2.

Issue
The Hospitalist - 2009(01)
Issue
The Hospitalist - 2009(01)
Publications
Publications
Article Type
Display Headline
Perilous Intersection
Display Headline
Perilous Intersection
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)