Affiliations
Department of Pediatrics, Primary Children's Hospital, Intermountain Healthcare, University of Utah School of Medicine, Salt Lake City, Utah
Department of Neurology, University of Utah School of Medicine, Salt Lake City, Utah
Given name(s)
Glenn
Family name
Rosenbluth
Degrees
MD

Communicating Effectively With Hospitalized Patients and Families During the COVID-19 Pandemic

Article Type
Changed
Tue, 06/30/2020 - 10:30

For parents of children with medical complexity (CMC), bringing a child to the hospital for needed expertise, equipment, and support is necessarily accompanied by a loss of power, freedom, and control. Two of our authors (K.L., P.M.) are parents of CMC—patients affectionately known as “frequent flyers” at their local hospitals. When health needs present, these experienced parents quickly identify what can be managed at home and what needs a higher level of care. The autonomy and security that accompany this parental expertise have been mitigated by, and in some cases even lost in, the COVID-19 pandemic. In particular, one of the most obvious changes to patients’ and families’ roles in inpatient care has been in communication practices, including changes to patient- and family-centered rounding that result from current isolation procedures and visitation policies. Over the past few months, we’ve learned a tremendous amount from providers and caregivers of hospitalized patients; in this article, we share some of what they’ve taught us.

Before we continue, we take a humble pause. The process of writing this piece spanned weeks during which certain areas of the world were overwhelmed. Our perspective has been informed by others who shared their experiences, and as a result, our health systems are more prepared. We offer this perspective recognizing the importance of learning from others and feeling a sense of gratitude to the providers and patients on the front lines.

CHANGING CIRCUMSTANCES OF CARE

As a group of parents, nurses, physicians, educators, and researchers who have spent the last 10 years studying how to communicate more effectively in the healthcare setting,1,2 we find ourselves in uncharted territory. Even now, we are engaged in an ongoing mentored implementation program examining the effects of a communication bundle on patient- and family- centered rounds (PFCRs) at 21 teaching hospitals across North America (the SHM I-PASS SCORE Study).3 COVID-19 has put that study on hold, and we have taken a step back to reassess the most basic communication needs of patients and families under any circumstance.

Even among our study group, our family advisors have also been on the front lines as patients and caregivers. One author (P.M.), shared a recent experience that she and her son, John Michael had:

“My son [who has autoimmune hepatitis and associated conditions] began coughing and had an intense sinus headache. As his symptoms continued, our concern steadily grew: Could we push through at home or would we have to go in [to the hospital] to seek care? My mind raced. We faced this decision many times, but never with the overwhelming threat of COVID-19 in the equation. My son, who is able to recognize troublesome symptoms, was afraid his sinuses were infected and decided that we should go in. My heart sank.”

Now, amid the COVID-19 pandemic, we have heard that patients like John Michael, who are accustomed to the healthcare setting, are “terrified with this additional concern of just being safe in the hospital,” reported a member of our Family Advisory Council. One of our members added, “We recognize this extends to the providers as well, who maintain great care despite their own family and personal safety concerns.” Although families affirmed the necessity of the enhanced isolation procedures and strict visitation policies, they also highlighted the effects of these changes on usual communication practices, including PFCRs.

CORE VALUES DURING COVID-19

In response to these sentiments, we reached out to all of our family advisors, as well as other team members, for suggestions on how healthcare teams could help patients and families best manage their hospital experiences in the setting of COVID-19. Additionally, we asked our physician and nursing colleagues across health systems about current inpatient unit adaptations. Their suggestions and adaptations reinforced and directly aligned with some of the core values of family engagement and patient- and family-centered care,4 namely, (1) prioritizing communication, (2) maintaining active engagement with patients and families, and (3) enhancing communication with technology.

Prioritizing Communication

Timely and clear communication can help providers manage the expectations of patients and families, build patient and family feelings of confidence, and reduce their feelings of anxiety and vulnerability. Almost universally, families acknowledged the importance of infection control and physical distancing measures while fearing that decreased entry into rooms would lead to decreased communication. “Since COVID-19 is contagious, families will want to see every precaution taken … but in a way that doesn’t cut off communication and leave an already sick and scared child and their family feeling emotionally isolated in a scary situation,” an Advisory Council member recounted. Importantly, one parent shared that hearing about personal protective equipment conservation could amplify stress because of fear their child wouldn’t be protected. These perspectives remind us that families may be experiencing heightened sensitivity and vulnerability during this pandemic.

Maintaining Active Engagement With Patients and Families

PFCRs continue to be an ideal setting for providers, patients, and families to communicate and build shared understanding, as well as build rapport and connection through human interactions. Maintaining rounding structures, when possible, reinforces familiarity with roles and expectations, among both patients who have been hospitalized in the past and those hospitalized for the first time. Adapting rounds may be as simple as opening the door during walk-rounds to invite caregiver participation while being aware of distancing. With large rounding teams, more substantial workflow changes may be necessary.

Beyond PFCRs, patients and family members can be further engaged through tasks/responsibilities for the time in between rounding communication. Examples include recording patient symptoms (eg, work of breathing) or actions (eg, how much water their child drinks). By doing this, patients and caregivers who feel helpless and anxious may be given a greater sense of control while at the same time making helpful contributions to medical care.

Parents also expressed value in reinforcing the message that patients and families are experts about themselves/their loved ones. Healthcare teams can invite their insights, questions, and concerns to show respect for their expertise and value. This builds trust and leads to a feeling of togetherness and teamwork. Across the board, families stressed the value of family engagement and communication in ideal conditions, and even more so in this time of upheaval.

Enhancing Communication With Technology

Many hospitals are leveraging technology to promote communication by integrating workstations on wheels & tablets with video-conferencing software (eg, Zoom, Skype) and even by adding communication via email and phone. While fewer team members are entering rooms, rounding teams are still including the voices of pharmacists, nutritionists, social workers, primary care physicians, and caregivers who are unable to be at the bedside.

These alternative communication methods may actually provide patients with more comfortable avenues for participating in their own care even beyond the pandemic. Children, in particular, may have strong opinions about their care but may not be comfortable speaking up in front of providers whom they don’t know very well. Telehealth, whiteboards, email, and limiting the number of providers in the room might actually create a more approachable environment for these patients even under routine conditions.

CONCLUSION

Patients, families, nurses, physicians, and other team members all feel the current stress on our healthcare system. As we continue to change workflows, alignment with principles of family engagement and patient- and family-centered care4 remain a priority for all involved. Prioritizing effective communication, maintaining engagement with patients and families, and using technology in new ways will all help us maintain high standards of care in both typical and completely atypical settings, such as during this pandemic. Nothing captures the benefits of effective communication better than P.M.’s description of John Michael’s experience during his hospitalization:

“Although usually an expedited triage patient, we spent hours in the ER among other ill and anxious patients. Ultimately, John Michael tested positive for influenza A. We spent 5 days in the hospital on droplet protection.

“The staff was amazing! The doctors and nurses communicated with us every step of the way. They made us aware of extra precautions and explained limitations, like not being able to go in the nutrition room or only having the doctors come in once midday. Whenever they did use [personal protective equipment] and come in, the nurses and team kept a safe distance but made sure to connect with John Michael, talking about what was on TV, what his favorite teams are, asking about his sisters, and always asking if we needed anything or if there was anything they could do. I am grateful for the kind, compassionate, and professional people who continue to care for our children under the intense danger and overwhelming magnitude of COVID-19.”

Disclosures

Dr Landrigan has served as a paid consultant to the Midwest Lighting Institute to help study the effect of blue light on health care provider performance and safety. He has consulted with and holds equity in the I-PASS Institute, which seeks to train institutions in best handoff practices and aid in their implementation. Dr Landrigan has received consulting fees from the Missouri Hospital Association/Executive Speakers Bureau for consulting on I-PASS. In addition, he has received monetary awards, honoraria, and travel reimbursement from multiple academic and professional organizations for teaching and consulting on sleep deprivation, physician performance, handoffs, and safety and has served as an expert witness in cases regarding patient safety and sleep deprivation. Drs Spector and Baird have also consulted with and hold equity in the I-PASS Institute. Dr Baird has consulted with the I-PASS Patient Safety Institute. Dr Patel holds equity/stock options in and has consulted for the I-PASS Patient Safety Institute. Dr Rosenbluth previously consulted with the I-PASS Patient Safety Institute, but not within the past 36 months. The other authors have no conflicts of interest or external support other than the existing PCORI funding for the Society of Hospital Medicine I-PASS SCORE study.

Disclaimer

The I-PASS Patient Safety Institute did not provide support to any authors for this work.

References

1. Starmer AJ, Spector ND, Srivastava R, et al. Changes in medical errors after implementation of a handoff program. N Engl J Med. 2014;371(19):1803-1812. https://doi.org/10.1056/nejmsa1405556.
2. Khan A, Spector ND, Baird JD, et al. Patient safety after implementation of a coproduced family centered communication programme: multicenter before and after intervention study. BMJ. 2018;363:k4764. https://doi.org/10.1136/bmj.k4764.
3. Patient-Centered Outcomes Research Institute. Helping Children’s Hospitals Use a Program to Improve Communication with Families. December 27, 2019. https://www.pcori.org/research-results/2018/helping-childrens-hospitals-­use-program-improve-communication-families. Accessed March 26, 2020.
4. Institute for Patient- and Family-Centered Care (IPFCC). PFCC and COVID-19. https://www.ipfcc.org/bestpractices/covid-19/index.html. Accessed April 10, 2020.

Article PDF
Issue
Journal of Hospital Medicine 15(7)
Publications
Topics
Page Number
440-442. Published Online First June 17, 2020
Sections
Article PDF
Article PDF
Related Articles

For parents of children with medical complexity (CMC), bringing a child to the hospital for needed expertise, equipment, and support is necessarily accompanied by a loss of power, freedom, and control. Two of our authors (K.L., P.M.) are parents of CMC—patients affectionately known as “frequent flyers” at their local hospitals. When health needs present, these experienced parents quickly identify what can be managed at home and what needs a higher level of care. The autonomy and security that accompany this parental expertise have been mitigated by, and in some cases even lost in, the COVID-19 pandemic. In particular, one of the most obvious changes to patients’ and families’ roles in inpatient care has been in communication practices, including changes to patient- and family-centered rounding that result from current isolation procedures and visitation policies. Over the past few months, we’ve learned a tremendous amount from providers and caregivers of hospitalized patients; in this article, we share some of what they’ve taught us.

Before we continue, we take a humble pause. The process of writing this piece spanned weeks during which certain areas of the world were overwhelmed. Our perspective has been informed by others who shared their experiences, and as a result, our health systems are more prepared. We offer this perspective recognizing the importance of learning from others and feeling a sense of gratitude to the providers and patients on the front lines.

CHANGING CIRCUMSTANCES OF CARE

As a group of parents, nurses, physicians, educators, and researchers who have spent the last 10 years studying how to communicate more effectively in the healthcare setting,1,2 we find ourselves in uncharted territory. Even now, we are engaged in an ongoing mentored implementation program examining the effects of a communication bundle on patient- and family- centered rounds (PFCRs) at 21 teaching hospitals across North America (the SHM I-PASS SCORE Study).3 COVID-19 has put that study on hold, and we have taken a step back to reassess the most basic communication needs of patients and families under any circumstance.

Even among our study group, our family advisors have also been on the front lines as patients and caregivers. One author (P.M.), shared a recent experience that she and her son, John Michael had:

“My son [who has autoimmune hepatitis and associated conditions] began coughing and had an intense sinus headache. As his symptoms continued, our concern steadily grew: Could we push through at home or would we have to go in [to the hospital] to seek care? My mind raced. We faced this decision many times, but never with the overwhelming threat of COVID-19 in the equation. My son, who is able to recognize troublesome symptoms, was afraid his sinuses were infected and decided that we should go in. My heart sank.”

Now, amid the COVID-19 pandemic, we have heard that patients like John Michael, who are accustomed to the healthcare setting, are “terrified with this additional concern of just being safe in the hospital,” reported a member of our Family Advisory Council. One of our members added, “We recognize this extends to the providers as well, who maintain great care despite their own family and personal safety concerns.” Although families affirmed the necessity of the enhanced isolation procedures and strict visitation policies, they also highlighted the effects of these changes on usual communication practices, including PFCRs.

CORE VALUES DURING COVID-19

In response to these sentiments, we reached out to all of our family advisors, as well as other team members, for suggestions on how healthcare teams could help patients and families best manage their hospital experiences in the setting of COVID-19. Additionally, we asked our physician and nursing colleagues across health systems about current inpatient unit adaptations. Their suggestions and adaptations reinforced and directly aligned with some of the core values of family engagement and patient- and family-centered care,4 namely, (1) prioritizing communication, (2) maintaining active engagement with patients and families, and (3) enhancing communication with technology.

Prioritizing Communication

Timely and clear communication can help providers manage the expectations of patients and families, build patient and family feelings of confidence, and reduce their feelings of anxiety and vulnerability. Almost universally, families acknowledged the importance of infection control and physical distancing measures while fearing that decreased entry into rooms would lead to decreased communication. “Since COVID-19 is contagious, families will want to see every precaution taken … but in a way that doesn’t cut off communication and leave an already sick and scared child and their family feeling emotionally isolated in a scary situation,” an Advisory Council member recounted. Importantly, one parent shared that hearing about personal protective equipment conservation could amplify stress because of fear their child wouldn’t be protected. These perspectives remind us that families may be experiencing heightened sensitivity and vulnerability during this pandemic.

Maintaining Active Engagement With Patients and Families

PFCRs continue to be an ideal setting for providers, patients, and families to communicate and build shared understanding, as well as build rapport and connection through human interactions. Maintaining rounding structures, when possible, reinforces familiarity with roles and expectations, among both patients who have been hospitalized in the past and those hospitalized for the first time. Adapting rounds may be as simple as opening the door during walk-rounds to invite caregiver participation while being aware of distancing. With large rounding teams, more substantial workflow changes may be necessary.

Beyond PFCRs, patients and family members can be further engaged through tasks/responsibilities for the time in between rounding communication. Examples include recording patient symptoms (eg, work of breathing) or actions (eg, how much water their child drinks). By doing this, patients and caregivers who feel helpless and anxious may be given a greater sense of control while at the same time making helpful contributions to medical care.

Parents also expressed value in reinforcing the message that patients and families are experts about themselves/their loved ones. Healthcare teams can invite their insights, questions, and concerns to show respect for their expertise and value. This builds trust and leads to a feeling of togetherness and teamwork. Across the board, families stressed the value of family engagement and communication in ideal conditions, and even more so in this time of upheaval.

Enhancing Communication With Technology

Many hospitals are leveraging technology to promote communication by integrating workstations on wheels & tablets with video-conferencing software (eg, Zoom, Skype) and even by adding communication via email and phone. While fewer team members are entering rooms, rounding teams are still including the voices of pharmacists, nutritionists, social workers, primary care physicians, and caregivers who are unable to be at the bedside.

These alternative communication methods may actually provide patients with more comfortable avenues for participating in their own care even beyond the pandemic. Children, in particular, may have strong opinions about their care but may not be comfortable speaking up in front of providers whom they don’t know very well. Telehealth, whiteboards, email, and limiting the number of providers in the room might actually create a more approachable environment for these patients even under routine conditions.

CONCLUSION

Patients, families, nurses, physicians, and other team members all feel the current stress on our healthcare system. As we continue to change workflows, alignment with principles of family engagement and patient- and family-centered care4 remain a priority for all involved. Prioritizing effective communication, maintaining engagement with patients and families, and using technology in new ways will all help us maintain high standards of care in both typical and completely atypical settings, such as during this pandemic. Nothing captures the benefits of effective communication better than P.M.’s description of John Michael’s experience during his hospitalization:

“Although usually an expedited triage patient, we spent hours in the ER among other ill and anxious patients. Ultimately, John Michael tested positive for influenza A. We spent 5 days in the hospital on droplet protection.

“The staff was amazing! The doctors and nurses communicated with us every step of the way. They made us aware of extra precautions and explained limitations, like not being able to go in the nutrition room or only having the doctors come in once midday. Whenever they did use [personal protective equipment] and come in, the nurses and team kept a safe distance but made sure to connect with John Michael, talking about what was on TV, what his favorite teams are, asking about his sisters, and always asking if we needed anything or if there was anything they could do. I am grateful for the kind, compassionate, and professional people who continue to care for our children under the intense danger and overwhelming magnitude of COVID-19.”

Disclosures

Dr Landrigan has served as a paid consultant to the Midwest Lighting Institute to help study the effect of blue light on health care provider performance and safety. He has consulted with and holds equity in the I-PASS Institute, which seeks to train institutions in best handoff practices and aid in their implementation. Dr Landrigan has received consulting fees from the Missouri Hospital Association/Executive Speakers Bureau for consulting on I-PASS. In addition, he has received monetary awards, honoraria, and travel reimbursement from multiple academic and professional organizations for teaching and consulting on sleep deprivation, physician performance, handoffs, and safety and has served as an expert witness in cases regarding patient safety and sleep deprivation. Drs Spector and Baird have also consulted with and hold equity in the I-PASS Institute. Dr Baird has consulted with the I-PASS Patient Safety Institute. Dr Patel holds equity/stock options in and has consulted for the I-PASS Patient Safety Institute. Dr Rosenbluth previously consulted with the I-PASS Patient Safety Institute, but not within the past 36 months. The other authors have no conflicts of interest or external support other than the existing PCORI funding for the Society of Hospital Medicine I-PASS SCORE study.

Disclaimer

The I-PASS Patient Safety Institute did not provide support to any authors for this work.

For parents of children with medical complexity (CMC), bringing a child to the hospital for needed expertise, equipment, and support is necessarily accompanied by a loss of power, freedom, and control. Two of our authors (K.L., P.M.) are parents of CMC—patients affectionately known as “frequent flyers” at their local hospitals. When health needs present, these experienced parents quickly identify what can be managed at home and what needs a higher level of care. The autonomy and security that accompany this parental expertise have been mitigated by, and in some cases even lost in, the COVID-19 pandemic. In particular, one of the most obvious changes to patients’ and families’ roles in inpatient care has been in communication practices, including changes to patient- and family-centered rounding that result from current isolation procedures and visitation policies. Over the past few months, we’ve learned a tremendous amount from providers and caregivers of hospitalized patients; in this article, we share some of what they’ve taught us.

Before we continue, we take a humble pause. The process of writing this piece spanned weeks during which certain areas of the world were overwhelmed. Our perspective has been informed by others who shared their experiences, and as a result, our health systems are more prepared. We offer this perspective recognizing the importance of learning from others and feeling a sense of gratitude to the providers and patients on the front lines.

CHANGING CIRCUMSTANCES OF CARE

As a group of parents, nurses, physicians, educators, and researchers who have spent the last 10 years studying how to communicate more effectively in the healthcare setting,1,2 we find ourselves in uncharted territory. Even now, we are engaged in an ongoing mentored implementation program examining the effects of a communication bundle on patient- and family- centered rounds (PFCRs) at 21 teaching hospitals across North America (the SHM I-PASS SCORE Study).3 COVID-19 has put that study on hold, and we have taken a step back to reassess the most basic communication needs of patients and families under any circumstance.

Even among our study group, our family advisors have also been on the front lines as patients and caregivers. One author (P.M.), shared a recent experience that she and her son, John Michael had:

“My son [who has autoimmune hepatitis and associated conditions] began coughing and had an intense sinus headache. As his symptoms continued, our concern steadily grew: Could we push through at home or would we have to go in [to the hospital] to seek care? My mind raced. We faced this decision many times, but never with the overwhelming threat of COVID-19 in the equation. My son, who is able to recognize troublesome symptoms, was afraid his sinuses were infected and decided that we should go in. My heart sank.”

Now, amid the COVID-19 pandemic, we have heard that patients like John Michael, who are accustomed to the healthcare setting, are “terrified with this additional concern of just being safe in the hospital,” reported a member of our Family Advisory Council. One of our members added, “We recognize this extends to the providers as well, who maintain great care despite their own family and personal safety concerns.” Although families affirmed the necessity of the enhanced isolation procedures and strict visitation policies, they also highlighted the effects of these changes on usual communication practices, including PFCRs.

CORE VALUES DURING COVID-19

In response to these sentiments, we reached out to all of our family advisors, as well as other team members, for suggestions on how healthcare teams could help patients and families best manage their hospital experiences in the setting of COVID-19. Additionally, we asked our physician and nursing colleagues across health systems about current inpatient unit adaptations. Their suggestions and adaptations reinforced and directly aligned with some of the core values of family engagement and patient- and family-centered care,4 namely, (1) prioritizing communication, (2) maintaining active engagement with patients and families, and (3) enhancing communication with technology.

Prioritizing Communication

Timely and clear communication can help providers manage the expectations of patients and families, build patient and family feelings of confidence, and reduce their feelings of anxiety and vulnerability. Almost universally, families acknowledged the importance of infection control and physical distancing measures while fearing that decreased entry into rooms would lead to decreased communication. “Since COVID-19 is contagious, families will want to see every precaution taken … but in a way that doesn’t cut off communication and leave an already sick and scared child and their family feeling emotionally isolated in a scary situation,” an Advisory Council member recounted. Importantly, one parent shared that hearing about personal protective equipment conservation could amplify stress because of fear their child wouldn’t be protected. These perspectives remind us that families may be experiencing heightened sensitivity and vulnerability during this pandemic.

Maintaining Active Engagement With Patients and Families

PFCRs continue to be an ideal setting for providers, patients, and families to communicate and build shared understanding, as well as build rapport and connection through human interactions. Maintaining rounding structures, when possible, reinforces familiarity with roles and expectations, among both patients who have been hospitalized in the past and those hospitalized for the first time. Adapting rounds may be as simple as opening the door during walk-rounds to invite caregiver participation while being aware of distancing. With large rounding teams, more substantial workflow changes may be necessary.

Beyond PFCRs, patients and family members can be further engaged through tasks/responsibilities for the time in between rounding communication. Examples include recording patient symptoms (eg, work of breathing) or actions (eg, how much water their child drinks). By doing this, patients and caregivers who feel helpless and anxious may be given a greater sense of control while at the same time making helpful contributions to medical care.

Parents also expressed value in reinforcing the message that patients and families are experts about themselves/their loved ones. Healthcare teams can invite their insights, questions, and concerns to show respect for their expertise and value. This builds trust and leads to a feeling of togetherness and teamwork. Across the board, families stressed the value of family engagement and communication in ideal conditions, and even more so in this time of upheaval.

Enhancing Communication With Technology

Many hospitals are leveraging technology to promote communication by integrating workstations on wheels & tablets with video-conferencing software (eg, Zoom, Skype) and even by adding communication via email and phone. While fewer team members are entering rooms, rounding teams are still including the voices of pharmacists, nutritionists, social workers, primary care physicians, and caregivers who are unable to be at the bedside.

These alternative communication methods may actually provide patients with more comfortable avenues for participating in their own care even beyond the pandemic. Children, in particular, may have strong opinions about their care but may not be comfortable speaking up in front of providers whom they don’t know very well. Telehealth, whiteboards, email, and limiting the number of providers in the room might actually create a more approachable environment for these patients even under routine conditions.

CONCLUSION

Patients, families, nurses, physicians, and other team members all feel the current stress on our healthcare system. As we continue to change workflows, alignment with principles of family engagement and patient- and family-centered care4 remain a priority for all involved. Prioritizing effective communication, maintaining engagement with patients and families, and using technology in new ways will all help us maintain high standards of care in both typical and completely atypical settings, such as during this pandemic. Nothing captures the benefits of effective communication better than P.M.’s description of John Michael’s experience during his hospitalization:

“Although usually an expedited triage patient, we spent hours in the ER among other ill and anxious patients. Ultimately, John Michael tested positive for influenza A. We spent 5 days in the hospital on droplet protection.

“The staff was amazing! The doctors and nurses communicated with us every step of the way. They made us aware of extra precautions and explained limitations, like not being able to go in the nutrition room or only having the doctors come in once midday. Whenever they did use [personal protective equipment] and come in, the nurses and team kept a safe distance but made sure to connect with John Michael, talking about what was on TV, what his favorite teams are, asking about his sisters, and always asking if we needed anything or if there was anything they could do. I am grateful for the kind, compassionate, and professional people who continue to care for our children under the intense danger and overwhelming magnitude of COVID-19.”

Disclosures

Dr Landrigan has served as a paid consultant to the Midwest Lighting Institute to help study the effect of blue light on health care provider performance and safety. He has consulted with and holds equity in the I-PASS Institute, which seeks to train institutions in best handoff practices and aid in their implementation. Dr Landrigan has received consulting fees from the Missouri Hospital Association/Executive Speakers Bureau for consulting on I-PASS. In addition, he has received monetary awards, honoraria, and travel reimbursement from multiple academic and professional organizations for teaching and consulting on sleep deprivation, physician performance, handoffs, and safety and has served as an expert witness in cases regarding patient safety and sleep deprivation. Drs Spector and Baird have also consulted with and hold equity in the I-PASS Institute. Dr Baird has consulted with the I-PASS Patient Safety Institute. Dr Patel holds equity/stock options in and has consulted for the I-PASS Patient Safety Institute. Dr Rosenbluth previously consulted with the I-PASS Patient Safety Institute, but not within the past 36 months. The other authors have no conflicts of interest or external support other than the existing PCORI funding for the Society of Hospital Medicine I-PASS SCORE study.

Disclaimer

The I-PASS Patient Safety Institute did not provide support to any authors for this work.

References

1. Starmer AJ, Spector ND, Srivastava R, et al. Changes in medical errors after implementation of a handoff program. N Engl J Med. 2014;371(19):1803-1812. https://doi.org/10.1056/nejmsa1405556.
2. Khan A, Spector ND, Baird JD, et al. Patient safety after implementation of a coproduced family centered communication programme: multicenter before and after intervention study. BMJ. 2018;363:k4764. https://doi.org/10.1136/bmj.k4764.
3. Patient-Centered Outcomes Research Institute. Helping Children’s Hospitals Use a Program to Improve Communication with Families. December 27, 2019. https://www.pcori.org/research-results/2018/helping-childrens-hospitals-­use-program-improve-communication-families. Accessed March 26, 2020.
4. Institute for Patient- and Family-Centered Care (IPFCC). PFCC and COVID-19. https://www.ipfcc.org/bestpractices/covid-19/index.html. Accessed April 10, 2020.

References

1. Starmer AJ, Spector ND, Srivastava R, et al. Changes in medical errors after implementation of a handoff program. N Engl J Med. 2014;371(19):1803-1812. https://doi.org/10.1056/nejmsa1405556.
2. Khan A, Spector ND, Baird JD, et al. Patient safety after implementation of a coproduced family centered communication programme: multicenter before and after intervention study. BMJ. 2018;363:k4764. https://doi.org/10.1136/bmj.k4764.
3. Patient-Centered Outcomes Research Institute. Helping Children’s Hospitals Use a Program to Improve Communication with Families. December 27, 2019. https://www.pcori.org/research-results/2018/helping-childrens-hospitals-­use-program-improve-communication-families. Accessed March 26, 2020.
4. Institute for Patient- and Family-Centered Care (IPFCC). PFCC and COVID-19. https://www.ipfcc.org/bestpractices/covid-19/index.html. Accessed April 10, 2020.

Issue
Journal of Hospital Medicine 15(7)
Issue
Journal of Hospital Medicine 15(7)
Page Number
440-442. Published Online First June 17, 2020
Page Number
440-442. Published Online First June 17, 2020
Publications
Publications
Topics
Article Type
Sections
Article Source

© 2020 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Glenn Rosenbluth, MD; Email: glenn.rosenbluth@ucsf.edu; Telephone: 415-476-9185; Twitter: @grosenbluth.
Content Gating
Open Access (article Unlocked/Open Access)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Article PDF Media

Improving Handoffs: Teaching beyond “Watch One, Do One”

Article Type
Changed
Sat, 12/16/2017 - 21:12

In this issue of the Journal of Hospital Medicine, Lee et al.1 describe a randomized trial to assess the effectiveness of four different approaches to teaching handoffs with the goal of improving process measures related to interns’ handoffs. The Society of Hospital Medicine (SHM), The Joint Commission (TJC), Accreditation Council for Graduate Medical Education (ACGME), and others have all emphasized the importance of high-quality handoffs as an essential component of safe patient care.2-4 The ACGME specifically requires that all institutions that sponsor ACGME-accredited programs provide both structure and monitoring, and the SHM complements this with evidence-based guidelines for handoffs.

Lee’s team trained 4 groups of residents in handoffs using 4 different hour-long sessions, each with a different focus and educational format. A control group received a 1-hour didactic, which they had already heard; an I-PASS–based training group included role plays; and Policy Mandate and PDSA (Plan, Do, Study, Act) groups included group discussions. The prioritization of content in the sessions varied considerably among the groups, and the results should be interpreted within the context of the variation in both delivery and content.

Consistent with the focus of each intervention, the I-PASS–based training group had the greatest improvement in transfer of patient information, the policy mandate training group (focused on specific tasks) had the greatest improvement in task accountability, and the PDSA-training group (focused on intern-driven improvements) had the greatest improvement in personal responsibility. The control 60-minute didactic group did not show significant improvement in any domains. The lack of improvement in the control group doesn’t imply that the content wasn’t valuable, just that repetition didn’t add anything to baseline. One takeaway from the primary results of this study is that residents are likely to practice and improve what they are taught, and therefore, faculty should teach them purposefully. If residents aren’t taught handoff skills, they are unlikely to master them.

The interventions used in this study are neither mutually exclusive nor duplicative. In the final conclusions, the authors described the potential for a curriculum that includes elements from all 3 interventions. One could certainly imagine a handoff training program that includes elements of the I-PASS handoff bundle including role plays, additional emphasis on personal responsibility for specific tasks, as well as a focus on PDSA cycles of improvement for handoff processes. This likely could be accomplished with efficiency and might add only an hour to the 1-hour trainings. Evidence from the I-PASS study5 suggests that improving handoffs can decrease medical errors by 21% and adverse events by 30%; this certainly seems worth the time.

Checklist-based observation tools can provide valuable data to assess handoffs.6 Lee’s study used a checklist based on TJC recommendations, and the 17 checklist elements overlapped somewhat with the SHM guidelines,2 providing some evidence for content validity. The dependent variable was total number of checklist items included in handoffs, a methodology that assumes that all handoff elements are equally important (eg, gender is weighted equally to if-then plans). This checklist also has a large proportion of items related to 2-way and closed-loop communication and therefore, places heavy weight on this component of handoffs. Adapting this checklist into an assessment tool would require additional validity evidence but could make it a very useful tool for completing handoff assessments and providing meaningful feedback.

The ideal data collection instrument would also include outcome measures, in addition to process measures. Improvements in outcome measures such as medical errors and adverse events, are more difficult to document but also provide more valuable data about the impact of curricula. In designing new hybrid curricula, it will be extremely important to focus on those outcomes that reflect the greatest impact on patient safety.

Finally, this study reminds us that the delivery modes of curricula are important factors in learning. The control group received an exclusively didactic presentation that they had heard before, while the other 3 groups had interactive components including role plays and group discussions. The improvements in different domains with different training formats provide evidence for the complementary nature. Interactive curricula involving role plays, simulations, and small-group discussions are more resource-intense than simple didactics, but they are also likely to be more impactful.

Teaching and assessing the quality of handoffs is critical to the safe practice of medicine. New ACGME duty hour requirements, which began in July, will allow for increased flexibility allowing longer shifts with shorter breaks.7 Regardless of the shift/call schedules programs design for their trainees, safe handoffs are essential. The strategies described here may be useful for helping institutions improve patient safety through better handoffs. This study adds to the bulk of data demonstrating that handoffs are a skill that should be both taught and assessed during residency training.

 

 

References

1. Lee SH, Terndrup C, Phan PH, et al. A Randomized Cohort Controlled Trial to Compare Intern Sign-Out Training Interventions. J Hosp Med. 2017;12(12):979-983.
2. Arora VM, Manjarrez E, Dressler DD, Basaviah P, Halasyamani L, Kripalani S. Hospitalist handoffs: a systematic review and task force recommendations. J Hosp Med. 2009;4(7):433-440. PubMed
3. Accreditation Council for Graduate Medical Education. Common Program Requirements. 2017. https://www.acgmecommon.org/2017_requirements Accessed November 10, 2017.
4. The Joint Commission. Improving Transitions of Care: Hand-off Communications. 2013; http://www.centerfortransforminghealthcare.org/tst_hoc.aspx. Accessed November 10, 2017.
5. Starmer AJ, Spector ND, Srivastava R, et al. Changes in medical errors after implementation of a handoff program. N Engl J Med. 2014;371(19):1803-1812. PubMed
6. Feraco AM, Starmer AJ, Sectish TC, Spector ND, West DC, Landrigan CP. Reliability of Verbal Handoff Assessment and Handoff Quality Before and After Implementation of a Resident Handoff Bundle. Acad Pediatr. 2016;16(6):524-531. PubMed
7. Accreditation Council for Continuing Medical Education. Common Program Requirements. 2017; https://www.acgmecommon.org/2017_requirements. Accessed on June 12, 2017. 

Article PDF
Issue
Journal of Hospital Medicine 12(12)
Publications
Topics
Page Number
1022-1023
Sections
Article PDF
Article PDF

In this issue of the Journal of Hospital Medicine, Lee et al.1 describe a randomized trial to assess the effectiveness of four different approaches to teaching handoffs with the goal of improving process measures related to interns’ handoffs. The Society of Hospital Medicine (SHM), The Joint Commission (TJC), Accreditation Council for Graduate Medical Education (ACGME), and others have all emphasized the importance of high-quality handoffs as an essential component of safe patient care.2-4 The ACGME specifically requires that all institutions that sponsor ACGME-accredited programs provide both structure and monitoring, and the SHM complements this with evidence-based guidelines for handoffs.

Lee’s team trained 4 groups of residents in handoffs using 4 different hour-long sessions, each with a different focus and educational format. A control group received a 1-hour didactic, which they had already heard; an I-PASS–based training group included role plays; and Policy Mandate and PDSA (Plan, Do, Study, Act) groups included group discussions. The prioritization of content in the sessions varied considerably among the groups, and the results should be interpreted within the context of the variation in both delivery and content.

Consistent with the focus of each intervention, the I-PASS–based training group had the greatest improvement in transfer of patient information, the policy mandate training group (focused on specific tasks) had the greatest improvement in task accountability, and the PDSA-training group (focused on intern-driven improvements) had the greatest improvement in personal responsibility. The control 60-minute didactic group did not show significant improvement in any domains. The lack of improvement in the control group doesn’t imply that the content wasn’t valuable, just that repetition didn’t add anything to baseline. One takeaway from the primary results of this study is that residents are likely to practice and improve what they are taught, and therefore, faculty should teach them purposefully. If residents aren’t taught handoff skills, they are unlikely to master them.

The interventions used in this study are neither mutually exclusive nor duplicative. In the final conclusions, the authors described the potential for a curriculum that includes elements from all 3 interventions. One could certainly imagine a handoff training program that includes elements of the I-PASS handoff bundle including role plays, additional emphasis on personal responsibility for specific tasks, as well as a focus on PDSA cycles of improvement for handoff processes. This likely could be accomplished with efficiency and might add only an hour to the 1-hour trainings. Evidence from the I-PASS study5 suggests that improving handoffs can decrease medical errors by 21% and adverse events by 30%; this certainly seems worth the time.

Checklist-based observation tools can provide valuable data to assess handoffs.6 Lee’s study used a checklist based on TJC recommendations, and the 17 checklist elements overlapped somewhat with the SHM guidelines,2 providing some evidence for content validity. The dependent variable was total number of checklist items included in handoffs, a methodology that assumes that all handoff elements are equally important (eg, gender is weighted equally to if-then plans). This checklist also has a large proportion of items related to 2-way and closed-loop communication and therefore, places heavy weight on this component of handoffs. Adapting this checklist into an assessment tool would require additional validity evidence but could make it a very useful tool for completing handoff assessments and providing meaningful feedback.

The ideal data collection instrument would also include outcome measures, in addition to process measures. Improvements in outcome measures such as medical errors and adverse events, are more difficult to document but also provide more valuable data about the impact of curricula. In designing new hybrid curricula, it will be extremely important to focus on those outcomes that reflect the greatest impact on patient safety.

Finally, this study reminds us that the delivery modes of curricula are important factors in learning. The control group received an exclusively didactic presentation that they had heard before, while the other 3 groups had interactive components including role plays and group discussions. The improvements in different domains with different training formats provide evidence for the complementary nature. Interactive curricula involving role plays, simulations, and small-group discussions are more resource-intense than simple didactics, but they are also likely to be more impactful.

Teaching and assessing the quality of handoffs is critical to the safe practice of medicine. New ACGME duty hour requirements, which began in July, will allow for increased flexibility allowing longer shifts with shorter breaks.7 Regardless of the shift/call schedules programs design for their trainees, safe handoffs are essential. The strategies described here may be useful for helping institutions improve patient safety through better handoffs. This study adds to the bulk of data demonstrating that handoffs are a skill that should be both taught and assessed during residency training.

 

 

In this issue of the Journal of Hospital Medicine, Lee et al.1 describe a randomized trial to assess the effectiveness of four different approaches to teaching handoffs with the goal of improving process measures related to interns’ handoffs. The Society of Hospital Medicine (SHM), The Joint Commission (TJC), Accreditation Council for Graduate Medical Education (ACGME), and others have all emphasized the importance of high-quality handoffs as an essential component of safe patient care.2-4 The ACGME specifically requires that all institutions that sponsor ACGME-accredited programs provide both structure and monitoring, and the SHM complements this with evidence-based guidelines for handoffs.

Lee’s team trained 4 groups of residents in handoffs using 4 different hour-long sessions, each with a different focus and educational format. A control group received a 1-hour didactic, which they had already heard; an I-PASS–based training group included role plays; and Policy Mandate and PDSA (Plan, Do, Study, Act) groups included group discussions. The prioritization of content in the sessions varied considerably among the groups, and the results should be interpreted within the context of the variation in both delivery and content.

Consistent with the focus of each intervention, the I-PASS–based training group had the greatest improvement in transfer of patient information, the policy mandate training group (focused on specific tasks) had the greatest improvement in task accountability, and the PDSA-training group (focused on intern-driven improvements) had the greatest improvement in personal responsibility. The control 60-minute didactic group did not show significant improvement in any domains. The lack of improvement in the control group doesn’t imply that the content wasn’t valuable, just that repetition didn’t add anything to baseline. One takeaway from the primary results of this study is that residents are likely to practice and improve what they are taught, and therefore, faculty should teach them purposefully. If residents aren’t taught handoff skills, they are unlikely to master them.

The interventions used in this study are neither mutually exclusive nor duplicative. In the final conclusions, the authors described the potential for a curriculum that includes elements from all 3 interventions. One could certainly imagine a handoff training program that includes elements of the I-PASS handoff bundle including role plays, additional emphasis on personal responsibility for specific tasks, as well as a focus on PDSA cycles of improvement for handoff processes. This likely could be accomplished with efficiency and might add only an hour to the 1-hour trainings. Evidence from the I-PASS study5 suggests that improving handoffs can decrease medical errors by 21% and adverse events by 30%; this certainly seems worth the time.

Checklist-based observation tools can provide valuable data to assess handoffs.6 Lee’s study used a checklist based on TJC recommendations, and the 17 checklist elements overlapped somewhat with the SHM guidelines,2 providing some evidence for content validity. The dependent variable was total number of checklist items included in handoffs, a methodology that assumes that all handoff elements are equally important (eg, gender is weighted equally to if-then plans). This checklist also has a large proportion of items related to 2-way and closed-loop communication and therefore, places heavy weight on this component of handoffs. Adapting this checklist into an assessment tool would require additional validity evidence but could make it a very useful tool for completing handoff assessments and providing meaningful feedback.

The ideal data collection instrument would also include outcome measures, in addition to process measures. Improvements in outcome measures such as medical errors and adverse events, are more difficult to document but also provide more valuable data about the impact of curricula. In designing new hybrid curricula, it will be extremely important to focus on those outcomes that reflect the greatest impact on patient safety.

Finally, this study reminds us that the delivery modes of curricula are important factors in learning. The control group received an exclusively didactic presentation that they had heard before, while the other 3 groups had interactive components including role plays and group discussions. The improvements in different domains with different training formats provide evidence for the complementary nature. Interactive curricula involving role plays, simulations, and small-group discussions are more resource-intense than simple didactics, but they are also likely to be more impactful.

Teaching and assessing the quality of handoffs is critical to the safe practice of medicine. New ACGME duty hour requirements, which began in July, will allow for increased flexibility allowing longer shifts with shorter breaks.7 Regardless of the shift/call schedules programs design for their trainees, safe handoffs are essential. The strategies described here may be useful for helping institutions improve patient safety through better handoffs. This study adds to the bulk of data demonstrating that handoffs are a skill that should be both taught and assessed during residency training.

 

 

References

1. Lee SH, Terndrup C, Phan PH, et al. A Randomized Cohort Controlled Trial to Compare Intern Sign-Out Training Interventions. J Hosp Med. 2017;12(12):979-983.
2. Arora VM, Manjarrez E, Dressler DD, Basaviah P, Halasyamani L, Kripalani S. Hospitalist handoffs: a systematic review and task force recommendations. J Hosp Med. 2009;4(7):433-440. PubMed
3. Accreditation Council for Graduate Medical Education. Common Program Requirements. 2017. https://www.acgmecommon.org/2017_requirements Accessed November 10, 2017.
4. The Joint Commission. Improving Transitions of Care: Hand-off Communications. 2013; http://www.centerfortransforminghealthcare.org/tst_hoc.aspx. Accessed November 10, 2017.
5. Starmer AJ, Spector ND, Srivastava R, et al. Changes in medical errors after implementation of a handoff program. N Engl J Med. 2014;371(19):1803-1812. PubMed
6. Feraco AM, Starmer AJ, Sectish TC, Spector ND, West DC, Landrigan CP. Reliability of Verbal Handoff Assessment and Handoff Quality Before and After Implementation of a Resident Handoff Bundle. Acad Pediatr. 2016;16(6):524-531. PubMed
7. Accreditation Council for Continuing Medical Education. Common Program Requirements. 2017; https://www.acgmecommon.org/2017_requirements. Accessed on June 12, 2017. 

References

1. Lee SH, Terndrup C, Phan PH, et al. A Randomized Cohort Controlled Trial to Compare Intern Sign-Out Training Interventions. J Hosp Med. 2017;12(12):979-983.
2. Arora VM, Manjarrez E, Dressler DD, Basaviah P, Halasyamani L, Kripalani S. Hospitalist handoffs: a systematic review and task force recommendations. J Hosp Med. 2009;4(7):433-440. PubMed
3. Accreditation Council for Graduate Medical Education. Common Program Requirements. 2017. https://www.acgmecommon.org/2017_requirements Accessed November 10, 2017.
4. The Joint Commission. Improving Transitions of Care: Hand-off Communications. 2013; http://www.centerfortransforminghealthcare.org/tst_hoc.aspx. Accessed November 10, 2017.
5. Starmer AJ, Spector ND, Srivastava R, et al. Changes in medical errors after implementation of a handoff program. N Engl J Med. 2014;371(19):1803-1812. PubMed
6. Feraco AM, Starmer AJ, Sectish TC, Spector ND, West DC, Landrigan CP. Reliability of Verbal Handoff Assessment and Handoff Quality Before and After Implementation of a Resident Handoff Bundle. Acad Pediatr. 2016;16(6):524-531. PubMed
7. Accreditation Council for Continuing Medical Education. Common Program Requirements. 2017; https://www.acgmecommon.org/2017_requirements. Accessed on June 12, 2017. 

Issue
Journal of Hospital Medicine 12(12)
Issue
Journal of Hospital Medicine 12(12)
Page Number
1022-1023
Page Number
1022-1023
Publications
Publications
Topics
Article Type
Sections
Teambase XML
<?xml version="1.0" encoding="UTF-8"?>
<!--$RCSfile: InCopy_agile.xsl,v $ $Revision: 1.35 $-->
<!--$RCSfile: drupal.xsl,v $ $Revision: 1.7 $-->
<root generator="drupal.xsl" gversion="1.7"> <header> <fileName>rosenbluth 1217</fileName> <TBEID>0C0121D9.SIG</TBEID> <TBUniqueIdentifier>NJ_0C0121D9</TBUniqueIdentifier> <newsOrJournal>Journal</newsOrJournal> <publisherName>Frontline Medical Communications Inc.</publisherName> <storyname/> <articleType>1</articleType> <TBLocation>Copyfitting-JHM</TBLocation> <QCDate/> <firstPublished>20171117T085241</firstPublished> <LastPublished>20171117T085241</LastPublished> <pubStatus qcode="stat:"/> <embargoDate/> <killDate/> <CMSDate>20171117T085241</CMSDate> <articleSource/> <facebookInfo/> <meetingNumber/> <byline/> <bylineText>Glenn Rosenbluth, MD*</bylineText> <bylineFull/> <bylineTitleText/> <USOrGlobal/> <wireDocType/> <newsDocType>(choose one)</newsDocType> <journalDocType>(choose one)</journalDocType> <linkLabel/> <pageRange/> <citation/> <quizID/> <indexIssueDate/> <itemClass qcode="ninat:text"/> <provider qcode="provider:"> <name/> <rightsInfo> <copyrightHolder> <name/> </copyrightHolder> <copyrightNotice/> </rightsInfo> </provider> <abstract/> <metaDescription>*Address for correspondence and reprint requests: Glenn Rosenbluth, MD, Department of Pediatrics, 550 16th Street, 5th floor, San Francisco, CA 94143-0110; Tele</metaDescription> <articlePDF/> <teaserImage/> <title>Improving Handoffs: Teaching beyond “Watch One, Do One”</title> <deck/> <eyebrow>EDITORIAL</eyebrow> <disclaimer/> <AuthorList/> <articleURL/> <doi/> <pubMedID/> <publishXMLStatus/> <publishXMLVersion>1</publishXMLVersion> <useEISSN>0</useEISSN> <urgency/> <pubPubdateYear/> <pubPubdateMonth/> <pubPubdateDay/> <pubVolume/> <pubNumber/> <wireChannels/> <primaryCMSID/> <CMSIDs/> <keywords/> <seeAlsos/> <publications_g> <publicationData> <publicationCode>jhm</publicationCode> <pubIssueName/> <pubArticleType/> <pubTopics/> <pubCategories/> <pubSections/> <journalTitle/> <journalFullTitle/> <copyrightStatement/> </publicationData> </publications_g> <publications> <term canonical="true">27312</term> </publications> <sections> <term canonical="true">159</term> </sections> <topics> <term canonical="true">327</term> </topics> <links/> </header> <itemSet> <newsItem> <itemMeta> <itemRole>Main</itemRole> <itemClass>text</itemClass> <title>Improving Handoffs: Teaching beyond “Watch One, Do One”</title> <deck/> </itemMeta> <itemContent> <p class="affiliation">Divisions of Pediatric Hospital Medicine and Medical Education, Department of Pediatrics, UCSF Benioff Children’s Hospital, University of California, San Francisco, San Francisco, California.</p> <p>*Address for correspondence and reprint requests: Glenn Rosenbluth, MD, Department of Pediatrics, 550 16th Street, 5th floor, San Francisco, CA 94143-0110; Telephone: 415-476-9180; Fax: 415-476-4009; E-mail: <span class="Hyperlink"><a href="mailto:glenn.rosenbluth@ucsf.edu">glenn.rosenbluth@ucsf.edu</a></span></p> <p>Received: June 13, 2017; Accepted: June 19, 2017<br/><br/><strong>2017 Society of Hospital Medicine DOI 10.12788/jhm.2849</strong></p> <p>In this issue of the <i>Journal of Hospital Medicine</i>, Lee et al.<sup>1<hl name="10"/></sup> describe a randomized trial to assess the effectiveness of four different approaches to teaching handoffs with the goal of improving process measures related to interns’ handoffs. The Society of Hospital Medicine (SHM), The Joint Commission (TJC), Accreditation Council for Graduate Medical Education (ACGME), and others have all emphasized the importance of high-quality handoffs as an essential component of safe patient care.<sup>2-4</sup> The ACGME specifically requires that all institutions that sponsor ACGME-accredited programs provide both structure and monitoring, and the SHM complements this with evidence-based guidelines for handoffs.</p> <p>Lee’s team trained 4 groups of residents in handoffs using 4 different hour-long sessions, each with a different focus and educational format. A control group received a 1-hour didactic, which they had already heard; an I-PASS–based training group included role plays; and Policy Mandate and PDSA (Plan, Do, Study, Act) groups included group discussions. The prioritization of content in the sessions varied considerably among the groups, and the results should be interpreted within the context of the variation in both delivery and content. <br/><br/>Consistent with the focus of each intervention, the I-PASS–based training group had the greatest improvement in transfer of patient information, the policy mandate training group (focused on specific tasks) had the greatest improvement in task accountability, and the PDSA-training group (focused on intern-driven improvements) had the greatest improvement in personal responsibility. The control 60-minute didactic group did not show significant improvement in any domains. The lack of improvement in the control group doesn’t imply that the content wasn’t valuable, just that repetition didn’t add anything to baseline. One takeaway from the primary results of this study is that residents are likely to practice and improve what they are taught, and therefore, faculty should teach them purposefully. If residents aren’t taught handoff skills, they are unlikely to master them.<br/><br/>The interventions used in this study are neither mutually exclusive nor duplicative. In the final conclusions, the authors described the potential for a curriculum that includes elements from all 3 interventions. One could certainly imagine a handoff training program that includes elements of the I-PASS handoff bundle including role plays, additional emphasis on personal responsibility for specific tasks, as well as a focus on PDSA cycles of improvement for handoff processes. This likely could be accomplished with efficiency and might add only an hour to the 1-hour trainings. Evidence from the I-PASS study<sup>5</sup> suggests that improving handoffs can decrease medical errors by 21% and adverse events by 30%; this certainly seems worth the time.<br/><br/>Checklist-based observation tools can provide valuable data to assess handoffs.<sup>6</sup> Lee’s study used a checklist based on TJC recommendations, and the 17 checklist elements overlapped somewhat with the SHM guidelines,<sup>2</sup> providing some evidence for content validity. The dependent variable was total number of checklist items included in handoffs, a methodology that assumes that all handoff elements are equally important (eg, gender is weighted equally to if-then plans). This checklist also has a large proportion of items related to 2-way and closed-loop communication and therefore, places heavy weight on this component of handoffs. Adapting this checklist into an assessment tool would require additional validity evidence but could make it a very useful tool for completing handoff assessments and providing meaningful feedback.<br/><br/>The ideal data collection instrument would also include outcome measures, in addition to process measures. Improvements in outcome measures such as medical errors and adverse events, are more difficult to document but also provide more valuable data about the impact of curricula. In designing new hybrid curricula, it will be extremely important to focus on those outcomes that reflect the greatest impact on patient safety.<br/><br/>Finally, this study reminds us that the delivery modes of curricula are important factors in learning. The control group received an exclusively didactic presentation that they had heard before, while the other 3 groups had interactive components including role plays and group discussions. The improvements in different domains with different training formats provide evidence for the complementary nature. Interactive curricula involving role plays, simulations, and small-group discussions are more resource-intense than simple didactics, but they are also likely to be more impactful.<br/><br/>Teaching and assessing the quality of handoffs is critical to the safe practice of medicine. New ACGME duty hour requirements, which began in July, will allow for increased flexibility allowing longer shifts with shorter breaks.<sup>7</sup> Regardless of the shift/call schedules programs design for their trainees, safe handoffs are essential. The strategies described here may be useful for helping institutions improve patient safety through better handoffs. This study adds to the bulk of data demonstrating that handoffs are a skill that should be both taught and assessed during residency training.</p> <p class="references">1. Lee SH, Terndrup C, Phan PH, et al. A Randomized Cohort Controlled Trial to Compare Intern Sign-Out Training Interventions. <i>J Hosp Med</i>. 2017;12(12):979-983.<br/><br/>2. Arora VM, Manjarrez E, Dressler DD, Basaviah P, Halasyamani L, Kripalani S. Hospitalist handoffs: a systematic review and task force recommendations. <i>J Hosp Med</i>. 2009;4(7):433-440.<br/><br/>3. Accreditation Council for Graduate Medical Education. Common Program Requirements. 2017. https://www.acgmecommon.org/2017_requirements Accessed November 10, 2017.<br/><br/>4. The Joint Commission. Improving Transitions of Care: Hand-off Communications. 2013; http://www.centerfortransforminghealthcare.org/tst_hoc.aspx. Accessed November 10, 2017.<br/><br/>5. Starmer AJ, Spector ND, Srivastava R, et al. Changes in medical errors after implementation of a handoff program. <i>N Engl J Med</i>. 2014;371(19):1803-1812.<br/><br/>6. Feraco AM, Starmer AJ, Sectish TC, Spector ND, West DC, Landrigan CP. Reliability of Verbal Handoff Assessment and Handoff Quality Before and After Implementation of a Resident Handoff Bundle. <i>Acad Pediatr</i>. 2016;16(6):524-531.<br/><br/>7. Accreditation Council for Continuing Medical Education. Common Program Requirements. 2017; https://www.acgmecommon.org/2017_requirements. Accessed on June 12, 2017.</p> </itemContent> </newsItem> </itemSet></root>
Article Source

© 2017 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Glenn Rosenbluth, MD, Department of Pediatrics, 550 16th Street, 5th floor, San Francisco, CA 94143-0110; Telephone: 415-476-9180; Fax: 415-476-4009; E-mail: glenn.rosenbluth@ucsf.edu
Content Gating
Gated (full article locked unless allowed per User)
Alternative CME
Disqus Comments
Default
Gating Strategy
First Peek Free
Article PDF Media

Ordering Patterns in Shift‐Based Care

Article Type
Changed
Mon, 01/02/2017 - 19:34
Display Headline
Association between ordering patterns and shift‐based care in general pediatrics inpatients

Duty‐hour restrictions were implemented by the Accreditation Council for Graduate Medical Education (ACGME) in 2003 in response to data showing that sleep deprivation was correlated with serious medical errors.[1] In 2011, the ACGME required more explicit restrictions in the number of hours worked and the maximal shift length.[2] These requirements have necessitated a transition from a traditional q4 call model for interns to one in which shifts are limited to a maximum of 16 hours.

Studies of interns working these shorter shifts have had varied results, and comprehensive reviews have failed to demonstrate consistent improvements.[3, 4, 5] Studies of shift‐length limitation initially suggested improvements in patient safety (decreased length of stay,[6, 7] cost of hospitalization,[6] medication errors,[7] serious medical errors,[8] and intensive care unit [ICU] admissions[9]) and resident quality of life.[10] However, other recent studies have reported an increased number of self‐reported medical errors[11] and either did not detect change[12] or reported perceived decreases[13] in quality of care and continuity of care.

We previously reported decreased length of stay and decreased cost of hospitalization in pediatric inpatients cared for in a day/night‐shiftbased care model.[6] An hypothesized reason for those care improvements is the restructured care model led to increased active clinical management during both day and night hours. Here we report the findings of a retrospective analysis to investigate this hypothesis.

PATIENTS AND METHODS

Study Population

We reviewed the charts of pediatric patients admitted to University of California, San Francisco Benioff Children's Hospital, a 175‐bed tertiary care facility, over a 2‐year period between September 15, 2007 and September 15, 2008 (preintervention) and September 16, 2008 and September 16, 2009 (postintervention). During this study period, our hospital was still dependent on paper orders. Admission order sets were preprinted paper forms that were unchanged for the study period. Using International Classification of Diseases, 9th Revision coding, we identified patients on the general pediatrics service with 1 of 6 common diagnosesdehydration, community‐acquired pneumonia, aspiration pneumonia, upper respiratory infection, asthma, and bronchiolitis. These diagnoses were chosen because it was hypothesized that their length of inpatient stay could be impacted by active clinical management. We excluded patients admitted to the ICU or transferred between services.

A list of medical record numbers (MRNs) corresponding to admissions for 1 of the 6 above diagnoses during the pre‐ and postintervention periods was compiled. MRNs were randomized and then sequentially reviewed until 50 admissions in each time period were obtained. After data collection was completed, we noted that 2 patients had been in the ICU for part of their hospitalization, and these were excluded, leaving 48 admissions from prior to the intervention and 50 admissions from after intervention who were examined.

Intervention

During the preintervention period, patients were cared for by interns who took call every sixth night (duty periods up to 30 hours), with cross‐coverage of patients on multiple teams. Cross‐coverage was defined as coverage of patients cared for during nonconsecutive shifts and for whom residents did not participate in attending rounds. Noncall shifts were typically 10 to 11 hours. They were supervised by senior residents who took call every fourth or fifth night and who provided similar cross‐coverage.

During the postintervention period, interns worked day and night shifts of 13 hours (1 hour overlap time between shifts for handoffs), with increased night staffing to eliminate intern‐level cross‐coverage of multiple teams and maintain interns as the primary providers. Interns covered the same team for 5 to 7 consecutive days on either the day or night shifts. Interns remained on the same teams when they switched from day shifts to night shifts to preserve continuity. There were some 24‐hour shifts for senior residents on weekends. Senior residents maintained supervisory responsibility for all patients (both hospitalist teams and a subspecialty team). They also worked 7 consecutive nights.

There were changes in the staffing ratios associated with the change to day and night teams (Table 1, Figure 1). In the preintervention period, general pediatrics patients were covered by a single hospitalist and cohorted on a single team (team A), which also covered several groups of subspecialty patients with subspecialty attendings. The team consisted of 2 interns and 1 senior resident, who shared extended (30‐hour) call in a cycle with 2 other inpatient teams. In the postintervention period, general pediatrics patients were split between 2 teams (teams D and E) and mixed with subspecialty patients. Hospitalist continued to be the attendings, and these hospitalists also covered specialty patients with subspecialists in consulting roles. The teams consisted of 3 interns on the day shift, and 1 on the night shift. There was 1 senior resident per team on day shift, and a single senior resident covering all teams at night.

Team Composition Before and After Intervention
Preintervention Postintervention
  • Refers to only to general pediatrics patient coverage Teams A, D, and E.

  • NOTE: Abbreviations: GI, gastrointestinal. *Refers to only to general pediatrics patient coverageteams A, D, and E.

General Pediatrics Team A Team B Team C Team D Team E Team F
Patient Distribution General Pediatrics GI/Liver Renal General Pediatrics General Pediatrics Liver
Pulmonary Neurology Rheumatology Mixed Specialty Mixed Specialty Renal
Adolescent Endocrine
Team membersa 2 interns (q6 call) 4 interns (3 on day shift/1 on night shift)
1 senior resident (q5 call) 1 senior resident
Night‐shift coveragea 1 intern and 1 senior resident together covered all 3 teams. 1 night intern per team (teams D/E) working 7 consecutive night shifts
1 supervising night resident covering all 3 teams
Intern cross‐coverage of other teams Nights/clinic afternoons None
Length of night shift 30 hours 13 hours
jhm2507-fig-0001-m.png
Team staffing before and after the intervention. Abbreviations: PGY2, postgraduate year 2.

There was no change in the paper‐order system, the electronic health record, timing of the morning blood draw, use of new facilities for patient care, or protocol for emergency department admission. Concomitant with the restructuring, most subspecialty patients were consolidated onto the hospitalist service, necessitating creation of a second hospitalist team. However, patients admitted with the diagnoses identified above would have been on the hospitalist service before and after the restructuring.

Data Collection/Analysis

We reviewed specific classes of orders and categorized by type: respiratory medication, oxygen, intravenous (IV) fluids, diet, monitoring, and activity, time of day (day vs night‐shift), and whether they were an escalation or de‐escalation of care. De‐escalation of care was defined as orders that decreased patient care such as weaning a patient off nebulized albuterol or decreasing their IV fluids. Orders between 07:00 to 18:00 were considered day‐shift orders and between 18:01 and 06:59 were classified as night‐shift orders. Only orders falling into 1 of the aforementioned categories were recorded. Admission order sets were not included. Initially, charts were reviewed by both investigators together; after comparing results for 10 charts to ensure consistency of methodology and criteria, the remaining charts were reviewed by 1 of the study investigators.

To compare demographics, diagnoses, and ordering patterns, t tests and 2 (SAS version 9.2 [SAS Institute, Cary, NC], Stata version 13.1 [StataCorp, College Station, TX]) were used. Multivariate gamma models (SAS version 9.2 [SAS Institute]) that adjusted for clustering at the attending level and patient age were used to compare severity of illness before and after the intervention. This study was approved by the University of California, San Francisco Committee on Human Research.

RESULTS

We analyzed data for 48 admissions preintervention and 50 postintervention. With the exception of insurance type, there was no difference in baseline demographics, diagnoses, or severity of illness between the groups (Table 2). Within the order classes above, we identified 212 orders preintervention and 231 orders postintervention.

Patient Demographics and Diagnoses
Preintervention,n = 48, N (%) Postintervention, n = 50, N (%) P Value
  • NOTE: Abbreviations: ED, emergency department; NH, non‐Hispanic; SD, standard deviation; URI, upper respiratory infection.

Age, y, mean (SD) 4.8 (4.6) 5.5 (4.7) 0.4474
Race/ethnicity 0.1953
NH white 12 (25.0%) 9 (18.0%)
NH black 11 (22.9%) 7 (14.0%)
Hispanic 16 (33.3%) 13 (26.0%)
Asian 6 (12.5%) 10 (20.0%)
Other 3 (6.3%) 10 (20.0%)
Missing 0 1 (2.0%)
Gender 0.6577
Female 19 (39.6%) 22 (44.0%)
Male 29 (60.4%) 28 (56.0%)
Primary language 0.2601
English 38 (79.2%) 45 (90.0%)
Spanish 9 (18.8%) 5 (10.0%)
Other 1 (2.1%) 0
Insurance 0.0118
Private 13 (27.1%) 26 (52.0%)
Medical 35 (72.9%) 24 (48.0%)
Other 0 0
Admit source 0.6581
Referral 20 (41.7%) 18 (36.0%)
ED 26 (54.2%) 31 (62.0%)
Transfer 2 (4.2%) 1 (2.0%)
Severity of illness 0.1926
Minor 15 (31.3%) 24 (48.0%)
Moderate 23 (47.9%) 16 (32.0%)
Severe 10 (20.8%) 10 (20.0%)
Extreme 0 0
Diagnoses 0.562
Asthma 21 19
Bronchiolitis 2 4
Pneumonia 17 19
Dehydration 6 7
URI 0 1
Aspiration pneumonia 2 0

After the intervention, there was a statistically significant increase in the average number of orders written within the first 12 hours (pre: 0.58 orders vs post: 1.12, P = 0.009) and 24 hours (pre: 1.52 vs post: 2.38, P = 0.004) following admission (Table 3), not including the admission order set. The fraction of orders written at night was not significantly different (27% at night preintervention, 33% postintervention, P = 0.149). The fraction of admissions on the day shift compared to the night shift did not change (P = 0.72). There was no difference in the ratio of de‐escalation to escalation orders written during the night (Table 2).

Distribution of Orders
Preintervention, 48 Admissions Postintervention, 50 Admissions P Value
  • NOTE: *Excludes admission order set.

Total no. of orders 212 231
Mean no. of orders per admission 4.42 4.62
Day shift orders, n (%) 155 (73) 155 (67) 0.149
Night shift orders, n (%) 57 (27) 76 (33)
Mean no. of orders within first 12 hours* 0.58 1.12 0.009
Mean no. of orders within first 24 hours* 1.52 2.38 0.004
Night shift escalation orders (%) 27 (47) 33 (43) 0.491
Night shift de‐escalation orders (%) 30 (53) 43 (57)

DISCUSSION

In this study, we demonstrate increased patient care management early in the hospitalization, measured in this study by the mean number of orders written per patient in the first 12 and 24 hours after admission, after transition from a call schedule with extended (>16 hours) shifts to one with shorter shifts compliant with current ACGME duty‐hour restrictions and an explicit focus on greater continuity of care. We did not detect a change in the proportion of total orders written on the night shift compared to the day shift. Earlier active medical management, such as weaning nebulized albuterol or supplemental oxygen, can speed the time to discharge.[14]

Our failure to detect a significant change in the proportion or type of orders written at night may have been due to our small sample size. Anecdotally, after the intervention, medical students reported to us that they noticed a difference between our service, in which we expect night teams to advance care, and other services at our institution, in which nights are a time to focus on putting out fires. This was not something that had been reported to us prior. It is likely reflective of the overall approach to patient care taken by residents working a night shift as part of a longitudinal care team.

This study builds on previous findings that demonstrated lower costs and shorter length of stay after implementing a schedule based on day and night teams.[7] The reasons for such improvements are likely multifactorial. In our model, which was purposefully designed to create night‐team continuity and minimize cross‐coverage, it is likely that residents also felt a greater sense of responsibility for and familiarity with the patients[15] and therefore felt more comfortable advancing care. Not only were interns likely better rested, the patient‐to‐provider ratio was also lower than in the preintervention model. Increases in staffing are often necessary to eliminate cross‐coverage while maintaining safe, 24‐hour care. These findings suggest that increases in cost from additional staffing may be at least partially offset by more active patient management early in the hospitalization, which has the potential to lead to shorter hospital stays.

There are several limitations to our research. We studied a small sample, including a subset of general pediatrics diagnoses that are amenable to active management, limiting generalizability. We did not calculate a physician‐to‐patient ratio because this was not possible with the retrospective data we collected. Staffing ratios likely improved, and we consider that part of the overall improvements in staffing that may have contributed to the observed changes in ordering patterns. Although intern‐level cross‐coverage was eliminated, the senior resident continued to cover multiple teams overnight. This senior covered the same 3 teams for 7 consecutive nights. The addition of a hospitalist team, with subspecialists being placed in consultant roles, may have contributed to the increase in active management, though our study population did not include subspecialty patients. There was a difference in insurance status between the 2 groups. This was unlikely to affect resident physician practices as insurance information is not routinely discussed in the course of patient care. In the context of the ongoing debate about duty‐hour restrictions, it will be important for future studies to elucidate whether sleep or other variables are the primary contributors to this finding. Our data are derived solely from 1 inpatient service at a single academic medical center; however, we do feel there are lessons that may be applied to other settings.

CONCLUSION

A coverage system with improved nighttime resident coverage was associated with a greater number of orders written early in the hospitalization, suggesting more active management of clinical problems to advance care.

Acknowledgements

The authors thank Dr. I. Elaine Allen, John Kornak, and Dr. Derek Pappas for assistance with biostatistics, and Dr. Diana Bojorquez and Dr. Derek Pappas for assistance with review of the manuscript and creation of the figures.

Disclosures: None of the authors have financial relationships or other conflicts of interest to disclose. No external funding was secured for this study. Dr. Auerbach was supported by grant K24HL098372 during the course of this study. This project was supported by the National Center for Advancing Translational Sciences, National Institutes of Health (NIH), through University of California San FranciscoClinical and Translational Sciences Institute grant UL1 TR000004. Its contents are solely the responsibility of the authors and do not necessarily represent the official views of the NIH. Dr. Rosenbluth had access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

Files
References
  1. Philibert I, Friedmann P, Williams WT. New requirements for resident duty hours. JAMA. 2002;288(9):11121114.
  2. Accreditation Council for Graduate Medical Education. Common program requirements. 2011. Available at: http://www.acgme.org/acgmeweb/Portals/0/PDFs/Common_Program_Requirements_07012011[2].pdf. Accessed November 28, 2011.
  3. Fletcher KE, Reed DA, Arora VM. Patient safety, resident education and resident well‐being following implementation of the 2003 ACGME duty hour rules. J Gen Intern Med. 2011;26(8):907919.
  4. Ahmed N, Devitt KS, Keshet I, et al. A systematic review of the effects of resident duty hour restrictions in surgery: impact on resident wellness, training, and patient outcomes. Ann Surg. 2014;259(6):10411053.
  5. Philibert I, Nasca T, Brigham T, Shapiro J. Duty‐hour limits and patient care and resident outcomes: can high‐quality studies offer insight into complex relationships? Annu Rev Med. 2013;64:467483.
  6. Rosenbluth G, Fiore DM, Maselli JH, Vittinghoff E, Wilson SD, Auerbach AD. Association between adaptations to ACGME duty hour requirements, length of stay, and costs. Sleep. 2013;36(2):245248.
  7. Gottlieb DJ, Parenti CM, Peterson CA, Lofgren RP. Effect of a change in house staff work schedule on resource utilization and patient care. Arch Intern Med. 1991;151(10):20652070.
  8. Landrigan CP, Rothschild JM, Cronin JW, et al. Effect of reducing interns' work hours on serious medical errors in intensive care units. N Engl J Med. 2004;351(18):18381848.
  9. Horwitz LI, Kosiborod M, Lin Z, Krumholz HM. Changes in outcomes for internal medicine inpatients after work‐hour regulations. Ann Intern Med. 2007;147(2):97103.
  10. Levine AC, Adusumilli J, Landrigan CP. Effects of reducing or eliminating resident work shifts over 16 hours: a systematic review. Sleep. 2010;33(8):10431053.
  11. Sen S, Kranzler HR, Didwania AK, et al. Effects of the 2011 duty hour reforms on interns and their patients: a prospective longitudinal cohort study. JAMA Intern Med. 2013;173(8):657662; discussion 663.
  12. McCoy CP, Halvorsen AJ, Loftus CG, McDonald FS, Oxentenko AS. Effect of 16‐hour duty periods on patient care and resident education. Mayo Clin Proc. 2011;86(3):192196.
  13. Desai SV, Feldman L, Brown L, et al. Effect of the 2011 vs 2003 duty hour regulation‐compliant models on sleep duration, trainee education, and continuity of patient care among internal medicine house staff: a randomized trial. JAMA Intern Med. 2013;173(8):649655.
  14. Johnson KB, Blaisdell CJ, Walker A, Eggleston P. Effectiveness of a clinical pathway for inpatient asthma management. Pediatrics. 2000;106(5):10061012.
  15. Burgis JC, Lockspeiser TM, Stumpf EC, Wilson SD. Resident perceptions of autonomy in a complex tertiary care environment improve when supervised by hospitalists. Hosp Pediatr. 2012;2(4):228234.
Article PDF
Issue
Journal of Hospital Medicine - 11(3)
Publications
Page Number
210-214
Sections
Files
Files
Article PDF
Article PDF

Duty‐hour restrictions were implemented by the Accreditation Council for Graduate Medical Education (ACGME) in 2003 in response to data showing that sleep deprivation was correlated with serious medical errors.[1] In 2011, the ACGME required more explicit restrictions in the number of hours worked and the maximal shift length.[2] These requirements have necessitated a transition from a traditional q4 call model for interns to one in which shifts are limited to a maximum of 16 hours.

Studies of interns working these shorter shifts have had varied results, and comprehensive reviews have failed to demonstrate consistent improvements.[3, 4, 5] Studies of shift‐length limitation initially suggested improvements in patient safety (decreased length of stay,[6, 7] cost of hospitalization,[6] medication errors,[7] serious medical errors,[8] and intensive care unit [ICU] admissions[9]) and resident quality of life.[10] However, other recent studies have reported an increased number of self‐reported medical errors[11] and either did not detect change[12] or reported perceived decreases[13] in quality of care and continuity of care.

We previously reported decreased length of stay and decreased cost of hospitalization in pediatric inpatients cared for in a day/night‐shiftbased care model.[6] An hypothesized reason for those care improvements is the restructured care model led to increased active clinical management during both day and night hours. Here we report the findings of a retrospective analysis to investigate this hypothesis.

PATIENTS AND METHODS

Study Population

We reviewed the charts of pediatric patients admitted to University of California, San Francisco Benioff Children's Hospital, a 175‐bed tertiary care facility, over a 2‐year period between September 15, 2007 and September 15, 2008 (preintervention) and September 16, 2008 and September 16, 2009 (postintervention). During this study period, our hospital was still dependent on paper orders. Admission order sets were preprinted paper forms that were unchanged for the study period. Using International Classification of Diseases, 9th Revision coding, we identified patients on the general pediatrics service with 1 of 6 common diagnosesdehydration, community‐acquired pneumonia, aspiration pneumonia, upper respiratory infection, asthma, and bronchiolitis. These diagnoses were chosen because it was hypothesized that their length of inpatient stay could be impacted by active clinical management. We excluded patients admitted to the ICU or transferred between services.

A list of medical record numbers (MRNs) corresponding to admissions for 1 of the 6 above diagnoses during the pre‐ and postintervention periods was compiled. MRNs were randomized and then sequentially reviewed until 50 admissions in each time period were obtained. After data collection was completed, we noted that 2 patients had been in the ICU for part of their hospitalization, and these were excluded, leaving 48 admissions from prior to the intervention and 50 admissions from after intervention who were examined.

Intervention

During the preintervention period, patients were cared for by interns who took call every sixth night (duty periods up to 30 hours), with cross‐coverage of patients on multiple teams. Cross‐coverage was defined as coverage of patients cared for during nonconsecutive shifts and for whom residents did not participate in attending rounds. Noncall shifts were typically 10 to 11 hours. They were supervised by senior residents who took call every fourth or fifth night and who provided similar cross‐coverage.

During the postintervention period, interns worked day and night shifts of 13 hours (1 hour overlap time between shifts for handoffs), with increased night staffing to eliminate intern‐level cross‐coverage of multiple teams and maintain interns as the primary providers. Interns covered the same team for 5 to 7 consecutive days on either the day or night shifts. Interns remained on the same teams when they switched from day shifts to night shifts to preserve continuity. There were some 24‐hour shifts for senior residents on weekends. Senior residents maintained supervisory responsibility for all patients (both hospitalist teams and a subspecialty team). They also worked 7 consecutive nights.

There were changes in the staffing ratios associated with the change to day and night teams (Table 1, Figure 1). In the preintervention period, general pediatrics patients were covered by a single hospitalist and cohorted on a single team (team A), which also covered several groups of subspecialty patients with subspecialty attendings. The team consisted of 2 interns and 1 senior resident, who shared extended (30‐hour) call in a cycle with 2 other inpatient teams. In the postintervention period, general pediatrics patients were split between 2 teams (teams D and E) and mixed with subspecialty patients. Hospitalist continued to be the attendings, and these hospitalists also covered specialty patients with subspecialists in consulting roles. The teams consisted of 3 interns on the day shift, and 1 on the night shift. There was 1 senior resident per team on day shift, and a single senior resident covering all teams at night.

Team Composition Before and After Intervention
Preintervention Postintervention
  • Refers to only to general pediatrics patient coverage Teams A, D, and E.

  • NOTE: Abbreviations: GI, gastrointestinal. *Refers to only to general pediatrics patient coverageteams A, D, and E.

General Pediatrics Team A Team B Team C Team D Team E Team F
Patient Distribution General Pediatrics GI/Liver Renal General Pediatrics General Pediatrics Liver
Pulmonary Neurology Rheumatology Mixed Specialty Mixed Specialty Renal
Adolescent Endocrine
Team membersa 2 interns (q6 call) 4 interns (3 on day shift/1 on night shift)
1 senior resident (q5 call) 1 senior resident
Night‐shift coveragea 1 intern and 1 senior resident together covered all 3 teams. 1 night intern per team (teams D/E) working 7 consecutive night shifts
1 supervising night resident covering all 3 teams
Intern cross‐coverage of other teams Nights/clinic afternoons None
Length of night shift 30 hours 13 hours
jhm2507-fig-0001-m.png
Team staffing before and after the intervention. Abbreviations: PGY2, postgraduate year 2.

There was no change in the paper‐order system, the electronic health record, timing of the morning blood draw, use of new facilities for patient care, or protocol for emergency department admission. Concomitant with the restructuring, most subspecialty patients were consolidated onto the hospitalist service, necessitating creation of a second hospitalist team. However, patients admitted with the diagnoses identified above would have been on the hospitalist service before and after the restructuring.

Data Collection/Analysis

We reviewed specific classes of orders and categorized by type: respiratory medication, oxygen, intravenous (IV) fluids, diet, monitoring, and activity, time of day (day vs night‐shift), and whether they were an escalation or de‐escalation of care. De‐escalation of care was defined as orders that decreased patient care such as weaning a patient off nebulized albuterol or decreasing their IV fluids. Orders between 07:00 to 18:00 were considered day‐shift orders and between 18:01 and 06:59 were classified as night‐shift orders. Only orders falling into 1 of the aforementioned categories were recorded. Admission order sets were not included. Initially, charts were reviewed by both investigators together; after comparing results for 10 charts to ensure consistency of methodology and criteria, the remaining charts were reviewed by 1 of the study investigators.

To compare demographics, diagnoses, and ordering patterns, t tests and 2 (SAS version 9.2 [SAS Institute, Cary, NC], Stata version 13.1 [StataCorp, College Station, TX]) were used. Multivariate gamma models (SAS version 9.2 [SAS Institute]) that adjusted for clustering at the attending level and patient age were used to compare severity of illness before and after the intervention. This study was approved by the University of California, San Francisco Committee on Human Research.

RESULTS

We analyzed data for 48 admissions preintervention and 50 postintervention. With the exception of insurance type, there was no difference in baseline demographics, diagnoses, or severity of illness between the groups (Table 2). Within the order classes above, we identified 212 orders preintervention and 231 orders postintervention.

Patient Demographics and Diagnoses
Preintervention,n = 48, N (%) Postintervention, n = 50, N (%) P Value
  • NOTE: Abbreviations: ED, emergency department; NH, non‐Hispanic; SD, standard deviation; URI, upper respiratory infection.

Age, y, mean (SD) 4.8 (4.6) 5.5 (4.7) 0.4474
Race/ethnicity 0.1953
NH white 12 (25.0%) 9 (18.0%)
NH black 11 (22.9%) 7 (14.0%)
Hispanic 16 (33.3%) 13 (26.0%)
Asian 6 (12.5%) 10 (20.0%)
Other 3 (6.3%) 10 (20.0%)
Missing 0 1 (2.0%)
Gender 0.6577
Female 19 (39.6%) 22 (44.0%)
Male 29 (60.4%) 28 (56.0%)
Primary language 0.2601
English 38 (79.2%) 45 (90.0%)
Spanish 9 (18.8%) 5 (10.0%)
Other 1 (2.1%) 0
Insurance 0.0118
Private 13 (27.1%) 26 (52.0%)
Medical 35 (72.9%) 24 (48.0%)
Other 0 0
Admit source 0.6581
Referral 20 (41.7%) 18 (36.0%)
ED 26 (54.2%) 31 (62.0%)
Transfer 2 (4.2%) 1 (2.0%)
Severity of illness 0.1926
Minor 15 (31.3%) 24 (48.0%)
Moderate 23 (47.9%) 16 (32.0%)
Severe 10 (20.8%) 10 (20.0%)
Extreme 0 0
Diagnoses 0.562
Asthma 21 19
Bronchiolitis 2 4
Pneumonia 17 19
Dehydration 6 7
URI 0 1
Aspiration pneumonia 2 0

After the intervention, there was a statistically significant increase in the average number of orders written within the first 12 hours (pre: 0.58 orders vs post: 1.12, P = 0.009) and 24 hours (pre: 1.52 vs post: 2.38, P = 0.004) following admission (Table 3), not including the admission order set. The fraction of orders written at night was not significantly different (27% at night preintervention, 33% postintervention, P = 0.149). The fraction of admissions on the day shift compared to the night shift did not change (P = 0.72). There was no difference in the ratio of de‐escalation to escalation orders written during the night (Table 2).

Distribution of Orders
Preintervention, 48 Admissions Postintervention, 50 Admissions P Value
  • NOTE: *Excludes admission order set.

Total no. of orders 212 231
Mean no. of orders per admission 4.42 4.62
Day shift orders, n (%) 155 (73) 155 (67) 0.149
Night shift orders, n (%) 57 (27) 76 (33)
Mean no. of orders within first 12 hours* 0.58 1.12 0.009
Mean no. of orders within first 24 hours* 1.52 2.38 0.004
Night shift escalation orders (%) 27 (47) 33 (43) 0.491
Night shift de‐escalation orders (%) 30 (53) 43 (57)

DISCUSSION

In this study, we demonstrate increased patient care management early in the hospitalization, measured in this study by the mean number of orders written per patient in the first 12 and 24 hours after admission, after transition from a call schedule with extended (>16 hours) shifts to one with shorter shifts compliant with current ACGME duty‐hour restrictions and an explicit focus on greater continuity of care. We did not detect a change in the proportion of total orders written on the night shift compared to the day shift. Earlier active medical management, such as weaning nebulized albuterol or supplemental oxygen, can speed the time to discharge.[14]

Our failure to detect a significant change in the proportion or type of orders written at night may have been due to our small sample size. Anecdotally, after the intervention, medical students reported to us that they noticed a difference between our service, in which we expect night teams to advance care, and other services at our institution, in which nights are a time to focus on putting out fires. This was not something that had been reported to us prior. It is likely reflective of the overall approach to patient care taken by residents working a night shift as part of a longitudinal care team.

This study builds on previous findings that demonstrated lower costs and shorter length of stay after implementing a schedule based on day and night teams.[7] The reasons for such improvements are likely multifactorial. In our model, which was purposefully designed to create night‐team continuity and minimize cross‐coverage, it is likely that residents also felt a greater sense of responsibility for and familiarity with the patients[15] and therefore felt more comfortable advancing care. Not only were interns likely better rested, the patient‐to‐provider ratio was also lower than in the preintervention model. Increases in staffing are often necessary to eliminate cross‐coverage while maintaining safe, 24‐hour care. These findings suggest that increases in cost from additional staffing may be at least partially offset by more active patient management early in the hospitalization, which has the potential to lead to shorter hospital stays.

There are several limitations to our research. We studied a small sample, including a subset of general pediatrics diagnoses that are amenable to active management, limiting generalizability. We did not calculate a physician‐to‐patient ratio because this was not possible with the retrospective data we collected. Staffing ratios likely improved, and we consider that part of the overall improvements in staffing that may have contributed to the observed changes in ordering patterns. Although intern‐level cross‐coverage was eliminated, the senior resident continued to cover multiple teams overnight. This senior covered the same 3 teams for 7 consecutive nights. The addition of a hospitalist team, with subspecialists being placed in consultant roles, may have contributed to the increase in active management, though our study population did not include subspecialty patients. There was a difference in insurance status between the 2 groups. This was unlikely to affect resident physician practices as insurance information is not routinely discussed in the course of patient care. In the context of the ongoing debate about duty‐hour restrictions, it will be important for future studies to elucidate whether sleep or other variables are the primary contributors to this finding. Our data are derived solely from 1 inpatient service at a single academic medical center; however, we do feel there are lessons that may be applied to other settings.

CONCLUSION

A coverage system with improved nighttime resident coverage was associated with a greater number of orders written early in the hospitalization, suggesting more active management of clinical problems to advance care.

Acknowledgements

The authors thank Dr. I. Elaine Allen, John Kornak, and Dr. Derek Pappas for assistance with biostatistics, and Dr. Diana Bojorquez and Dr. Derek Pappas for assistance with review of the manuscript and creation of the figures.

Disclosures: None of the authors have financial relationships or other conflicts of interest to disclose. No external funding was secured for this study. Dr. Auerbach was supported by grant K24HL098372 during the course of this study. This project was supported by the National Center for Advancing Translational Sciences, National Institutes of Health (NIH), through University of California San FranciscoClinical and Translational Sciences Institute grant UL1 TR000004. Its contents are solely the responsibility of the authors and do not necessarily represent the official views of the NIH. Dr. Rosenbluth had access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

Duty‐hour restrictions were implemented by the Accreditation Council for Graduate Medical Education (ACGME) in 2003 in response to data showing that sleep deprivation was correlated with serious medical errors.[1] In 2011, the ACGME required more explicit restrictions in the number of hours worked and the maximal shift length.[2] These requirements have necessitated a transition from a traditional q4 call model for interns to one in which shifts are limited to a maximum of 16 hours.

Studies of interns working these shorter shifts have had varied results, and comprehensive reviews have failed to demonstrate consistent improvements.[3, 4, 5] Studies of shift‐length limitation initially suggested improvements in patient safety (decreased length of stay,[6, 7] cost of hospitalization,[6] medication errors,[7] serious medical errors,[8] and intensive care unit [ICU] admissions[9]) and resident quality of life.[10] However, other recent studies have reported an increased number of self‐reported medical errors[11] and either did not detect change[12] or reported perceived decreases[13] in quality of care and continuity of care.

We previously reported decreased length of stay and decreased cost of hospitalization in pediatric inpatients cared for in a day/night‐shiftbased care model.[6] An hypothesized reason for those care improvements is the restructured care model led to increased active clinical management during both day and night hours. Here we report the findings of a retrospective analysis to investigate this hypothesis.

PATIENTS AND METHODS

Study Population

We reviewed the charts of pediatric patients admitted to University of California, San Francisco Benioff Children's Hospital, a 175‐bed tertiary care facility, over a 2‐year period between September 15, 2007 and September 15, 2008 (preintervention) and September 16, 2008 and September 16, 2009 (postintervention). During this study period, our hospital was still dependent on paper orders. Admission order sets were preprinted paper forms that were unchanged for the study period. Using International Classification of Diseases, 9th Revision coding, we identified patients on the general pediatrics service with 1 of 6 common diagnosesdehydration, community‐acquired pneumonia, aspiration pneumonia, upper respiratory infection, asthma, and bronchiolitis. These diagnoses were chosen because it was hypothesized that their length of inpatient stay could be impacted by active clinical management. We excluded patients admitted to the ICU or transferred between services.

A list of medical record numbers (MRNs) corresponding to admissions for 1 of the 6 above diagnoses during the pre‐ and postintervention periods was compiled. MRNs were randomized and then sequentially reviewed until 50 admissions in each time period were obtained. After data collection was completed, we noted that 2 patients had been in the ICU for part of their hospitalization, and these were excluded, leaving 48 admissions from prior to the intervention and 50 admissions from after intervention who were examined.

Intervention

During the preintervention period, patients were cared for by interns who took call every sixth night (duty periods up to 30 hours), with cross‐coverage of patients on multiple teams. Cross‐coverage was defined as coverage of patients cared for during nonconsecutive shifts and for whom residents did not participate in attending rounds. Noncall shifts were typically 10 to 11 hours. They were supervised by senior residents who took call every fourth or fifth night and who provided similar cross‐coverage.

During the postintervention period, interns worked day and night shifts of 13 hours (1 hour overlap time between shifts for handoffs), with increased night staffing to eliminate intern‐level cross‐coverage of multiple teams and maintain interns as the primary providers. Interns covered the same team for 5 to 7 consecutive days on either the day or night shifts. Interns remained on the same teams when they switched from day shifts to night shifts to preserve continuity. There were some 24‐hour shifts for senior residents on weekends. Senior residents maintained supervisory responsibility for all patients (both hospitalist teams and a subspecialty team). They also worked 7 consecutive nights.

There were changes in the staffing ratios associated with the change to day and night teams (Table 1, Figure 1). In the preintervention period, general pediatrics patients were covered by a single hospitalist and cohorted on a single team (team A), which also covered several groups of subspecialty patients with subspecialty attendings. The team consisted of 2 interns and 1 senior resident, who shared extended (30‐hour) call in a cycle with 2 other inpatient teams. In the postintervention period, general pediatrics patients were split between 2 teams (teams D and E) and mixed with subspecialty patients. Hospitalist continued to be the attendings, and these hospitalists also covered specialty patients with subspecialists in consulting roles. The teams consisted of 3 interns on the day shift, and 1 on the night shift. There was 1 senior resident per team on day shift, and a single senior resident covering all teams at night.

Team Composition Before and After Intervention
Preintervention Postintervention
  • Refers to only to general pediatrics patient coverage Teams A, D, and E.

  • NOTE: Abbreviations: GI, gastrointestinal. *Refers to only to general pediatrics patient coverageteams A, D, and E.

General Pediatrics Team A Team B Team C Team D Team E Team F
Patient Distribution General Pediatrics GI/Liver Renal General Pediatrics General Pediatrics Liver
Pulmonary Neurology Rheumatology Mixed Specialty Mixed Specialty Renal
Adolescent Endocrine
Team membersa 2 interns (q6 call) 4 interns (3 on day shift/1 on night shift)
1 senior resident (q5 call) 1 senior resident
Night‐shift coveragea 1 intern and 1 senior resident together covered all 3 teams. 1 night intern per team (teams D/E) working 7 consecutive night shifts
1 supervising night resident covering all 3 teams
Intern cross‐coverage of other teams Nights/clinic afternoons None
Length of night shift 30 hours 13 hours
jhm2507-fig-0001-m.png
Team staffing before and after the intervention. Abbreviations: PGY2, postgraduate year 2.

There was no change in the paper‐order system, the electronic health record, timing of the morning blood draw, use of new facilities for patient care, or protocol for emergency department admission. Concomitant with the restructuring, most subspecialty patients were consolidated onto the hospitalist service, necessitating creation of a second hospitalist team. However, patients admitted with the diagnoses identified above would have been on the hospitalist service before and after the restructuring.

Data Collection/Analysis

We reviewed specific classes of orders and categorized by type: respiratory medication, oxygen, intravenous (IV) fluids, diet, monitoring, and activity, time of day (day vs night‐shift), and whether they were an escalation or de‐escalation of care. De‐escalation of care was defined as orders that decreased patient care such as weaning a patient off nebulized albuterol or decreasing their IV fluids. Orders between 07:00 to 18:00 were considered day‐shift orders and between 18:01 and 06:59 were classified as night‐shift orders. Only orders falling into 1 of the aforementioned categories were recorded. Admission order sets were not included. Initially, charts were reviewed by both investigators together; after comparing results for 10 charts to ensure consistency of methodology and criteria, the remaining charts were reviewed by 1 of the study investigators.

To compare demographics, diagnoses, and ordering patterns, t tests and 2 (SAS version 9.2 [SAS Institute, Cary, NC], Stata version 13.1 [StataCorp, College Station, TX]) were used. Multivariate gamma models (SAS version 9.2 [SAS Institute]) that adjusted for clustering at the attending level and patient age were used to compare severity of illness before and after the intervention. This study was approved by the University of California, San Francisco Committee on Human Research.

RESULTS

We analyzed data for 48 admissions preintervention and 50 postintervention. With the exception of insurance type, there was no difference in baseline demographics, diagnoses, or severity of illness between the groups (Table 2). Within the order classes above, we identified 212 orders preintervention and 231 orders postintervention.

Patient Demographics and Diagnoses
Preintervention,n = 48, N (%) Postintervention, n = 50, N (%) P Value
  • NOTE: Abbreviations: ED, emergency department; NH, non‐Hispanic; SD, standard deviation; URI, upper respiratory infection.

Age, y, mean (SD) 4.8 (4.6) 5.5 (4.7) 0.4474
Race/ethnicity 0.1953
NH white 12 (25.0%) 9 (18.0%)
NH black 11 (22.9%) 7 (14.0%)
Hispanic 16 (33.3%) 13 (26.0%)
Asian 6 (12.5%) 10 (20.0%)
Other 3 (6.3%) 10 (20.0%)
Missing 0 1 (2.0%)
Gender 0.6577
Female 19 (39.6%) 22 (44.0%)
Male 29 (60.4%) 28 (56.0%)
Primary language 0.2601
English 38 (79.2%) 45 (90.0%)
Spanish 9 (18.8%) 5 (10.0%)
Other 1 (2.1%) 0
Insurance 0.0118
Private 13 (27.1%) 26 (52.0%)
Medical 35 (72.9%) 24 (48.0%)
Other 0 0
Admit source 0.6581
Referral 20 (41.7%) 18 (36.0%)
ED 26 (54.2%) 31 (62.0%)
Transfer 2 (4.2%) 1 (2.0%)
Severity of illness 0.1926
Minor 15 (31.3%) 24 (48.0%)
Moderate 23 (47.9%) 16 (32.0%)
Severe 10 (20.8%) 10 (20.0%)
Extreme 0 0
Diagnoses 0.562
Asthma 21 19
Bronchiolitis 2 4
Pneumonia 17 19
Dehydration 6 7
URI 0 1
Aspiration pneumonia 2 0

After the intervention, there was a statistically significant increase in the average number of orders written within the first 12 hours (pre: 0.58 orders vs post: 1.12, P = 0.009) and 24 hours (pre: 1.52 vs post: 2.38, P = 0.004) following admission (Table 3), not including the admission order set. The fraction of orders written at night was not significantly different (27% at night preintervention, 33% postintervention, P = 0.149). The fraction of admissions on the day shift compared to the night shift did not change (P = 0.72). There was no difference in the ratio of de‐escalation to escalation orders written during the night (Table 2).

Distribution of Orders
Preintervention, 48 Admissions Postintervention, 50 Admissions P Value
  • NOTE: *Excludes admission order set.

Total no. of orders 212 231
Mean no. of orders per admission 4.42 4.62
Day shift orders, n (%) 155 (73) 155 (67) 0.149
Night shift orders, n (%) 57 (27) 76 (33)
Mean no. of orders within first 12 hours* 0.58 1.12 0.009
Mean no. of orders within first 24 hours* 1.52 2.38 0.004
Night shift escalation orders (%) 27 (47) 33 (43) 0.491
Night shift de‐escalation orders (%) 30 (53) 43 (57)

DISCUSSION

In this study, we demonstrate increased patient care management early in the hospitalization, measured in this study by the mean number of orders written per patient in the first 12 and 24 hours after admission, after transition from a call schedule with extended (>16 hours) shifts to one with shorter shifts compliant with current ACGME duty‐hour restrictions and an explicit focus on greater continuity of care. We did not detect a change in the proportion of total orders written on the night shift compared to the day shift. Earlier active medical management, such as weaning nebulized albuterol or supplemental oxygen, can speed the time to discharge.[14]

Our failure to detect a significant change in the proportion or type of orders written at night may have been due to our small sample size. Anecdotally, after the intervention, medical students reported to us that they noticed a difference between our service, in which we expect night teams to advance care, and other services at our institution, in which nights are a time to focus on putting out fires. This was not something that had been reported to us prior. It is likely reflective of the overall approach to patient care taken by residents working a night shift as part of a longitudinal care team.

This study builds on previous findings that demonstrated lower costs and shorter length of stay after implementing a schedule based on day and night teams.[7] The reasons for such improvements are likely multifactorial. In our model, which was purposefully designed to create night‐team continuity and minimize cross‐coverage, it is likely that residents also felt a greater sense of responsibility for and familiarity with the patients[15] and therefore felt more comfortable advancing care. Not only were interns likely better rested, the patient‐to‐provider ratio was also lower than in the preintervention model. Increases in staffing are often necessary to eliminate cross‐coverage while maintaining safe, 24‐hour care. These findings suggest that increases in cost from additional staffing may be at least partially offset by more active patient management early in the hospitalization, which has the potential to lead to shorter hospital stays.

There are several limitations to our research. We studied a small sample, including a subset of general pediatrics diagnoses that are amenable to active management, limiting generalizability. We did not calculate a physician‐to‐patient ratio because this was not possible with the retrospective data we collected. Staffing ratios likely improved, and we consider that part of the overall improvements in staffing that may have contributed to the observed changes in ordering patterns. Although intern‐level cross‐coverage was eliminated, the senior resident continued to cover multiple teams overnight. This senior covered the same 3 teams for 7 consecutive nights. The addition of a hospitalist team, with subspecialists being placed in consultant roles, may have contributed to the increase in active management, though our study population did not include subspecialty patients. There was a difference in insurance status between the 2 groups. This was unlikely to affect resident physician practices as insurance information is not routinely discussed in the course of patient care. In the context of the ongoing debate about duty‐hour restrictions, it will be important for future studies to elucidate whether sleep or other variables are the primary contributors to this finding. Our data are derived solely from 1 inpatient service at a single academic medical center; however, we do feel there are lessons that may be applied to other settings.

CONCLUSION

A coverage system with improved nighttime resident coverage was associated with a greater number of orders written early in the hospitalization, suggesting more active management of clinical problems to advance care.

Acknowledgements

The authors thank Dr. I. Elaine Allen, John Kornak, and Dr. Derek Pappas for assistance with biostatistics, and Dr. Diana Bojorquez and Dr. Derek Pappas for assistance with review of the manuscript and creation of the figures.

Disclosures: None of the authors have financial relationships or other conflicts of interest to disclose. No external funding was secured for this study. Dr. Auerbach was supported by grant K24HL098372 during the course of this study. This project was supported by the National Center for Advancing Translational Sciences, National Institutes of Health (NIH), through University of California San FranciscoClinical and Translational Sciences Institute grant UL1 TR000004. Its contents are solely the responsibility of the authors and do not necessarily represent the official views of the NIH. Dr. Rosenbluth had access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

References
  1. Philibert I, Friedmann P, Williams WT. New requirements for resident duty hours. JAMA. 2002;288(9):11121114.
  2. Accreditation Council for Graduate Medical Education. Common program requirements. 2011. Available at: http://www.acgme.org/acgmeweb/Portals/0/PDFs/Common_Program_Requirements_07012011[2].pdf. Accessed November 28, 2011.
  3. Fletcher KE, Reed DA, Arora VM. Patient safety, resident education and resident well‐being following implementation of the 2003 ACGME duty hour rules. J Gen Intern Med. 2011;26(8):907919.
  4. Ahmed N, Devitt KS, Keshet I, et al. A systematic review of the effects of resident duty hour restrictions in surgery: impact on resident wellness, training, and patient outcomes. Ann Surg. 2014;259(6):10411053.
  5. Philibert I, Nasca T, Brigham T, Shapiro J. Duty‐hour limits and patient care and resident outcomes: can high‐quality studies offer insight into complex relationships? Annu Rev Med. 2013;64:467483.
  6. Rosenbluth G, Fiore DM, Maselli JH, Vittinghoff E, Wilson SD, Auerbach AD. Association between adaptations to ACGME duty hour requirements, length of stay, and costs. Sleep. 2013;36(2):245248.
  7. Gottlieb DJ, Parenti CM, Peterson CA, Lofgren RP. Effect of a change in house staff work schedule on resource utilization and patient care. Arch Intern Med. 1991;151(10):20652070.
  8. Landrigan CP, Rothschild JM, Cronin JW, et al. Effect of reducing interns' work hours on serious medical errors in intensive care units. N Engl J Med. 2004;351(18):18381848.
  9. Horwitz LI, Kosiborod M, Lin Z, Krumholz HM. Changes in outcomes for internal medicine inpatients after work‐hour regulations. Ann Intern Med. 2007;147(2):97103.
  10. Levine AC, Adusumilli J, Landrigan CP. Effects of reducing or eliminating resident work shifts over 16 hours: a systematic review. Sleep. 2010;33(8):10431053.
  11. Sen S, Kranzler HR, Didwania AK, et al. Effects of the 2011 duty hour reforms on interns and their patients: a prospective longitudinal cohort study. JAMA Intern Med. 2013;173(8):657662; discussion 663.
  12. McCoy CP, Halvorsen AJ, Loftus CG, McDonald FS, Oxentenko AS. Effect of 16‐hour duty periods on patient care and resident education. Mayo Clin Proc. 2011;86(3):192196.
  13. Desai SV, Feldman L, Brown L, et al. Effect of the 2011 vs 2003 duty hour regulation‐compliant models on sleep duration, trainee education, and continuity of patient care among internal medicine house staff: a randomized trial. JAMA Intern Med. 2013;173(8):649655.
  14. Johnson KB, Blaisdell CJ, Walker A, Eggleston P. Effectiveness of a clinical pathway for inpatient asthma management. Pediatrics. 2000;106(5):10061012.
  15. Burgis JC, Lockspeiser TM, Stumpf EC, Wilson SD. Resident perceptions of autonomy in a complex tertiary care environment improve when supervised by hospitalists. Hosp Pediatr. 2012;2(4):228234.
References
  1. Philibert I, Friedmann P, Williams WT. New requirements for resident duty hours. JAMA. 2002;288(9):11121114.
  2. Accreditation Council for Graduate Medical Education. Common program requirements. 2011. Available at: http://www.acgme.org/acgmeweb/Portals/0/PDFs/Common_Program_Requirements_07012011[2].pdf. Accessed November 28, 2011.
  3. Fletcher KE, Reed DA, Arora VM. Patient safety, resident education and resident well‐being following implementation of the 2003 ACGME duty hour rules. J Gen Intern Med. 2011;26(8):907919.
  4. Ahmed N, Devitt KS, Keshet I, et al. A systematic review of the effects of resident duty hour restrictions in surgery: impact on resident wellness, training, and patient outcomes. Ann Surg. 2014;259(6):10411053.
  5. Philibert I, Nasca T, Brigham T, Shapiro J. Duty‐hour limits and patient care and resident outcomes: can high‐quality studies offer insight into complex relationships? Annu Rev Med. 2013;64:467483.
  6. Rosenbluth G, Fiore DM, Maselli JH, Vittinghoff E, Wilson SD, Auerbach AD. Association between adaptations to ACGME duty hour requirements, length of stay, and costs. Sleep. 2013;36(2):245248.
  7. Gottlieb DJ, Parenti CM, Peterson CA, Lofgren RP. Effect of a change in house staff work schedule on resource utilization and patient care. Arch Intern Med. 1991;151(10):20652070.
  8. Landrigan CP, Rothschild JM, Cronin JW, et al. Effect of reducing interns' work hours on serious medical errors in intensive care units. N Engl J Med. 2004;351(18):18381848.
  9. Horwitz LI, Kosiborod M, Lin Z, Krumholz HM. Changes in outcomes for internal medicine inpatients after work‐hour regulations. Ann Intern Med. 2007;147(2):97103.
  10. Levine AC, Adusumilli J, Landrigan CP. Effects of reducing or eliminating resident work shifts over 16 hours: a systematic review. Sleep. 2010;33(8):10431053.
  11. Sen S, Kranzler HR, Didwania AK, et al. Effects of the 2011 duty hour reforms on interns and their patients: a prospective longitudinal cohort study. JAMA Intern Med. 2013;173(8):657662; discussion 663.
  12. McCoy CP, Halvorsen AJ, Loftus CG, McDonald FS, Oxentenko AS. Effect of 16‐hour duty periods on patient care and resident education. Mayo Clin Proc. 2011;86(3):192196.
  13. Desai SV, Feldman L, Brown L, et al. Effect of the 2011 vs 2003 duty hour regulation‐compliant models on sleep duration, trainee education, and continuity of patient care among internal medicine house staff: a randomized trial. JAMA Intern Med. 2013;173(8):649655.
  14. Johnson KB, Blaisdell CJ, Walker A, Eggleston P. Effectiveness of a clinical pathway for inpatient asthma management. Pediatrics. 2000;106(5):10061012.
  15. Burgis JC, Lockspeiser TM, Stumpf EC, Wilson SD. Resident perceptions of autonomy in a complex tertiary care environment improve when supervised by hospitalists. Hosp Pediatr. 2012;2(4):228234.
Issue
Journal of Hospital Medicine - 11(3)
Issue
Journal of Hospital Medicine - 11(3)
Page Number
210-214
Page Number
210-214
Publications
Publications
Article Type
Display Headline
Association between ordering patterns and shift‐based care in general pediatrics inpatients
Display Headline
Association between ordering patterns and shift‐based care in general pediatrics inpatients
Sections
Article Source
© 2015 Society of Hospital Medicine
Disallow All Ads
Correspondence Location
Address for correspondence and reprint requests: Glenn Rosenbluth, MD, Department of Pediatrics, 505 Parnassus Ave, M‐691, San Francisco, CA 94143‐0110; Telephone: 415‐476‐9185; Fax: 415‐476‐4009; E‐mail: rosenbluthg@peds.ucsf.edu
Content Gating
Gated (full article locked unless allowed per User)
Gating Strategy
First Peek Free
Article PDF Media
Image
Disable zoom
Off
Media Files
Image
Disable zoom
Off

Variation in Printed Handoff Documents

Article Type
Changed
Tue, 05/16/2017 - 23:11
Display Headline
Variation in printed handoff documents: Results and recommendations from a multicenter needs assessment

Handoffs among hospital providers are highly error prone and can result in serious morbidity and mortality. Best practices for verbal handoffs have been described[1, 2, 3, 4] and include conducting verbal handoffs face to face, providing opportunities for questions, having the receiver perform a readback, as well as specific content recommendations including action items. Far less research has focused on best practices for printed handoff documents,[5, 6] despite the routine use of written handoff tools as a reference by on‐call physicians.[7, 8] Erroneous or outdated information on the written handoff can mislead on‐call providers, potentially leading to serious medical errors.

In their most basic form, printed handoff documents list patients for whom a provider is responsible. Typically, they also contain demographic information, reason for hospital admission, and a task list for each patient. They may also contain more detailed information on patient history, hospital course, and/or care plan, and may vary among specialties.[9] They come in various forms, ranging from index cards with handwritten notes, to word‐processor or spreadsheet documents, to printed documents that are autopopulated from the electronic health record (EHR).[2] Importantly, printed handoff documents supplement the verbal handoff by allowing receivers to follow along as patients are presented. The concurrent use of written and verbal handoffs may improve retention of clinical information as compared with either alone.[10, 11]

The Joint Commission requires an institutional approach to patient handoffs.[12] The requirements state that handoff communication solutions should take a standardized form, but they do not provide details regarding what data elements should be included in printed or verbal handoffs. Accreditation Council for Graduate Medical Education Common Program Requirements likewise require that residents must become competent in patient handoffs[13] but do not provide specific details or measurement tools. Absent widely accepted guidelines, decisions regarding which elements to include in printed handoff documents are currently made at an individual or institutional level.

The I‐PASS study is a federally funded multi‐institutional project that demonstrated a decrease in medical errors and preventable adverse events after implementation of a standardized resident handoff bundle.[14, 15] The I‐PASS Study Group developed a bundle of handoff interventions, beginning with a handoff and teamwork training program (based in part on TeamSTEPPS [Team Strategies and Tools to Enhance Performance and Patient Safety]),[16] a novel verbal mnemonic, I‐PASS (Illness Severity, Patient Summary, Action List, Situation Awareness and Contingency Planning, and Synthesis by Receiver),[17] and changes to the verbal handoff process, in addition to several other elements.

We hypothesized that developing a standardized printed handoff template would reinforce the handoff training and enhance the value of the verbal handoff process changes. Given the paucity of data on best printed handoff practices, however, we first conducted a needs assessment to identify which data elements were currently contained in printed handoffs across sites, and to allow an expert panel to make recommendations for best practices.

METHODS

I‐PASS Study sites included 9 pediatric residency programs at academic medical centers from across North America. Programs were identified through professional networks and invited to participate. The nonintensive care unit hospitalist services at these medical centers are primarily staffed by residents and medical students with attending supervision. At 1 site, nurse practitioners also participate in care. Additional details about study sites can be found in the study descriptions previously published.[14, 15] All sites received local institutional review board approval.

We began by inviting members of the I‐PASS Education Executive Committee (EEC)[14] to build a collective, comprehensive list of possible data elements for printed handoff documents. This committee included pediatric residency program directors, pediatric hospitalists, education researchers, health services researchers, and patient safety experts. We obtained sample handoff documents from pediatric hospitalist services at each of 9 institutions in the United States and Canada (with protected health information redacted). We reviewed these sample handoff documents to characterize their format and to determine what discrete data elements appeared in each site's printed handoff document. Presence or absence of each data element across sites was tabulated. We also queried sites to determine the feasibility of including elements that were not presently included.

Subsequently, I‐PASS site investigators led structured group interviews at participating sites to gather additional information about handoff practices at each site. These structured group interviews included diverse representation from residents, faculty, and residency program leadership, as well as hospitalists and medical students, to ensure the comprehensive acquisition of information regarding site‐specific characteristics. Each group provided answers to a standardized set of open‐ended questions that addressed current practices, handoff education, simulation use, team structure, and the nature of current written handoff tools, if applicable, at each site. One member of the structured group interview served as a scribe and created a document that summarized the content of the structured group interview meeting and answers to the standardized questions.

Consensus on Content

The initial data collection also included a multivote process[18] of the full I‐PASS EEC to help prioritize data elements. Committee members brainstormed a list of all possible data elements for a printed handoff document. Each member (n=14) was given 10 votes to distribute among the elements. Committee members could assign more than 1 vote to an element to emphasize its importance.

The results of this process as well as the current data elements included in each printed handoff tool were reviewed by a subgroup of the I‐PASS EEC. These expert panel members participated in a series of conference calls during which they tabulated categorical information, reviewed narrative comments, discussed existing evidence, and conducted simple content analysis to identify areas of concordance or discordance. Areas of discordance were discussed by the committee. Disagreements were resolved with group consensus with attention to published evidence or best practices, if available.

Elements were divided into those that were essential (unanimous consensus, no conflicting literature) and those that were recommended (majority supported inclusion of element, no conflicting literature). Ratings were assigned using the American College of Cardiology/American Heart Association framework for practice guidelines,[19] in which each element is assigned a classification (I=effective, II=conflicting evidence/opinion, III=not effective) and a level of evidence to support that classification (A=multiple large randomized controlled trials, B=single randomized trial, or nonrandomized studies, C=expert consensus).

The expert panel reached consensus, through active discussion, on a list of data elements that should be included in an ideal printed handoff document. Elements were chosen based on perceived importance, with attention to published best practices[1, 16] and the multivoting results. In making recommendations, consideration was given to whether data elements could be electronically imported into the printed handoff document from the EHR, or whether they would be entered manually. The potential for serious medical errors due to possible errors in manual entry of data was an important aspect of recommendations made. The list of candidate elements was then reviewed by a larger group of investigators from the I‐PASS Education Executive Committee and Coordinating Council for additional input.

The panel asked site investigators from each participating hospital to gather data on the feasibility of redesigning the printed handoff at that hospital to include each recommended element. Site investigators reported whether each element was already included, possible to include but not included currently, or not currently possible to include within that site's printed handoff tool. Site investigators also reported how data elements were populated in their handoff documents, with options including: (1) autopopulated from administrative data (eg, pharmacy‐entered medication list, demographic data entered by admitting office), (2) autoimported from physicians' free‐text entries elsewhere in the EHR (eg, progress notes), (3) free text entered specifically for the printed handoff, or (4) not applicable (element cannot be included).

RESULTS

Nine programs (100%) provided data on the structure and contents of their printed handoff documents. We found wide variation in structure across the 9 sites. Three sites used a word‐processorbased document that required manual entry of all data elements. The other 6 institutions had a direct link with the EHR to enable autopopulation of between 10 and 20 elements on the printed handoff document.

The content of written handoff documents, as well as the sources of data included in them (present or future), likewise varied substantially across sites (Table 1). Only 4 data elements (name, age, weight, and a list of medications) were universally included at all 9 sites. Among the 6 institutions that linked the printed handoff to the EHR, there was also substantial variation in which elements were autoimported. Only 7 elements were universally autoimported at these 6 sites: patient name, medical record number, room number, weight, date of birth, age, and date of admission. Two elements from the original brainstorming were not presently included in any sites' documents (emergency contact and primary language).

Results of Initial Needs Assessment, With Current and Potential Future Inclusion of Data Elements in Printed Handoff Documents at Nine Study Sites
Data ElementsSites With Data Element Included at Initial Needs Assessment (Out of Nine Sites)Data Source (Current or Anticipated)
Autoimported*Manually EnteredNot Applicable
  • NOTE: *Includes administrative data and free text entered into other electronic health record fields. Manually entered directly into printed handoff document. Data field could not be included due to institutional limitations.

Name9630
Medical record number8630
Room number8630
Allergies6450
Weight9630
Age9630
Date of birth6630
Admission date8630
Attending name5450
Team/service7450
Illness severity1090
Patient summary8090
Action items8090
Situation monitoring/contingency plan5090
Medication name9450
Medication name and dose/route/frequency4450
Code status2270
Labs6540
Access2270
Ins/outs2441
Primary language0360
Vital signs3441
Emergency contact0270
Primary care provider4450

Nine institutions (100%) conducted structured group interviews, ranging in size from 4 to 27 individuals with a median of 5 participants. The documents containing information from each site were provided to the authors. The authors then tabulated categorical information, reviewed narrative comments to understand current institutional practices, and conducted simple content analysis to identify areas of concordance or discordance, particularly with respect to data elements and EHR usage. Based on the results of the printed handoff document review and structured group interviews, with additional perspectives provided by the I‐PASS EEC, the expert panel came to consensus on a list of 23 elements that should be included in printed handoff documents, including 15 essential data elements and 8 additional recommended elements (Table 2).

Rating of Essential and Recommended Data Elements for Printed Handoff Template*
  • NOTE: Abbreviations: I‐PASS, illness severity, patient summary, action list, situation awareness and contingency planning, and synthesis by receiver. *Utilizing American College of Cardiology Foundation and American Heart Association framework for practice guidelines: classification (I=effective, IIa=conflicting evidence/opinion but weight is in favor of usefulness/efficacy, IIb=usefulness/efficacy less well established by evidence/opinion, III=not effective) and level of evidence to support classification (A=multiple large randomized controlled trials, B=single randomized trial or nonrandomized studies, C=expert consensus). Preferably using the I‐PASS categorization of stable/watcher/unstable, but other categorization okay. Refers to common or patient‐specific labs.

Essential Elements
Patient identifiers
Patient name (class I, level of evidence C)
Medical record number (class I, level of evidence C)
Date of birth (class I, level of evidence C)
Hospital service identifiers
Attending name (class I, level of evidence C)
Team/service (class I, level of evidence C)
Room number (class I, level of evidence C)
Admission date (class I, level of evidence C)
Age (class I, level of evidence C)
Weight (class I, level of evidence C)
Illness severity (class I, level of evidence B)[20, 21]
Patient summary (class I, level of evidence B)[21, 22]
Action items (class I, level of evidence B) [21, 22]
Situation awareness/contingency planning (class I, level of evidence B) [21, 22]
Allergies (class I, level of evidence C)
Medications
Autopopulation of medications (class I, level of evidence B)[22, 23, 24]
Free‐text entry of medications (class IIa, level of evidence C)
Recommended elements
Primary language (class IIa, level of evidence C)
Emergency contact (class IIa, level of evidence C)
Primary care provider (class IIa, level of evidence C)
Code status (class IIb, level of evidence C)
Labs (class IIa, level of evidence C)
Access (class IIa, level of evidence C)
Ins/outs (class IIa, level of evidence C)
Vital signs (class IIa, level of evidence C)

Evidence ratings[19] of these elements are included. Several elements are classified as I‐B (effective, nonrandomized studies) based on either studies of individual elements, or greater than 1 study of bundled elements that could reasonably be extrapolated. These include Illness severity,[20, 21] patient summary,[21, 22] action items[21, 22] (to do lists), situation awareness and contingency plan,[21, 22] and medications[22, 23, 24] with attention to importing from the EHR. Medications entered as free text were classified as IIa‐C because of risk and potential significance of errors; in particular there was concern that transcription errors, errors of omission, or errors of commission could potentially lead to patient harms. The remaining essential elements are classified as I‐C (effective, expert consensus). Of note, date of birth was specifically included as a patient identifier, distinct from age, which was felt to be useful as a descriptor (often within a one‐liner or as part of the patient summary).

The 8 recommended elements were elements for which there was not unanimous agreement on inclusion, but the majority of the panel felt they should be included. These elements were classified as IIa‐C, with 1 exception. Code status generated significant controversy among the group. After extensive discussion among the group and consideration of safety, supervision, educational, and pediatric‐specific considerations, all members of the group agreed on the categorization as a recommended element; it is classified as IIb‐C.

All members of the group agreed that data elements should be directly imported from the EHR whenever possible. Finally, members agreed that the elements that make up the I‐PASS mnemonic (illness severity, patient summary, action items, situation awareness/contingency planning) should be listed in that order whenever possible. A sample I‐PASS‐compliant printed handoff document is shown Figure 1.

jhm2380-fig-0001-m.png
Sample screenshot of an I‐PASS–compliant handoff report. Abbreviations: I‐PASS, illness severity, patient summary, action list, situation awareness and contingency planning, and synthesis by receiver.

DISCUSSION

We identified substantial variability in the structure and content of printed handoff documents used by 9 pediatric hospitalist teaching services, reflective of a lack of standardization. We found that institutional printed handoff documents shared some demographic elements (eg, name, room, medical record number) but also varied in clinical content (eg, vital signs, lab tests, code status). Our expert panel developed a list of 15 essential and 8 recommended data elements for printed handoff documents. Although this is a large number of fields, the majority of the essential fields were already included by most sites, and many are basic demographic identifiers. Illness severity is the 1 essential field that was not routinely included; however, including this type of overview is consistently recommended[2, 4] and supported by evidence,[20, 21] and contributes to building a shared mental model.[16] We recommend the categories of stable/watcher/unstable.[17]

Several prior single‐center studies have found that introducing a printed handoff document can lead to improvements in workflow, communication, and patient safety. In an early study, Petersen et al.[25] showed an association between use of a computerized sign‐out program and reduced odds of preventable adverse events during periods of cross‐coverage. Wayne et al.[26] reported fewer perceived inaccuracies in handoff documents as well as improved clarity at the time of transfer, supporting the role for standardization. Van Eaton et al.[27] demonstrated rapid uptake and desirability of a computerized handoff document, which combined autoimportation of information from an EHR with resident‐entered patient details, reflecting the importance of both data sources. In addition, they demonstrated improvements in both the rounding and sign‐out processes.[28]

Two studies specifically reported the increased use of specific fields after implementation. Payne et al. implemented a Web‐based handoff tool and documented significant increases in the number of handoffs containing problem lists, medication lists, and code status, accompanied by perceived improvements in quality of handoffs and fewer near‐miss events.[24] Starmer et al. found that introduction of a resident handoff bundle that included a printed handoff tool led to reduction in medical errors and adverse events.[22] The study group using the tool populated 11 data elements more often after implementation, and introduction of this printed handoff tool in particular was associated with reductions in written handoff miscommunications. Neither of these studies included subanalysis to indicate which data elements may have been most important.

In contrast to previous single‐institution studies, our recommendations for a printed handoff template come from evaluations of tools and discussions with front line providers across 9 institutions. We had substantial overlap with data elements recommended by Van Eaton et al.[27] However, there were several areas in which we did not have overlap with published templates including weight, ins/outs, primary language, emergency contact information, or primary care provider. Other published handoff tools have been highly specialized (eg, for cardiac intensive care) or included many fewer data elements than our group felt were essential. These differences may reflect the unique aspects of caring for pediatric patients (eg, need for weights) and the absence of defined protocols for many pediatric conditions. In addition, the level of detail needed for contingency planning may vary between teaching and nonteaching services.

Resident physicians may provide valuable information in the development of standardized handoff documents. Clark et al.,[29] at Virginia Mason University, utilized resident‐driven continuous quality improvement processes including real‐time feedback to implement an electronic template. They found that engagement of both senior leaders and front‐line users was an important component of their success in uptake. Our study utilized residents as essential members of structured group interviews to ensure that front‐line users' needs were represented as recommendations for a printed handoff tool template were developed.

As previously described,[17] our study group had identified several key data elements that should be included in verbal handoffs: illness severity, a patient summary, a discrete action list, situation awareness/contingency planning, and a synthesis by receiver. With consideration of the multivoting results as well as known best practices,[1, 4, 12] the expert panel for this study agreed that each of these elements should also be highlighted in the printed template to ensure consistency between the printed document and the verbal handoff, and to have each reinforce the other. On the printed handoff tool, the final S in the I‐PASS mnemonic (synthesis by receiver) cannot be prepopulated, but considering the importance of this step,[16, 30, 31, 32] it should be printed as synthesis by receiver to serve as a text‐reminder to both givers and receivers.

The panel also felt, however, that the printed handoff document should provide additional background information not routinely included in a verbal handoff. It should serve as a reference tool both at the time of verbal handoff and throughout the day and night, and therefore should include more comprehensive information than is necessary or appropriate to convey during the verbal handoff. We identified 10 data elements that are essential in a printed handoff document in addition to the I‐PASS elements (Table 2).

Patient demographic data elements, as well as team assignments and attending physician, were uniformly supported for inclusion. The medication list was viewed as essential; however, the panel also recognized the potential for medical errors due to inaccuracies in the medication list. In particular, there was concern that including all fields of a medication order (drug, dose, route, frequency) would result in handoffs containing a high proportion of inaccurate information, particularly for complex patients whose medication regimens may vary over the course of hospitalization. Therefore, the panel agreed that if medication lists were entered manually, then only the medication name should be included as they did not wish to perpetuate inaccurate or potentially harmful information. If medication lists were autoimported from an EHR, then they should include drug name, dose, route, and frequency if possible.

In the I‐PASS study,[15] all institutions implemented printed handoff documents that included fields for the essential data elements. After implementation, there was a significant increase in completion of all essential fields. Although there is limited evidence to support any individual data element, increased usage of these elements was associated with the overall study finding of decreased rates of medical errors and preventable adverse events.

EHRs have the potential to help standardize printed handoff documents[5, 6, 33, 34, 35]; all participants in our study agreed that printed handoff documents should ideally be linked with the EHR and should autoimport data wherever appropriate. Manually populated (eg, word processor‐ or spreadsheet‐based) handoff tools have important limitations, particularly related to the potential for typographical errors as well as accidental omission of data fields, and lead to unnecessary duplication of work (eg, re‐entering data already included in a progress note) that can waste providers' time. It was also acknowledged that word processor‐ or spreadsheet‐based documents may have flexibility that is lacking in EHR‐based handoff documents. For example, formatting can more easily be adjusted to increase the number of patients per printed page. As technology advances, printed documents may be phased out in favor of EHR‐based on‐screen reports, which by their nature would be more accurate due to real‐time autoupdates.

In making recommendations about essential versus recommended items for inclusion in the printed handoff template, the only data element that generated controversy among our experts was code status. Some felt that it should be included as an essential element, whereas others did not. We believe that this was unique to our practice in pediatric hospital ward settings, as codes in most pediatric ward settings are rare. Among the concerns expressed with including code status for all patients were that residents might assume patients were full‐code without verifying. The potential inaccuracy created by this might have severe implications. Alternatively, residents might feel obligated to have code discussions with all patients regardless of severity of illness, which may be inappropriate in a pediatric population. Several educators expressed concerns about trainees having unsupervised code‐status conversations with families of pediatric patients. Conversely, although codes are rare in pediatric ward settings, concerns were raised that not including code status could be problematic during these rare but critically important events. Other fields, such as weight, might have less relevance for an adult population in which emergency drug doses are standardized.

Limitations

Our study has several limitations. We only collected data from hospitalist services at pediatric sites. It is likely that providers in other specialties would have specific data elements they felt were essential (eg, postoperative day, code status). Our methodology was expert consensus based, driven by data collection from sites that were already participating in the I‐PASS study. Although the I‐PASS study demonstrated decreased rates of medical errors and preventable adverse events with inclusion of these data elements as part of a bundle, future research will be required to evaluate whether some of these items are more important than others in improving written communication and ultimately patient safety. In spite of these limitations, our work represents an important starting point for the development of standards for written handoff documents that should be used in patient handoffs, particularly those generated from EHRs.

CONCLUSIONS

In this article we describe the results of a needs assessment that informed expert consensus‐based recommendations for data elements to include in a printed handoff document. We recommend that pediatric programs include the elements identified as part of a standardized written handoff tool. Although many of these elements are also applicable to other specialties, future work should be conducted to adapt the printed handoff document elements described here for use in other specialties and settings. Future studies should work to validate the importance of these elements, studying the manner in which their inclusion affects the quality of written handoffs, and ultimately patient safety.

Acknowledgements

Members of the I‐PASS Study Education Executive Committee who contributed to this manuscript include: Boston Children's Hospital/Harvard Medical School (primary site) (Christopher P. Landrigan, MD, MPH, Elizabeth L. Noble, BA. Theodore C. Sectish, MD. Lisa L. Tse, BA). Cincinnati Children's Hospital Medical Center/University of Cincinnati College of Medicine (Jennifer K. O'Toole, MD, MEd). Doernbecher Children's Hospital/Oregon Health and Science University (Amy J. Starmer, MD, MPH). Hospital for Sick Children/University of Toronto (Zia Bismilla, MD. Maitreya Coffey, MD). Lucile Packard Children's Hospital/Stanford University (Lauren A. Destino, MD. Jennifer L. Everhart, MD. Shilpa J. Patel, MD [currently at Kapi'olani Children's Hospital/University of Hawai'i School of Medicine]). National Capital Consortium (Jennifer H. Hepps, MD. Joseph O. Lopreiato, MD, MPH. Clifton E. Yu, MD). Primary Children's Medical Center/University of Utah (James F. Bale, Jr., MD. Adam T. Stevenson, MD). St. Louis Children's Hospital/Washington University (F. Sessions Cole, MD). St. Christopher's Hospital for Children/Drexel University College of Medicine (Sharon Calaman, MD. Nancy D. Spector, MD). Benioff Children's Hospital/University of California San Francisco School of Medicine (Glenn Rosenbluth, MD. Daniel C. West, MD).

Additional I‐PASS Study Group members who contributed to this manuscript include April D. Allen, MPA, MA (Heller School for Social Policy and Management, Brandeis University, previously affiliated with Boston Children's Hospital), Madelyn D. Kahana, MD (The Children's Hospital at Montefiore/Albert Einstein College of Medicine, previously affiliated with Lucile Packard Children's Hospital/Stanford University), Robert S. McGregor, MD (Akron Children's Hospital/Northeast Ohio Medical University, previously affiliated with St. Christopher's Hospital for Children/Drexel University), and John S. Webster, MD, MBA, MS (Webster Healthcare Consulting Inc., formerly of the Department of Defense).

Members of the I‐PASS Study Group include individuals from the institutions listed below as follows: Boston Children's Hospital/Harvard Medical School (primary site): April D. Allen, MPA, MA (currently at Heller School for Social Policy and Management, Brandeis University), Angela M. Feraco, MD, Christopher P. Landrigan, MD, MPH, Elizabeth L. Noble, BA, Theodore C. Sectish, MD, Lisa L. Tse, BA. Brigham and Women's Hospital (data coordinating center): Anuj K. Dalal, MD, Carol A. Keohane, BSN, RN, Stuart Lipsitz, PhD, Jeffrey M. Rothschild, MD, MPH, Matt F. Wien, BS, Catherine S. Yoon, MS, Katherine R. Zigmont, BSN, RN. Cincinnati Children's Hospital Medical Center/University of Cincinnati College of Medicine: Javier Gonzalez del Rey, MD, MEd, Jennifer K. O'Toole, MD, MEd, Lauren G. Solan, MD. Doernbecher Children's Hospital/Oregon Health and Science University: Megan E. Aylor, MD, Amy J. Starmer, MD, MPH, Windy Stevenson, MD, Tamara Wagner, MD. Hospital for Sick Children/University of Toronto: Zia Bismilla, MD, Maitreya Coffey, MD, Sanjay Mahant, MD, MSc. Lucile Packard Children's Hospital/Stanford University: Rebecca L. Blankenburg, MD, MPH, Lauren A. Destino, MD, Jennifer L. Everhart, MD, Madelyn Kahana, MD, Shilpa J. Patel, MD (currently at Kapi'olani Children's Hospital/University of Hawaii School of Medicine). National Capital Consortium: Jennifer H. Hepps, MD, Joseph O. Lopreiato, MD, MPH, Clifton E. Yu, MD. Primary Children's Hospital/University of Utah: James F. Bale, Jr., MD, Jaime Blank Spackman, MSHS, CCRP, Rajendu Srivastava, MD, FRCP(C), MPH, Adam Stevenson, MD. St. Louis Children's Hospital/Washington University: Kevin Barton, MD, Kathleen Berchelmann, MD, F. Sessions Cole, MD, Christine Hrach, MD, Kyle S. Schultz, MD, Michael P. Turmelle, MD, Andrew J. White, MD. St. Christopher's Hospital for Children/Drexel University: Sharon Calaman, MD, Bronwyn D. Carlson, MD, Robert S. McGregor, MD (currently at Akron Children's Hospital/Northeast Ohio Medical University), Vahideh Nilforoshan, MD, Nancy D. Spector, MD. and Benioff Children's Hospital/University of California San Francisco School of Medicine: Glenn Rosenbluth, MD, Daniel C. West, MD. Dorene Balmer, PhD, RD, Carol L. Carraccio, MD, MA, Laura Degnon, CAE, and David McDonald, and Alan Schwartz PhD serve the I‐PASS Study Group as part of the IIPE. Karen M. Wilson, MD, MPH serves the I‐PASS Study Group as part of the advisory board from the PRIS Executive Council. John Webster served the I‐PASS Study Group and Education Executive Committee as a representative from TeamSTEPPS.

Disclosures: The I‐PASS Study was primarily supported by the US Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation (1R18AE000029‐01). The opinions and conclusions expressed herein are solely those of the author(s) and should not be constructed as representing the opinions or policy of any agency of the federal government. Developed with input from the Initiative for Innovation in Pediatric Education and the Pediatric Research in Inpatient Settings Network (supported by the Children's Hospital Association, the Academic Pediatric Association, the American Academy of Pediatrics, and the Society of Hospital Medicine). A. J. S. was supported by the Agency for Healthcare Research and Quality/Oregon Comparative Effectiveness Research K12 Program (1K12HS019456‐01). Additional funding for the I‐PASS Study was provided by the Medical Research Foundation of Oregon, Physician Services Incorporated Foundation (Ontario, Canada), and Pfizer (unrestricted medical education grant to N.D.S.). C.P.L, A.J.S. were supported by the Oregon Comparative Effectiveness Research K12 Program (1K12HS019456 from the Agency for Healthcare Research and Quality). A.J.S. was also supported by the Medical Research Foundation of Oregon. The authors report no conflicts of interest.

Files
References
  1. Patterson ES, Roth EM, Woods DD, Chow R, Gomes JO. Handoff strategies in settings with high consequences for failure: lessons for health care operations. Int J Qual Health Care. 2004;16(2):125132.
  2. Vidyarthi AR, Arora V, Schnipper JL, Wall SD, Wachter RM. Managing discontinuity in academic medical centers: strategies for a safe and effective resident sign‐out. J Hosp Med. 2006;1(4):257266.
  3. Horwitz LI, Moin T, Green ML. Development and implementation of an oral sign‐out skills curriculum. J Gen Intern Med. 2007;22(10):14701474.
  4. Arora VM, Manjarrez E, Dressler DD, Basaviah P, Halasyamani L, Kripalani S. Hospitalist handoffs: a systematic review and task force recommendations. J Hosp Med. 2009;4(7):433440.
  5. Abraham J, Kannampallil T, Patel VL. A systematic review of the literature on the evaluation of handoff tools: implications for research and practice. J Am Med Inform Assoc. 2014;21(1):154162.
  6. Li P, Ali S, Tang C, Ghali WA, Stelfox HT. Review of computerized physician handoff tools for improving the quality of patient care. J Hosp Med. 2013;8(8):456463.
  7. McSweeney ME, Landrigan CP, Jiang H, Starmer A, Lightdale JR. Answering questions on call: pediatric resident physicians' use of handoffs and other resources. J Hosp Med. 2013;8(6):328333.
  8. Fogerty RL, Schoenfeld A, Salim Al‐Damluji M, Horwitz LI. Effectiveness of written hospitalist sign‐outs in answering overnight inquiries. J Hosp Med. 2013;8(11):609614.
  9. Schoenfeld AR, Salim Al‐Damluji M, Horwitz LI. Sign‐out snapshot: cross‐sectional evaluation of written sign‐outs among specialties. BMJ Qual Saf. 2014;23(1):6672.
  10. Bhabra G, Mackeith S, Monteiro P, Pothier DD. An experimental comparison of handover methods. Ann R Coll Surg Engl. 2007;89(3):298300.
  11. Pothier D, Monteiro P, Mooktiar M, Shaw A. Pilot study to show the loss of important data in nursing handover. Br J Nurs. 2005;14(20):10901093.
  12. The Joint Commission. Hospital Accreditation Standards 2015: Joint Commission Resources; 2015:PC.02.02.01.
  13. Accreditation Council for Graduate Medical Education. Common Program Requirements. 2013; http://acgme.org/acgmeweb/tabid/429/ProgramandInstitutionalAccreditation/CommonProgramRequirements.aspx. Accessed May 11, 2015.
  14. Sectish TC, Starmer AJ, Landrigan CP, Spector ND. Establishing a multisite education and research project requires leadership, expertise, collaboration, and an important aim. Pediatrics. 2010;126(4):619622.
  15. Starmer AJ, Spector ND, Srivastava R, et al. Changes in medical errors after implementation of a handoff program. N Engl J Med. 2014;371(19):18031812.
  16. US Department of Health and Human Services. Agency for Healthcare Research and Quality. TeamSTEPPS website. Available at: http://teamstepps.ahrq.gov/. Accessed July 12, 2013.
  17. Starmer AJ, Spector ND, Srivastava R, Allen AD, Landrigan CP, Sectish TC. I‐PASS, a mnemonic to standardize verbal handoffs. Pediatrics. 2012;129(2):201204.
  18. Scholtes P, Joiner B, Streibel B. The Team Handbook. 3rd ed. Middleton, WI: Oriel STAT A MATRIX; 2010.
  19. ACC/AHA Task Force on Practice Guidelines. Methodology Manual and Policies From the ACCF/AHA Task Force on Practice Guidelines. Available at: http://my.americanheart.org/idc/groups/ahamah‐public/@wcm/@sop/documents/downloadable/ucm_319826.pdf. Published June 2010. Accessed January 11, 2015.
  20. Naessens JM, Campbell CR, Shah N, et al. Effect of illness severity and comorbidity on patient safety and adverse events. Am J Med Qual. 2012;27(1):4857.
  21. Horwitz LI, Moin T, Krumholz HM, Wang L, Bradley EH. Consequences of inadequate sign‐out for patient care. Arch Intern Med. 2008;168(16):17551760.
  22. Starmer AJ, Sectish TC, Simon DW, et al. Rates of medical errors and preventable adverse events among hospitalized children following implementation of a resident handoff bundle. JAMA. 2013;310(21):22622270.
  23. Arora V, Kao J, Lovinger D, Seiden SC, Meltzer D. Medication discrepancies in resident sign‐outs and their potential to harm. J Gen Intern Med. 2007;22(12):17511755.
  24. Payne CE, Stein JM, Leong T, Dressler DD. Avoiding handover fumbles: a controlled trial of a structured handover tool versus traditional handover methods. BMJ Qual Saf. 2012;21(11):925932.
  25. Petersen LA, Orav EJ, Teich JM, O'Neil AC, Brennan TA. Using a computerized sign‐out program to improve continuity of inpatient care and prevent adverse events. Jt Comm J Qual Improv. 1998;24(2):7787.
  26. Wayne JD, Tyagi R, Reinhardt G, et al. Simple standardized patient handoff system that increases accuracy and completeness. J Surg Educ. 2008;65(6):476485.
  27. Eaton EG, Horvath KD, Lober WB, Pellegrini CA. Organizing the transfer of patient care information: the development of a computerized resident sign‐out system. Surgery. 2004;136(1):513.
  28. Eaton EG, Horvath KD, Lober WB, Rossini AJ, Pellegrini CA. A randomized, controlled trial evaluating the impact of a computerized rounding and sign‐out system on continuity of care and resident work hours. J Am Coll Surg. 2005;200(4):538545.
  29. Clark CJ, Sindell SL, Koehler RP. Template for success: using a resident‐designed sign‐out template in the handover of patient care. J Surg Educ. 2011;68(1):5257.
  30. Boyd M, Cumin D, Lombard B, Torrie J, Civil N, Weller J. Read‐back improves information transfer in simulated clinical crises. BMJ Qual Saf. 2014;23(12):989993.
  31. Chang VY, Arora VM, Lev‐Ari S, D'Arcy M, Keysar B. Interns overestimate the effectiveness of their hand‐off communication. Pediatrics. 2010;125(3):491496.
  32. Barenfanger J, Sautter RL, Lang DL, Collins SM, Hacek DM, Peterson LR. Improving patient safety by repeating (read‐back) telephone reports of critical information. Am J Clin Pathol. 2004;121(6):801803.
  33. Collins SA, Stein DM, Vawdrey DK, Stetson PD, Bakken S. Content overlap in nurse and physician handoff artifacts and the potential role of electronic health records: a systematic review. J Biomed Inform. 2011;44(4):704712.
  34. Laxmisan A, McCoy AB, Wright A, Sittig DF. Clinical summarization capabilities of commercially‐available and internally‐developed electronic health records. Appl Clin Inform. 2012;3(1):8093.
  35. Hunt S, Staggers N. An analysis and recommendations for multidisciplinary computerized handoff applications in hospitals. AMIA Annu Symp Proc. 2011;2011:588597.
Article PDF
Issue
Journal of Hospital Medicine - 10(8)
Publications
Page Number
517-524
Sections
Files
Files
Article PDF
Article PDF

Handoffs among hospital providers are highly error prone and can result in serious morbidity and mortality. Best practices for verbal handoffs have been described[1, 2, 3, 4] and include conducting verbal handoffs face to face, providing opportunities for questions, having the receiver perform a readback, as well as specific content recommendations including action items. Far less research has focused on best practices for printed handoff documents,[5, 6] despite the routine use of written handoff tools as a reference by on‐call physicians.[7, 8] Erroneous or outdated information on the written handoff can mislead on‐call providers, potentially leading to serious medical errors.

In their most basic form, printed handoff documents list patients for whom a provider is responsible. Typically, they also contain demographic information, reason for hospital admission, and a task list for each patient. They may also contain more detailed information on patient history, hospital course, and/or care plan, and may vary among specialties.[9] They come in various forms, ranging from index cards with handwritten notes, to word‐processor or spreadsheet documents, to printed documents that are autopopulated from the electronic health record (EHR).[2] Importantly, printed handoff documents supplement the verbal handoff by allowing receivers to follow along as patients are presented. The concurrent use of written and verbal handoffs may improve retention of clinical information as compared with either alone.[10, 11]

The Joint Commission requires an institutional approach to patient handoffs.[12] The requirements state that handoff communication solutions should take a standardized form, but they do not provide details regarding what data elements should be included in printed or verbal handoffs. Accreditation Council for Graduate Medical Education Common Program Requirements likewise require that residents must become competent in patient handoffs[13] but do not provide specific details or measurement tools. Absent widely accepted guidelines, decisions regarding which elements to include in printed handoff documents are currently made at an individual or institutional level.

The I‐PASS study is a federally funded multi‐institutional project that demonstrated a decrease in medical errors and preventable adverse events after implementation of a standardized resident handoff bundle.[14, 15] The I‐PASS Study Group developed a bundle of handoff interventions, beginning with a handoff and teamwork training program (based in part on TeamSTEPPS [Team Strategies and Tools to Enhance Performance and Patient Safety]),[16] a novel verbal mnemonic, I‐PASS (Illness Severity, Patient Summary, Action List, Situation Awareness and Contingency Planning, and Synthesis by Receiver),[17] and changes to the verbal handoff process, in addition to several other elements.

We hypothesized that developing a standardized printed handoff template would reinforce the handoff training and enhance the value of the verbal handoff process changes. Given the paucity of data on best printed handoff practices, however, we first conducted a needs assessment to identify which data elements were currently contained in printed handoffs across sites, and to allow an expert panel to make recommendations for best practices.

METHODS

I‐PASS Study sites included 9 pediatric residency programs at academic medical centers from across North America. Programs were identified through professional networks and invited to participate. The nonintensive care unit hospitalist services at these medical centers are primarily staffed by residents and medical students with attending supervision. At 1 site, nurse practitioners also participate in care. Additional details about study sites can be found in the study descriptions previously published.[14, 15] All sites received local institutional review board approval.

We began by inviting members of the I‐PASS Education Executive Committee (EEC)[14] to build a collective, comprehensive list of possible data elements for printed handoff documents. This committee included pediatric residency program directors, pediatric hospitalists, education researchers, health services researchers, and patient safety experts. We obtained sample handoff documents from pediatric hospitalist services at each of 9 institutions in the United States and Canada (with protected health information redacted). We reviewed these sample handoff documents to characterize their format and to determine what discrete data elements appeared in each site's printed handoff document. Presence or absence of each data element across sites was tabulated. We also queried sites to determine the feasibility of including elements that were not presently included.

Subsequently, I‐PASS site investigators led structured group interviews at participating sites to gather additional information about handoff practices at each site. These structured group interviews included diverse representation from residents, faculty, and residency program leadership, as well as hospitalists and medical students, to ensure the comprehensive acquisition of information regarding site‐specific characteristics. Each group provided answers to a standardized set of open‐ended questions that addressed current practices, handoff education, simulation use, team structure, and the nature of current written handoff tools, if applicable, at each site. One member of the structured group interview served as a scribe and created a document that summarized the content of the structured group interview meeting and answers to the standardized questions.

Consensus on Content

The initial data collection also included a multivote process[18] of the full I‐PASS EEC to help prioritize data elements. Committee members brainstormed a list of all possible data elements for a printed handoff document. Each member (n=14) was given 10 votes to distribute among the elements. Committee members could assign more than 1 vote to an element to emphasize its importance.

The results of this process as well as the current data elements included in each printed handoff tool were reviewed by a subgroup of the I‐PASS EEC. These expert panel members participated in a series of conference calls during which they tabulated categorical information, reviewed narrative comments, discussed existing evidence, and conducted simple content analysis to identify areas of concordance or discordance. Areas of discordance were discussed by the committee. Disagreements were resolved with group consensus with attention to published evidence or best practices, if available.

Elements were divided into those that were essential (unanimous consensus, no conflicting literature) and those that were recommended (majority supported inclusion of element, no conflicting literature). Ratings were assigned using the American College of Cardiology/American Heart Association framework for practice guidelines,[19] in which each element is assigned a classification (I=effective, II=conflicting evidence/opinion, III=not effective) and a level of evidence to support that classification (A=multiple large randomized controlled trials, B=single randomized trial, or nonrandomized studies, C=expert consensus).

The expert panel reached consensus, through active discussion, on a list of data elements that should be included in an ideal printed handoff document. Elements were chosen based on perceived importance, with attention to published best practices[1, 16] and the multivoting results. In making recommendations, consideration was given to whether data elements could be electronically imported into the printed handoff document from the EHR, or whether they would be entered manually. The potential for serious medical errors due to possible errors in manual entry of data was an important aspect of recommendations made. The list of candidate elements was then reviewed by a larger group of investigators from the I‐PASS Education Executive Committee and Coordinating Council for additional input.

The panel asked site investigators from each participating hospital to gather data on the feasibility of redesigning the printed handoff at that hospital to include each recommended element. Site investigators reported whether each element was already included, possible to include but not included currently, or not currently possible to include within that site's printed handoff tool. Site investigators also reported how data elements were populated in their handoff documents, with options including: (1) autopopulated from administrative data (eg, pharmacy‐entered medication list, demographic data entered by admitting office), (2) autoimported from physicians' free‐text entries elsewhere in the EHR (eg, progress notes), (3) free text entered specifically for the printed handoff, or (4) not applicable (element cannot be included).

RESULTS

Nine programs (100%) provided data on the structure and contents of their printed handoff documents. We found wide variation in structure across the 9 sites. Three sites used a word‐processorbased document that required manual entry of all data elements. The other 6 institutions had a direct link with the EHR to enable autopopulation of between 10 and 20 elements on the printed handoff document.

The content of written handoff documents, as well as the sources of data included in them (present or future), likewise varied substantially across sites (Table 1). Only 4 data elements (name, age, weight, and a list of medications) were universally included at all 9 sites. Among the 6 institutions that linked the printed handoff to the EHR, there was also substantial variation in which elements were autoimported. Only 7 elements were universally autoimported at these 6 sites: patient name, medical record number, room number, weight, date of birth, age, and date of admission. Two elements from the original brainstorming were not presently included in any sites' documents (emergency contact and primary language).

Results of Initial Needs Assessment, With Current and Potential Future Inclusion of Data Elements in Printed Handoff Documents at Nine Study Sites
Data ElementsSites With Data Element Included at Initial Needs Assessment (Out of Nine Sites)Data Source (Current or Anticipated)
Autoimported*Manually EnteredNot Applicable
  • NOTE: *Includes administrative data and free text entered into other electronic health record fields. Manually entered directly into printed handoff document. Data field could not be included due to institutional limitations.

Name9630
Medical record number8630
Room number8630
Allergies6450
Weight9630
Age9630
Date of birth6630
Admission date8630
Attending name5450
Team/service7450
Illness severity1090
Patient summary8090
Action items8090
Situation monitoring/contingency plan5090
Medication name9450
Medication name and dose/route/frequency4450
Code status2270
Labs6540
Access2270
Ins/outs2441
Primary language0360
Vital signs3441
Emergency contact0270
Primary care provider4450

Nine institutions (100%) conducted structured group interviews, ranging in size from 4 to 27 individuals with a median of 5 participants. The documents containing information from each site were provided to the authors. The authors then tabulated categorical information, reviewed narrative comments to understand current institutional practices, and conducted simple content analysis to identify areas of concordance or discordance, particularly with respect to data elements and EHR usage. Based on the results of the printed handoff document review and structured group interviews, with additional perspectives provided by the I‐PASS EEC, the expert panel came to consensus on a list of 23 elements that should be included in printed handoff documents, including 15 essential data elements and 8 additional recommended elements (Table 2).

Rating of Essential and Recommended Data Elements for Printed Handoff Template*
  • NOTE: Abbreviations: I‐PASS, illness severity, patient summary, action list, situation awareness and contingency planning, and synthesis by receiver. *Utilizing American College of Cardiology Foundation and American Heart Association framework for practice guidelines: classification (I=effective, IIa=conflicting evidence/opinion but weight is in favor of usefulness/efficacy, IIb=usefulness/efficacy less well established by evidence/opinion, III=not effective) and level of evidence to support classification (A=multiple large randomized controlled trials, B=single randomized trial or nonrandomized studies, C=expert consensus). Preferably using the I‐PASS categorization of stable/watcher/unstable, but other categorization okay. Refers to common or patient‐specific labs.

Essential Elements
Patient identifiers
Patient name (class I, level of evidence C)
Medical record number (class I, level of evidence C)
Date of birth (class I, level of evidence C)
Hospital service identifiers
Attending name (class I, level of evidence C)
Team/service (class I, level of evidence C)
Room number (class I, level of evidence C)
Admission date (class I, level of evidence C)
Age (class I, level of evidence C)
Weight (class I, level of evidence C)
Illness severity (class I, level of evidence B)[20, 21]
Patient summary (class I, level of evidence B)[21, 22]
Action items (class I, level of evidence B) [21, 22]
Situation awareness/contingency planning (class I, level of evidence B) [21, 22]
Allergies (class I, level of evidence C)
Medications
Autopopulation of medications (class I, level of evidence B)[22, 23, 24]
Free‐text entry of medications (class IIa, level of evidence C)
Recommended elements
Primary language (class IIa, level of evidence C)
Emergency contact (class IIa, level of evidence C)
Primary care provider (class IIa, level of evidence C)
Code status (class IIb, level of evidence C)
Labs (class IIa, level of evidence C)
Access (class IIa, level of evidence C)
Ins/outs (class IIa, level of evidence C)
Vital signs (class IIa, level of evidence C)

Evidence ratings[19] of these elements are included. Several elements are classified as I‐B (effective, nonrandomized studies) based on either studies of individual elements, or greater than 1 study of bundled elements that could reasonably be extrapolated. These include Illness severity,[20, 21] patient summary,[21, 22] action items[21, 22] (to do lists), situation awareness and contingency plan,[21, 22] and medications[22, 23, 24] with attention to importing from the EHR. Medications entered as free text were classified as IIa‐C because of risk and potential significance of errors; in particular there was concern that transcription errors, errors of omission, or errors of commission could potentially lead to patient harms. The remaining essential elements are classified as I‐C (effective, expert consensus). Of note, date of birth was specifically included as a patient identifier, distinct from age, which was felt to be useful as a descriptor (often within a one‐liner or as part of the patient summary).

The 8 recommended elements were elements for which there was not unanimous agreement on inclusion, but the majority of the panel felt they should be included. These elements were classified as IIa‐C, with 1 exception. Code status generated significant controversy among the group. After extensive discussion among the group and consideration of safety, supervision, educational, and pediatric‐specific considerations, all members of the group agreed on the categorization as a recommended element; it is classified as IIb‐C.

All members of the group agreed that data elements should be directly imported from the EHR whenever possible. Finally, members agreed that the elements that make up the I‐PASS mnemonic (illness severity, patient summary, action items, situation awareness/contingency planning) should be listed in that order whenever possible. A sample I‐PASS‐compliant printed handoff document is shown Figure 1.

jhm2380-fig-0001-m.png
Sample screenshot of an I‐PASS–compliant handoff report. Abbreviations: I‐PASS, illness severity, patient summary, action list, situation awareness and contingency planning, and synthesis by receiver.

DISCUSSION

We identified substantial variability in the structure and content of printed handoff documents used by 9 pediatric hospitalist teaching services, reflective of a lack of standardization. We found that institutional printed handoff documents shared some demographic elements (eg, name, room, medical record number) but also varied in clinical content (eg, vital signs, lab tests, code status). Our expert panel developed a list of 15 essential and 8 recommended data elements for printed handoff documents. Although this is a large number of fields, the majority of the essential fields were already included by most sites, and many are basic demographic identifiers. Illness severity is the 1 essential field that was not routinely included; however, including this type of overview is consistently recommended[2, 4] and supported by evidence,[20, 21] and contributes to building a shared mental model.[16] We recommend the categories of stable/watcher/unstable.[17]

Several prior single‐center studies have found that introducing a printed handoff document can lead to improvements in workflow, communication, and patient safety. In an early study, Petersen et al.[25] showed an association between use of a computerized sign‐out program and reduced odds of preventable adverse events during periods of cross‐coverage. Wayne et al.[26] reported fewer perceived inaccuracies in handoff documents as well as improved clarity at the time of transfer, supporting the role for standardization. Van Eaton et al.[27] demonstrated rapid uptake and desirability of a computerized handoff document, which combined autoimportation of information from an EHR with resident‐entered patient details, reflecting the importance of both data sources. In addition, they demonstrated improvements in both the rounding and sign‐out processes.[28]

Two studies specifically reported the increased use of specific fields after implementation. Payne et al. implemented a Web‐based handoff tool and documented significant increases in the number of handoffs containing problem lists, medication lists, and code status, accompanied by perceived improvements in quality of handoffs and fewer near‐miss events.[24] Starmer et al. found that introduction of a resident handoff bundle that included a printed handoff tool led to reduction in medical errors and adverse events.[22] The study group using the tool populated 11 data elements more often after implementation, and introduction of this printed handoff tool in particular was associated with reductions in written handoff miscommunications. Neither of these studies included subanalysis to indicate which data elements may have been most important.

In contrast to previous single‐institution studies, our recommendations for a printed handoff template come from evaluations of tools and discussions with front line providers across 9 institutions. We had substantial overlap with data elements recommended by Van Eaton et al.[27] However, there were several areas in which we did not have overlap with published templates including weight, ins/outs, primary language, emergency contact information, or primary care provider. Other published handoff tools have been highly specialized (eg, for cardiac intensive care) or included many fewer data elements than our group felt were essential. These differences may reflect the unique aspects of caring for pediatric patients (eg, need for weights) and the absence of defined protocols for many pediatric conditions. In addition, the level of detail needed for contingency planning may vary between teaching and nonteaching services.

Resident physicians may provide valuable information in the development of standardized handoff documents. Clark et al.,[29] at Virginia Mason University, utilized resident‐driven continuous quality improvement processes including real‐time feedback to implement an electronic template. They found that engagement of both senior leaders and front‐line users was an important component of their success in uptake. Our study utilized residents as essential members of structured group interviews to ensure that front‐line users' needs were represented as recommendations for a printed handoff tool template were developed.

As previously described,[17] our study group had identified several key data elements that should be included in verbal handoffs: illness severity, a patient summary, a discrete action list, situation awareness/contingency planning, and a synthesis by receiver. With consideration of the multivoting results as well as known best practices,[1, 4, 12] the expert panel for this study agreed that each of these elements should also be highlighted in the printed template to ensure consistency between the printed document and the verbal handoff, and to have each reinforce the other. On the printed handoff tool, the final S in the I‐PASS mnemonic (synthesis by receiver) cannot be prepopulated, but considering the importance of this step,[16, 30, 31, 32] it should be printed as synthesis by receiver to serve as a text‐reminder to both givers and receivers.

The panel also felt, however, that the printed handoff document should provide additional background information not routinely included in a verbal handoff. It should serve as a reference tool both at the time of verbal handoff and throughout the day and night, and therefore should include more comprehensive information than is necessary or appropriate to convey during the verbal handoff. We identified 10 data elements that are essential in a printed handoff document in addition to the I‐PASS elements (Table 2).

Patient demographic data elements, as well as team assignments and attending physician, were uniformly supported for inclusion. The medication list was viewed as essential; however, the panel also recognized the potential for medical errors due to inaccuracies in the medication list. In particular, there was concern that including all fields of a medication order (drug, dose, route, frequency) would result in handoffs containing a high proportion of inaccurate information, particularly for complex patients whose medication regimens may vary over the course of hospitalization. Therefore, the panel agreed that if medication lists were entered manually, then only the medication name should be included as they did not wish to perpetuate inaccurate or potentially harmful information. If medication lists were autoimported from an EHR, then they should include drug name, dose, route, and frequency if possible.

In the I‐PASS study,[15] all institutions implemented printed handoff documents that included fields for the essential data elements. After implementation, there was a significant increase in completion of all essential fields. Although there is limited evidence to support any individual data element, increased usage of these elements was associated with the overall study finding of decreased rates of medical errors and preventable adverse events.

EHRs have the potential to help standardize printed handoff documents[5, 6, 33, 34, 35]; all participants in our study agreed that printed handoff documents should ideally be linked with the EHR and should autoimport data wherever appropriate. Manually populated (eg, word processor‐ or spreadsheet‐based) handoff tools have important limitations, particularly related to the potential for typographical errors as well as accidental omission of data fields, and lead to unnecessary duplication of work (eg, re‐entering data already included in a progress note) that can waste providers' time. It was also acknowledged that word processor‐ or spreadsheet‐based documents may have flexibility that is lacking in EHR‐based handoff documents. For example, formatting can more easily be adjusted to increase the number of patients per printed page. As technology advances, printed documents may be phased out in favor of EHR‐based on‐screen reports, which by their nature would be more accurate due to real‐time autoupdates.

In making recommendations about essential versus recommended items for inclusion in the printed handoff template, the only data element that generated controversy among our experts was code status. Some felt that it should be included as an essential element, whereas others did not. We believe that this was unique to our practice in pediatric hospital ward settings, as codes in most pediatric ward settings are rare. Among the concerns expressed with including code status for all patients were that residents might assume patients were full‐code without verifying. The potential inaccuracy created by this might have severe implications. Alternatively, residents might feel obligated to have code discussions with all patients regardless of severity of illness, which may be inappropriate in a pediatric population. Several educators expressed concerns about trainees having unsupervised code‐status conversations with families of pediatric patients. Conversely, although codes are rare in pediatric ward settings, concerns were raised that not including code status could be problematic during these rare but critically important events. Other fields, such as weight, might have less relevance for an adult population in which emergency drug doses are standardized.

Limitations

Our study has several limitations. We only collected data from hospitalist services at pediatric sites. It is likely that providers in other specialties would have specific data elements they felt were essential (eg, postoperative day, code status). Our methodology was expert consensus based, driven by data collection from sites that were already participating in the I‐PASS study. Although the I‐PASS study demonstrated decreased rates of medical errors and preventable adverse events with inclusion of these data elements as part of a bundle, future research will be required to evaluate whether some of these items are more important than others in improving written communication and ultimately patient safety. In spite of these limitations, our work represents an important starting point for the development of standards for written handoff documents that should be used in patient handoffs, particularly those generated from EHRs.

CONCLUSIONS

In this article we describe the results of a needs assessment that informed expert consensus‐based recommendations for data elements to include in a printed handoff document. We recommend that pediatric programs include the elements identified as part of a standardized written handoff tool. Although many of these elements are also applicable to other specialties, future work should be conducted to adapt the printed handoff document elements described here for use in other specialties and settings. Future studies should work to validate the importance of these elements, studying the manner in which their inclusion affects the quality of written handoffs, and ultimately patient safety.

Acknowledgements

Members of the I‐PASS Study Education Executive Committee who contributed to this manuscript include: Boston Children's Hospital/Harvard Medical School (primary site) (Christopher P. Landrigan, MD, MPH, Elizabeth L. Noble, BA. Theodore C. Sectish, MD. Lisa L. Tse, BA). Cincinnati Children's Hospital Medical Center/University of Cincinnati College of Medicine (Jennifer K. O'Toole, MD, MEd). Doernbecher Children's Hospital/Oregon Health and Science University (Amy J. Starmer, MD, MPH). Hospital for Sick Children/University of Toronto (Zia Bismilla, MD. Maitreya Coffey, MD). Lucile Packard Children's Hospital/Stanford University (Lauren A. Destino, MD. Jennifer L. Everhart, MD. Shilpa J. Patel, MD [currently at Kapi'olani Children's Hospital/University of Hawai'i School of Medicine]). National Capital Consortium (Jennifer H. Hepps, MD. Joseph O. Lopreiato, MD, MPH. Clifton E. Yu, MD). Primary Children's Medical Center/University of Utah (James F. Bale, Jr., MD. Adam T. Stevenson, MD). St. Louis Children's Hospital/Washington University (F. Sessions Cole, MD). St. Christopher's Hospital for Children/Drexel University College of Medicine (Sharon Calaman, MD. Nancy D. Spector, MD). Benioff Children's Hospital/University of California San Francisco School of Medicine (Glenn Rosenbluth, MD. Daniel C. West, MD).

Additional I‐PASS Study Group members who contributed to this manuscript include April D. Allen, MPA, MA (Heller School for Social Policy and Management, Brandeis University, previously affiliated with Boston Children's Hospital), Madelyn D. Kahana, MD (The Children's Hospital at Montefiore/Albert Einstein College of Medicine, previously affiliated with Lucile Packard Children's Hospital/Stanford University), Robert S. McGregor, MD (Akron Children's Hospital/Northeast Ohio Medical University, previously affiliated with St. Christopher's Hospital for Children/Drexel University), and John S. Webster, MD, MBA, MS (Webster Healthcare Consulting Inc., formerly of the Department of Defense).

Members of the I‐PASS Study Group include individuals from the institutions listed below as follows: Boston Children's Hospital/Harvard Medical School (primary site): April D. Allen, MPA, MA (currently at Heller School for Social Policy and Management, Brandeis University), Angela M. Feraco, MD, Christopher P. Landrigan, MD, MPH, Elizabeth L. Noble, BA, Theodore C. Sectish, MD, Lisa L. Tse, BA. Brigham and Women's Hospital (data coordinating center): Anuj K. Dalal, MD, Carol A. Keohane, BSN, RN, Stuart Lipsitz, PhD, Jeffrey M. Rothschild, MD, MPH, Matt F. Wien, BS, Catherine S. Yoon, MS, Katherine R. Zigmont, BSN, RN. Cincinnati Children's Hospital Medical Center/University of Cincinnati College of Medicine: Javier Gonzalez del Rey, MD, MEd, Jennifer K. O'Toole, MD, MEd, Lauren G. Solan, MD. Doernbecher Children's Hospital/Oregon Health and Science University: Megan E. Aylor, MD, Amy J. Starmer, MD, MPH, Windy Stevenson, MD, Tamara Wagner, MD. Hospital for Sick Children/University of Toronto: Zia Bismilla, MD, Maitreya Coffey, MD, Sanjay Mahant, MD, MSc. Lucile Packard Children's Hospital/Stanford University: Rebecca L. Blankenburg, MD, MPH, Lauren A. Destino, MD, Jennifer L. Everhart, MD, Madelyn Kahana, MD, Shilpa J. Patel, MD (currently at Kapi'olani Children's Hospital/University of Hawaii School of Medicine). National Capital Consortium: Jennifer H. Hepps, MD, Joseph O. Lopreiato, MD, MPH, Clifton E. Yu, MD. Primary Children's Hospital/University of Utah: James F. Bale, Jr., MD, Jaime Blank Spackman, MSHS, CCRP, Rajendu Srivastava, MD, FRCP(C), MPH, Adam Stevenson, MD. St. Louis Children's Hospital/Washington University: Kevin Barton, MD, Kathleen Berchelmann, MD, F. Sessions Cole, MD, Christine Hrach, MD, Kyle S. Schultz, MD, Michael P. Turmelle, MD, Andrew J. White, MD. St. Christopher's Hospital for Children/Drexel University: Sharon Calaman, MD, Bronwyn D. Carlson, MD, Robert S. McGregor, MD (currently at Akron Children's Hospital/Northeast Ohio Medical University), Vahideh Nilforoshan, MD, Nancy D. Spector, MD. and Benioff Children's Hospital/University of California San Francisco School of Medicine: Glenn Rosenbluth, MD, Daniel C. West, MD. Dorene Balmer, PhD, RD, Carol L. Carraccio, MD, MA, Laura Degnon, CAE, and David McDonald, and Alan Schwartz PhD serve the I‐PASS Study Group as part of the IIPE. Karen M. Wilson, MD, MPH serves the I‐PASS Study Group as part of the advisory board from the PRIS Executive Council. John Webster served the I‐PASS Study Group and Education Executive Committee as a representative from TeamSTEPPS.

Disclosures: The I‐PASS Study was primarily supported by the US Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation (1R18AE000029‐01). The opinions and conclusions expressed herein are solely those of the author(s) and should not be constructed as representing the opinions or policy of any agency of the federal government. Developed with input from the Initiative for Innovation in Pediatric Education and the Pediatric Research in Inpatient Settings Network (supported by the Children's Hospital Association, the Academic Pediatric Association, the American Academy of Pediatrics, and the Society of Hospital Medicine). A. J. S. was supported by the Agency for Healthcare Research and Quality/Oregon Comparative Effectiveness Research K12 Program (1K12HS019456‐01). Additional funding for the I‐PASS Study was provided by the Medical Research Foundation of Oregon, Physician Services Incorporated Foundation (Ontario, Canada), and Pfizer (unrestricted medical education grant to N.D.S.). C.P.L, A.J.S. were supported by the Oregon Comparative Effectiveness Research K12 Program (1K12HS019456 from the Agency for Healthcare Research and Quality). A.J.S. was also supported by the Medical Research Foundation of Oregon. The authors report no conflicts of interest.

Handoffs among hospital providers are highly error prone and can result in serious morbidity and mortality. Best practices for verbal handoffs have been described[1, 2, 3, 4] and include conducting verbal handoffs face to face, providing opportunities for questions, having the receiver perform a readback, as well as specific content recommendations including action items. Far less research has focused on best practices for printed handoff documents,[5, 6] despite the routine use of written handoff tools as a reference by on‐call physicians.[7, 8] Erroneous or outdated information on the written handoff can mislead on‐call providers, potentially leading to serious medical errors.

In their most basic form, printed handoff documents list patients for whom a provider is responsible. Typically, they also contain demographic information, reason for hospital admission, and a task list for each patient. They may also contain more detailed information on patient history, hospital course, and/or care plan, and may vary among specialties.[9] They come in various forms, ranging from index cards with handwritten notes, to word‐processor or spreadsheet documents, to printed documents that are autopopulated from the electronic health record (EHR).[2] Importantly, printed handoff documents supplement the verbal handoff by allowing receivers to follow along as patients are presented. The concurrent use of written and verbal handoffs may improve retention of clinical information as compared with either alone.[10, 11]

The Joint Commission requires an institutional approach to patient handoffs.[12] The requirements state that handoff communication solutions should take a standardized form, but they do not provide details regarding what data elements should be included in printed or verbal handoffs. Accreditation Council for Graduate Medical Education Common Program Requirements likewise require that residents must become competent in patient handoffs[13] but do not provide specific details or measurement tools. Absent widely accepted guidelines, decisions regarding which elements to include in printed handoff documents are currently made at an individual or institutional level.

The I‐PASS study is a federally funded multi‐institutional project that demonstrated a decrease in medical errors and preventable adverse events after implementation of a standardized resident handoff bundle.[14, 15] The I‐PASS Study Group developed a bundle of handoff interventions, beginning with a handoff and teamwork training program (based in part on TeamSTEPPS [Team Strategies and Tools to Enhance Performance and Patient Safety]),[16] a novel verbal mnemonic, I‐PASS (Illness Severity, Patient Summary, Action List, Situation Awareness and Contingency Planning, and Synthesis by Receiver),[17] and changes to the verbal handoff process, in addition to several other elements.

We hypothesized that developing a standardized printed handoff template would reinforce the handoff training and enhance the value of the verbal handoff process changes. Given the paucity of data on best printed handoff practices, however, we first conducted a needs assessment to identify which data elements were currently contained in printed handoffs across sites, and to allow an expert panel to make recommendations for best practices.

METHODS

I‐PASS Study sites included 9 pediatric residency programs at academic medical centers from across North America. Programs were identified through professional networks and invited to participate. The nonintensive care unit hospitalist services at these medical centers are primarily staffed by residents and medical students with attending supervision. At 1 site, nurse practitioners also participate in care. Additional details about study sites can be found in the study descriptions previously published.[14, 15] All sites received local institutional review board approval.

We began by inviting members of the I‐PASS Education Executive Committee (EEC)[14] to build a collective, comprehensive list of possible data elements for printed handoff documents. This committee included pediatric residency program directors, pediatric hospitalists, education researchers, health services researchers, and patient safety experts. We obtained sample handoff documents from pediatric hospitalist services at each of 9 institutions in the United States and Canada (with protected health information redacted). We reviewed these sample handoff documents to characterize their format and to determine what discrete data elements appeared in each site's printed handoff document. Presence or absence of each data element across sites was tabulated. We also queried sites to determine the feasibility of including elements that were not presently included.

Subsequently, I‐PASS site investigators led structured group interviews at participating sites to gather additional information about handoff practices at each site. These structured group interviews included diverse representation from residents, faculty, and residency program leadership, as well as hospitalists and medical students, to ensure the comprehensive acquisition of information regarding site‐specific characteristics. Each group provided answers to a standardized set of open‐ended questions that addressed current practices, handoff education, simulation use, team structure, and the nature of current written handoff tools, if applicable, at each site. One member of the structured group interview served as a scribe and created a document that summarized the content of the structured group interview meeting and answers to the standardized questions.

Consensus on Content

The initial data collection also included a multivote process[18] of the full I‐PASS EEC to help prioritize data elements. Committee members brainstormed a list of all possible data elements for a printed handoff document. Each member (n=14) was given 10 votes to distribute among the elements. Committee members could assign more than 1 vote to an element to emphasize its importance.

The results of this process as well as the current data elements included in each printed handoff tool were reviewed by a subgroup of the I‐PASS EEC. These expert panel members participated in a series of conference calls during which they tabulated categorical information, reviewed narrative comments, discussed existing evidence, and conducted simple content analysis to identify areas of concordance or discordance. Areas of discordance were discussed by the committee. Disagreements were resolved with group consensus with attention to published evidence or best practices, if available.

Elements were divided into those that were essential (unanimous consensus, no conflicting literature) and those that were recommended (majority supported inclusion of element, no conflicting literature). Ratings were assigned using the American College of Cardiology/American Heart Association framework for practice guidelines,[19] in which each element is assigned a classification (I=effective, II=conflicting evidence/opinion, III=not effective) and a level of evidence to support that classification (A=multiple large randomized controlled trials, B=single randomized trial, or nonrandomized studies, C=expert consensus).

The expert panel reached consensus, through active discussion, on a list of data elements that should be included in an ideal printed handoff document. Elements were chosen based on perceived importance, with attention to published best practices[1, 16] and the multivoting results. In making recommendations, consideration was given to whether data elements could be electronically imported into the printed handoff document from the EHR, or whether they would be entered manually. The potential for serious medical errors due to possible errors in manual entry of data was an important aspect of recommendations made. The list of candidate elements was then reviewed by a larger group of investigators from the I‐PASS Education Executive Committee and Coordinating Council for additional input.

The panel asked site investigators from each participating hospital to gather data on the feasibility of redesigning the printed handoff at that hospital to include each recommended element. Site investigators reported whether each element was already included, possible to include but not included currently, or not currently possible to include within that site's printed handoff tool. Site investigators also reported how data elements were populated in their handoff documents, with options including: (1) autopopulated from administrative data (eg, pharmacy‐entered medication list, demographic data entered by admitting office), (2) autoimported from physicians' free‐text entries elsewhere in the EHR (eg, progress notes), (3) free text entered specifically for the printed handoff, or (4) not applicable (element cannot be included).

RESULTS

Nine programs (100%) provided data on the structure and contents of their printed handoff documents. We found wide variation in structure across the 9 sites. Three sites used a word‐processorbased document that required manual entry of all data elements. The other 6 institutions had a direct link with the EHR to enable autopopulation of between 10 and 20 elements on the printed handoff document.

The content of written handoff documents, as well as the sources of data included in them (present or future), likewise varied substantially across sites (Table 1). Only 4 data elements (name, age, weight, and a list of medications) were universally included at all 9 sites. Among the 6 institutions that linked the printed handoff to the EHR, there was also substantial variation in which elements were autoimported. Only 7 elements were universally autoimported at these 6 sites: patient name, medical record number, room number, weight, date of birth, age, and date of admission. Two elements from the original brainstorming were not presently included in any sites' documents (emergency contact and primary language).

Results of Initial Needs Assessment, With Current and Potential Future Inclusion of Data Elements in Printed Handoff Documents at Nine Study Sites
Data ElementsSites With Data Element Included at Initial Needs Assessment (Out of Nine Sites)Data Source (Current or Anticipated)
Autoimported*Manually EnteredNot Applicable
  • NOTE: *Includes administrative data and free text entered into other electronic health record fields. Manually entered directly into printed handoff document. Data field could not be included due to institutional limitations.

Name9630
Medical record number8630
Room number8630
Allergies6450
Weight9630
Age9630
Date of birth6630
Admission date8630
Attending name5450
Team/service7450
Illness severity1090
Patient summary8090
Action items8090
Situation monitoring/contingency plan5090
Medication name9450
Medication name and dose/route/frequency4450
Code status2270
Labs6540
Access2270
Ins/outs2441
Primary language0360
Vital signs3441
Emergency contact0270
Primary care provider4450

Nine institutions (100%) conducted structured group interviews, ranging in size from 4 to 27 individuals with a median of 5 participants. The documents containing information from each site were provided to the authors. The authors then tabulated categorical information, reviewed narrative comments to understand current institutional practices, and conducted simple content analysis to identify areas of concordance or discordance, particularly with respect to data elements and EHR usage. Based on the results of the printed handoff document review and structured group interviews, with additional perspectives provided by the I‐PASS EEC, the expert panel came to consensus on a list of 23 elements that should be included in printed handoff documents, including 15 essential data elements and 8 additional recommended elements (Table 2).

Rating of Essential and Recommended Data Elements for Printed Handoff Template*
  • NOTE: Abbreviations: I‐PASS, illness severity, patient summary, action list, situation awareness and contingency planning, and synthesis by receiver. *Utilizing American College of Cardiology Foundation and American Heart Association framework for practice guidelines: classification (I=effective, IIa=conflicting evidence/opinion but weight is in favor of usefulness/efficacy, IIb=usefulness/efficacy less well established by evidence/opinion, III=not effective) and level of evidence to support classification (A=multiple large randomized controlled trials, B=single randomized trial or nonrandomized studies, C=expert consensus). Preferably using the I‐PASS categorization of stable/watcher/unstable, but other categorization okay. Refers to common or patient‐specific labs.

Essential Elements
Patient identifiers
Patient name (class I, level of evidence C)
Medical record number (class I, level of evidence C)
Date of birth (class I, level of evidence C)
Hospital service identifiers
Attending name (class I, level of evidence C)
Team/service (class I, level of evidence C)
Room number (class I, level of evidence C)
Admission date (class I, level of evidence C)
Age (class I, level of evidence C)
Weight (class I, level of evidence C)
Illness severity (class I, level of evidence B)[20, 21]
Patient summary (class I, level of evidence B)[21, 22]
Action items (class I, level of evidence B) [21, 22]
Situation awareness/contingency planning (class I, level of evidence B) [21, 22]
Allergies (class I, level of evidence C)
Medications
Autopopulation of medications (class I, level of evidence B)[22, 23, 24]
Free‐text entry of medications (class IIa, level of evidence C)
Recommended elements
Primary language (class IIa, level of evidence C)
Emergency contact (class IIa, level of evidence C)
Primary care provider (class IIa, level of evidence C)
Code status (class IIb, level of evidence C)
Labs (class IIa, level of evidence C)
Access (class IIa, level of evidence C)
Ins/outs (class IIa, level of evidence C)
Vital signs (class IIa, level of evidence C)

Evidence ratings[19] of these elements are included. Several elements are classified as I‐B (effective, nonrandomized studies) based on either studies of individual elements, or greater than 1 study of bundled elements that could reasonably be extrapolated. These include Illness severity,[20, 21] patient summary,[21, 22] action items[21, 22] (to do lists), situation awareness and contingency plan,[21, 22] and medications[22, 23, 24] with attention to importing from the EHR. Medications entered as free text were classified as IIa‐C because of risk and potential significance of errors; in particular there was concern that transcription errors, errors of omission, or errors of commission could potentially lead to patient harms. The remaining essential elements are classified as I‐C (effective, expert consensus). Of note, date of birth was specifically included as a patient identifier, distinct from age, which was felt to be useful as a descriptor (often within a one‐liner or as part of the patient summary).

The 8 recommended elements were elements for which there was not unanimous agreement on inclusion, but the majority of the panel felt they should be included. These elements were classified as IIa‐C, with 1 exception. Code status generated significant controversy among the group. After extensive discussion among the group and consideration of safety, supervision, educational, and pediatric‐specific considerations, all members of the group agreed on the categorization as a recommended element; it is classified as IIb‐C.

All members of the group agreed that data elements should be directly imported from the EHR whenever possible. Finally, members agreed that the elements that make up the I‐PASS mnemonic (illness severity, patient summary, action items, situation awareness/contingency planning) should be listed in that order whenever possible. A sample I‐PASS‐compliant printed handoff document is shown Figure 1.

jhm2380-fig-0001-m.png
Sample screenshot of an I‐PASS–compliant handoff report. Abbreviations: I‐PASS, illness severity, patient summary, action list, situation awareness and contingency planning, and synthesis by receiver.

DISCUSSION

We identified substantial variability in the structure and content of printed handoff documents used by 9 pediatric hospitalist teaching services, reflective of a lack of standardization. We found that institutional printed handoff documents shared some demographic elements (eg, name, room, medical record number) but also varied in clinical content (eg, vital signs, lab tests, code status). Our expert panel developed a list of 15 essential and 8 recommended data elements for printed handoff documents. Although this is a large number of fields, the majority of the essential fields were already included by most sites, and many are basic demographic identifiers. Illness severity is the 1 essential field that was not routinely included; however, including this type of overview is consistently recommended[2, 4] and supported by evidence,[20, 21] and contributes to building a shared mental model.[16] We recommend the categories of stable/watcher/unstable.[17]

Several prior single‐center studies have found that introducing a printed handoff document can lead to improvements in workflow, communication, and patient safety. In an early study, Petersen et al.[25] showed an association between use of a computerized sign‐out program and reduced odds of preventable adverse events during periods of cross‐coverage. Wayne et al.[26] reported fewer perceived inaccuracies in handoff documents as well as improved clarity at the time of transfer, supporting the role for standardization. Van Eaton et al.[27] demonstrated rapid uptake and desirability of a computerized handoff document, which combined autoimportation of information from an EHR with resident‐entered patient details, reflecting the importance of both data sources. In addition, they demonstrated improvements in both the rounding and sign‐out processes.[28]

Two studies specifically reported the increased use of specific fields after implementation. Payne et al. implemented a Web‐based handoff tool and documented significant increases in the number of handoffs containing problem lists, medication lists, and code status, accompanied by perceived improvements in quality of handoffs and fewer near‐miss events.[24] Starmer et al. found that introduction of a resident handoff bundle that included a printed handoff tool led to reduction in medical errors and adverse events.[22] The study group using the tool populated 11 data elements more often after implementation, and introduction of this printed handoff tool in particular was associated with reductions in written handoff miscommunications. Neither of these studies included subanalysis to indicate which data elements may have been most important.

In contrast to previous single‐institution studies, our recommendations for a printed handoff template come from evaluations of tools and discussions with front line providers across 9 institutions. We had substantial overlap with data elements recommended by Van Eaton et al.[27] However, there were several areas in which we did not have overlap with published templates including weight, ins/outs, primary language, emergency contact information, or primary care provider. Other published handoff tools have been highly specialized (eg, for cardiac intensive care) or included many fewer data elements than our group felt were essential. These differences may reflect the unique aspects of caring for pediatric patients (eg, need for weights) and the absence of defined protocols for many pediatric conditions. In addition, the level of detail needed for contingency planning may vary between teaching and nonteaching services.

Resident physicians may provide valuable information in the development of standardized handoff documents. Clark et al.,[29] at Virginia Mason University, utilized resident‐driven continuous quality improvement processes including real‐time feedback to implement an electronic template. They found that engagement of both senior leaders and front‐line users was an important component of their success in uptake. Our study utilized residents as essential members of structured group interviews to ensure that front‐line users' needs were represented as recommendations for a printed handoff tool template were developed.

As previously described,[17] our study group had identified several key data elements that should be included in verbal handoffs: illness severity, a patient summary, a discrete action list, situation awareness/contingency planning, and a synthesis by receiver. With consideration of the multivoting results as well as known best practices,[1, 4, 12] the expert panel for this study agreed that each of these elements should also be highlighted in the printed template to ensure consistency between the printed document and the verbal handoff, and to have each reinforce the other. On the printed handoff tool, the final S in the I‐PASS mnemonic (synthesis by receiver) cannot be prepopulated, but considering the importance of this step,[16, 30, 31, 32] it should be printed as synthesis by receiver to serve as a text‐reminder to both givers and receivers.

The panel also felt, however, that the printed handoff document should provide additional background information not routinely included in a verbal handoff. It should serve as a reference tool both at the time of verbal handoff and throughout the day and night, and therefore should include more comprehensive information than is necessary or appropriate to convey during the verbal handoff. We identified 10 data elements that are essential in a printed handoff document in addition to the I‐PASS elements (Table 2).

Patient demographic data elements, as well as team assignments and attending physician, were uniformly supported for inclusion. The medication list was viewed as essential; however, the panel also recognized the potential for medical errors due to inaccuracies in the medication list. In particular, there was concern that including all fields of a medication order (drug, dose, route, frequency) would result in handoffs containing a high proportion of inaccurate information, particularly for complex patients whose medication regimens may vary over the course of hospitalization. Therefore, the panel agreed that if medication lists were entered manually, then only the medication name should be included as they did not wish to perpetuate inaccurate or potentially harmful information. If medication lists were autoimported from an EHR, then they should include drug name, dose, route, and frequency if possible.

In the I‐PASS study,[15] all institutions implemented printed handoff documents that included fields for the essential data elements. After implementation, there was a significant increase in completion of all essential fields. Although there is limited evidence to support any individual data element, increased usage of these elements was associated with the overall study finding of decreased rates of medical errors and preventable adverse events.

EHRs have the potential to help standardize printed handoff documents[5, 6, 33, 34, 35]; all participants in our study agreed that printed handoff documents should ideally be linked with the EHR and should autoimport data wherever appropriate. Manually populated (eg, word processor‐ or spreadsheet‐based) handoff tools have important limitations, particularly related to the potential for typographical errors as well as accidental omission of data fields, and lead to unnecessary duplication of work (eg, re‐entering data already included in a progress note) that can waste providers' time. It was also acknowledged that word processor‐ or spreadsheet‐based documents may have flexibility that is lacking in EHR‐based handoff documents. For example, formatting can more easily be adjusted to increase the number of patients per printed page. As technology advances, printed documents may be phased out in favor of EHR‐based on‐screen reports, which by their nature would be more accurate due to real‐time autoupdates.

In making recommendations about essential versus recommended items for inclusion in the printed handoff template, the only data element that generated controversy among our experts was code status. Some felt that it should be included as an essential element, whereas others did not. We believe that this was unique to our practice in pediatric hospital ward settings, as codes in most pediatric ward settings are rare. Among the concerns expressed with including code status for all patients were that residents might assume patients were full‐code without verifying. The potential inaccuracy created by this might have severe implications. Alternatively, residents might feel obligated to have code discussions with all patients regardless of severity of illness, which may be inappropriate in a pediatric population. Several educators expressed concerns about trainees having unsupervised code‐status conversations with families of pediatric patients. Conversely, although codes are rare in pediatric ward settings, concerns were raised that not including code status could be problematic during these rare but critically important events. Other fields, such as weight, might have less relevance for an adult population in which emergency drug doses are standardized.

Limitations

Our study has several limitations. We only collected data from hospitalist services at pediatric sites. It is likely that providers in other specialties would have specific data elements they felt were essential (eg, postoperative day, code status). Our methodology was expert consensus based, driven by data collection from sites that were already participating in the I‐PASS study. Although the I‐PASS study demonstrated decreased rates of medical errors and preventable adverse events with inclusion of these data elements as part of a bundle, future research will be required to evaluate whether some of these items are more important than others in improving written communication and ultimately patient safety. In spite of these limitations, our work represents an important starting point for the development of standards for written handoff documents that should be used in patient handoffs, particularly those generated from EHRs.

CONCLUSIONS

In this article we describe the results of a needs assessment that informed expert consensus‐based recommendations for data elements to include in a printed handoff document. We recommend that pediatric programs include the elements identified as part of a standardized written handoff tool. Although many of these elements are also applicable to other specialties, future work should be conducted to adapt the printed handoff document elements described here for use in other specialties and settings. Future studies should work to validate the importance of these elements, studying the manner in which their inclusion affects the quality of written handoffs, and ultimately patient safety.

Acknowledgements

Members of the I‐PASS Study Education Executive Committee who contributed to this manuscript include: Boston Children's Hospital/Harvard Medical School (primary site) (Christopher P. Landrigan, MD, MPH, Elizabeth L. Noble, BA. Theodore C. Sectish, MD. Lisa L. Tse, BA). Cincinnati Children's Hospital Medical Center/University of Cincinnati College of Medicine (Jennifer K. O'Toole, MD, MEd). Doernbecher Children's Hospital/Oregon Health and Science University (Amy J. Starmer, MD, MPH). Hospital for Sick Children/University of Toronto (Zia Bismilla, MD. Maitreya Coffey, MD). Lucile Packard Children's Hospital/Stanford University (Lauren A. Destino, MD. Jennifer L. Everhart, MD. Shilpa J. Patel, MD [currently at Kapi'olani Children's Hospital/University of Hawai'i School of Medicine]). National Capital Consortium (Jennifer H. Hepps, MD. Joseph O. Lopreiato, MD, MPH. Clifton E. Yu, MD). Primary Children's Medical Center/University of Utah (James F. Bale, Jr., MD. Adam T. Stevenson, MD). St. Louis Children's Hospital/Washington University (F. Sessions Cole, MD). St. Christopher's Hospital for Children/Drexel University College of Medicine (Sharon Calaman, MD. Nancy D. Spector, MD). Benioff Children's Hospital/University of California San Francisco School of Medicine (Glenn Rosenbluth, MD. Daniel C. West, MD).

Additional I‐PASS Study Group members who contributed to this manuscript include April D. Allen, MPA, MA (Heller School for Social Policy and Management, Brandeis University, previously affiliated with Boston Children's Hospital), Madelyn D. Kahana, MD (The Children's Hospital at Montefiore/Albert Einstein College of Medicine, previously affiliated with Lucile Packard Children's Hospital/Stanford University), Robert S. McGregor, MD (Akron Children's Hospital/Northeast Ohio Medical University, previously affiliated with St. Christopher's Hospital for Children/Drexel University), and John S. Webster, MD, MBA, MS (Webster Healthcare Consulting Inc., formerly of the Department of Defense).

Members of the I‐PASS Study Group include individuals from the institutions listed below as follows: Boston Children's Hospital/Harvard Medical School (primary site): April D. Allen, MPA, MA (currently at Heller School for Social Policy and Management, Brandeis University), Angela M. Feraco, MD, Christopher P. Landrigan, MD, MPH, Elizabeth L. Noble, BA, Theodore C. Sectish, MD, Lisa L. Tse, BA. Brigham and Women's Hospital (data coordinating center): Anuj K. Dalal, MD, Carol A. Keohane, BSN, RN, Stuart Lipsitz, PhD, Jeffrey M. Rothschild, MD, MPH, Matt F. Wien, BS, Catherine S. Yoon, MS, Katherine R. Zigmont, BSN, RN. Cincinnati Children's Hospital Medical Center/University of Cincinnati College of Medicine: Javier Gonzalez del Rey, MD, MEd, Jennifer K. O'Toole, MD, MEd, Lauren G. Solan, MD. Doernbecher Children's Hospital/Oregon Health and Science University: Megan E. Aylor, MD, Amy J. Starmer, MD, MPH, Windy Stevenson, MD, Tamara Wagner, MD. Hospital for Sick Children/University of Toronto: Zia Bismilla, MD, Maitreya Coffey, MD, Sanjay Mahant, MD, MSc. Lucile Packard Children's Hospital/Stanford University: Rebecca L. Blankenburg, MD, MPH, Lauren A. Destino, MD, Jennifer L. Everhart, MD, Madelyn Kahana, MD, Shilpa J. Patel, MD (currently at Kapi'olani Children's Hospital/University of Hawaii School of Medicine). National Capital Consortium: Jennifer H. Hepps, MD, Joseph O. Lopreiato, MD, MPH, Clifton E. Yu, MD. Primary Children's Hospital/University of Utah: James F. Bale, Jr., MD, Jaime Blank Spackman, MSHS, CCRP, Rajendu Srivastava, MD, FRCP(C), MPH, Adam Stevenson, MD. St. Louis Children's Hospital/Washington University: Kevin Barton, MD, Kathleen Berchelmann, MD, F. Sessions Cole, MD, Christine Hrach, MD, Kyle S. Schultz, MD, Michael P. Turmelle, MD, Andrew J. White, MD. St. Christopher's Hospital for Children/Drexel University: Sharon Calaman, MD, Bronwyn D. Carlson, MD, Robert S. McGregor, MD (currently at Akron Children's Hospital/Northeast Ohio Medical University), Vahideh Nilforoshan, MD, Nancy D. Spector, MD. and Benioff Children's Hospital/University of California San Francisco School of Medicine: Glenn Rosenbluth, MD, Daniel C. West, MD. Dorene Balmer, PhD, RD, Carol L. Carraccio, MD, MA, Laura Degnon, CAE, and David McDonald, and Alan Schwartz PhD serve the I‐PASS Study Group as part of the IIPE. Karen M. Wilson, MD, MPH serves the I‐PASS Study Group as part of the advisory board from the PRIS Executive Council. John Webster served the I‐PASS Study Group and Education Executive Committee as a representative from TeamSTEPPS.

Disclosures: The I‐PASS Study was primarily supported by the US Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation (1R18AE000029‐01). The opinions and conclusions expressed herein are solely those of the author(s) and should not be constructed as representing the opinions or policy of any agency of the federal government. Developed with input from the Initiative for Innovation in Pediatric Education and the Pediatric Research in Inpatient Settings Network (supported by the Children's Hospital Association, the Academic Pediatric Association, the American Academy of Pediatrics, and the Society of Hospital Medicine). A. J. S. was supported by the Agency for Healthcare Research and Quality/Oregon Comparative Effectiveness Research K12 Program (1K12HS019456‐01). Additional funding for the I‐PASS Study was provided by the Medical Research Foundation of Oregon, Physician Services Incorporated Foundation (Ontario, Canada), and Pfizer (unrestricted medical education grant to N.D.S.). C.P.L, A.J.S. were supported by the Oregon Comparative Effectiveness Research K12 Program (1K12HS019456 from the Agency for Healthcare Research and Quality). A.J.S. was also supported by the Medical Research Foundation of Oregon. The authors report no conflicts of interest.

References
  1. Patterson ES, Roth EM, Woods DD, Chow R, Gomes JO. Handoff strategies in settings with high consequences for failure: lessons for health care operations. Int J Qual Health Care. 2004;16(2):125132.
  2. Vidyarthi AR, Arora V, Schnipper JL, Wall SD, Wachter RM. Managing discontinuity in academic medical centers: strategies for a safe and effective resident sign‐out. J Hosp Med. 2006;1(4):257266.
  3. Horwitz LI, Moin T, Green ML. Development and implementation of an oral sign‐out skills curriculum. J Gen Intern Med. 2007;22(10):14701474.
  4. Arora VM, Manjarrez E, Dressler DD, Basaviah P, Halasyamani L, Kripalani S. Hospitalist handoffs: a systematic review and task force recommendations. J Hosp Med. 2009;4(7):433440.
  5. Abraham J, Kannampallil T, Patel VL. A systematic review of the literature on the evaluation of handoff tools: implications for research and practice. J Am Med Inform Assoc. 2014;21(1):154162.
  6. Li P, Ali S, Tang C, Ghali WA, Stelfox HT. Review of computerized physician handoff tools for improving the quality of patient care. J Hosp Med. 2013;8(8):456463.
  7. McSweeney ME, Landrigan CP, Jiang H, Starmer A, Lightdale JR. Answering questions on call: pediatric resident physicians' use of handoffs and other resources. J Hosp Med. 2013;8(6):328333.
  8. Fogerty RL, Schoenfeld A, Salim Al‐Damluji M, Horwitz LI. Effectiveness of written hospitalist sign‐outs in answering overnight inquiries. J Hosp Med. 2013;8(11):609614.
  9. Schoenfeld AR, Salim Al‐Damluji M, Horwitz LI. Sign‐out snapshot: cross‐sectional evaluation of written sign‐outs among specialties. BMJ Qual Saf. 2014;23(1):6672.
  10. Bhabra G, Mackeith S, Monteiro P, Pothier DD. An experimental comparison of handover methods. Ann R Coll Surg Engl. 2007;89(3):298300.
  11. Pothier D, Monteiro P, Mooktiar M, Shaw A. Pilot study to show the loss of important data in nursing handover. Br J Nurs. 2005;14(20):10901093.
  12. The Joint Commission. Hospital Accreditation Standards 2015: Joint Commission Resources; 2015:PC.02.02.01.
  13. Accreditation Council for Graduate Medical Education. Common Program Requirements. 2013; http://acgme.org/acgmeweb/tabid/429/ProgramandInstitutionalAccreditation/CommonProgramRequirements.aspx. Accessed May 11, 2015.
  14. Sectish TC, Starmer AJ, Landrigan CP, Spector ND. Establishing a multisite education and research project requires leadership, expertise, collaboration, and an important aim. Pediatrics. 2010;126(4):619622.
  15. Starmer AJ, Spector ND, Srivastava R, et al. Changes in medical errors after implementation of a handoff program. N Engl J Med. 2014;371(19):18031812.
  16. US Department of Health and Human Services. Agency for Healthcare Research and Quality. TeamSTEPPS website. Available at: http://teamstepps.ahrq.gov/. Accessed July 12, 2013.
  17. Starmer AJ, Spector ND, Srivastava R, Allen AD, Landrigan CP, Sectish TC. I‐PASS, a mnemonic to standardize verbal handoffs. Pediatrics. 2012;129(2):201204.
  18. Scholtes P, Joiner B, Streibel B. The Team Handbook. 3rd ed. Middleton, WI: Oriel STAT A MATRIX; 2010.
  19. ACC/AHA Task Force on Practice Guidelines. Methodology Manual and Policies From the ACCF/AHA Task Force on Practice Guidelines. Available at: http://my.americanheart.org/idc/groups/ahamah‐public/@wcm/@sop/documents/downloadable/ucm_319826.pdf. Published June 2010. Accessed January 11, 2015.
  20. Naessens JM, Campbell CR, Shah N, et al. Effect of illness severity and comorbidity on patient safety and adverse events. Am J Med Qual. 2012;27(1):4857.
  21. Horwitz LI, Moin T, Krumholz HM, Wang L, Bradley EH. Consequences of inadequate sign‐out for patient care. Arch Intern Med. 2008;168(16):17551760.
  22. Starmer AJ, Sectish TC, Simon DW, et al. Rates of medical errors and preventable adverse events among hospitalized children following implementation of a resident handoff bundle. JAMA. 2013;310(21):22622270.
  23. Arora V, Kao J, Lovinger D, Seiden SC, Meltzer D. Medication discrepancies in resident sign‐outs and their potential to harm. J Gen Intern Med. 2007;22(12):17511755.
  24. Payne CE, Stein JM, Leong T, Dressler DD. Avoiding handover fumbles: a controlled trial of a structured handover tool versus traditional handover methods. BMJ Qual Saf. 2012;21(11):925932.
  25. Petersen LA, Orav EJ, Teich JM, O'Neil AC, Brennan TA. Using a computerized sign‐out program to improve continuity of inpatient care and prevent adverse events. Jt Comm J Qual Improv. 1998;24(2):7787.
  26. Wayne JD, Tyagi R, Reinhardt G, et al. Simple standardized patient handoff system that increases accuracy and completeness. J Surg Educ. 2008;65(6):476485.
  27. Eaton EG, Horvath KD, Lober WB, Pellegrini CA. Organizing the transfer of patient care information: the development of a computerized resident sign‐out system. Surgery. 2004;136(1):513.
  28. Eaton EG, Horvath KD, Lober WB, Rossini AJ, Pellegrini CA. A randomized, controlled trial evaluating the impact of a computerized rounding and sign‐out system on continuity of care and resident work hours. J Am Coll Surg. 2005;200(4):538545.
  29. Clark CJ, Sindell SL, Koehler RP. Template for success: using a resident‐designed sign‐out template in the handover of patient care. J Surg Educ. 2011;68(1):5257.
  30. Boyd M, Cumin D, Lombard B, Torrie J, Civil N, Weller J. Read‐back improves information transfer in simulated clinical crises. BMJ Qual Saf. 2014;23(12):989993.
  31. Chang VY, Arora VM, Lev‐Ari S, D'Arcy M, Keysar B. Interns overestimate the effectiveness of their hand‐off communication. Pediatrics. 2010;125(3):491496.
  32. Barenfanger J, Sautter RL, Lang DL, Collins SM, Hacek DM, Peterson LR. Improving patient safety by repeating (read‐back) telephone reports of critical information. Am J Clin Pathol. 2004;121(6):801803.
  33. Collins SA, Stein DM, Vawdrey DK, Stetson PD, Bakken S. Content overlap in nurse and physician handoff artifacts and the potential role of electronic health records: a systematic review. J Biomed Inform. 2011;44(4):704712.
  34. Laxmisan A, McCoy AB, Wright A, Sittig DF. Clinical summarization capabilities of commercially‐available and internally‐developed electronic health records. Appl Clin Inform. 2012;3(1):8093.
  35. Hunt S, Staggers N. An analysis and recommendations for multidisciplinary computerized handoff applications in hospitals. AMIA Annu Symp Proc. 2011;2011:588597.
References
  1. Patterson ES, Roth EM, Woods DD, Chow R, Gomes JO. Handoff strategies in settings with high consequences for failure: lessons for health care operations. Int J Qual Health Care. 2004;16(2):125132.
  2. Vidyarthi AR, Arora V, Schnipper JL, Wall SD, Wachter RM. Managing discontinuity in academic medical centers: strategies for a safe and effective resident sign‐out. J Hosp Med. 2006;1(4):257266.
  3. Horwitz LI, Moin T, Green ML. Development and implementation of an oral sign‐out skills curriculum. J Gen Intern Med. 2007;22(10):14701474.
  4. Arora VM, Manjarrez E, Dressler DD, Basaviah P, Halasyamani L, Kripalani S. Hospitalist handoffs: a systematic review and task force recommendations. J Hosp Med. 2009;4(7):433440.
  5. Abraham J, Kannampallil T, Patel VL. A systematic review of the literature on the evaluation of handoff tools: implications for research and practice. J Am Med Inform Assoc. 2014;21(1):154162.
  6. Li P, Ali S, Tang C, Ghali WA, Stelfox HT. Review of computerized physician handoff tools for improving the quality of patient care. J Hosp Med. 2013;8(8):456463.
  7. McSweeney ME, Landrigan CP, Jiang H, Starmer A, Lightdale JR. Answering questions on call: pediatric resident physicians' use of handoffs and other resources. J Hosp Med. 2013;8(6):328333.
  8. Fogerty RL, Schoenfeld A, Salim Al‐Damluji M, Horwitz LI. Effectiveness of written hospitalist sign‐outs in answering overnight inquiries. J Hosp Med. 2013;8(11):609614.
  9. Schoenfeld AR, Salim Al‐Damluji M, Horwitz LI. Sign‐out snapshot: cross‐sectional evaluation of written sign‐outs among specialties. BMJ Qual Saf. 2014;23(1):6672.
  10. Bhabra G, Mackeith S, Monteiro P, Pothier DD. An experimental comparison of handover methods. Ann R Coll Surg Engl. 2007;89(3):298300.
  11. Pothier D, Monteiro P, Mooktiar M, Shaw A. Pilot study to show the loss of important data in nursing handover. Br J Nurs. 2005;14(20):10901093.
  12. The Joint Commission. Hospital Accreditation Standards 2015: Joint Commission Resources; 2015:PC.02.02.01.
  13. Accreditation Council for Graduate Medical Education. Common Program Requirements. 2013; http://acgme.org/acgmeweb/tabid/429/ProgramandInstitutionalAccreditation/CommonProgramRequirements.aspx. Accessed May 11, 2015.
  14. Sectish TC, Starmer AJ, Landrigan CP, Spector ND. Establishing a multisite education and research project requires leadership, expertise, collaboration, and an important aim. Pediatrics. 2010;126(4):619622.
  15. Starmer AJ, Spector ND, Srivastava R, et al. Changes in medical errors after implementation of a handoff program. N Engl J Med. 2014;371(19):18031812.
  16. US Department of Health and Human Services. Agency for Healthcare Research and Quality. TeamSTEPPS website. Available at: http://teamstepps.ahrq.gov/. Accessed July 12, 2013.
  17. Starmer AJ, Spector ND, Srivastava R, Allen AD, Landrigan CP, Sectish TC. I‐PASS, a mnemonic to standardize verbal handoffs. Pediatrics. 2012;129(2):201204.
  18. Scholtes P, Joiner B, Streibel B. The Team Handbook. 3rd ed. Middleton, WI: Oriel STAT A MATRIX; 2010.
  19. ACC/AHA Task Force on Practice Guidelines. Methodology Manual and Policies From the ACCF/AHA Task Force on Practice Guidelines. Available at: http://my.americanheart.org/idc/groups/ahamah‐public/@wcm/@sop/documents/downloadable/ucm_319826.pdf. Published June 2010. Accessed January 11, 2015.
  20. Naessens JM, Campbell CR, Shah N, et al. Effect of illness severity and comorbidity on patient safety and adverse events. Am J Med Qual. 2012;27(1):4857.
  21. Horwitz LI, Moin T, Krumholz HM, Wang L, Bradley EH. Consequences of inadequate sign‐out for patient care. Arch Intern Med. 2008;168(16):17551760.
  22. Starmer AJ, Sectish TC, Simon DW, et al. Rates of medical errors and preventable adverse events among hospitalized children following implementation of a resident handoff bundle. JAMA. 2013;310(21):22622270.
  23. Arora V, Kao J, Lovinger D, Seiden SC, Meltzer D. Medication discrepancies in resident sign‐outs and their potential to harm. J Gen Intern Med. 2007;22(12):17511755.
  24. Payne CE, Stein JM, Leong T, Dressler DD. Avoiding handover fumbles: a controlled trial of a structured handover tool versus traditional handover methods. BMJ Qual Saf. 2012;21(11):925932.
  25. Petersen LA, Orav EJ, Teich JM, O'Neil AC, Brennan TA. Using a computerized sign‐out program to improve continuity of inpatient care and prevent adverse events. Jt Comm J Qual Improv. 1998;24(2):7787.
  26. Wayne JD, Tyagi R, Reinhardt G, et al. Simple standardized patient handoff system that increases accuracy and completeness. J Surg Educ. 2008;65(6):476485.
  27. Eaton EG, Horvath KD, Lober WB, Pellegrini CA. Organizing the transfer of patient care information: the development of a computerized resident sign‐out system. Surgery. 2004;136(1):513.
  28. Eaton EG, Horvath KD, Lober WB, Rossini AJ, Pellegrini CA. A randomized, controlled trial evaluating the impact of a computerized rounding and sign‐out system on continuity of care and resident work hours. J Am Coll Surg. 2005;200(4):538545.
  29. Clark CJ, Sindell SL, Koehler RP. Template for success: using a resident‐designed sign‐out template in the handover of patient care. J Surg Educ. 2011;68(1):5257.
  30. Boyd M, Cumin D, Lombard B, Torrie J, Civil N, Weller J. Read‐back improves information transfer in simulated clinical crises. BMJ Qual Saf. 2014;23(12):989993.
  31. Chang VY, Arora VM, Lev‐Ari S, D'Arcy M, Keysar B. Interns overestimate the effectiveness of their hand‐off communication. Pediatrics. 2010;125(3):491496.
  32. Barenfanger J, Sautter RL, Lang DL, Collins SM, Hacek DM, Peterson LR. Improving patient safety by repeating (read‐back) telephone reports of critical information. Am J Clin Pathol. 2004;121(6):801803.
  33. Collins SA, Stein DM, Vawdrey DK, Stetson PD, Bakken S. Content overlap in nurse and physician handoff artifacts and the potential role of electronic health records: a systematic review. J Biomed Inform. 2011;44(4):704712.
  34. Laxmisan A, McCoy AB, Wright A, Sittig DF. Clinical summarization capabilities of commercially‐available and internally‐developed electronic health records. Appl Clin Inform. 2012;3(1):8093.
  35. Hunt S, Staggers N. An analysis and recommendations for multidisciplinary computerized handoff applications in hospitals. AMIA Annu Symp Proc. 2011;2011:588597.
Issue
Journal of Hospital Medicine - 10(8)
Issue
Journal of Hospital Medicine - 10(8)
Page Number
517-524
Page Number
517-524
Publications
Publications
Article Type
Display Headline
Variation in printed handoff documents: Results and recommendations from a multicenter needs assessment
Display Headline
Variation in printed handoff documents: Results and recommendations from a multicenter needs assessment
Sections
Article Source

© 2015 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Address for correspondence and reprint requests: Glenn Rosenbluth, MD, Department of Pediatrics, 550 16th Street, 5th Floor, San Francisco, CA 94143‐0110; Telephone: 415‐476‐9185; Fax: 415‐476‐4009; E‐mail: rosenbluthg@peds.ucsf.edu
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Article PDF Media
Image
Disable zoom
Off
Media Files
Image
Disable zoom
Off