Affiliations
Section of Hospital Medicine, Children's Hospital Colorado, and Department of Pediatrics, University of Colorado School of Medicine
Given name(s)
Leigh Anne
Family name
Bakel
Degrees
MD

Improving Notes in the EHR

Article Type
Changed
Mon, 01/02/2017 - 19:34
Display Headline
The effectiveness of a bundled intervention to improve resident progress notes in an electronic health record

There are described advantages to documenting in an electronic health record (EHR).[1, 2, 3, 4, 5] There has been, however, an unanticipated decline in certain aspects of documentation quality after implementing EHRs,[6, 7, 8] for example, the overinclusion of data (note clutter) and inappropriate use of copy‐paste.[6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17]

The objectives of this pilot study were to examine the effectiveness of an intervention bundle designed to improve resident progress notes written in an EHR (Epic Systems Corp., Verona, WI) and to establish the reliability of an audit tool used to assess the notes. Prior to this intervention, we provided no formal education for our residents about documentation in the EHR and had no policy governing format or content. The institutional review board at the University of Wisconsin approved this study.

METHODS

The Intervention Bundle

A multidisciplinary task force developed a set of Best Practice Guidelines for Writing Progress Notes in the EHR (see Supporting Information, Appendix 1, in the online version of this article). They were designed to promote cognitive review of data, reduce note clutter, promote synthesis of data, and discourage copy‐paste. For example, the guidelines recommended either the phrase, Vital signs from the last 24 hours have been reviewed and are pertinent for or a link that included minimum/maximum values rather than including multiple sets of data. We next developed a note template aligned with these guidelines (see Supporting Information, Appendix 2, in the online version of this article) using features and links that already existed within the EHR. Interns received classroom teaching about the best practices and instruction in use of the template.

Study Design

The study was a retrospective pre‐/postintervention. An audit tool designed to assess compliance with the guidelines was used to score 25 progress notes written by pediatric interns in August 2010 and August 2011 during the pre‐ and postintervention periods, respectively (see Supporting Information, Appendix 3, in the online version of this article).

Progress notes were eligible based on the following criteria: (1) written on any day subsequent to the admission date, (2) written by a pediatric intern, and (3) progress note from the previous day available for comparison. It was not required that 2 consecutive notes be written by the same resident. Eligible notes were identified using a computer‐generated report, reviewed by a study member to ensure eligibility, and assigned a number.

Notes were scored on a scale of 0 to 17, with each question having a range of possible scores from 0 to 2. Some questions related to inappropriate copy‐paste (questions 2, 9, 10) and a question related to discrete diagnostic language for abnormal labs (question 11) were weighted more heavily in the tool, as compliance with these components of the guideline was felt to be of greater importance. Several questions within the audit tool refer to clutter. We defined clutter as any additional data not endorsed by the guidelines or not explicitly stated as relevant to the patient's care for that day.

Raters were trained to score notes through practice sessions, during which they all scored the same note and compared findings. To rectify inter‐rater scoring discrepancies identified during these sessions, a reference manual was created to assist raters in scoring notes (see Supporting Information, Appendix 4, in the online version of this article). Each preintervention note was then systematically assigned to 2 raters, comprised of a physician and 3 staff from health information management. Each rater scored the note individually without discussion. The inter‐rater reliability was determined to be excellent, with kappa indices ranging from 88% to 100% for the 13 questions; each note in the postintervention period was therefore assigned to only 1 rater. Total and individual questions' scores were sent to the statistician for analysis.

Statistical Analysis

Inter‐rater reliability of the audit tool was evaluated by calculating the intraclass correlation (ICC) coefficient using a multilevel random intercept model to account for the rater effect.[18] The study was powered to detect an anticipated ICC of at least 0.75 at the 1‐sided 0.05 significance level, assuming a null hypothesis that the ICC is 0.4 or less. The total score was summarized in terms of means and standard deviation. Individual item responses were summarized using percentages and compared between the pre‐ and postintervention assessment using the Fisher exact test. The analysis of response patterns for individual item scores was considered exploratory. The Benjamini‐Hochberg false discovery rate method was utilized to control the false‐positive rate when comparing individual item scores.[19] All P values were 2‐sided and considered statistically significant at <0.05. Statistical analyses were conducted using SAS software version 9.2 (SAS Institute Inc., Cary, NC).

RESULTS

The ICC was 0.96 (95% confidence interval: 0.91‐0.98), indicating an excellent level of inter‐rater reliability. There was a significant improvement in the total score (see Supporting Information, Appendix 5, in the online version of this article) between the preintervention (mean 9.72, standard deviation [SD] 1.52) and postintervention (mean 11.72, SD 1.62) periods (P<0.0001).

Table 1 shows the percentage of yes responses to each individual item in the pre‐ and postintervention periods. Our intervention had a significant impact on reducing vital sign clutter (4% preintervention, 84% postintervention, P<0.0001) and other visual clutter within the note (0% preintervention, 28% postintervention, P=0.0035). We did not observe a significant impact on the reduction of input/output or lab clutter. There was no significant difference observed in the inclusion of the medication list. No significant improvements were seen in questions related to copy‐paste. The intervention had no significant impact on areas with an already high baseline performance: newly written interval histories, newly written physical exams, newly written plans, and the inclusion of discrete diagnostic language for abnormal labs.

Comparison of Percentage of Yes Responses Between Pre‐ and Postintervention for Each Question
Question Preintervention, N=25* Postintervention, N=25 P Value
  • NOTE: *Percentages calculated from the first rater. Adjusted P value (for evaluating multiple items) using the Benjamini‐Hochberg false discovery rate method.

1. Does the note header include the name of the service, author, and training level of the author? 0% 68% <0.0001
2. Does it appear that the subjective/emnterval history section of the note was newly written? (ie, not copied in its entirety from the previous note) 100% 96% 0.9999
3. Is the vital sign section noncluttered? 4% 84% <0.0001
4. Is the entire medication list included in the note? 96% 96% 0.9999
5. Is the intake/output section noncluttered? 0% 16% 0.3076
6. Does it appear that the physical exam was newly written? (ie, not copied in its entirety from the previous note) 80% 68% 0.9103
7. Is the lab section noncluttered? 64% 44% 0.5125
8. Is the imaging section noncluttered? 100% 100% 0.9999
9. Does it appear that the assessment was newly written? 48% 28% 0.5121
48% partial 52% partial 0.9999
10. Does it appear that the plan was newly written or partially copied with new information added? 88% 96% 0.9477
11. If the assessment includes abnormal lab values, is there also an accompanying diagnosis? (eg, inclusion of patient has hemoglobin of 6.2, also includes diagnosis of anemia) 96% 96% 0.9999
12. Is additional visual clutter prevented by excluding other objective data found elsewhere in the chart? 0% 28% 0.0035
13. Is the author's name and contact information (pager, cell) included at the bottom of the note? 0% 72% <0.0001

DISCUSSION

Principal Findings

Improvements in electronic note writing, particularly in reducing note clutter, were achieved after the implementation of a bundled intervention. Because the intervention is a bundle, we cannot definitively identify which component had the greatest impact. Given the improvements seen in some areas with very low baseline performance, we hypothesize that these are most attributable to the creation of a compliant note template that (1) guided authors in using data links that were less cluttered and (2) eliminated the use of unnecessary links (eg, pain scores and daily weights). The lack of similar improvements in reducing input/output and lab clutter may be due to the fact that even with changes to the template suggesting a more narrative approach to these components, residents still felt compelled to use data links. Because our EHR does not easily allow for the inclusion of individual data elements, such as specific drain output or hemoglobin as opposed to a complete blood count, residents continued to use links that included more data than necessary. Although not significant findings, there was an observed decline in the proportion of notes containing a physical exam not entirely copied from the previous day and containing an assessment that was entirely new. These findings may be attributable to having a small sample of authors, a few of whom in the postintervention period were particularly prone to using copy‐paste.

Relationship to Other Evidence

The observed decline in quality of provider documentation after implementation of the EHR has led to a robust discussion in the literature about what really constitutes a quality provider note.[7, 8, 9, 10, 20] The absence of a defined gold standard makes research in this area challenging. It is our observation that when physicians refer to a decline in quality documentation in the EHR, they are frequently referring to the fact that electronically generated notes are often unattractive, difficult to read, and seem to lack clinical narrative.

Several publications have attempted to define note quality. Payne et al. described physical characteristics of electronically generated notes that were deemed more attractive to a reader, including a large proportion of narrative free text.[15] Hanson performed a qualitative study to describe outpatient clinical notes from the perspective of multiple stakeholders, resulting in a description of the characteristics of a quality note.[21] This formed the basis for the QNOTE, a validated tool to measure the quality of outpatient notes.[22] Similar work has not been done to rigorously define quality for inpatient documentation. Stetson did develop an instrument, the Physician Documentation Quality Instrument (PDQI‐9) to assess inpatient notes across 9 attributes; however, the validation method relied on a gold standard of a general impression score of 7 physician leaders.[23, 24]

Although these tools aim to address overall note quality, an advantage provided by our audit tool is that it directly addresses the problems most attributable to documenting in an EHR, namely note clutter and copy‐paste. A second advantage is that clinicians and nonclinicians can score notes objectively. The QNOTE and PDQI‐9 still rely on subjective assessment and require that the evaluator be a clinician.

There has also been little published about how to achieve notes of high quality. In 2013, Shoolin et al. did publish a consensus statement from the Association of Medical Directors of Information Systems outlining some guidelines for inpatient EHR documentation.[25] Optimal strategies for implementing such guidelines, however, and the overall impact such an implementation would have on improving note writing has not previously been studied. This study, therefore, adds to the existing body of literature by providing an example of an intervention that may lead to improvements in note writing.

Limitations

Our study has several limitations. The sample size of notes and authors was small. The short duration of the study and the assessment of notes soon after the intervention prevented an assessment of whether improvements were sustained over time.

Unfortunately, we were not evaluating the same group of interns in the pre‐ and postintervention periods. Interns were chosen as subjects as there was an existing opportunity to do large group training during new intern orientation. Furthermore, we were concerned that more note‐writing experience alone would influence the outcome if we examined the same interns later in the year.

The audit tool was also a first attempt at measuring compliance with the guidelines. Determination of an optimal score/weight for each item requires further investigation as part of a larger scale validation study. In addition, the cognitive review and synthesis of data encouraged in our guideline were more difficult to measure using the audit tool, as they require some clinical knowledge about the patient and an assessment of the author's medical decision making. We do not assert, therefore, that compliance with the guidelines or a higher total score necessarily translates into overall note quality, as we recognize these limitations of the tool.

Future Directions

In conclusion, this report is a first effort to improve the quality of note writing in the EHR. Much more work is necessary, particularly in improving the clinical narrative and inappropriate copy‐paste. The examination of other interventions, such as the impact of structured feedback to the note author, whether by way of a validated scoring tool and/or narrative comments, is a logical next step for investigation.

ACKNOWLEDGEMENTS

The authors acknowledge and appreciate the support of Joel Buchanan, MD, Ellen Wald, MD, and Ann Boyer, MD, for their contributions to this study and manuscript preparation. We also acknowledge the members of the auditing team: Linda Brickert, Jane Duckert, and Jeannine Strunk.

Disclosure: Nothing to report.

Files
References
  1. Tang PC, LaRosa MP, Gorden SM. Use of computer‐based records, completeness of documentation, and appropriateness of documented clinical decisions. J Am Med Inform Assoc. 1999;6(3):245251.
  2. Amarasingham R, Moore BJ, Tabak YP, et al. An automated model to identify heart failure patients at risk for 30‐day readmission or death using electronic medical record data. Med Care. 2010;48(11):981988.
  3. Amarasingham R, Plantinga L, Diener‐West M, Gaskin DJ, Powe NR. Clinical information technologies and inpatient outcomes: a multiple hospital study. Arch Intern Med. 2009;169(2):108114.
  4. Makam AN, Nguyen OK, Moore B, Ma Y, Amarasingham R. Identifying patients with diabetes and the earliest date of diagnosis in real time: an electronic health record case‐finding algorithm. BMC Med Inform Decis Mak. 2013;13:81.
  5. Poon EG, Wright A, Simon SR, et al. Relationship between use of electronic health record features and health care quality: results of a statewide survey. Med Care. 2010;48(3):203209.
  6. Embi PJ, Yackel TR, Logan JR, Bowen JL, Cooney TG, Gorman PN. Impacts of computerized physician documentation in a teaching hospital: perceptions of faculty and resident physicians. J Am Med Inform Assoc. 2004;11(4):300309.
  7. Hartzband P, Groopman J. Off the record—avoiding the pitfalls of going electronic. N Engl J Med. 2008;358(16):16561658.
  8. Hirschtick RE. A piece of my mind. Copy‐and‐paste. JAMA. 2006;295(20):23352336.
  9. Siegler EL, Adelman R. Copy and paste: a remediable hazard of electronic health records. Am J Med. 2009;122(6):495496.
  10. O'Donnell HC, Kaushal R, Barron Y, Callahan MA, Adelman RD, Siegler EL. Physicians' attitudes towards copy and pasting in electronic note writing. J Gen Intern Med. 2009;24(1):6368.
  11. Cimino JJ. Improving the electronic health record—are clinicians getting what they wished for? JAMA. 2013;309(10):991992.
  12. Thielke S, Hammond K, Helbig S. Copying and pasting of examinations within the electronic medical record. Int J Med Inform. 2007;76(suppl 1):S122S128.
  13. Siegler EL. The evolving medical record. Ann Intern Med. 2010;153(10):671677.
  14. Weir CR, Hurdle JF, Felgar MA, Hoffman JM, Roth B, Nebeker JR. Direct text entry in electronic progress notes. An evaluation of input errors. Methods Inf Med. 2003;42(1):6167.
  15. Payne TH, Patel R, Beahan S, Zehner J. The physical attractiveness of electronic physician notes. AMIA Annu Symp Proc. 2010;2010:622626.
  16. Yackel TR, Embi PJ. Copy‐and‐paste‐and‐paste. JAMA. 2006;296(19):2315; author reply 2315–2316.
  17. Hammond KW, Helbig ST, Benson CC, Brathwaite‐Sketoe BM. Are electronic medical records trustworthy? Observations on copying, pasting and duplication. AMIA Annu Symp Proc. 2003:269273.
  18. Raudenbush S, Bruk AS. Hierarchical Linear Models: Applications and Data Analysis Methods. 2nd ed. Thousand Oaks, CA: Sage; 2002.
  19. Benjamini Y, Hochberg Y. Controlling the false discovery rate: a practical and powerful approach for multiple testing. J R Stat Soc Series B Stat Methodol 1995;57(1):289300.
  20. Sheehy AM, Weissburg DJ, Dean SM. The role of copy‐and‐paste in the hospital electronic health record. JAMA Intern Med. 2014;174(8):12171218.
  21. Hanson JL, Stephens MB, Pangaro LN, Gimbel RW. Quality of outpatient clinical notes: a stakeholder definition derived through qualitative research. BMC Health Serv Res. 2012;12:407.
  22. Burke HB, Hoang A, Becher D, et al. QNOTE: an instrument for measuring the quality of EHR clinical notes. J Am Med Inform Assoc. 2014;21(5):910916.
  23. Stetson PD, Bakken S, Wrenn JO, Siegler EL. Assessing electronic note quality using the physician documentation quality instrument (PDQI‐9). Appl Clin Inform. 2012;3(2):164174.
  24. Stetson PD, Morrison FP, Bakken S, Johnson SB. Preliminary development of the physician documentation quality instrument. J Am Med Inform Assoc. 2008;15(4):534541.
  25. Shoolin J, Ozeran L, Hamann C, Bria W. Association of Medical Directors of Information Systems consensus on inpatient electronic health record documentation. Appl Clin Inform. 2013;4(2):293303.
Article PDF
Issue
Journal of Hospital Medicine - 10(2)
Publications
Page Number
104-107
Sections
Files
Files
Article PDF
Article PDF

There are described advantages to documenting in an electronic health record (EHR).[1, 2, 3, 4, 5] There has been, however, an unanticipated decline in certain aspects of documentation quality after implementing EHRs,[6, 7, 8] for example, the overinclusion of data (note clutter) and inappropriate use of copy‐paste.[6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17]

The objectives of this pilot study were to examine the effectiveness of an intervention bundle designed to improve resident progress notes written in an EHR (Epic Systems Corp., Verona, WI) and to establish the reliability of an audit tool used to assess the notes. Prior to this intervention, we provided no formal education for our residents about documentation in the EHR and had no policy governing format or content. The institutional review board at the University of Wisconsin approved this study.

METHODS

The Intervention Bundle

A multidisciplinary task force developed a set of Best Practice Guidelines for Writing Progress Notes in the EHR (see Supporting Information, Appendix 1, in the online version of this article). They were designed to promote cognitive review of data, reduce note clutter, promote synthesis of data, and discourage copy‐paste. For example, the guidelines recommended either the phrase, Vital signs from the last 24 hours have been reviewed and are pertinent for or a link that included minimum/maximum values rather than including multiple sets of data. We next developed a note template aligned with these guidelines (see Supporting Information, Appendix 2, in the online version of this article) using features and links that already existed within the EHR. Interns received classroom teaching about the best practices and instruction in use of the template.

Study Design

The study was a retrospective pre‐/postintervention. An audit tool designed to assess compliance with the guidelines was used to score 25 progress notes written by pediatric interns in August 2010 and August 2011 during the pre‐ and postintervention periods, respectively (see Supporting Information, Appendix 3, in the online version of this article).

Progress notes were eligible based on the following criteria: (1) written on any day subsequent to the admission date, (2) written by a pediatric intern, and (3) progress note from the previous day available for comparison. It was not required that 2 consecutive notes be written by the same resident. Eligible notes were identified using a computer‐generated report, reviewed by a study member to ensure eligibility, and assigned a number.

Notes were scored on a scale of 0 to 17, with each question having a range of possible scores from 0 to 2. Some questions related to inappropriate copy‐paste (questions 2, 9, 10) and a question related to discrete diagnostic language for abnormal labs (question 11) were weighted more heavily in the tool, as compliance with these components of the guideline was felt to be of greater importance. Several questions within the audit tool refer to clutter. We defined clutter as any additional data not endorsed by the guidelines or not explicitly stated as relevant to the patient's care for that day.

Raters were trained to score notes through practice sessions, during which they all scored the same note and compared findings. To rectify inter‐rater scoring discrepancies identified during these sessions, a reference manual was created to assist raters in scoring notes (see Supporting Information, Appendix 4, in the online version of this article). Each preintervention note was then systematically assigned to 2 raters, comprised of a physician and 3 staff from health information management. Each rater scored the note individually without discussion. The inter‐rater reliability was determined to be excellent, with kappa indices ranging from 88% to 100% for the 13 questions; each note in the postintervention period was therefore assigned to only 1 rater. Total and individual questions' scores were sent to the statistician for analysis.

Statistical Analysis

Inter‐rater reliability of the audit tool was evaluated by calculating the intraclass correlation (ICC) coefficient using a multilevel random intercept model to account for the rater effect.[18] The study was powered to detect an anticipated ICC of at least 0.75 at the 1‐sided 0.05 significance level, assuming a null hypothesis that the ICC is 0.4 or less. The total score was summarized in terms of means and standard deviation. Individual item responses were summarized using percentages and compared between the pre‐ and postintervention assessment using the Fisher exact test. The analysis of response patterns for individual item scores was considered exploratory. The Benjamini‐Hochberg false discovery rate method was utilized to control the false‐positive rate when comparing individual item scores.[19] All P values were 2‐sided and considered statistically significant at <0.05. Statistical analyses were conducted using SAS software version 9.2 (SAS Institute Inc., Cary, NC).

RESULTS

The ICC was 0.96 (95% confidence interval: 0.91‐0.98), indicating an excellent level of inter‐rater reliability. There was a significant improvement in the total score (see Supporting Information, Appendix 5, in the online version of this article) between the preintervention (mean 9.72, standard deviation [SD] 1.52) and postintervention (mean 11.72, SD 1.62) periods (P<0.0001).

Table 1 shows the percentage of yes responses to each individual item in the pre‐ and postintervention periods. Our intervention had a significant impact on reducing vital sign clutter (4% preintervention, 84% postintervention, P<0.0001) and other visual clutter within the note (0% preintervention, 28% postintervention, P=0.0035). We did not observe a significant impact on the reduction of input/output or lab clutter. There was no significant difference observed in the inclusion of the medication list. No significant improvements were seen in questions related to copy‐paste. The intervention had no significant impact on areas with an already high baseline performance: newly written interval histories, newly written physical exams, newly written plans, and the inclusion of discrete diagnostic language for abnormal labs.

Comparison of Percentage of Yes Responses Between Pre‐ and Postintervention for Each Question
Question Preintervention, N=25* Postintervention, N=25 P Value
  • NOTE: *Percentages calculated from the first rater. Adjusted P value (for evaluating multiple items) using the Benjamini‐Hochberg false discovery rate method.

1. Does the note header include the name of the service, author, and training level of the author? 0% 68% <0.0001
2. Does it appear that the subjective/emnterval history section of the note was newly written? (ie, not copied in its entirety from the previous note) 100% 96% 0.9999
3. Is the vital sign section noncluttered? 4% 84% <0.0001
4. Is the entire medication list included in the note? 96% 96% 0.9999
5. Is the intake/output section noncluttered? 0% 16% 0.3076
6. Does it appear that the physical exam was newly written? (ie, not copied in its entirety from the previous note) 80% 68% 0.9103
7. Is the lab section noncluttered? 64% 44% 0.5125
8. Is the imaging section noncluttered? 100% 100% 0.9999
9. Does it appear that the assessment was newly written? 48% 28% 0.5121
48% partial 52% partial 0.9999
10. Does it appear that the plan was newly written or partially copied with new information added? 88% 96% 0.9477
11. If the assessment includes abnormal lab values, is there also an accompanying diagnosis? (eg, inclusion of patient has hemoglobin of 6.2, also includes diagnosis of anemia) 96% 96% 0.9999
12. Is additional visual clutter prevented by excluding other objective data found elsewhere in the chart? 0% 28% 0.0035
13. Is the author's name and contact information (pager, cell) included at the bottom of the note? 0% 72% <0.0001

DISCUSSION

Principal Findings

Improvements in electronic note writing, particularly in reducing note clutter, were achieved after the implementation of a bundled intervention. Because the intervention is a bundle, we cannot definitively identify which component had the greatest impact. Given the improvements seen in some areas with very low baseline performance, we hypothesize that these are most attributable to the creation of a compliant note template that (1) guided authors in using data links that were less cluttered and (2) eliminated the use of unnecessary links (eg, pain scores and daily weights). The lack of similar improvements in reducing input/output and lab clutter may be due to the fact that even with changes to the template suggesting a more narrative approach to these components, residents still felt compelled to use data links. Because our EHR does not easily allow for the inclusion of individual data elements, such as specific drain output or hemoglobin as opposed to a complete blood count, residents continued to use links that included more data than necessary. Although not significant findings, there was an observed decline in the proportion of notes containing a physical exam not entirely copied from the previous day and containing an assessment that was entirely new. These findings may be attributable to having a small sample of authors, a few of whom in the postintervention period were particularly prone to using copy‐paste.

Relationship to Other Evidence

The observed decline in quality of provider documentation after implementation of the EHR has led to a robust discussion in the literature about what really constitutes a quality provider note.[7, 8, 9, 10, 20] The absence of a defined gold standard makes research in this area challenging. It is our observation that when physicians refer to a decline in quality documentation in the EHR, they are frequently referring to the fact that electronically generated notes are often unattractive, difficult to read, and seem to lack clinical narrative.

Several publications have attempted to define note quality. Payne et al. described physical characteristics of electronically generated notes that were deemed more attractive to a reader, including a large proportion of narrative free text.[15] Hanson performed a qualitative study to describe outpatient clinical notes from the perspective of multiple stakeholders, resulting in a description of the characteristics of a quality note.[21] This formed the basis for the QNOTE, a validated tool to measure the quality of outpatient notes.[22] Similar work has not been done to rigorously define quality for inpatient documentation. Stetson did develop an instrument, the Physician Documentation Quality Instrument (PDQI‐9) to assess inpatient notes across 9 attributes; however, the validation method relied on a gold standard of a general impression score of 7 physician leaders.[23, 24]

Although these tools aim to address overall note quality, an advantage provided by our audit tool is that it directly addresses the problems most attributable to documenting in an EHR, namely note clutter and copy‐paste. A second advantage is that clinicians and nonclinicians can score notes objectively. The QNOTE and PDQI‐9 still rely on subjective assessment and require that the evaluator be a clinician.

There has also been little published about how to achieve notes of high quality. In 2013, Shoolin et al. did publish a consensus statement from the Association of Medical Directors of Information Systems outlining some guidelines for inpatient EHR documentation.[25] Optimal strategies for implementing such guidelines, however, and the overall impact such an implementation would have on improving note writing has not previously been studied. This study, therefore, adds to the existing body of literature by providing an example of an intervention that may lead to improvements in note writing.

Limitations

Our study has several limitations. The sample size of notes and authors was small. The short duration of the study and the assessment of notes soon after the intervention prevented an assessment of whether improvements were sustained over time.

Unfortunately, we were not evaluating the same group of interns in the pre‐ and postintervention periods. Interns were chosen as subjects as there was an existing opportunity to do large group training during new intern orientation. Furthermore, we were concerned that more note‐writing experience alone would influence the outcome if we examined the same interns later in the year.

The audit tool was also a first attempt at measuring compliance with the guidelines. Determination of an optimal score/weight for each item requires further investigation as part of a larger scale validation study. In addition, the cognitive review and synthesis of data encouraged in our guideline were more difficult to measure using the audit tool, as they require some clinical knowledge about the patient and an assessment of the author's medical decision making. We do not assert, therefore, that compliance with the guidelines or a higher total score necessarily translates into overall note quality, as we recognize these limitations of the tool.

Future Directions

In conclusion, this report is a first effort to improve the quality of note writing in the EHR. Much more work is necessary, particularly in improving the clinical narrative and inappropriate copy‐paste. The examination of other interventions, such as the impact of structured feedback to the note author, whether by way of a validated scoring tool and/or narrative comments, is a logical next step for investigation.

ACKNOWLEDGEMENTS

The authors acknowledge and appreciate the support of Joel Buchanan, MD, Ellen Wald, MD, and Ann Boyer, MD, for their contributions to this study and manuscript preparation. We also acknowledge the members of the auditing team: Linda Brickert, Jane Duckert, and Jeannine Strunk.

Disclosure: Nothing to report.

There are described advantages to documenting in an electronic health record (EHR).[1, 2, 3, 4, 5] There has been, however, an unanticipated decline in certain aspects of documentation quality after implementing EHRs,[6, 7, 8] for example, the overinclusion of data (note clutter) and inappropriate use of copy‐paste.[6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17]

The objectives of this pilot study were to examine the effectiveness of an intervention bundle designed to improve resident progress notes written in an EHR (Epic Systems Corp., Verona, WI) and to establish the reliability of an audit tool used to assess the notes. Prior to this intervention, we provided no formal education for our residents about documentation in the EHR and had no policy governing format or content. The institutional review board at the University of Wisconsin approved this study.

METHODS

The Intervention Bundle

A multidisciplinary task force developed a set of Best Practice Guidelines for Writing Progress Notes in the EHR (see Supporting Information, Appendix 1, in the online version of this article). They were designed to promote cognitive review of data, reduce note clutter, promote synthesis of data, and discourage copy‐paste. For example, the guidelines recommended either the phrase, Vital signs from the last 24 hours have been reviewed and are pertinent for or a link that included minimum/maximum values rather than including multiple sets of data. We next developed a note template aligned with these guidelines (see Supporting Information, Appendix 2, in the online version of this article) using features and links that already existed within the EHR. Interns received classroom teaching about the best practices and instruction in use of the template.

Study Design

The study was a retrospective pre‐/postintervention. An audit tool designed to assess compliance with the guidelines was used to score 25 progress notes written by pediatric interns in August 2010 and August 2011 during the pre‐ and postintervention periods, respectively (see Supporting Information, Appendix 3, in the online version of this article).

Progress notes were eligible based on the following criteria: (1) written on any day subsequent to the admission date, (2) written by a pediatric intern, and (3) progress note from the previous day available for comparison. It was not required that 2 consecutive notes be written by the same resident. Eligible notes were identified using a computer‐generated report, reviewed by a study member to ensure eligibility, and assigned a number.

Notes were scored on a scale of 0 to 17, with each question having a range of possible scores from 0 to 2. Some questions related to inappropriate copy‐paste (questions 2, 9, 10) and a question related to discrete diagnostic language for abnormal labs (question 11) were weighted more heavily in the tool, as compliance with these components of the guideline was felt to be of greater importance. Several questions within the audit tool refer to clutter. We defined clutter as any additional data not endorsed by the guidelines or not explicitly stated as relevant to the patient's care for that day.

Raters were trained to score notes through practice sessions, during which they all scored the same note and compared findings. To rectify inter‐rater scoring discrepancies identified during these sessions, a reference manual was created to assist raters in scoring notes (see Supporting Information, Appendix 4, in the online version of this article). Each preintervention note was then systematically assigned to 2 raters, comprised of a physician and 3 staff from health information management. Each rater scored the note individually without discussion. The inter‐rater reliability was determined to be excellent, with kappa indices ranging from 88% to 100% for the 13 questions; each note in the postintervention period was therefore assigned to only 1 rater. Total and individual questions' scores were sent to the statistician for analysis.

Statistical Analysis

Inter‐rater reliability of the audit tool was evaluated by calculating the intraclass correlation (ICC) coefficient using a multilevel random intercept model to account for the rater effect.[18] The study was powered to detect an anticipated ICC of at least 0.75 at the 1‐sided 0.05 significance level, assuming a null hypothesis that the ICC is 0.4 or less. The total score was summarized in terms of means and standard deviation. Individual item responses were summarized using percentages and compared between the pre‐ and postintervention assessment using the Fisher exact test. The analysis of response patterns for individual item scores was considered exploratory. The Benjamini‐Hochberg false discovery rate method was utilized to control the false‐positive rate when comparing individual item scores.[19] All P values were 2‐sided and considered statistically significant at <0.05. Statistical analyses were conducted using SAS software version 9.2 (SAS Institute Inc., Cary, NC).

RESULTS

The ICC was 0.96 (95% confidence interval: 0.91‐0.98), indicating an excellent level of inter‐rater reliability. There was a significant improvement in the total score (see Supporting Information, Appendix 5, in the online version of this article) between the preintervention (mean 9.72, standard deviation [SD] 1.52) and postintervention (mean 11.72, SD 1.62) periods (P<0.0001).

Table 1 shows the percentage of yes responses to each individual item in the pre‐ and postintervention periods. Our intervention had a significant impact on reducing vital sign clutter (4% preintervention, 84% postintervention, P<0.0001) and other visual clutter within the note (0% preintervention, 28% postintervention, P=0.0035). We did not observe a significant impact on the reduction of input/output or lab clutter. There was no significant difference observed in the inclusion of the medication list. No significant improvements were seen in questions related to copy‐paste. The intervention had no significant impact on areas with an already high baseline performance: newly written interval histories, newly written physical exams, newly written plans, and the inclusion of discrete diagnostic language for abnormal labs.

Comparison of Percentage of Yes Responses Between Pre‐ and Postintervention for Each Question
Question Preintervention, N=25* Postintervention, N=25 P Value
  • NOTE: *Percentages calculated from the first rater. Adjusted P value (for evaluating multiple items) using the Benjamini‐Hochberg false discovery rate method.

1. Does the note header include the name of the service, author, and training level of the author? 0% 68% <0.0001
2. Does it appear that the subjective/emnterval history section of the note was newly written? (ie, not copied in its entirety from the previous note) 100% 96% 0.9999
3. Is the vital sign section noncluttered? 4% 84% <0.0001
4. Is the entire medication list included in the note? 96% 96% 0.9999
5. Is the intake/output section noncluttered? 0% 16% 0.3076
6. Does it appear that the physical exam was newly written? (ie, not copied in its entirety from the previous note) 80% 68% 0.9103
7. Is the lab section noncluttered? 64% 44% 0.5125
8. Is the imaging section noncluttered? 100% 100% 0.9999
9. Does it appear that the assessment was newly written? 48% 28% 0.5121
48% partial 52% partial 0.9999
10. Does it appear that the plan was newly written or partially copied with new information added? 88% 96% 0.9477
11. If the assessment includes abnormal lab values, is there also an accompanying diagnosis? (eg, inclusion of patient has hemoglobin of 6.2, also includes diagnosis of anemia) 96% 96% 0.9999
12. Is additional visual clutter prevented by excluding other objective data found elsewhere in the chart? 0% 28% 0.0035
13. Is the author's name and contact information (pager, cell) included at the bottom of the note? 0% 72% <0.0001

DISCUSSION

Principal Findings

Improvements in electronic note writing, particularly in reducing note clutter, were achieved after the implementation of a bundled intervention. Because the intervention is a bundle, we cannot definitively identify which component had the greatest impact. Given the improvements seen in some areas with very low baseline performance, we hypothesize that these are most attributable to the creation of a compliant note template that (1) guided authors in using data links that were less cluttered and (2) eliminated the use of unnecessary links (eg, pain scores and daily weights). The lack of similar improvements in reducing input/output and lab clutter may be due to the fact that even with changes to the template suggesting a more narrative approach to these components, residents still felt compelled to use data links. Because our EHR does not easily allow for the inclusion of individual data elements, such as specific drain output or hemoglobin as opposed to a complete blood count, residents continued to use links that included more data than necessary. Although not significant findings, there was an observed decline in the proportion of notes containing a physical exam not entirely copied from the previous day and containing an assessment that was entirely new. These findings may be attributable to having a small sample of authors, a few of whom in the postintervention period were particularly prone to using copy‐paste.

Relationship to Other Evidence

The observed decline in quality of provider documentation after implementation of the EHR has led to a robust discussion in the literature about what really constitutes a quality provider note.[7, 8, 9, 10, 20] The absence of a defined gold standard makes research in this area challenging. It is our observation that when physicians refer to a decline in quality documentation in the EHR, they are frequently referring to the fact that electronically generated notes are often unattractive, difficult to read, and seem to lack clinical narrative.

Several publications have attempted to define note quality. Payne et al. described physical characteristics of electronically generated notes that were deemed more attractive to a reader, including a large proportion of narrative free text.[15] Hanson performed a qualitative study to describe outpatient clinical notes from the perspective of multiple stakeholders, resulting in a description of the characteristics of a quality note.[21] This formed the basis for the QNOTE, a validated tool to measure the quality of outpatient notes.[22] Similar work has not been done to rigorously define quality for inpatient documentation. Stetson did develop an instrument, the Physician Documentation Quality Instrument (PDQI‐9) to assess inpatient notes across 9 attributes; however, the validation method relied on a gold standard of a general impression score of 7 physician leaders.[23, 24]

Although these tools aim to address overall note quality, an advantage provided by our audit tool is that it directly addresses the problems most attributable to documenting in an EHR, namely note clutter and copy‐paste. A second advantage is that clinicians and nonclinicians can score notes objectively. The QNOTE and PDQI‐9 still rely on subjective assessment and require that the evaluator be a clinician.

There has also been little published about how to achieve notes of high quality. In 2013, Shoolin et al. did publish a consensus statement from the Association of Medical Directors of Information Systems outlining some guidelines for inpatient EHR documentation.[25] Optimal strategies for implementing such guidelines, however, and the overall impact such an implementation would have on improving note writing has not previously been studied. This study, therefore, adds to the existing body of literature by providing an example of an intervention that may lead to improvements in note writing.

Limitations

Our study has several limitations. The sample size of notes and authors was small. The short duration of the study and the assessment of notes soon after the intervention prevented an assessment of whether improvements were sustained over time.

Unfortunately, we were not evaluating the same group of interns in the pre‐ and postintervention periods. Interns were chosen as subjects as there was an existing opportunity to do large group training during new intern orientation. Furthermore, we were concerned that more note‐writing experience alone would influence the outcome if we examined the same interns later in the year.

The audit tool was also a first attempt at measuring compliance with the guidelines. Determination of an optimal score/weight for each item requires further investigation as part of a larger scale validation study. In addition, the cognitive review and synthesis of data encouraged in our guideline were more difficult to measure using the audit tool, as they require some clinical knowledge about the patient and an assessment of the author's medical decision making. We do not assert, therefore, that compliance with the guidelines or a higher total score necessarily translates into overall note quality, as we recognize these limitations of the tool.

Future Directions

In conclusion, this report is a first effort to improve the quality of note writing in the EHR. Much more work is necessary, particularly in improving the clinical narrative and inappropriate copy‐paste. The examination of other interventions, such as the impact of structured feedback to the note author, whether by way of a validated scoring tool and/or narrative comments, is a logical next step for investigation.

ACKNOWLEDGEMENTS

The authors acknowledge and appreciate the support of Joel Buchanan, MD, Ellen Wald, MD, and Ann Boyer, MD, for their contributions to this study and manuscript preparation. We also acknowledge the members of the auditing team: Linda Brickert, Jane Duckert, and Jeannine Strunk.

Disclosure: Nothing to report.

References
  1. Tang PC, LaRosa MP, Gorden SM. Use of computer‐based records, completeness of documentation, and appropriateness of documented clinical decisions. J Am Med Inform Assoc. 1999;6(3):245251.
  2. Amarasingham R, Moore BJ, Tabak YP, et al. An automated model to identify heart failure patients at risk for 30‐day readmission or death using electronic medical record data. Med Care. 2010;48(11):981988.
  3. Amarasingham R, Plantinga L, Diener‐West M, Gaskin DJ, Powe NR. Clinical information technologies and inpatient outcomes: a multiple hospital study. Arch Intern Med. 2009;169(2):108114.
  4. Makam AN, Nguyen OK, Moore B, Ma Y, Amarasingham R. Identifying patients with diabetes and the earliest date of diagnosis in real time: an electronic health record case‐finding algorithm. BMC Med Inform Decis Mak. 2013;13:81.
  5. Poon EG, Wright A, Simon SR, et al. Relationship between use of electronic health record features and health care quality: results of a statewide survey. Med Care. 2010;48(3):203209.
  6. Embi PJ, Yackel TR, Logan JR, Bowen JL, Cooney TG, Gorman PN. Impacts of computerized physician documentation in a teaching hospital: perceptions of faculty and resident physicians. J Am Med Inform Assoc. 2004;11(4):300309.
  7. Hartzband P, Groopman J. Off the record—avoiding the pitfalls of going electronic. N Engl J Med. 2008;358(16):16561658.
  8. Hirschtick RE. A piece of my mind. Copy‐and‐paste. JAMA. 2006;295(20):23352336.
  9. Siegler EL, Adelman R. Copy and paste: a remediable hazard of electronic health records. Am J Med. 2009;122(6):495496.
  10. O'Donnell HC, Kaushal R, Barron Y, Callahan MA, Adelman RD, Siegler EL. Physicians' attitudes towards copy and pasting in electronic note writing. J Gen Intern Med. 2009;24(1):6368.
  11. Cimino JJ. Improving the electronic health record—are clinicians getting what they wished for? JAMA. 2013;309(10):991992.
  12. Thielke S, Hammond K, Helbig S. Copying and pasting of examinations within the electronic medical record. Int J Med Inform. 2007;76(suppl 1):S122S128.
  13. Siegler EL. The evolving medical record. Ann Intern Med. 2010;153(10):671677.
  14. Weir CR, Hurdle JF, Felgar MA, Hoffman JM, Roth B, Nebeker JR. Direct text entry in electronic progress notes. An evaluation of input errors. Methods Inf Med. 2003;42(1):6167.
  15. Payne TH, Patel R, Beahan S, Zehner J. The physical attractiveness of electronic physician notes. AMIA Annu Symp Proc. 2010;2010:622626.
  16. Yackel TR, Embi PJ. Copy‐and‐paste‐and‐paste. JAMA. 2006;296(19):2315; author reply 2315–2316.
  17. Hammond KW, Helbig ST, Benson CC, Brathwaite‐Sketoe BM. Are electronic medical records trustworthy? Observations on copying, pasting and duplication. AMIA Annu Symp Proc. 2003:269273.
  18. Raudenbush S, Bruk AS. Hierarchical Linear Models: Applications and Data Analysis Methods. 2nd ed. Thousand Oaks, CA: Sage; 2002.
  19. Benjamini Y, Hochberg Y. Controlling the false discovery rate: a practical and powerful approach for multiple testing. J R Stat Soc Series B Stat Methodol 1995;57(1):289300.
  20. Sheehy AM, Weissburg DJ, Dean SM. The role of copy‐and‐paste in the hospital electronic health record. JAMA Intern Med. 2014;174(8):12171218.
  21. Hanson JL, Stephens MB, Pangaro LN, Gimbel RW. Quality of outpatient clinical notes: a stakeholder definition derived through qualitative research. BMC Health Serv Res. 2012;12:407.
  22. Burke HB, Hoang A, Becher D, et al. QNOTE: an instrument for measuring the quality of EHR clinical notes. J Am Med Inform Assoc. 2014;21(5):910916.
  23. Stetson PD, Bakken S, Wrenn JO, Siegler EL. Assessing electronic note quality using the physician documentation quality instrument (PDQI‐9). Appl Clin Inform. 2012;3(2):164174.
  24. Stetson PD, Morrison FP, Bakken S, Johnson SB. Preliminary development of the physician documentation quality instrument. J Am Med Inform Assoc. 2008;15(4):534541.
  25. Shoolin J, Ozeran L, Hamann C, Bria W. Association of Medical Directors of Information Systems consensus on inpatient electronic health record documentation. Appl Clin Inform. 2013;4(2):293303.
References
  1. Tang PC, LaRosa MP, Gorden SM. Use of computer‐based records, completeness of documentation, and appropriateness of documented clinical decisions. J Am Med Inform Assoc. 1999;6(3):245251.
  2. Amarasingham R, Moore BJ, Tabak YP, et al. An automated model to identify heart failure patients at risk for 30‐day readmission or death using electronic medical record data. Med Care. 2010;48(11):981988.
  3. Amarasingham R, Plantinga L, Diener‐West M, Gaskin DJ, Powe NR. Clinical information technologies and inpatient outcomes: a multiple hospital study. Arch Intern Med. 2009;169(2):108114.
  4. Makam AN, Nguyen OK, Moore B, Ma Y, Amarasingham R. Identifying patients with diabetes and the earliest date of diagnosis in real time: an electronic health record case‐finding algorithm. BMC Med Inform Decis Mak. 2013;13:81.
  5. Poon EG, Wright A, Simon SR, et al. Relationship between use of electronic health record features and health care quality: results of a statewide survey. Med Care. 2010;48(3):203209.
  6. Embi PJ, Yackel TR, Logan JR, Bowen JL, Cooney TG, Gorman PN. Impacts of computerized physician documentation in a teaching hospital: perceptions of faculty and resident physicians. J Am Med Inform Assoc. 2004;11(4):300309.
  7. Hartzband P, Groopman J. Off the record—avoiding the pitfalls of going electronic. N Engl J Med. 2008;358(16):16561658.
  8. Hirschtick RE. A piece of my mind. Copy‐and‐paste. JAMA. 2006;295(20):23352336.
  9. Siegler EL, Adelman R. Copy and paste: a remediable hazard of electronic health records. Am J Med. 2009;122(6):495496.
  10. O'Donnell HC, Kaushal R, Barron Y, Callahan MA, Adelman RD, Siegler EL. Physicians' attitudes towards copy and pasting in electronic note writing. J Gen Intern Med. 2009;24(1):6368.
  11. Cimino JJ. Improving the electronic health record—are clinicians getting what they wished for? JAMA. 2013;309(10):991992.
  12. Thielke S, Hammond K, Helbig S. Copying and pasting of examinations within the electronic medical record. Int J Med Inform. 2007;76(suppl 1):S122S128.
  13. Siegler EL. The evolving medical record. Ann Intern Med. 2010;153(10):671677.
  14. Weir CR, Hurdle JF, Felgar MA, Hoffman JM, Roth B, Nebeker JR. Direct text entry in electronic progress notes. An evaluation of input errors. Methods Inf Med. 2003;42(1):6167.
  15. Payne TH, Patel R, Beahan S, Zehner J. The physical attractiveness of electronic physician notes. AMIA Annu Symp Proc. 2010;2010:622626.
  16. Yackel TR, Embi PJ. Copy‐and‐paste‐and‐paste. JAMA. 2006;296(19):2315; author reply 2315–2316.
  17. Hammond KW, Helbig ST, Benson CC, Brathwaite‐Sketoe BM. Are electronic medical records trustworthy? Observations on copying, pasting and duplication. AMIA Annu Symp Proc. 2003:269273.
  18. Raudenbush S, Bruk AS. Hierarchical Linear Models: Applications and Data Analysis Methods. 2nd ed. Thousand Oaks, CA: Sage; 2002.
  19. Benjamini Y, Hochberg Y. Controlling the false discovery rate: a practical and powerful approach for multiple testing. J R Stat Soc Series B Stat Methodol 1995;57(1):289300.
  20. Sheehy AM, Weissburg DJ, Dean SM. The role of copy‐and‐paste in the hospital electronic health record. JAMA Intern Med. 2014;174(8):12171218.
  21. Hanson JL, Stephens MB, Pangaro LN, Gimbel RW. Quality of outpatient clinical notes: a stakeholder definition derived through qualitative research. BMC Health Serv Res. 2012;12:407.
  22. Burke HB, Hoang A, Becher D, et al. QNOTE: an instrument for measuring the quality of EHR clinical notes. J Am Med Inform Assoc. 2014;21(5):910916.
  23. Stetson PD, Bakken S, Wrenn JO, Siegler EL. Assessing electronic note quality using the physician documentation quality instrument (PDQI‐9). Appl Clin Inform. 2012;3(2):164174.
  24. Stetson PD, Morrison FP, Bakken S, Johnson SB. Preliminary development of the physician documentation quality instrument. J Am Med Inform Assoc. 2008;15(4):534541.
  25. Shoolin J, Ozeran L, Hamann C, Bria W. Association of Medical Directors of Information Systems consensus on inpatient electronic health record documentation. Appl Clin Inform. 2013;4(2):293303.
Issue
Journal of Hospital Medicine - 10(2)
Issue
Journal of Hospital Medicine - 10(2)
Page Number
104-107
Page Number
104-107
Publications
Publications
Article Type
Display Headline
The effectiveness of a bundled intervention to improve resident progress notes in an electronic health record
Display Headline
The effectiveness of a bundled intervention to improve resident progress notes in an electronic health record
Sections
Article Source
© 2014 Society of Hospital Medicine
Disallow All Ads
Correspondence Location
Address for correspondence and reprint requests: Shannon M. Dean, MD, 600 Highland Avenue H4/410 (MC 4108), Madison, WI 53792; Telephone: 608‐265‐5545; Fax: 608‐265‐8074; E‐mail: sdean@uwhealth.org
Content Gating
Gated (full article locked unless allowed per User)
Gating Strategy
First Peek Free
Article PDF Media
Media Files

Discharge Planning Tool in the EHR

Article Type
Changed
Mon, 01/02/2017 - 19:34
Display Headline
Development of a discharge readiness report within the electronic health record—A discharge planning tool

According to the American Academy of Pediatrics clinical report on physicians' roles in coordinating care of hospitalized children, there are several important components of hospital discharge planning.[1] Foremost is that discharge planning should begin, and discharge criteria should be set, at the time of hospital admission. This allows for optimal engagement of parents and providers in the effort to adequately prepare patients for the transition to home.

As pediatric inpatients become increasingly complex,[2] adequately preparing families for the transition to home becomes more challenging.[3] There are a myriad of issues to address and the burden of this preparation effort falls on multiple individuals other than the bedside nurse and physician. Large multidisciplinary teams often play a significant role in the discharge of medically complex children.[4] Several challenges may hinder the team's ability to effectively navigate the discharge process such as financial or insurance‐related issues, language differences, or geographic barriers. Patient and family anxieties may also complicate the transition to home.[5]

The challenges of a multidisciplinary approach to discharge planning are further magnified by the limitations of the electronic health record (EHR). The EHR is well designed to record individual encounters, but poorly designed to coordinate longitudinal care across settings.[6] Although multidisciplinary providers may spend significant and well‐intentioned energy to facilitate hospital discharge, their efforts may go unseen or be duplicative.

We developed a discharge readiness report (DRR) for the EHR, an integrated summary of discharge‐related issues, organized into a highly visible and easily accessible report. The development of the discharge planning tool was the first step in a larger quality improvement (QI) initiative aimed at improving the efficiency, effectiveness, and safety of hospital discharge. Our team recognized that improving the flow and visibility of information between disciplines was the first step toward accomplishing this larger aim. Health information technology offers an important opportunity for the improvement of patient safety and care transitions7; therefore, we leveraged the EHR to create an integrated discharge report. We used QI methods to understand our hospital's discharge processes, examined potential pitfalls in interdisciplinary communication, determined relevant information to include in the report, and optimized ways to display the data. To our knowledge, this use of the EHR is novel. The objectives of this article were to describe our team's development and implementation strategies, as well as challenges encountered, in the design of this electronic discharge planning tool.

METHODS

Setting

Children's Hospital Colorado is a 413‐bed freestanding tertiary care teaching hospital with over 13,000 inpatient admissions annually and an average patient length of stay of 5.7 days. We were the first children's hospital to fully implement a single EHR (Epic Systems, Madison, WI) in 2006. This discharge improvement initiative emerged from our hospital's involvement in the Children's Hospital Association Discharge Collaborative between October 2011 and October 2012. We were 1 of 12 participating hospitals and developed several different projects within the framework of the initiative.

Improvement Team

Our multidisciplinary project team included hospitalist physicians, case managers, social workers, respiratory therapists, pharmacists, medical interpreters, process improvement specialists, clinical application specialists whose daily role is management of our hospital's EHR software, and resident liaisons whose daily role is working with residents to facilitate care coordination.

Ethics

The project was determined to be QI work by the Children's Hospital Colorado Organizational Research Risk and Quality Improvement Review Panel.

Understanding the Problem

To understand the perspectives of each discipline involved in discharge planning, the lead hospitalist physician and a process improvement specialist interviewed key representatives from each group. Key informant interviews were conducted with hospitalist physicians, case managers, nurses, social workers, resident liaisons, respiratory therapists, pharmacists, medical interpreters, and residents. We inquired about their informational needs, their methods for obtaining relevant information, and whether the information was currently documented in the EHR. We then used process mapping to learn each disciplines' workflow related to discharge planning. Finally, we gathered key stakeholders together for a group session where discharge planning was mapped using the example of a patient admitted with asthma. From this session, we created a detailed multidisciplinary swim lane process map, a flowchart displaying the sequence of events in the overall discharge process grouped visually by placing the events in lanes. Each lane represented a discipline involved in patient discharge, and the arrows between lanes showed how information is passed between the various disciplines. Using this diagram, the team was able to fully understand provider interdependence in discharge planning and longitudinal timing of discharge‐related tasks during the patient's hospitalization.

We learned that: (1) discharge planning is complex, and there were often multiple provider types involved in the discharge of a single patient; (2) communication and coordination between the multitude of providers was often suboptimal; and (3) many of the tasks related to discharge were left to the last minute, resulting in unnecessary delays. Underlying these problems was a clear lack of organized and visible discharge planning information within the EHR.

There were many examples of obscure and siloed discharge processes. Physicians were aware of discharge criteria, but did not document these criteria for others to see. Case management assessments of home health needs were conveyed verbally to other team members, creating the potential for omissions, mistakes, or delays in appropriate home health planning. Social workers helped families to navigate financial hurdles (eg, assistance with payments for prescription medications). However, the presence of financial or insurance problems was not readily apparent to front‐line clinicians making discharge decisions. Other factors with potential significance for discharge planning, such as English‐language proficiency or a family's geographic distance from the hospital, were buried in disparate flow sheets or reports and not available or apparent to all health team members. There were also clear examples of discharge‐related tasks occurring at the end of hospitalization that could easily have been completed earlier in the admission such as identifying a primary care provider (PCP), scheduling follow‐up appointments, and completing work/subhool excuses because of lack of care team awareness that these items were needed.

Planning the Intervention

Based on our learning, we developed a key driver diagram (Figure 1). Our aim was to create a DRR that organized important discharge‐related information into 1 easily accessible report. Key drivers that were identified as relevant to the content of the DRR included: barriers to discharge, discharge criteria, home care, postdischarge care, and last minute delays. We also identified secondary drivers related to the design of the DRR. We hypothesized that addressing the secondary drivers would be essential to end user adoption of the tool. The secondary drivers included: accessibility, relevance, ease of updating, automation, and readability.

Figure 1
Key driver diagram. This improvement tool is read from left to right and begins with our aim (left side of the diagram). We used key informant interviews and process mapping to identify the key drivers (center) that affect this aim. We then brainstormed changes or interventions (right side) to address the key drivers. Abbreviations: DRR, discharge readiness report; EHR, electronic health record; PCP primary care provider.

With the swim lane diagram as well as our primary and secondary drivers in mind, we created a mock DRR on paper. We conducted multiple patient discharge simulations with representatives from all disciplines, walking through each step of a patient hospitalization from registration to discharge. This allowed us to map out how preexisting, yet disparate, EHR data could be channeled into 1 report. A few changes were made to processes involving data collection and documentation to facilitate timely transfer of information to the report. For example, questions addressing potential barriers to discharge and whether a school/work excuse was needed were added to the admission nursing assessment.

We then moved the paper DRR to the electronic environment. Data elements that were pulled automatically into the report included: potential barriers to discharge collected during nursing intake, case management information on home care needs, discharge criteria entered by resident and attending physicians, PCP, home pharmacy, follow‐up appointments, school/work excuse information gathered by resident liaisons, and active patient problems drawn from the problem list section. These data were organized into 4 distinct domains within the final DRR: potential barriers, transitional care, home care, and discharge criteria (Table 1).

Discharge Readiness Report Domains
Discharge Readiness Report Domain Example Content
  • NOTE: Abbreviations: PCP, primary care provider.

Potential barriers to discharge Geographic location of the family, whether patient lives in more than 1 household, primary spoken language, financial or insurance concern, and need for work/subhool excuses
Transitional care PCP and home pharmacy information, follow‐up ambulatory and imaging appointments, and care team communications with the PCP
Home care Planned discharge date/time and home care needs assessments such as needs for special equipment or skilled home nursing
Discharge criteria Clinical, social, or other care coordination conditions for discharge

Additional features potentially important to discharge planning were also incorporated into the report based on end user feedback. These included hyperlinks to discharge orders, home oxygen prescriptions, and the after‐visit summary for families, and the patient's home care company (if present). To facilitate discharge and transitional care related communication between the primary team and subspecialty teams, consults involved during the hospitalization were included on the report. As home care arrangements often involve care for active lines and drains, they were added to the report (Figure 2).

Figure 2
Discharge readiness report. Illustration of the report in the electronic health record. Abbreviations: GI, gastrointestinal; GT, gastrostomy tube; ID, infectious disease; O2, oxygen; PCP, primary care provider; PIV, peripheral venous catheter; RLL, right lowerlobe; RMY, Rocky Mountain Youth; RT, respiratory therapy; UTI, urinary tract infection. © 2014 Epic Systems Corporation. Used with permission.

Implementation

The report was activated within the EHR in June 2012. The team focused initial promotion and education efforts on medical floors. Education was widely disseminated via email and in‐person presentations.

The DRR was incorporated into daily CCRs for medical patients in July 2012. These multidisciplinary rounds occurred after medical‐team bedside rounds, focusing on care coordination and discharge planning. For each patient discussed, the DRR was projected onto a large screen, allowing all team members to view and discuss relevant discharge information. A process improvement (PI) specialist attended CCRs daily for several months, educating participants and monitoring use of the DRR. The PI specialist solicited feedback on ways to improve the DRR, and timed rounds to measure whether use of the DRR prolonged CCRs.

In the first weeks postimplementation, the use of the DRR prolonged rounds by as much as 1 minute per patient. Based on direct observation, the team focused interventions on barriers to the efficient use of the report during CCRs including: the need to scroll through the report, which was not visible on 1 screen; the need to navigate between patients; the need to quickly update the report based on discussion; and the need to update discharge criteria (Figure 3).

Figure 3
Description of interventions and tests of change to address increased time to complete care coordination rounds (CCRs) postimplementation of the discharge readiness report (DRR). Abbreviations: EHR, electronic health record.

RESULTS

Creation of the final DRR required significant time and effort and was the culmination of a uniquely collaborative effort between clinicians, ancillary staff, and information technology specialists (Figure 4). The report is used consistently for all general medical and medical subspecialty patients during CCRs. After interventions were implemented to improve the efficiency of using the DRR during CCRs, the use of the DRR did not prolong CCRs. Members of the care team acknowledge that all sections of the report are populated and accurate. Though end users have commented on their use of the report outside of CCRs, we have not been able to formally measure this.

Figure 4
Time spent and key steps used to design and implement the discharge readiness report (DRR). Abbreviations: CCR, care coordination rounds; EHR, electronic health record; PDSA, Plan Do Study Act.

We have noticed a shift in the focus of discussion since implementation of the DRR. Prior to this initiative, care teams at our institution did not regularly discuss discharge criteria during bedside or CCRs. The phrase discharge criteria has now become part of our shared language.

Informally, the DRR appears to have reduced inefficiency and the potential for communication error. The practice of writing notes on printed patient lists to be used to sign‐out or communicate to other team members not in attendance at CCRs has largely disappeared.

The DRR has proven to be adaptable across patient units, and can be tailored to the specific transitional care needs of a given patient population. At discharge institution, the DRR has been modified for, and has taken on a prominent role in, the discharge planning of highly complex populations such as rehabilitation and ventilated patients.

DISCUSSION

Discharge planning is a multifaceted, multidisciplinary process that should begin at the time of hospital admission. Safe patient transition depends on efficient discharge processes and effective communication across settings.[8] Although not well studied in the inpatient setting, care process variability can result in inefficient patient flow and increased stress among staff.[9] Patients and families may experience confusion, coping difficulties, and increased readmission due to ineffective discharge planning.[10] These potential pitfalls highlight the need for healthcare providers to develop patient‐centered, systematic approaches to improving the discharge process.[11]

To our knowledge, this is the first description of a discharge planning tool for the EHR in the pediatric setting. Our discharge report is centralized, easily accessible by all members of the care team, and includes important patient‐specific discharge‐related information that be used to focus discussion and streamline multidisciplinary discharge planning rounds.

We anticipate that the report will allow the entire healthcare team to function more efficiently, decrease discharge‐related delays and failures based on communication roadblocks, and improve family and caregiver satisfaction with the discharge process. We are currently testing these hypotheses and evaluating several implementation strategies in an ongoing research study. Assuming positive impact, we plan to spread the use of the DRR to all inpatient care areas at our hospital, and potentially to other hospitals.

The limitations of this QI project are consistent with other initiatives to improve care. The challenges we encounter at our freestanding tertiary care teaching hospital with regard to effective discharge planning and multidisciplinary communication may not be generalizable to other nonteaching or community hospitals, and the DRR may not be useful in other settings. Though the report is now a part of our EHR, the most impactful implementation strategies remain to be determined. The report and related changes represent significant diversion from years of deeply ingrained workflows for some providers, and we encountered some resistance from staff during the early stages of implementation. The most important of which was that some team members are uncomfortable with technology and prefer to use paper. Most of this initial resistance was overcome by implementing changes to improve the ease of use of the report (Figure 3). Though input from end users and key stakeholders has been incorporated throughout this initiative, more work is needed to measure end user adoption and satisfaction with the report.

CONCLUSION

High‐quality hospital discharge planning requires an increasingly multidisciplinary approach. The EHR can be leveraged to improve transparency and interdisciplinary communication around the discharge process. An integrated summary of discharge‐related issues, organized into 1 highly visible and easily accessible report in the EHR has the potential to improve care transitions.

Disclosure

Nothing to report.

Files
References
  1. Lye PS. Clinical report—physicians' roles in coordinating care of hospitalized children. Pediatrics. 2010;126:829832.
  2. Burns KH, Casey PH, Lyle RE, Bird TM, Fussell JJ, Robbins JM. Increasing prevalence of medically complex children in US hospitals. Pediatrics. 2010;126:638646.
  3. Srivastava R, Stone BL, Murphy NA. Hospitalist care of the medically complex child. Pediatr Clin North Am. 2005;52:11651187, x.
  4. Bakewell‐Sachs S, Porth S. Discharge planning and home care of the technology‐dependent infant. J Obstet Gynecol Neonatal Nurs. 1995;24:7783.
  5. Proctor EK, Morrow‐Howell N, Kitchen A, Wang YT. Pediatric discharge planning: complications, efficiency, and adequacy. Soc Work Health Care. 1995;22:118.
  6. Samal L, Dykes PC, Greenberg J, et al. The current capabilities of health information technology to support care transitions. AMIA Annu Symp Proc. 2013;2013:1231.
  7. Walsh C, Siegler EL, Cheston E, et al. Provider‐to‐provider electronic communication in the era of meaningful use: a review of the evidence. J Hosp Med. 2013;8:589597.
  8. Kripalani S, Jackson AT, Schnipper JL, Coleman EA. Promoting effective transitions of care at hospital discharge: a review of key issues for hospitalists. J Hosp Med. 2007;2:314323.
  9. Kyriacou DN, Ricketts V, Dyne PL, McCollough MD, Talan DA. A 5‐year time study analysis of emergency department patient care efficiency. Ann Emerg Med. 1999;34:326335.
  10. Horwitz LI, Moriarty JP, Chen C, et al. Quality of discharge practices and patient understanding at an academic medical center. JAMA Intern Med. 2013;173(18):17151722.
  11. Tsilimingras D, Bates DW. Addressing postdischarge adverse events: a neglected area. Jt Comm J Qual Patient Saf. 2008;34:8597.
Article PDF
Issue
Journal of Hospital Medicine - 9(8)
Publications
Page Number
533-539
Sections
Files
Files
Article PDF
Article PDF

According to the American Academy of Pediatrics clinical report on physicians' roles in coordinating care of hospitalized children, there are several important components of hospital discharge planning.[1] Foremost is that discharge planning should begin, and discharge criteria should be set, at the time of hospital admission. This allows for optimal engagement of parents and providers in the effort to adequately prepare patients for the transition to home.

As pediatric inpatients become increasingly complex,[2] adequately preparing families for the transition to home becomes more challenging.[3] There are a myriad of issues to address and the burden of this preparation effort falls on multiple individuals other than the bedside nurse and physician. Large multidisciplinary teams often play a significant role in the discharge of medically complex children.[4] Several challenges may hinder the team's ability to effectively navigate the discharge process such as financial or insurance‐related issues, language differences, or geographic barriers. Patient and family anxieties may also complicate the transition to home.[5]

The challenges of a multidisciplinary approach to discharge planning are further magnified by the limitations of the electronic health record (EHR). The EHR is well designed to record individual encounters, but poorly designed to coordinate longitudinal care across settings.[6] Although multidisciplinary providers may spend significant and well‐intentioned energy to facilitate hospital discharge, their efforts may go unseen or be duplicative.

We developed a discharge readiness report (DRR) for the EHR, an integrated summary of discharge‐related issues, organized into a highly visible and easily accessible report. The development of the discharge planning tool was the first step in a larger quality improvement (QI) initiative aimed at improving the efficiency, effectiveness, and safety of hospital discharge. Our team recognized that improving the flow and visibility of information between disciplines was the first step toward accomplishing this larger aim. Health information technology offers an important opportunity for the improvement of patient safety and care transitions7; therefore, we leveraged the EHR to create an integrated discharge report. We used QI methods to understand our hospital's discharge processes, examined potential pitfalls in interdisciplinary communication, determined relevant information to include in the report, and optimized ways to display the data. To our knowledge, this use of the EHR is novel. The objectives of this article were to describe our team's development and implementation strategies, as well as challenges encountered, in the design of this electronic discharge planning tool.

METHODS

Setting

Children's Hospital Colorado is a 413‐bed freestanding tertiary care teaching hospital with over 13,000 inpatient admissions annually and an average patient length of stay of 5.7 days. We were the first children's hospital to fully implement a single EHR (Epic Systems, Madison, WI) in 2006. This discharge improvement initiative emerged from our hospital's involvement in the Children's Hospital Association Discharge Collaborative between October 2011 and October 2012. We were 1 of 12 participating hospitals and developed several different projects within the framework of the initiative.

Improvement Team

Our multidisciplinary project team included hospitalist physicians, case managers, social workers, respiratory therapists, pharmacists, medical interpreters, process improvement specialists, clinical application specialists whose daily role is management of our hospital's EHR software, and resident liaisons whose daily role is working with residents to facilitate care coordination.

Ethics

The project was determined to be QI work by the Children's Hospital Colorado Organizational Research Risk and Quality Improvement Review Panel.

Understanding the Problem

To understand the perspectives of each discipline involved in discharge planning, the lead hospitalist physician and a process improvement specialist interviewed key representatives from each group. Key informant interviews were conducted with hospitalist physicians, case managers, nurses, social workers, resident liaisons, respiratory therapists, pharmacists, medical interpreters, and residents. We inquired about their informational needs, their methods for obtaining relevant information, and whether the information was currently documented in the EHR. We then used process mapping to learn each disciplines' workflow related to discharge planning. Finally, we gathered key stakeholders together for a group session where discharge planning was mapped using the example of a patient admitted with asthma. From this session, we created a detailed multidisciplinary swim lane process map, a flowchart displaying the sequence of events in the overall discharge process grouped visually by placing the events in lanes. Each lane represented a discipline involved in patient discharge, and the arrows between lanes showed how information is passed between the various disciplines. Using this diagram, the team was able to fully understand provider interdependence in discharge planning and longitudinal timing of discharge‐related tasks during the patient's hospitalization.

We learned that: (1) discharge planning is complex, and there were often multiple provider types involved in the discharge of a single patient; (2) communication and coordination between the multitude of providers was often suboptimal; and (3) many of the tasks related to discharge were left to the last minute, resulting in unnecessary delays. Underlying these problems was a clear lack of organized and visible discharge planning information within the EHR.

There were many examples of obscure and siloed discharge processes. Physicians were aware of discharge criteria, but did not document these criteria for others to see. Case management assessments of home health needs were conveyed verbally to other team members, creating the potential for omissions, mistakes, or delays in appropriate home health planning. Social workers helped families to navigate financial hurdles (eg, assistance with payments for prescription medications). However, the presence of financial or insurance problems was not readily apparent to front‐line clinicians making discharge decisions. Other factors with potential significance for discharge planning, such as English‐language proficiency or a family's geographic distance from the hospital, were buried in disparate flow sheets or reports and not available or apparent to all health team members. There were also clear examples of discharge‐related tasks occurring at the end of hospitalization that could easily have been completed earlier in the admission such as identifying a primary care provider (PCP), scheduling follow‐up appointments, and completing work/subhool excuses because of lack of care team awareness that these items were needed.

Planning the Intervention

Based on our learning, we developed a key driver diagram (Figure 1). Our aim was to create a DRR that organized important discharge‐related information into 1 easily accessible report. Key drivers that were identified as relevant to the content of the DRR included: barriers to discharge, discharge criteria, home care, postdischarge care, and last minute delays. We also identified secondary drivers related to the design of the DRR. We hypothesized that addressing the secondary drivers would be essential to end user adoption of the tool. The secondary drivers included: accessibility, relevance, ease of updating, automation, and readability.

Figure 1
Key driver diagram. This improvement tool is read from left to right and begins with our aim (left side of the diagram). We used key informant interviews and process mapping to identify the key drivers (center) that affect this aim. We then brainstormed changes or interventions (right side) to address the key drivers. Abbreviations: DRR, discharge readiness report; EHR, electronic health record; PCP primary care provider.

With the swim lane diagram as well as our primary and secondary drivers in mind, we created a mock DRR on paper. We conducted multiple patient discharge simulations with representatives from all disciplines, walking through each step of a patient hospitalization from registration to discharge. This allowed us to map out how preexisting, yet disparate, EHR data could be channeled into 1 report. A few changes were made to processes involving data collection and documentation to facilitate timely transfer of information to the report. For example, questions addressing potential barriers to discharge and whether a school/work excuse was needed were added to the admission nursing assessment.

We then moved the paper DRR to the electronic environment. Data elements that were pulled automatically into the report included: potential barriers to discharge collected during nursing intake, case management information on home care needs, discharge criteria entered by resident and attending physicians, PCP, home pharmacy, follow‐up appointments, school/work excuse information gathered by resident liaisons, and active patient problems drawn from the problem list section. These data were organized into 4 distinct domains within the final DRR: potential barriers, transitional care, home care, and discharge criteria (Table 1).

Discharge Readiness Report Domains
Discharge Readiness Report Domain Example Content
  • NOTE: Abbreviations: PCP, primary care provider.

Potential barriers to discharge Geographic location of the family, whether patient lives in more than 1 household, primary spoken language, financial or insurance concern, and need for work/subhool excuses
Transitional care PCP and home pharmacy information, follow‐up ambulatory and imaging appointments, and care team communications with the PCP
Home care Planned discharge date/time and home care needs assessments such as needs for special equipment or skilled home nursing
Discharge criteria Clinical, social, or other care coordination conditions for discharge

Additional features potentially important to discharge planning were also incorporated into the report based on end user feedback. These included hyperlinks to discharge orders, home oxygen prescriptions, and the after‐visit summary for families, and the patient's home care company (if present). To facilitate discharge and transitional care related communication between the primary team and subspecialty teams, consults involved during the hospitalization were included on the report. As home care arrangements often involve care for active lines and drains, they were added to the report (Figure 2).

Figure 2
Discharge readiness report. Illustration of the report in the electronic health record. Abbreviations: GI, gastrointestinal; GT, gastrostomy tube; ID, infectious disease; O2, oxygen; PCP, primary care provider; PIV, peripheral venous catheter; RLL, right lowerlobe; RMY, Rocky Mountain Youth; RT, respiratory therapy; UTI, urinary tract infection. © 2014 Epic Systems Corporation. Used with permission.

Implementation

The report was activated within the EHR in June 2012. The team focused initial promotion and education efforts on medical floors. Education was widely disseminated via email and in‐person presentations.

The DRR was incorporated into daily CCRs for medical patients in July 2012. These multidisciplinary rounds occurred after medical‐team bedside rounds, focusing on care coordination and discharge planning. For each patient discussed, the DRR was projected onto a large screen, allowing all team members to view and discuss relevant discharge information. A process improvement (PI) specialist attended CCRs daily for several months, educating participants and monitoring use of the DRR. The PI specialist solicited feedback on ways to improve the DRR, and timed rounds to measure whether use of the DRR prolonged CCRs.

In the first weeks postimplementation, the use of the DRR prolonged rounds by as much as 1 minute per patient. Based on direct observation, the team focused interventions on barriers to the efficient use of the report during CCRs including: the need to scroll through the report, which was not visible on 1 screen; the need to navigate between patients; the need to quickly update the report based on discussion; and the need to update discharge criteria (Figure 3).

Figure 3
Description of interventions and tests of change to address increased time to complete care coordination rounds (CCRs) postimplementation of the discharge readiness report (DRR). Abbreviations: EHR, electronic health record.

RESULTS

Creation of the final DRR required significant time and effort and was the culmination of a uniquely collaborative effort between clinicians, ancillary staff, and information technology specialists (Figure 4). The report is used consistently for all general medical and medical subspecialty patients during CCRs. After interventions were implemented to improve the efficiency of using the DRR during CCRs, the use of the DRR did not prolong CCRs. Members of the care team acknowledge that all sections of the report are populated and accurate. Though end users have commented on their use of the report outside of CCRs, we have not been able to formally measure this.

Figure 4
Time spent and key steps used to design and implement the discharge readiness report (DRR). Abbreviations: CCR, care coordination rounds; EHR, electronic health record; PDSA, Plan Do Study Act.

We have noticed a shift in the focus of discussion since implementation of the DRR. Prior to this initiative, care teams at our institution did not regularly discuss discharge criteria during bedside or CCRs. The phrase discharge criteria has now become part of our shared language.

Informally, the DRR appears to have reduced inefficiency and the potential for communication error. The practice of writing notes on printed patient lists to be used to sign‐out or communicate to other team members not in attendance at CCRs has largely disappeared.

The DRR has proven to be adaptable across patient units, and can be tailored to the specific transitional care needs of a given patient population. At discharge institution, the DRR has been modified for, and has taken on a prominent role in, the discharge planning of highly complex populations such as rehabilitation and ventilated patients.

DISCUSSION

Discharge planning is a multifaceted, multidisciplinary process that should begin at the time of hospital admission. Safe patient transition depends on efficient discharge processes and effective communication across settings.[8] Although not well studied in the inpatient setting, care process variability can result in inefficient patient flow and increased stress among staff.[9] Patients and families may experience confusion, coping difficulties, and increased readmission due to ineffective discharge planning.[10] These potential pitfalls highlight the need for healthcare providers to develop patient‐centered, systematic approaches to improving the discharge process.[11]

To our knowledge, this is the first description of a discharge planning tool for the EHR in the pediatric setting. Our discharge report is centralized, easily accessible by all members of the care team, and includes important patient‐specific discharge‐related information that be used to focus discussion and streamline multidisciplinary discharge planning rounds.

We anticipate that the report will allow the entire healthcare team to function more efficiently, decrease discharge‐related delays and failures based on communication roadblocks, and improve family and caregiver satisfaction with the discharge process. We are currently testing these hypotheses and evaluating several implementation strategies in an ongoing research study. Assuming positive impact, we plan to spread the use of the DRR to all inpatient care areas at our hospital, and potentially to other hospitals.

The limitations of this QI project are consistent with other initiatives to improve care. The challenges we encounter at our freestanding tertiary care teaching hospital with regard to effective discharge planning and multidisciplinary communication may not be generalizable to other nonteaching or community hospitals, and the DRR may not be useful in other settings. Though the report is now a part of our EHR, the most impactful implementation strategies remain to be determined. The report and related changes represent significant diversion from years of deeply ingrained workflows for some providers, and we encountered some resistance from staff during the early stages of implementation. The most important of which was that some team members are uncomfortable with technology and prefer to use paper. Most of this initial resistance was overcome by implementing changes to improve the ease of use of the report (Figure 3). Though input from end users and key stakeholders has been incorporated throughout this initiative, more work is needed to measure end user adoption and satisfaction with the report.

CONCLUSION

High‐quality hospital discharge planning requires an increasingly multidisciplinary approach. The EHR can be leveraged to improve transparency and interdisciplinary communication around the discharge process. An integrated summary of discharge‐related issues, organized into 1 highly visible and easily accessible report in the EHR has the potential to improve care transitions.

Disclosure

Nothing to report.

According to the American Academy of Pediatrics clinical report on physicians' roles in coordinating care of hospitalized children, there are several important components of hospital discharge planning.[1] Foremost is that discharge planning should begin, and discharge criteria should be set, at the time of hospital admission. This allows for optimal engagement of parents and providers in the effort to adequately prepare patients for the transition to home.

As pediatric inpatients become increasingly complex,[2] adequately preparing families for the transition to home becomes more challenging.[3] There are a myriad of issues to address and the burden of this preparation effort falls on multiple individuals other than the bedside nurse and physician. Large multidisciplinary teams often play a significant role in the discharge of medically complex children.[4] Several challenges may hinder the team's ability to effectively navigate the discharge process such as financial or insurance‐related issues, language differences, or geographic barriers. Patient and family anxieties may also complicate the transition to home.[5]

The challenges of a multidisciplinary approach to discharge planning are further magnified by the limitations of the electronic health record (EHR). The EHR is well designed to record individual encounters, but poorly designed to coordinate longitudinal care across settings.[6] Although multidisciplinary providers may spend significant and well‐intentioned energy to facilitate hospital discharge, their efforts may go unseen or be duplicative.

We developed a discharge readiness report (DRR) for the EHR, an integrated summary of discharge‐related issues, organized into a highly visible and easily accessible report. The development of the discharge planning tool was the first step in a larger quality improvement (QI) initiative aimed at improving the efficiency, effectiveness, and safety of hospital discharge. Our team recognized that improving the flow and visibility of information between disciplines was the first step toward accomplishing this larger aim. Health information technology offers an important opportunity for the improvement of patient safety and care transitions7; therefore, we leveraged the EHR to create an integrated discharge report. We used QI methods to understand our hospital's discharge processes, examined potential pitfalls in interdisciplinary communication, determined relevant information to include in the report, and optimized ways to display the data. To our knowledge, this use of the EHR is novel. The objectives of this article were to describe our team's development and implementation strategies, as well as challenges encountered, in the design of this electronic discharge planning tool.

METHODS

Setting

Children's Hospital Colorado is a 413‐bed freestanding tertiary care teaching hospital with over 13,000 inpatient admissions annually and an average patient length of stay of 5.7 days. We were the first children's hospital to fully implement a single EHR (Epic Systems, Madison, WI) in 2006. This discharge improvement initiative emerged from our hospital's involvement in the Children's Hospital Association Discharge Collaborative between October 2011 and October 2012. We were 1 of 12 participating hospitals and developed several different projects within the framework of the initiative.

Improvement Team

Our multidisciplinary project team included hospitalist physicians, case managers, social workers, respiratory therapists, pharmacists, medical interpreters, process improvement specialists, clinical application specialists whose daily role is management of our hospital's EHR software, and resident liaisons whose daily role is working with residents to facilitate care coordination.

Ethics

The project was determined to be QI work by the Children's Hospital Colorado Organizational Research Risk and Quality Improvement Review Panel.

Understanding the Problem

To understand the perspectives of each discipline involved in discharge planning, the lead hospitalist physician and a process improvement specialist interviewed key representatives from each group. Key informant interviews were conducted with hospitalist physicians, case managers, nurses, social workers, resident liaisons, respiratory therapists, pharmacists, medical interpreters, and residents. We inquired about their informational needs, their methods for obtaining relevant information, and whether the information was currently documented in the EHR. We then used process mapping to learn each disciplines' workflow related to discharge planning. Finally, we gathered key stakeholders together for a group session where discharge planning was mapped using the example of a patient admitted with asthma. From this session, we created a detailed multidisciplinary swim lane process map, a flowchart displaying the sequence of events in the overall discharge process grouped visually by placing the events in lanes. Each lane represented a discipline involved in patient discharge, and the arrows between lanes showed how information is passed between the various disciplines. Using this diagram, the team was able to fully understand provider interdependence in discharge planning and longitudinal timing of discharge‐related tasks during the patient's hospitalization.

We learned that: (1) discharge planning is complex, and there were often multiple provider types involved in the discharge of a single patient; (2) communication and coordination between the multitude of providers was often suboptimal; and (3) many of the tasks related to discharge were left to the last minute, resulting in unnecessary delays. Underlying these problems was a clear lack of organized and visible discharge planning information within the EHR.

There were many examples of obscure and siloed discharge processes. Physicians were aware of discharge criteria, but did not document these criteria for others to see. Case management assessments of home health needs were conveyed verbally to other team members, creating the potential for omissions, mistakes, or delays in appropriate home health planning. Social workers helped families to navigate financial hurdles (eg, assistance with payments for prescription medications). However, the presence of financial or insurance problems was not readily apparent to front‐line clinicians making discharge decisions. Other factors with potential significance for discharge planning, such as English‐language proficiency or a family's geographic distance from the hospital, were buried in disparate flow sheets or reports and not available or apparent to all health team members. There were also clear examples of discharge‐related tasks occurring at the end of hospitalization that could easily have been completed earlier in the admission such as identifying a primary care provider (PCP), scheduling follow‐up appointments, and completing work/subhool excuses because of lack of care team awareness that these items were needed.

Planning the Intervention

Based on our learning, we developed a key driver diagram (Figure 1). Our aim was to create a DRR that organized important discharge‐related information into 1 easily accessible report. Key drivers that were identified as relevant to the content of the DRR included: barriers to discharge, discharge criteria, home care, postdischarge care, and last minute delays. We also identified secondary drivers related to the design of the DRR. We hypothesized that addressing the secondary drivers would be essential to end user adoption of the tool. The secondary drivers included: accessibility, relevance, ease of updating, automation, and readability.

Figure 1
Key driver diagram. This improvement tool is read from left to right and begins with our aim (left side of the diagram). We used key informant interviews and process mapping to identify the key drivers (center) that affect this aim. We then brainstormed changes or interventions (right side) to address the key drivers. Abbreviations: DRR, discharge readiness report; EHR, electronic health record; PCP primary care provider.

With the swim lane diagram as well as our primary and secondary drivers in mind, we created a mock DRR on paper. We conducted multiple patient discharge simulations with representatives from all disciplines, walking through each step of a patient hospitalization from registration to discharge. This allowed us to map out how preexisting, yet disparate, EHR data could be channeled into 1 report. A few changes were made to processes involving data collection and documentation to facilitate timely transfer of information to the report. For example, questions addressing potential barriers to discharge and whether a school/work excuse was needed were added to the admission nursing assessment.

We then moved the paper DRR to the electronic environment. Data elements that were pulled automatically into the report included: potential barriers to discharge collected during nursing intake, case management information on home care needs, discharge criteria entered by resident and attending physicians, PCP, home pharmacy, follow‐up appointments, school/work excuse information gathered by resident liaisons, and active patient problems drawn from the problem list section. These data were organized into 4 distinct domains within the final DRR: potential barriers, transitional care, home care, and discharge criteria (Table 1).

Discharge Readiness Report Domains
Discharge Readiness Report Domain Example Content
  • NOTE: Abbreviations: PCP, primary care provider.

Potential barriers to discharge Geographic location of the family, whether patient lives in more than 1 household, primary spoken language, financial or insurance concern, and need for work/subhool excuses
Transitional care PCP and home pharmacy information, follow‐up ambulatory and imaging appointments, and care team communications with the PCP
Home care Planned discharge date/time and home care needs assessments such as needs for special equipment or skilled home nursing
Discharge criteria Clinical, social, or other care coordination conditions for discharge

Additional features potentially important to discharge planning were also incorporated into the report based on end user feedback. These included hyperlinks to discharge orders, home oxygen prescriptions, and the after‐visit summary for families, and the patient's home care company (if present). To facilitate discharge and transitional care related communication between the primary team and subspecialty teams, consults involved during the hospitalization were included on the report. As home care arrangements often involve care for active lines and drains, they were added to the report (Figure 2).

Figure 2
Discharge readiness report. Illustration of the report in the electronic health record. Abbreviations: GI, gastrointestinal; GT, gastrostomy tube; ID, infectious disease; O2, oxygen; PCP, primary care provider; PIV, peripheral venous catheter; RLL, right lowerlobe; RMY, Rocky Mountain Youth; RT, respiratory therapy; UTI, urinary tract infection. © 2014 Epic Systems Corporation. Used with permission.

Implementation

The report was activated within the EHR in June 2012. The team focused initial promotion and education efforts on medical floors. Education was widely disseminated via email and in‐person presentations.

The DRR was incorporated into daily CCRs for medical patients in July 2012. These multidisciplinary rounds occurred after medical‐team bedside rounds, focusing on care coordination and discharge planning. For each patient discussed, the DRR was projected onto a large screen, allowing all team members to view and discuss relevant discharge information. A process improvement (PI) specialist attended CCRs daily for several months, educating participants and monitoring use of the DRR. The PI specialist solicited feedback on ways to improve the DRR, and timed rounds to measure whether use of the DRR prolonged CCRs.

In the first weeks postimplementation, the use of the DRR prolonged rounds by as much as 1 minute per patient. Based on direct observation, the team focused interventions on barriers to the efficient use of the report during CCRs including: the need to scroll through the report, which was not visible on 1 screen; the need to navigate between patients; the need to quickly update the report based on discussion; and the need to update discharge criteria (Figure 3).

Figure 3
Description of interventions and tests of change to address increased time to complete care coordination rounds (CCRs) postimplementation of the discharge readiness report (DRR). Abbreviations: EHR, electronic health record.

RESULTS

Creation of the final DRR required significant time and effort and was the culmination of a uniquely collaborative effort between clinicians, ancillary staff, and information technology specialists (Figure 4). The report is used consistently for all general medical and medical subspecialty patients during CCRs. After interventions were implemented to improve the efficiency of using the DRR during CCRs, the use of the DRR did not prolong CCRs. Members of the care team acknowledge that all sections of the report are populated and accurate. Though end users have commented on their use of the report outside of CCRs, we have not been able to formally measure this.

Figure 4
Time spent and key steps used to design and implement the discharge readiness report (DRR). Abbreviations: CCR, care coordination rounds; EHR, electronic health record; PDSA, Plan Do Study Act.

We have noticed a shift in the focus of discussion since implementation of the DRR. Prior to this initiative, care teams at our institution did not regularly discuss discharge criteria during bedside or CCRs. The phrase discharge criteria has now become part of our shared language.

Informally, the DRR appears to have reduced inefficiency and the potential for communication error. The practice of writing notes on printed patient lists to be used to sign‐out or communicate to other team members not in attendance at CCRs has largely disappeared.

The DRR has proven to be adaptable across patient units, and can be tailored to the specific transitional care needs of a given patient population. At discharge institution, the DRR has been modified for, and has taken on a prominent role in, the discharge planning of highly complex populations such as rehabilitation and ventilated patients.

DISCUSSION

Discharge planning is a multifaceted, multidisciplinary process that should begin at the time of hospital admission. Safe patient transition depends on efficient discharge processes and effective communication across settings.[8] Although not well studied in the inpatient setting, care process variability can result in inefficient patient flow and increased stress among staff.[9] Patients and families may experience confusion, coping difficulties, and increased readmission due to ineffective discharge planning.[10] These potential pitfalls highlight the need for healthcare providers to develop patient‐centered, systematic approaches to improving the discharge process.[11]

To our knowledge, this is the first description of a discharge planning tool for the EHR in the pediatric setting. Our discharge report is centralized, easily accessible by all members of the care team, and includes important patient‐specific discharge‐related information that be used to focus discussion and streamline multidisciplinary discharge planning rounds.

We anticipate that the report will allow the entire healthcare team to function more efficiently, decrease discharge‐related delays and failures based on communication roadblocks, and improve family and caregiver satisfaction with the discharge process. We are currently testing these hypotheses and evaluating several implementation strategies in an ongoing research study. Assuming positive impact, we plan to spread the use of the DRR to all inpatient care areas at our hospital, and potentially to other hospitals.

The limitations of this QI project are consistent with other initiatives to improve care. The challenges we encounter at our freestanding tertiary care teaching hospital with regard to effective discharge planning and multidisciplinary communication may not be generalizable to other nonteaching or community hospitals, and the DRR may not be useful in other settings. Though the report is now a part of our EHR, the most impactful implementation strategies remain to be determined. The report and related changes represent significant diversion from years of deeply ingrained workflows for some providers, and we encountered some resistance from staff during the early stages of implementation. The most important of which was that some team members are uncomfortable with technology and prefer to use paper. Most of this initial resistance was overcome by implementing changes to improve the ease of use of the report (Figure 3). Though input from end users and key stakeholders has been incorporated throughout this initiative, more work is needed to measure end user adoption and satisfaction with the report.

CONCLUSION

High‐quality hospital discharge planning requires an increasingly multidisciplinary approach. The EHR can be leveraged to improve transparency and interdisciplinary communication around the discharge process. An integrated summary of discharge‐related issues, organized into 1 highly visible and easily accessible report in the EHR has the potential to improve care transitions.

Disclosure

Nothing to report.

References
  1. Lye PS. Clinical report—physicians' roles in coordinating care of hospitalized children. Pediatrics. 2010;126:829832.
  2. Burns KH, Casey PH, Lyle RE, Bird TM, Fussell JJ, Robbins JM. Increasing prevalence of medically complex children in US hospitals. Pediatrics. 2010;126:638646.
  3. Srivastava R, Stone BL, Murphy NA. Hospitalist care of the medically complex child. Pediatr Clin North Am. 2005;52:11651187, x.
  4. Bakewell‐Sachs S, Porth S. Discharge planning and home care of the technology‐dependent infant. J Obstet Gynecol Neonatal Nurs. 1995;24:7783.
  5. Proctor EK, Morrow‐Howell N, Kitchen A, Wang YT. Pediatric discharge planning: complications, efficiency, and adequacy. Soc Work Health Care. 1995;22:118.
  6. Samal L, Dykes PC, Greenberg J, et al. The current capabilities of health information technology to support care transitions. AMIA Annu Symp Proc. 2013;2013:1231.
  7. Walsh C, Siegler EL, Cheston E, et al. Provider‐to‐provider electronic communication in the era of meaningful use: a review of the evidence. J Hosp Med. 2013;8:589597.
  8. Kripalani S, Jackson AT, Schnipper JL, Coleman EA. Promoting effective transitions of care at hospital discharge: a review of key issues for hospitalists. J Hosp Med. 2007;2:314323.
  9. Kyriacou DN, Ricketts V, Dyne PL, McCollough MD, Talan DA. A 5‐year time study analysis of emergency department patient care efficiency. Ann Emerg Med. 1999;34:326335.
  10. Horwitz LI, Moriarty JP, Chen C, et al. Quality of discharge practices and patient understanding at an academic medical center. JAMA Intern Med. 2013;173(18):17151722.
  11. Tsilimingras D, Bates DW. Addressing postdischarge adverse events: a neglected area. Jt Comm J Qual Patient Saf. 2008;34:8597.
References
  1. Lye PS. Clinical report—physicians' roles in coordinating care of hospitalized children. Pediatrics. 2010;126:829832.
  2. Burns KH, Casey PH, Lyle RE, Bird TM, Fussell JJ, Robbins JM. Increasing prevalence of medically complex children in US hospitals. Pediatrics. 2010;126:638646.
  3. Srivastava R, Stone BL, Murphy NA. Hospitalist care of the medically complex child. Pediatr Clin North Am. 2005;52:11651187, x.
  4. Bakewell‐Sachs S, Porth S. Discharge planning and home care of the technology‐dependent infant. J Obstet Gynecol Neonatal Nurs. 1995;24:7783.
  5. Proctor EK, Morrow‐Howell N, Kitchen A, Wang YT. Pediatric discharge planning: complications, efficiency, and adequacy. Soc Work Health Care. 1995;22:118.
  6. Samal L, Dykes PC, Greenberg J, et al. The current capabilities of health information technology to support care transitions. AMIA Annu Symp Proc. 2013;2013:1231.
  7. Walsh C, Siegler EL, Cheston E, et al. Provider‐to‐provider electronic communication in the era of meaningful use: a review of the evidence. J Hosp Med. 2013;8:589597.
  8. Kripalani S, Jackson AT, Schnipper JL, Coleman EA. Promoting effective transitions of care at hospital discharge: a review of key issues for hospitalists. J Hosp Med. 2007;2:314323.
  9. Kyriacou DN, Ricketts V, Dyne PL, McCollough MD, Talan DA. A 5‐year time study analysis of emergency department patient care efficiency. Ann Emerg Med. 1999;34:326335.
  10. Horwitz LI, Moriarty JP, Chen C, et al. Quality of discharge practices and patient understanding at an academic medical center. JAMA Intern Med. 2013;173(18):17151722.
  11. Tsilimingras D, Bates DW. Addressing postdischarge adverse events: a neglected area. Jt Comm J Qual Patient Saf. 2008;34:8597.
Issue
Journal of Hospital Medicine - 9(8)
Issue
Journal of Hospital Medicine - 9(8)
Page Number
533-539
Page Number
533-539
Publications
Publications
Article Type
Display Headline
Development of a discharge readiness report within the electronic health record—A discharge planning tool
Display Headline
Development of a discharge readiness report within the electronic health record—A discharge planning tool
Sections
Article Source
© 2014 Society of Hospital Medicine
Disallow All Ads
Correspondence Location
Address for correspondence and reprint requests: Amy Tyler, MD, 13123 East 16th Avenue, Mail Stop 302, Anschutz Medical Campus, Aurora, CO 80045; Telephone: 720‐777‐2794; Fax: 720‐777‐7873; E‐mail: amy.tyler@childrenscolorado.org
Content Gating
Gated (full article locked unless allowed per User)
Gating Strategy
First Peek Free
Article PDF Media
Media Files