Affiliations
Department of Medicine, Northwestern University, Chicago, Illinois
Email
dwayne@northwestern.edu
Given name(s)
Diane B.
Family name
Wayne
Degrees
MD

Thoracentesis Referral

Article Type
Changed
Mon, 01/02/2017 - 19:34
Display Headline
The effect of simulation‐based mastery learning on thoracentesis referral patterns

Internal medicine (IM) residents and hospitalist physicians commonly conduct bedside thoracenteses for both diagnostic and therapeutic purposes.[1] The American Board of Internal Medicine only requires that certification candidates understand the indications, complications, and management of thoracenteses.[2] A disconnect between clinical practice patterns and board requirements may increase patient risk because poorly trained physicians are more likely to cause complications.[3] National practice patterns show that many thoracenteses are referred to interventional radiology (IR).[4] However, research links performance of bedside procedures to reduced hospital length of stay and lower costs, without increasing risk of complications.[1, 5, 6]

Simulation‐based education offers a controlled environment where trainees improve procedural knowledge and skills without patient harm.[7] Simulation‐based mastery learning (SBML) is a rigorous form of competency‐based education that improves clinical skills and reduces iatrogenic complications and healthcare costs.[5, 6, 8] SBML also is an effective method to boost thoracentesis skills among IM residents.[9] However, there are no data to show that thoracentesis skills acquired in the simulation laboratory transfer to clinical environments and affect referral patterns.

We hypothesized that a thoracentesis SBML intervention would improve skills and increase procedural self‐confidence while reducing procedure referrals. This study aimed to (1) assess the effect of thoracentesis SBML on a cohort of IM residents' simulated skills and (2) compare traditionally trained (nonSBML‐trained) residents, SBML‐trained residents, and hospitalist physicians regarding procedure referral patterns, self‐confidence, procedure experience, and reasons for referral.

METHODS AND MATERIALS

Study Design

We surveyed physicians about thoracenteses performed on patients cared for by postgraduate year (PGY)‐2 and PGY‐3 IM residents and hospitalist physicians at Northwestern Memorial Hospital (NMH) from December 2012 to May 2015. NMH is an 896‐bed, tertiary academic medical center, located in Chicago, Illinois. A random sample of IM residents participated in a thoracentesis SBML intervention, whereas hospitalist physicians did not. We compared referral patterns, self‐confidence, procedure experience, and reasons for referral between traditionally trained residents, SBML‐trained residents, and hospitalist physicians. The Northwestern University Institutional Review Board approved this study, and all study participants provided informed consent.

At NMH, resident‐staffed services include general IM and nonintensive care subspecialty medical services. There are also 2 nonteaching floors staffed by hospitalist attending physicians without residents. Thoracenteses performed on these services can either be done at the bedside or referred to pulmonary medicine or IR. The majority of thoracenteses performed by pulmonary medicine occur at the patients' bedside, and the patients also receive a clinical consultation. IR procedures are done in the IR suite without additional clinical consultation.

Procedure

One hundred sixty residents were available for training over the study period. We randomly selected 20% of the approximately 20 PGY‐2 and PGY‐3 IM residents assigned to the NMH medicine services each month to participate in SBML thoracentesis training before their rotation. Randomly selected residents were required to undergo SBML training but were not required to participate in the study. This selection process was repeated before every rotation during the study period. This randomized wait‐list control method allowed residents to serve as controls if not initially selected for training and remain eligible for SBML training in subsequent rotations.

Intervention

The SBML intervention used a pretest/post‐test design, as described elsewhere.[9] Residents completed a clinical skills pretest on a thoracentesis simulator using a previously published 26‐item checklist.[9] Following the pretest, residents participated in 2, 1‐hour training sessions including a lecture, video, and deliberate practice on the simulator with feedback from an expert instructor. Finally, residents completed a clinical skills post‐test using the checklist within 1 week from training (but on a different day) and were required to meet or exceed an 84.3% minimum passing score (MPS). The entire training, including pre‐ and post‐tests, took approximately 3 hours to complete, and residents were given an additional 1 hour refresher training every 6 months for up to a year after original training. We compared pre‐ and post‐test checklist scores to evaluate skills improvement.

Thoracentesis Patient Identification

The NMH electronic health record (EHR) was used to identify medical service inpatients who underwent a thoracentesis during the study period. NMH clinicians must place an EHR order for procedure kits, consults, and laboratory analysis of thoracentesis fluid. We developed a real‐time query of NMH's EHR that identified all patients with electronic orders for thoracenteses and monitored this daily.

Physician Surveys

After each thoracentesis, we surveyed the PGY‐2 or PGY‐3 resident or hospitalist caring for the patient about the procedure. A research coordinator, blind to whether the resident received SBML, performed the surveys face‐to‐face on Monday to Friday during normal business hours. Residents were not considered SBML‐trained until they met or exceeded the MPS on the simulated skills checklist at post‐test. Surveys occurred on Monday for procedures performed on Friday evening through Sunday. Survey questions asked physicians about who performed the procedure, their procedural self‐confidence, and total number of thoracenteses performed in their career. For referred procedures, physicians were asked about reasons for referral including lack of confidence, work hour restrictions (residents only), and low reimbursement rates.[10] There was also an option to add other reasons.

Measurement

The thoracentesis skills checklist documented all required steps for an evidence‐based thoracentesis. Each task received equal weight (0 = done incorrectly/not done, 1 = done correctly).[9] For physician surveys, self‐confidence about performing the procedure was rated on a scale of 0 = not confident to 100 = very confident. Reasons for referral were scored on a Likert scale 1 to 5 (1 = not at all important, 5 = very important). Other reasons for referral were categorized.

Statistical Analysis

The clinical skills pre‐ and post‐test checklist scores were compared using a Wilcoxon matched pairs rank test. Physician survey data were compared between different procedure performers using the 2 test, independent t test, analysis of variance (ANOVA), or Kruskal‐Wallis test depending on data properties. Referral patterns measured by the Likert scale were averaged, and differences between physician groups were evaluated using ANOVA. Counts of other reasons for referral were compared using the 2 test. We performed all statistical analyses using IBM SPSS Statistics version 23 (IBM Corp., Armonk, NY).

RESULTS

Thoracentesis Clinical Skills

One hundred twelve (70%) residents were randomized to SBML, and all completed the protocol. Median pretest scores were 57.6% (interquartile range [IQR] 43.376.9), and final post‐test mastery scores were 96.2 (IQR 96.2100.0; P < 0.001). Twenty‐three residents (21.0%) failed to meet the MPS at initial post‐test, but met the MPS on retest after <1 hour of additional training.

Physician Surveys

The EHR query identified 474 procedures eligible for physician surveys. One hundred twenty‐two residents and 51 hospitalist physicians completed surveys for 472 procedures (99.6%); 182 patients by traditionally trained residents, 145 by SBML‐trained residents, and 145 by hospitalist physicians. As shown in Table 1, 413 (88%) of all procedures were referred to another service. Traditionally trained residents were more likely to refer to IR compared to SBML‐trained residents or hospitalist physicians. SBML‐trained residents were more likely to perform bedside procedures, whereas hospitalist physicians were most likely to refer to pulmonary medicine. SBML‐trained residents were most confident in their procedural skills, despite hospitalist physicians performing more actual procedures.

Characteristics of 472 Thoracentesis Procedures Described on Surveys of Traditionally Trained Residents, SBML‐Trained Residents, and Hospitalist Physicians
Traditionally Trained Resident Surveys, n = 182 SBML‐Trained Resident Surveys, n = 145 Hospitalist Physician Surveys, n = 145 P Value
  • NOTE: Abbreviations: IQR, interquartile range; IR, interventional radiology; SBML, simulation‐based mastery learning; SD, standard deviation. *Scale of 0 = not at all confident to 100 = very confident.

Bedside procedures, no. (%) 26 (14.3%) 32 (22.1%) 1 (0.7%) <0.001
IR procedures, no. (%) 119 (65.4%) 74 (51.0%) 82 (56.6%) 0.029
Pulmonary procedures, no. (%) 37 (20.3%) 39 (26.9%) 62 (42.8%) <0.001
Procedure self‐confidence, mean (SD)* 43.6 (28.66) 68.2 (25.17) 55.7 (31.17) <0.001
Experience performing actual procedures, median (IQR) 1 (13) 2 (13.5) 10 (425) <0.001

Traditionally trained residents were most likely to rate low confidence as reasons why they referred thoracenteses (Table 2). Hospitalist physicians were more likely to cite lack of time to perform the procedure themselves. Other reasons were different across groups. SBML‐trained residents were more likely to refer because of attending preference, whereas traditionally trained residents were mostly like to refer because of high risk/technically difficult cases.

Reasons Provided for Referral of 413 Thoracentesis Procedures Between Traditionally Trained Residents, SBML‐Trained Residents, and Hospitalist Physicians
Traditionally Trained Residents, n = 156 SBML‐Trained Residents, n = 113 Hospitalist Physicians, n = 144 P Value
  • NOTE: Abbreviations: IR, interventional radiology; SBML, simulation‐based mastery learning; SD, standard deviation. *Mean score on a 5‐point Likert scale (1 = not at all important, 5 = very important). Some expected counts are less than 5; 2 test may be invalid.

Lack of confidence to perform procedure, mean (SD)* 3.46 (1.32) 2.52 (1.45) 2.89 (1.60) <0.001
Work hour restrictions, mean (SD) * 2.05 (1.37) 1.50 (1.11) n/a 0.001
Low reimbursement, mean (SD)* 1.02 (0.12) 1.0 (0) 1.22 (0.69) <0.001
Other reasons for referral, no. (%)
Attending preference 8 (5.1%) 11 (9.7%) 3 (2.1%) 0.025
Don't know how 6 (3.8%) 0 0 0.007
Failed bedside 0 2 (1.8%) 0 0.07
High risk/technically difficult case 24 (15.4%) 12 (10.6%) 5 (3.5%) 0.003
IR or pulmonary patient 5 (3.2%) 2 (1.8%) 4 (2.8%) 0.77
Other IR procedure taking place 11 (7.1%) 9 (8.0%) 4 (2.8%) 0.13
Patient preference 2 (1.3%) 7 (6.2%) 2 (3.5%) 0.024
Time 9 (5.8%) 7 (6.2%) 29 (20.1%) <0.001

DISCUSSION

This study confirms earlier research showing that thoracentesis SBML improves residents' clinical skills, but is the first to use a randomized study design.[9] Use of the mastery model in health professions education ensures that all learners are competent to provide patient care including performing invasive procedures. Such rigorous education yields downstream translational outcomes including safety profiles comparable to experts.[1, 6]

This study also shows that SBML‐trained residents displayed higher self‐confidence and performed significantly more bedside procedures than traditionally trained residents and more experienced hospitalist physicians. Although the Society of Hospital Medicine considers thoracentesis skills a core competency for hospitalist physicians,[11] we speculate that some hospitalist physicians had not performed a thoracentesis in years. A recent national survey showed that only 44% of hospitalist physicians performed at least 1 thoracentesis within the past year.[10] Research also shows a shift in medical culture to refer procedures to specialty services, such as IR, by over 900% in the past 2 decades.[4] Our results provide novel information about procedure referrals because we show that SBML provides translational outcomes by improving skills and self‐confidence that influence referral patterns. SBML‐trained residents performed almost a quarter of procedures at the bedside. Although this only represents an 8% absolute difference in bedside procedures compared to traditionally trained residents, if a large number of residents are trained using SBML this results in a meaningful number of procedures shifted to the patient bedside. According to University HealthSystem Consortium data, in US teaching hospitals, approximately 35,325 thoracenteses are performed yearly.[1] Shifting even 8% of these procedures to the bedside would result in significant clinical benefit and cost savings. Reduced referrals increase additional bedside procedures that are safe, cost‐effective, and highly satisfying to patients.[1, 12, 13] Further study is required to determine the impact on referral patterns after providing SMBL training to attending physicians.

Our study also provides information about the rationale for procedure referrals. Earlier work speculates that financial incentive, training and time may explain high procedure referral rates.[10] One report on IM residents noted an 87% IR referral rate for thoracentesis, and confirmed that both training and time were major reasons.[14] Hospitalist physicians reported lack of time as the major factor leading to procedural referrals, which is problematic because bedside procedures yield similar clinical outcomes at lower costs.[1, 12] Attending preference also prevented 11 additional bedside procedures in the SBML‐trained group. Schedule adjustments and SBML training of hospitalist physicians should be considered, because bundled payments in the Affordable Care Act may favor shifting to the higher‐value approach of bedside thoracenteses.[15]

Our study has several limitations. First, we only performed surveys at 1 institution and the results may not be generalizable. Second, we relied on an electronic query to alert us to thoracenteses. Our query may have missed procedures that were unsuccessful or did not have EHR orders entered. Third, physicians may have been surveyed more than once for different or the same patient(s), but opinions may have shifted over time. Fourth, some items such as time needed to be written in the survey and were not specifically asked. This could have resulted in under‐reporting. Finally, we did not assess the clinical outcomes of thoracenteses in this study, although earlier work shows that residents who complete SBML have safety outcomes similar to IR.[1, 6]

In summary, IM residents who complete thoracentesis SBML demonstrate improved clinical skills and are more likely to perform bedside procedures. In an era of bundled payments, rethinking current care models to promote cost‐effective care is necessary. We believe providing additional education, training, and support to hospitalist physicians to promote bedside procedures is a promising strategy that warrants further study.

Acknowledgements

The authors acknowledge Drs. Douglas Vaughan and Kevin O'Leary for their support and encouragement of this work. The authors also thank the internal medicine residents at Northwestern for their dedication to patient care.

Disclosures: This project was supported by grant R18HS021202‐01 from the Agency for Healthcare Research and Quality (AHRQ). AHRQ had no role in the preparation, review, or approval of the manuscript. Trial Registration: ClinicalTrials.gov NCT01898247 (https://clinicaltrials.gov/ct2/show/NCT01898247?term=thoracentesis+and+simulation& rank=1). The authors report no conflicts of interest.

Files
References
  1. Kozmic SE, Wayne DB, Feinglass J, Hohmann SF, Barsuk JH. Thoracentesis procedures at university hospitals: comparing outcomes by specialty. Jt Comm J Qual Patient Saf. 2015;42(1):3440.
  2. American Board of Internal Medicine. Internal medicine policies. Available at: http://www.abim.org/certification/policies/internal‐medicine‐subspecialty‐policies/internal‐medicine.aspx. Accessed March 9, 2016.
  3. Gordon CE, Feller‐Kopman D, Balk EM, Smetana GW. Pneumothorax following thoracentesis: a systematic review and meta‐analysis. Arch Intern Med. 2010;170(4):332339.
  4. Duszak R, Chatterjee AR, Schneider DA. National fluid shifts: fifteen‐year trends in paracentesis and thoracentesis procedures. J Am Coll Radiol. 2010;7(11):859864.
  5. Barsuk JH, Cohen ER, Feinglass J, et al. Cost savings of performing paracentesis procedures at the bedside after simulation‐based education. Simul Healthc. 2014;9(5):312318.
  6. Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Clinical outcomes after bedside and interventional radiology paracentesis procedures. Am J Med. 2013;126(4):349356.
  7. Issenberg SB, McGaghie WC, Hart IR, et al. Simulation technology for health care professional skills training and assessment. JAMA. 1999;282(9):861866.
  8. Cohen ER, Feinglass J, Barsuk JH, et al. Cost savings from reduced catheter‐related bloodstream infection after simulation‐based education for residents in a medical intensive care unit. Simul Healthc. 2010;5(2):98102.
  9. Wayne DB, Barsuk JH, O'Leary KJ, Fudala MJ, McGaghie WC. Mastery learning of thoracentesis skills by internal medicine residents using simulation technology and deliberate practice. J Hosp Med. 2008;3(1):4854.
  10. Thakkar R, Wright SM, Alguire P, Wigton RS, Boonyasai RT. Procedures performed by hospitalist and non‐hospitalist general internists. J Gen Intern Med. 2010;25(5):448452.
  11. Dressler DD, Pistoria MJ, Budnitz TL, McKean SC, Amin AN. Core competencies in hospital medicine: development and methodology. J Hosp Med. 2006;1(suppl 1):4856.
  12. Barsuk JH, Feinglass J, Kozmic SE, Hohmann SF, Ganger D, Wayne DB. Specialties performing paracentesis procedures at university hospitals: implications for training and certification. J Hosp Med. 2014;9(3):162168.
  13. Barsuk JH, Kozmic SE, Scher J, Feinglass J, Hoyer A, Wayne DB. Are we providing patient‐centered care? Preferences about paracentesis and thoracentesis procedures. Patient Exp J. 2014;1(2):94103. Available at: http://pxjournal.org/cgi/viewcontent.cgi?article=1024
Article PDF
Issue
Journal of Hospital Medicine - 11(11)
Publications
Page Number
792-795
Sections
Files
Files
Article PDF
Article PDF

Internal medicine (IM) residents and hospitalist physicians commonly conduct bedside thoracenteses for both diagnostic and therapeutic purposes.[1] The American Board of Internal Medicine only requires that certification candidates understand the indications, complications, and management of thoracenteses.[2] A disconnect between clinical practice patterns and board requirements may increase patient risk because poorly trained physicians are more likely to cause complications.[3] National practice patterns show that many thoracenteses are referred to interventional radiology (IR).[4] However, research links performance of bedside procedures to reduced hospital length of stay and lower costs, without increasing risk of complications.[1, 5, 6]

Simulation‐based education offers a controlled environment where trainees improve procedural knowledge and skills without patient harm.[7] Simulation‐based mastery learning (SBML) is a rigorous form of competency‐based education that improves clinical skills and reduces iatrogenic complications and healthcare costs.[5, 6, 8] SBML also is an effective method to boost thoracentesis skills among IM residents.[9] However, there are no data to show that thoracentesis skills acquired in the simulation laboratory transfer to clinical environments and affect referral patterns.

We hypothesized that a thoracentesis SBML intervention would improve skills and increase procedural self‐confidence while reducing procedure referrals. This study aimed to (1) assess the effect of thoracentesis SBML on a cohort of IM residents' simulated skills and (2) compare traditionally trained (nonSBML‐trained) residents, SBML‐trained residents, and hospitalist physicians regarding procedure referral patterns, self‐confidence, procedure experience, and reasons for referral.

METHODS AND MATERIALS

Study Design

We surveyed physicians about thoracenteses performed on patients cared for by postgraduate year (PGY)‐2 and PGY‐3 IM residents and hospitalist physicians at Northwestern Memorial Hospital (NMH) from December 2012 to May 2015. NMH is an 896‐bed, tertiary academic medical center, located in Chicago, Illinois. A random sample of IM residents participated in a thoracentesis SBML intervention, whereas hospitalist physicians did not. We compared referral patterns, self‐confidence, procedure experience, and reasons for referral between traditionally trained residents, SBML‐trained residents, and hospitalist physicians. The Northwestern University Institutional Review Board approved this study, and all study participants provided informed consent.

At NMH, resident‐staffed services include general IM and nonintensive care subspecialty medical services. There are also 2 nonteaching floors staffed by hospitalist attending physicians without residents. Thoracenteses performed on these services can either be done at the bedside or referred to pulmonary medicine or IR. The majority of thoracenteses performed by pulmonary medicine occur at the patients' bedside, and the patients also receive a clinical consultation. IR procedures are done in the IR suite without additional clinical consultation.

Procedure

One hundred sixty residents were available for training over the study period. We randomly selected 20% of the approximately 20 PGY‐2 and PGY‐3 IM residents assigned to the NMH medicine services each month to participate in SBML thoracentesis training before their rotation. Randomly selected residents were required to undergo SBML training but were not required to participate in the study. This selection process was repeated before every rotation during the study period. This randomized wait‐list control method allowed residents to serve as controls if not initially selected for training and remain eligible for SBML training in subsequent rotations.

Intervention

The SBML intervention used a pretest/post‐test design, as described elsewhere.[9] Residents completed a clinical skills pretest on a thoracentesis simulator using a previously published 26‐item checklist.[9] Following the pretest, residents participated in 2, 1‐hour training sessions including a lecture, video, and deliberate practice on the simulator with feedback from an expert instructor. Finally, residents completed a clinical skills post‐test using the checklist within 1 week from training (but on a different day) and were required to meet or exceed an 84.3% minimum passing score (MPS). The entire training, including pre‐ and post‐tests, took approximately 3 hours to complete, and residents were given an additional 1 hour refresher training every 6 months for up to a year after original training. We compared pre‐ and post‐test checklist scores to evaluate skills improvement.

Thoracentesis Patient Identification

The NMH electronic health record (EHR) was used to identify medical service inpatients who underwent a thoracentesis during the study period. NMH clinicians must place an EHR order for procedure kits, consults, and laboratory analysis of thoracentesis fluid. We developed a real‐time query of NMH's EHR that identified all patients with electronic orders for thoracenteses and monitored this daily.

Physician Surveys

After each thoracentesis, we surveyed the PGY‐2 or PGY‐3 resident or hospitalist caring for the patient about the procedure. A research coordinator, blind to whether the resident received SBML, performed the surveys face‐to‐face on Monday to Friday during normal business hours. Residents were not considered SBML‐trained until they met or exceeded the MPS on the simulated skills checklist at post‐test. Surveys occurred on Monday for procedures performed on Friday evening through Sunday. Survey questions asked physicians about who performed the procedure, their procedural self‐confidence, and total number of thoracenteses performed in their career. For referred procedures, physicians were asked about reasons for referral including lack of confidence, work hour restrictions (residents only), and low reimbursement rates.[10] There was also an option to add other reasons.

Measurement

The thoracentesis skills checklist documented all required steps for an evidence‐based thoracentesis. Each task received equal weight (0 = done incorrectly/not done, 1 = done correctly).[9] For physician surveys, self‐confidence about performing the procedure was rated on a scale of 0 = not confident to 100 = very confident. Reasons for referral were scored on a Likert scale 1 to 5 (1 = not at all important, 5 = very important). Other reasons for referral were categorized.

Statistical Analysis

The clinical skills pre‐ and post‐test checklist scores were compared using a Wilcoxon matched pairs rank test. Physician survey data were compared between different procedure performers using the 2 test, independent t test, analysis of variance (ANOVA), or Kruskal‐Wallis test depending on data properties. Referral patterns measured by the Likert scale were averaged, and differences between physician groups were evaluated using ANOVA. Counts of other reasons for referral were compared using the 2 test. We performed all statistical analyses using IBM SPSS Statistics version 23 (IBM Corp., Armonk, NY).

RESULTS

Thoracentesis Clinical Skills

One hundred twelve (70%) residents were randomized to SBML, and all completed the protocol. Median pretest scores were 57.6% (interquartile range [IQR] 43.376.9), and final post‐test mastery scores were 96.2 (IQR 96.2100.0; P < 0.001). Twenty‐three residents (21.0%) failed to meet the MPS at initial post‐test, but met the MPS on retest after <1 hour of additional training.

Physician Surveys

The EHR query identified 474 procedures eligible for physician surveys. One hundred twenty‐two residents and 51 hospitalist physicians completed surveys for 472 procedures (99.6%); 182 patients by traditionally trained residents, 145 by SBML‐trained residents, and 145 by hospitalist physicians. As shown in Table 1, 413 (88%) of all procedures were referred to another service. Traditionally trained residents were more likely to refer to IR compared to SBML‐trained residents or hospitalist physicians. SBML‐trained residents were more likely to perform bedside procedures, whereas hospitalist physicians were most likely to refer to pulmonary medicine. SBML‐trained residents were most confident in their procedural skills, despite hospitalist physicians performing more actual procedures.

Characteristics of 472 Thoracentesis Procedures Described on Surveys of Traditionally Trained Residents, SBML‐Trained Residents, and Hospitalist Physicians
Traditionally Trained Resident Surveys, n = 182 SBML‐Trained Resident Surveys, n = 145 Hospitalist Physician Surveys, n = 145 P Value
  • NOTE: Abbreviations: IQR, interquartile range; IR, interventional radiology; SBML, simulation‐based mastery learning; SD, standard deviation. *Scale of 0 = not at all confident to 100 = very confident.

Bedside procedures, no. (%) 26 (14.3%) 32 (22.1%) 1 (0.7%) <0.001
IR procedures, no. (%) 119 (65.4%) 74 (51.0%) 82 (56.6%) 0.029
Pulmonary procedures, no. (%) 37 (20.3%) 39 (26.9%) 62 (42.8%) <0.001
Procedure self‐confidence, mean (SD)* 43.6 (28.66) 68.2 (25.17) 55.7 (31.17) <0.001
Experience performing actual procedures, median (IQR) 1 (13) 2 (13.5) 10 (425) <0.001

Traditionally trained residents were most likely to rate low confidence as reasons why they referred thoracenteses (Table 2). Hospitalist physicians were more likely to cite lack of time to perform the procedure themselves. Other reasons were different across groups. SBML‐trained residents were more likely to refer because of attending preference, whereas traditionally trained residents were mostly like to refer because of high risk/technically difficult cases.

Reasons Provided for Referral of 413 Thoracentesis Procedures Between Traditionally Trained Residents, SBML‐Trained Residents, and Hospitalist Physicians
Traditionally Trained Residents, n = 156 SBML‐Trained Residents, n = 113 Hospitalist Physicians, n = 144 P Value
  • NOTE: Abbreviations: IR, interventional radiology; SBML, simulation‐based mastery learning; SD, standard deviation. *Mean score on a 5‐point Likert scale (1 = not at all important, 5 = very important). Some expected counts are less than 5; 2 test may be invalid.

Lack of confidence to perform procedure, mean (SD)* 3.46 (1.32) 2.52 (1.45) 2.89 (1.60) <0.001
Work hour restrictions, mean (SD) * 2.05 (1.37) 1.50 (1.11) n/a 0.001
Low reimbursement, mean (SD)* 1.02 (0.12) 1.0 (0) 1.22 (0.69) <0.001
Other reasons for referral, no. (%)
Attending preference 8 (5.1%) 11 (9.7%) 3 (2.1%) 0.025
Don't know how 6 (3.8%) 0 0 0.007
Failed bedside 0 2 (1.8%) 0 0.07
High risk/technically difficult case 24 (15.4%) 12 (10.6%) 5 (3.5%) 0.003
IR or pulmonary patient 5 (3.2%) 2 (1.8%) 4 (2.8%) 0.77
Other IR procedure taking place 11 (7.1%) 9 (8.0%) 4 (2.8%) 0.13
Patient preference 2 (1.3%) 7 (6.2%) 2 (3.5%) 0.024
Time 9 (5.8%) 7 (6.2%) 29 (20.1%) <0.001

DISCUSSION

This study confirms earlier research showing that thoracentesis SBML improves residents' clinical skills, but is the first to use a randomized study design.[9] Use of the mastery model in health professions education ensures that all learners are competent to provide patient care including performing invasive procedures. Such rigorous education yields downstream translational outcomes including safety profiles comparable to experts.[1, 6]

This study also shows that SBML‐trained residents displayed higher self‐confidence and performed significantly more bedside procedures than traditionally trained residents and more experienced hospitalist physicians. Although the Society of Hospital Medicine considers thoracentesis skills a core competency for hospitalist physicians,[11] we speculate that some hospitalist physicians had not performed a thoracentesis in years. A recent national survey showed that only 44% of hospitalist physicians performed at least 1 thoracentesis within the past year.[10] Research also shows a shift in medical culture to refer procedures to specialty services, such as IR, by over 900% in the past 2 decades.[4] Our results provide novel information about procedure referrals because we show that SBML provides translational outcomes by improving skills and self‐confidence that influence referral patterns. SBML‐trained residents performed almost a quarter of procedures at the bedside. Although this only represents an 8% absolute difference in bedside procedures compared to traditionally trained residents, if a large number of residents are trained using SBML this results in a meaningful number of procedures shifted to the patient bedside. According to University HealthSystem Consortium data, in US teaching hospitals, approximately 35,325 thoracenteses are performed yearly.[1] Shifting even 8% of these procedures to the bedside would result in significant clinical benefit and cost savings. Reduced referrals increase additional bedside procedures that are safe, cost‐effective, and highly satisfying to patients.[1, 12, 13] Further study is required to determine the impact on referral patterns after providing SMBL training to attending physicians.

Our study also provides information about the rationale for procedure referrals. Earlier work speculates that financial incentive, training and time may explain high procedure referral rates.[10] One report on IM residents noted an 87% IR referral rate for thoracentesis, and confirmed that both training and time were major reasons.[14] Hospitalist physicians reported lack of time as the major factor leading to procedural referrals, which is problematic because bedside procedures yield similar clinical outcomes at lower costs.[1, 12] Attending preference also prevented 11 additional bedside procedures in the SBML‐trained group. Schedule adjustments and SBML training of hospitalist physicians should be considered, because bundled payments in the Affordable Care Act may favor shifting to the higher‐value approach of bedside thoracenteses.[15]

Our study has several limitations. First, we only performed surveys at 1 institution and the results may not be generalizable. Second, we relied on an electronic query to alert us to thoracenteses. Our query may have missed procedures that were unsuccessful or did not have EHR orders entered. Third, physicians may have been surveyed more than once for different or the same patient(s), but opinions may have shifted over time. Fourth, some items such as time needed to be written in the survey and were not specifically asked. This could have resulted in under‐reporting. Finally, we did not assess the clinical outcomes of thoracenteses in this study, although earlier work shows that residents who complete SBML have safety outcomes similar to IR.[1, 6]

In summary, IM residents who complete thoracentesis SBML demonstrate improved clinical skills and are more likely to perform bedside procedures. In an era of bundled payments, rethinking current care models to promote cost‐effective care is necessary. We believe providing additional education, training, and support to hospitalist physicians to promote bedside procedures is a promising strategy that warrants further study.

Acknowledgements

The authors acknowledge Drs. Douglas Vaughan and Kevin O'Leary for their support and encouragement of this work. The authors also thank the internal medicine residents at Northwestern for their dedication to patient care.

Disclosures: This project was supported by grant R18HS021202‐01 from the Agency for Healthcare Research and Quality (AHRQ). AHRQ had no role in the preparation, review, or approval of the manuscript. Trial Registration: ClinicalTrials.gov NCT01898247 (https://clinicaltrials.gov/ct2/show/NCT01898247?term=thoracentesis+and+simulation& rank=1). The authors report no conflicts of interest.

Internal medicine (IM) residents and hospitalist physicians commonly conduct bedside thoracenteses for both diagnostic and therapeutic purposes.[1] The American Board of Internal Medicine only requires that certification candidates understand the indications, complications, and management of thoracenteses.[2] A disconnect between clinical practice patterns and board requirements may increase patient risk because poorly trained physicians are more likely to cause complications.[3] National practice patterns show that many thoracenteses are referred to interventional radiology (IR).[4] However, research links performance of bedside procedures to reduced hospital length of stay and lower costs, without increasing risk of complications.[1, 5, 6]

Simulation‐based education offers a controlled environment where trainees improve procedural knowledge and skills without patient harm.[7] Simulation‐based mastery learning (SBML) is a rigorous form of competency‐based education that improves clinical skills and reduces iatrogenic complications and healthcare costs.[5, 6, 8] SBML also is an effective method to boost thoracentesis skills among IM residents.[9] However, there are no data to show that thoracentesis skills acquired in the simulation laboratory transfer to clinical environments and affect referral patterns.

We hypothesized that a thoracentesis SBML intervention would improve skills and increase procedural self‐confidence while reducing procedure referrals. This study aimed to (1) assess the effect of thoracentesis SBML on a cohort of IM residents' simulated skills and (2) compare traditionally trained (nonSBML‐trained) residents, SBML‐trained residents, and hospitalist physicians regarding procedure referral patterns, self‐confidence, procedure experience, and reasons for referral.

METHODS AND MATERIALS

Study Design

We surveyed physicians about thoracenteses performed on patients cared for by postgraduate year (PGY)‐2 and PGY‐3 IM residents and hospitalist physicians at Northwestern Memorial Hospital (NMH) from December 2012 to May 2015. NMH is an 896‐bed, tertiary academic medical center, located in Chicago, Illinois. A random sample of IM residents participated in a thoracentesis SBML intervention, whereas hospitalist physicians did not. We compared referral patterns, self‐confidence, procedure experience, and reasons for referral between traditionally trained residents, SBML‐trained residents, and hospitalist physicians. The Northwestern University Institutional Review Board approved this study, and all study participants provided informed consent.

At NMH, resident‐staffed services include general IM and nonintensive care subspecialty medical services. There are also 2 nonteaching floors staffed by hospitalist attending physicians without residents. Thoracenteses performed on these services can either be done at the bedside or referred to pulmonary medicine or IR. The majority of thoracenteses performed by pulmonary medicine occur at the patients' bedside, and the patients also receive a clinical consultation. IR procedures are done in the IR suite without additional clinical consultation.

Procedure

One hundred sixty residents were available for training over the study period. We randomly selected 20% of the approximately 20 PGY‐2 and PGY‐3 IM residents assigned to the NMH medicine services each month to participate in SBML thoracentesis training before their rotation. Randomly selected residents were required to undergo SBML training but were not required to participate in the study. This selection process was repeated before every rotation during the study period. This randomized wait‐list control method allowed residents to serve as controls if not initially selected for training and remain eligible for SBML training in subsequent rotations.

Intervention

The SBML intervention used a pretest/post‐test design, as described elsewhere.[9] Residents completed a clinical skills pretest on a thoracentesis simulator using a previously published 26‐item checklist.[9] Following the pretest, residents participated in 2, 1‐hour training sessions including a lecture, video, and deliberate practice on the simulator with feedback from an expert instructor. Finally, residents completed a clinical skills post‐test using the checklist within 1 week from training (but on a different day) and were required to meet or exceed an 84.3% minimum passing score (MPS). The entire training, including pre‐ and post‐tests, took approximately 3 hours to complete, and residents were given an additional 1 hour refresher training every 6 months for up to a year after original training. We compared pre‐ and post‐test checklist scores to evaluate skills improvement.

Thoracentesis Patient Identification

The NMH electronic health record (EHR) was used to identify medical service inpatients who underwent a thoracentesis during the study period. NMH clinicians must place an EHR order for procedure kits, consults, and laboratory analysis of thoracentesis fluid. We developed a real‐time query of NMH's EHR that identified all patients with electronic orders for thoracenteses and monitored this daily.

Physician Surveys

After each thoracentesis, we surveyed the PGY‐2 or PGY‐3 resident or hospitalist caring for the patient about the procedure. A research coordinator, blind to whether the resident received SBML, performed the surveys face‐to‐face on Monday to Friday during normal business hours. Residents were not considered SBML‐trained until they met or exceeded the MPS on the simulated skills checklist at post‐test. Surveys occurred on Monday for procedures performed on Friday evening through Sunday. Survey questions asked physicians about who performed the procedure, their procedural self‐confidence, and total number of thoracenteses performed in their career. For referred procedures, physicians were asked about reasons for referral including lack of confidence, work hour restrictions (residents only), and low reimbursement rates.[10] There was also an option to add other reasons.

Measurement

The thoracentesis skills checklist documented all required steps for an evidence‐based thoracentesis. Each task received equal weight (0 = done incorrectly/not done, 1 = done correctly).[9] For physician surveys, self‐confidence about performing the procedure was rated on a scale of 0 = not confident to 100 = very confident. Reasons for referral were scored on a Likert scale 1 to 5 (1 = not at all important, 5 = very important). Other reasons for referral were categorized.

Statistical Analysis

The clinical skills pre‐ and post‐test checklist scores were compared using a Wilcoxon matched pairs rank test. Physician survey data were compared between different procedure performers using the 2 test, independent t test, analysis of variance (ANOVA), or Kruskal‐Wallis test depending on data properties. Referral patterns measured by the Likert scale were averaged, and differences between physician groups were evaluated using ANOVA. Counts of other reasons for referral were compared using the 2 test. We performed all statistical analyses using IBM SPSS Statistics version 23 (IBM Corp., Armonk, NY).

RESULTS

Thoracentesis Clinical Skills

One hundred twelve (70%) residents were randomized to SBML, and all completed the protocol. Median pretest scores were 57.6% (interquartile range [IQR] 43.376.9), and final post‐test mastery scores were 96.2 (IQR 96.2100.0; P < 0.001). Twenty‐three residents (21.0%) failed to meet the MPS at initial post‐test, but met the MPS on retest after <1 hour of additional training.

Physician Surveys

The EHR query identified 474 procedures eligible for physician surveys. One hundred twenty‐two residents and 51 hospitalist physicians completed surveys for 472 procedures (99.6%); 182 patients by traditionally trained residents, 145 by SBML‐trained residents, and 145 by hospitalist physicians. As shown in Table 1, 413 (88%) of all procedures were referred to another service. Traditionally trained residents were more likely to refer to IR compared to SBML‐trained residents or hospitalist physicians. SBML‐trained residents were more likely to perform bedside procedures, whereas hospitalist physicians were most likely to refer to pulmonary medicine. SBML‐trained residents were most confident in their procedural skills, despite hospitalist physicians performing more actual procedures.

Characteristics of 472 Thoracentesis Procedures Described on Surveys of Traditionally Trained Residents, SBML‐Trained Residents, and Hospitalist Physicians
Traditionally Trained Resident Surveys, n = 182 SBML‐Trained Resident Surveys, n = 145 Hospitalist Physician Surveys, n = 145 P Value
  • NOTE: Abbreviations: IQR, interquartile range; IR, interventional radiology; SBML, simulation‐based mastery learning; SD, standard deviation. *Scale of 0 = not at all confident to 100 = very confident.

Bedside procedures, no. (%) 26 (14.3%) 32 (22.1%) 1 (0.7%) <0.001
IR procedures, no. (%) 119 (65.4%) 74 (51.0%) 82 (56.6%) 0.029
Pulmonary procedures, no. (%) 37 (20.3%) 39 (26.9%) 62 (42.8%) <0.001
Procedure self‐confidence, mean (SD)* 43.6 (28.66) 68.2 (25.17) 55.7 (31.17) <0.001
Experience performing actual procedures, median (IQR) 1 (13) 2 (13.5) 10 (425) <0.001

Traditionally trained residents were most likely to rate low confidence as reasons why they referred thoracenteses (Table 2). Hospitalist physicians were more likely to cite lack of time to perform the procedure themselves. Other reasons were different across groups. SBML‐trained residents were more likely to refer because of attending preference, whereas traditionally trained residents were mostly like to refer because of high risk/technically difficult cases.

Reasons Provided for Referral of 413 Thoracentesis Procedures Between Traditionally Trained Residents, SBML‐Trained Residents, and Hospitalist Physicians
Traditionally Trained Residents, n = 156 SBML‐Trained Residents, n = 113 Hospitalist Physicians, n = 144 P Value
  • NOTE: Abbreviations: IR, interventional radiology; SBML, simulation‐based mastery learning; SD, standard deviation. *Mean score on a 5‐point Likert scale (1 = not at all important, 5 = very important). Some expected counts are less than 5; 2 test may be invalid.

Lack of confidence to perform procedure, mean (SD)* 3.46 (1.32) 2.52 (1.45) 2.89 (1.60) <0.001
Work hour restrictions, mean (SD) * 2.05 (1.37) 1.50 (1.11) n/a 0.001
Low reimbursement, mean (SD)* 1.02 (0.12) 1.0 (0) 1.22 (0.69) <0.001
Other reasons for referral, no. (%)
Attending preference 8 (5.1%) 11 (9.7%) 3 (2.1%) 0.025
Don't know how 6 (3.8%) 0 0 0.007
Failed bedside 0 2 (1.8%) 0 0.07
High risk/technically difficult case 24 (15.4%) 12 (10.6%) 5 (3.5%) 0.003
IR or pulmonary patient 5 (3.2%) 2 (1.8%) 4 (2.8%) 0.77
Other IR procedure taking place 11 (7.1%) 9 (8.0%) 4 (2.8%) 0.13
Patient preference 2 (1.3%) 7 (6.2%) 2 (3.5%) 0.024
Time 9 (5.8%) 7 (6.2%) 29 (20.1%) <0.001

DISCUSSION

This study confirms earlier research showing that thoracentesis SBML improves residents' clinical skills, but is the first to use a randomized study design.[9] Use of the mastery model in health professions education ensures that all learners are competent to provide patient care including performing invasive procedures. Such rigorous education yields downstream translational outcomes including safety profiles comparable to experts.[1, 6]

This study also shows that SBML‐trained residents displayed higher self‐confidence and performed significantly more bedside procedures than traditionally trained residents and more experienced hospitalist physicians. Although the Society of Hospital Medicine considers thoracentesis skills a core competency for hospitalist physicians,[11] we speculate that some hospitalist physicians had not performed a thoracentesis in years. A recent national survey showed that only 44% of hospitalist physicians performed at least 1 thoracentesis within the past year.[10] Research also shows a shift in medical culture to refer procedures to specialty services, such as IR, by over 900% in the past 2 decades.[4] Our results provide novel information about procedure referrals because we show that SBML provides translational outcomes by improving skills and self‐confidence that influence referral patterns. SBML‐trained residents performed almost a quarter of procedures at the bedside. Although this only represents an 8% absolute difference in bedside procedures compared to traditionally trained residents, if a large number of residents are trained using SBML this results in a meaningful number of procedures shifted to the patient bedside. According to University HealthSystem Consortium data, in US teaching hospitals, approximately 35,325 thoracenteses are performed yearly.[1] Shifting even 8% of these procedures to the bedside would result in significant clinical benefit and cost savings. Reduced referrals increase additional bedside procedures that are safe, cost‐effective, and highly satisfying to patients.[1, 12, 13] Further study is required to determine the impact on referral patterns after providing SMBL training to attending physicians.

Our study also provides information about the rationale for procedure referrals. Earlier work speculates that financial incentive, training and time may explain high procedure referral rates.[10] One report on IM residents noted an 87% IR referral rate for thoracentesis, and confirmed that both training and time were major reasons.[14] Hospitalist physicians reported lack of time as the major factor leading to procedural referrals, which is problematic because bedside procedures yield similar clinical outcomes at lower costs.[1, 12] Attending preference also prevented 11 additional bedside procedures in the SBML‐trained group. Schedule adjustments and SBML training of hospitalist physicians should be considered, because bundled payments in the Affordable Care Act may favor shifting to the higher‐value approach of bedside thoracenteses.[15]

Our study has several limitations. First, we only performed surveys at 1 institution and the results may not be generalizable. Second, we relied on an electronic query to alert us to thoracenteses. Our query may have missed procedures that were unsuccessful or did not have EHR orders entered. Third, physicians may have been surveyed more than once for different or the same patient(s), but opinions may have shifted over time. Fourth, some items such as time needed to be written in the survey and were not specifically asked. This could have resulted in under‐reporting. Finally, we did not assess the clinical outcomes of thoracenteses in this study, although earlier work shows that residents who complete SBML have safety outcomes similar to IR.[1, 6]

In summary, IM residents who complete thoracentesis SBML demonstrate improved clinical skills and are more likely to perform bedside procedures. In an era of bundled payments, rethinking current care models to promote cost‐effective care is necessary. We believe providing additional education, training, and support to hospitalist physicians to promote bedside procedures is a promising strategy that warrants further study.

Acknowledgements

The authors acknowledge Drs. Douglas Vaughan and Kevin O'Leary for their support and encouragement of this work. The authors also thank the internal medicine residents at Northwestern for their dedication to patient care.

Disclosures: This project was supported by grant R18HS021202‐01 from the Agency for Healthcare Research and Quality (AHRQ). AHRQ had no role in the preparation, review, or approval of the manuscript. Trial Registration: ClinicalTrials.gov NCT01898247 (https://clinicaltrials.gov/ct2/show/NCT01898247?term=thoracentesis+and+simulation& rank=1). The authors report no conflicts of interest.

References
  1. Kozmic SE, Wayne DB, Feinglass J, Hohmann SF, Barsuk JH. Thoracentesis procedures at university hospitals: comparing outcomes by specialty. Jt Comm J Qual Patient Saf. 2015;42(1):3440.
  2. American Board of Internal Medicine. Internal medicine policies. Available at: http://www.abim.org/certification/policies/internal‐medicine‐subspecialty‐policies/internal‐medicine.aspx. Accessed March 9, 2016.
  3. Gordon CE, Feller‐Kopman D, Balk EM, Smetana GW. Pneumothorax following thoracentesis: a systematic review and meta‐analysis. Arch Intern Med. 2010;170(4):332339.
  4. Duszak R, Chatterjee AR, Schneider DA. National fluid shifts: fifteen‐year trends in paracentesis and thoracentesis procedures. J Am Coll Radiol. 2010;7(11):859864.
  5. Barsuk JH, Cohen ER, Feinglass J, et al. Cost savings of performing paracentesis procedures at the bedside after simulation‐based education. Simul Healthc. 2014;9(5):312318.
  6. Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Clinical outcomes after bedside and interventional radiology paracentesis procedures. Am J Med. 2013;126(4):349356.
  7. Issenberg SB, McGaghie WC, Hart IR, et al. Simulation technology for health care professional skills training and assessment. JAMA. 1999;282(9):861866.
  8. Cohen ER, Feinglass J, Barsuk JH, et al. Cost savings from reduced catheter‐related bloodstream infection after simulation‐based education for residents in a medical intensive care unit. Simul Healthc. 2010;5(2):98102.
  9. Wayne DB, Barsuk JH, O'Leary KJ, Fudala MJ, McGaghie WC. Mastery learning of thoracentesis skills by internal medicine residents using simulation technology and deliberate practice. J Hosp Med. 2008;3(1):4854.
  10. Thakkar R, Wright SM, Alguire P, Wigton RS, Boonyasai RT. Procedures performed by hospitalist and non‐hospitalist general internists. J Gen Intern Med. 2010;25(5):448452.
  11. Dressler DD, Pistoria MJ, Budnitz TL, McKean SC, Amin AN. Core competencies in hospital medicine: development and methodology. J Hosp Med. 2006;1(suppl 1):4856.
  12. Barsuk JH, Feinglass J, Kozmic SE, Hohmann SF, Ganger D, Wayne DB. Specialties performing paracentesis procedures at university hospitals: implications for training and certification. J Hosp Med. 2014;9(3):162168.
  13. Barsuk JH, Kozmic SE, Scher J, Feinglass J, Hoyer A, Wayne DB. Are we providing patient‐centered care? Preferences about paracentesis and thoracentesis procedures. Patient Exp J. 2014;1(2):94103. Available at: http://pxjournal.org/cgi/viewcontent.cgi?article=1024
References
  1. Kozmic SE, Wayne DB, Feinglass J, Hohmann SF, Barsuk JH. Thoracentesis procedures at university hospitals: comparing outcomes by specialty. Jt Comm J Qual Patient Saf. 2015;42(1):3440.
  2. American Board of Internal Medicine. Internal medicine policies. Available at: http://www.abim.org/certification/policies/internal‐medicine‐subspecialty‐policies/internal‐medicine.aspx. Accessed March 9, 2016.
  3. Gordon CE, Feller‐Kopman D, Balk EM, Smetana GW. Pneumothorax following thoracentesis: a systematic review and meta‐analysis. Arch Intern Med. 2010;170(4):332339.
  4. Duszak R, Chatterjee AR, Schneider DA. National fluid shifts: fifteen‐year trends in paracentesis and thoracentesis procedures. J Am Coll Radiol. 2010;7(11):859864.
  5. Barsuk JH, Cohen ER, Feinglass J, et al. Cost savings of performing paracentesis procedures at the bedside after simulation‐based education. Simul Healthc. 2014;9(5):312318.
  6. Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Clinical outcomes after bedside and interventional radiology paracentesis procedures. Am J Med. 2013;126(4):349356.
  7. Issenberg SB, McGaghie WC, Hart IR, et al. Simulation technology for health care professional skills training and assessment. JAMA. 1999;282(9):861866.
  8. Cohen ER, Feinglass J, Barsuk JH, et al. Cost savings from reduced catheter‐related bloodstream infection after simulation‐based education for residents in a medical intensive care unit. Simul Healthc. 2010;5(2):98102.
  9. Wayne DB, Barsuk JH, O'Leary KJ, Fudala MJ, McGaghie WC. Mastery learning of thoracentesis skills by internal medicine residents using simulation technology and deliberate practice. J Hosp Med. 2008;3(1):4854.
  10. Thakkar R, Wright SM, Alguire P, Wigton RS, Boonyasai RT. Procedures performed by hospitalist and non‐hospitalist general internists. J Gen Intern Med. 2010;25(5):448452.
  11. Dressler DD, Pistoria MJ, Budnitz TL, McKean SC, Amin AN. Core competencies in hospital medicine: development and methodology. J Hosp Med. 2006;1(suppl 1):4856.
  12. Barsuk JH, Feinglass J, Kozmic SE, Hohmann SF, Ganger D, Wayne DB. Specialties performing paracentesis procedures at university hospitals: implications for training and certification. J Hosp Med. 2014;9(3):162168.
  13. Barsuk JH, Kozmic SE, Scher J, Feinglass J, Hoyer A, Wayne DB. Are we providing patient‐centered care? Preferences about paracentesis and thoracentesis procedures. Patient Exp J. 2014;1(2):94103. Available at: http://pxjournal.org/cgi/viewcontent.cgi?article=1024
Issue
Journal of Hospital Medicine - 11(11)
Issue
Journal of Hospital Medicine - 11(11)
Page Number
792-795
Page Number
792-795
Publications
Publications
Article Type
Display Headline
The effect of simulation‐based mastery learning on thoracentesis referral patterns
Display Headline
The effect of simulation‐based mastery learning on thoracentesis referral patterns
Sections
Article Source
© 2016 Society of Hospital Medicine
Disallow All Ads
Correspondence Location
Address for correspondence and reprint requests: Jeffrey H. Barsuk, MD, Division of Hospital Medicine; 211 E. Ontario Street, Suite 717, Chicago, IL 60611; Telephone: 312‐926‐3680; Fax: 312‐926‐4588; E‐mail: jbarsuk@nm.org
Content Gating
Gated (full article locked unless allowed per User)
Gating Strategy
First Peek Free
Article PDF Media
Media Files

Specialties Performing Paracentesis

Article Type
Changed
Sun, 05/21/2017 - 14:42
Display Headline
Specialties performing paracentesis procedures at university hospitals: Implications for training and certification

Cirrhosis affects up to 3% of the population and is 1 of the 10 most common causes of death in the United States.[1, 2, 3, 4] Paracentesis procedures are frequently performed in patients with liver disease and ascites for diagnostic and/or therapeutic purposes. These procedures can be performed safely by trained clinicians at the bedside or referred to interventional radiology (IR).[2, 3, 4]

National practice patterns show that paracentesis procedures are increasingly referred to IR rather than performed at the bedside by internal medicine or gastroenterology clinicians.[5, 6, 7] In fact, a recent study of Medicare beneficiaries showed that inpatient and outpatient paracentesis procedures performed by radiologists increased by 964% from 1993 to 2008.[7] Reasons for the decline in bedside procedures include the increased availability of IR, lack of sufficient reimbursement, and the time required to perform paracentesis procedures.[5, 6, 7, 8] Surveys of internal medicine and family medicine residents and gastroenterology fellows show trainees often lack the confidence and experience needed to perform the procedure safely.[9, 10, 11] Additionally, many clinicians do not have expertise with ultrasound use and may not have access to necessary equipment.

Inconsistent certification requirements may also impact the competence and experience of physicians to perform paracentesis procedures. Internal medicine residents are no longer required by the American Board of Internal Medicine (ABIM) to demonstrate competency in procedures such as paracentesis for certification.[12] However, the Accreditation Council for Graduate Medical Education (ACGME) requirements state that internal medicine programs must offer residents the opportunity to demonstrate competence in the performance of procedures such as paracentesis, thoracentesis, and central venous catheter insertion.[13] The American Board of Family Medicine (ABFM) does not outline specific procedural competence for initial certification.[14] The ACGME states that family medicine residents must receive training to perform those clinical procedures required for their future practices but allows each program to determine which procedures to require.[15] Due to this uncertainty, practicing hospitalists are likely to have variable training and competence in bedside procedures such as paracentesis.

We previously showed that internal medicine residents rotating on the hepatology service of an academic medical center performed 59% of paracentesis procedures at the bedside.[16] These findings are in contrast to national data showing that 74% of paracentesis procedures performed on Medicare beneficiaries were performed by radiologists.[7] Practice patterns at university hospitals may not be reflected in this data because the study was limited to Medicare beneficiaries and included ambulatory patients.[7] In addition to uncertainty about who is performing this procedure in inpatient settings, little is known about the effect of specialty on postparacentesis clinical outcomes.[16, 17]

The current study had 3 aims: (1) evaluate which clinical specialties perform paracentesis procedures at university hospitals; (2) model patient characteristics associated with procedures performed at the bedside versus those referred to IR; and (3) among patients with a similar likelihood of IR referral, evaluate length of stay (LOS) and hospital costs of patients undergoing procedures performed by different specialties.

METHODS

We performed an observational administrative database review of patients who underwent paracentesis procedures in hospitals participating in the University HealthSystem Consortium (UHC) Clinical Database from January 2010 through December 2012. UHC is an alliance of 120 nonprofit academic medical centers and their 290 affiliated hospitals. UHC maintains databases containing clinical, operational, financial, and patient safety data from affiliated hospitals. Using the UHC database, we described the characteristics of all patients who underwent paracentesis procedures by clinical specialty performing the procedure. We then modeled the effects of patient characteristics on decision‐making about IR referral. Finally, among patients with a homogeneous predicted probability of IR referral, we compared LOS and direct costs by specialty performing the procedure. The Northwestern University institutional review board approved this study.

Procedure

We queried the UHC database for all patients over the age of 18 years who underwent paracentesis procedures (International Classification of Disease Revision 9 [ICD‐9] procedure code 54.91) and had at least 1 diagnosis code of liver disease (571.x). We excluded patients admitted to obstetrics. The query included patient and clinical characteristics such as admission, discharge, and procedure date; age, gender, procedure provider specialty, and intensive care unit (ICU) stay. We also obtained all ICD‐9 codes associated with the admission including obesity, severe liver disease, coagulation disorders, blood loss anemia, hyponatremia, hypotension, thrombocytopenia, liver transplant before or during the admission, awaiting liver transplant, and complications of liver transplant. We used ICD‐9 codes to calculate patients' Charlson score[18, 19] to assess severity of illness on admission.

LOS and total direct hospital costs were compared among patients with a paracentesis performed by a single clinical group and among patients with a similar predicted probability of IR referral. UHC generates direct cost estimates by applying Medicare Cost Report ratios of cost to charges with the labor cost further adjusted by the respective area wage index. Hospital costs were not available from 8.3% of UHC hospitals. We therefore based cost estimates on nonmissing data.

Paracentesis provider specialties were divided into 6 general categories: (1) IR (interventional and diagnostic radiology); (2) medicine (family medicine, general medicine, and hospital medicine); (3) subspecialty medicine (infectious disease, cardiology, nephrology, hematology/oncology, endocrinology, pulmonary, and geriatrics); (4) gastroenterology/hepatology (gastroenterology, hepatology, and transplant medicine); (5) general surgery (general surgery and transplant surgery); and (6) all other (included unclassified specialties). We present patient characteristics categorized by these specialty groups and for admissions in which multiple specialties performed procedures.

Study Design

To analyze an individual patient's likelihood of IR referral, we needed to restrict our sample to discharges where only 1 clinical specialty performed a paracentesis. Therefore, we excluded hybrid discharges with procedures performed by more than 1 specialty in a single admission as well as discharges with procedures performed by all other specialties. To compare LOS and direct cost outcomes, and to minimize selection bias among exclusively IR‐treated patients, we excluded hospitals without procedures done by both IR and medicine.

We modeled referral to IR as a function of patients' demographic and clinical variables, which we believed would affect the probability of referral. We then examined the IR referral model predicted probabilities (propensity score).[20] Finally, we examined mean differences in LOS and direct costs among discharges with a single clinical specialty group, while using the predicted probability of referral as a filter to compare these outcomes by specialty. We further tested specialty differences in LOS and direct costs controlling for demographic and clinical variables.

Statistical Analysis

To test the significance of differences between demographic and clinical characteristics of patients across specialties, we used 2 tests for categorical variables and analysis of variance or the Kruskal‐Wallis rank test for continuous variables. Random effects logistic regression, which adjusts standard errors for clustering by hospital, was used to model the likelihood of referral to IR. Independent variables included patient age, gender, obesity, coagulation disorders, blood loss anemia, hyponatremia, hypotension, thrombocytopenia, liver transplant before hospitalization, liver transplant during hospitalization, awaiting transplant, complications of liver transplant, ICU stay, Charlson score, and number of paracentesis procedures performed during the admission. Predicted probabilities derived from this IR referral model were used to investigate selection bias in our subsequent analyses of LOS and costs.[20]

We used random effects multiple linear regression to test the association of procedure specialty with hospital LOS and total direct costs, controlling for the same independent variables listed above. Analyses were conducted using both actual LOS in days and Medicare costs. We also performed a log transformation of LOS and costs to account for rightward skew. We only present actual LOS and cost results because results were virtually identical. We used SAS version 9 (SAS Institute Inc., Cary, NC) to extract data from the UHC Clinical Database. We performed all statistical analyses using Stata version 12 (StataCorp LP, College Station, TX).

RESULTS

Procedure and Discharge Level Results

There were 97,577 paracentesis procedures performed during 70,862 hospital admissions in 204 UHC hospitals during the study period. Table 1 shows specific specialty groups for each procedure. The all other category consisted of 17,558 subspecialty groups including 9,434 with specialty unknown. Twenty‐nine percent of procedures were performed in IR versus 27% by medicine, 11% by gastroenterology/hepatology, and 11% by subspecialty medicine.

Group Frequencies of 97,577 Paracentesis Procedures by Specialty Within Specialty Groups
Specialty GroupNo.%
Interventional radiology28,41429.1
Medicine26,03126.7
Family medicine1,0261.1
General medicine21,78722.3
Hospitalist3,2183.3
Subspecialty medicine10,55810.8
Infectious disease8480.9
Nephrology6150.6
Cardiology9911.0
Hematology oncology7950.8
Endocrinology3590.4
Pulmonology6,6056.8
Geriatrics3450.4
Gastroenterology/hepatology11,14311.4
Transplant medicine990.1
Hepatology8740.9
Gastroenterology10,17010.4
General surgery3,8734.0
Transplant surgery2,1462.2
General surgery1,7271.8
All other17,55818.0
Specialty unknown9,4349.7

Table 2 presents patient characteristics for 70,862 hospital discharges with paracentesis procedures grouped by whether single or multiple specialties performed procedures. Patient characteristics were significantly different across specialty groups. Medicine, subspecialty medicine, and gastroenterology/hepatology patients were younger, more likely to be male, and more likely to have severe liver disease, coagulation disorders, hypotension, and hyponatremia than IR patients.

Characteristics of Patients Discharged by Medical Specialty Groups Performing Paracentesis Procedures at University HealthSystem Consortium Hospitals (N=70,862 Discharges From 20102012 in 204 Hospitals)
 All Discharges, N=70,862Interventional Radiology, n=9,348Medicine, n=13,789Subspecialty Medicine, n=5,085Gastroenterology/Hepatology, n=6,664General Surgery, n=1,891All Other, n=7,912Discharges With Multiple Specialties, n=26,173
  • NOTE: All patient characteristic comparisons across all specialty groups, P<0.0001. Abbreviations: BMI, body mass index; SD, standard deviation.

  • International Classification of Diseases, 9th Revision codes 572.2582.8, 456.0456.2x.

Age group, y (%)        
184925.422.527.624.923.520.825.526.1
505939.839.840.939.441.540.340.038.7
606924.724.921.624.726.530.023.625.8
70+10.112.99.911.18.48.911.09.4
Male (%)65.564.267.667.565.766.665.764.2
Severe liver disease (%)a73.765.367.871.075.366.667.682.1
Obesity (BMI 40+) (%)6.36.15.35.75.15.85.27.6
Any intensive care unit stay (%)31.010.916.850.516.936.722.347.8
Coagulation disorders (%)24.314.820.229.916.119.017.833.1
Blood loss anemia (%)3.41.32.82.72.71.92.15.2
Hyponatremia (%)29.927.129.228.928.026.627.333.1
Hypotension (%)9.87.08.011.07.710.58.112.4
Thrombocytopenia (%)29.624.628.332.522.121.524.035.8
Complication of transplant (%)3.32.11.12.44.010.32.74.7
Awaiting liver transplant (%)7.66.44.05.412.816.07.88.2
Prior liver transplant (%)0.50.80.30.30.70.70.40.6
Liver transplant procedure (%)2.70.00.00.30.415.61.65.6
Mean Charlson score (SD)4.51 (2.17)4.28 (2.26)4.16 (2.17)4.72 (2.30)4.30 (1.98)4.26 (2.22)4.36 (2.30)4.84 (2.07)
Mean paracentesis procedures per discharge (SD)1.38 (0.88)1.21 (0.56)1.26 (0.66)1.30 (0.76)1.31 (0.70)1.28 (0.78)1.22 (0.61)1.58 (1.13)

IR Referral Model

We first excluded 6030/70,862 discharges (8.5%) from 59 hospitals without both IR and medicine procedures. We then further excluded 24,986/70,862 (35.3%) discharges with procedures performed by multiple specialties during the same admission. Finally, we excluded 5555/70,862 (7.8%) of discharges with procedure specialty coded as all other. Therefore, 34,291 (48.4%) discharges (43,337/97,577; 44.4% procedures) from 145 UHC hospitals with paracentesis procedures performed by a single clinical specialty group remained for the IR referral analysis sample. Among admissions with multiple specialty paracentesis performed within the same admission, 3128/26,606 admissions with any IR procedure (11.8%) had a different specialty ascribed to the first, second, or third paracentesis with a subsequent IR procedure.

Model results (Table 3) indicate that patients who were obese (odds ratio [OR]: 1.25; 95% confidence interval [CI]: 1.10‐1.43) or had a liver transplant on a prior admission (OR: 2.03; 95% CI: 1.40‐2.95) were more likely to be referred to IR. However, male patients (OR: 0.89; 95% CI: 0.83‐0.95), or patients who required an ICU stay (OR: 0.39; 95% CI: 0.36‐0.43) were less likely to have IR procedures. Other patient factors reducing the likelihood of IR referral included characteristics associated with higher severity of illness (coagulation disorders, hyponatremia, hypotension, and thrombocytopenia).

Random Effects Logistic Regression Model of Likelihood of Interventional Radiology Paracentesis (N=34,291 Discharges From 145 Hospitals)
 Odds Ratio95% CI
LowerUpper
  • NOTE: Abbreviations: BMI, body mass index; CI, confidence interval; ICU, intensive care unit.

Age group, y   
1849Reference  
50591.050.971.14
60691.121.021.22
70+1.110.991.24
Male0.890.830.95
Obesity, BMI 40+1.251.101.43
ICU care0.390.360.43
Coagulation disorders0.680.630.75
Blood loss anemia0.520.410.66
Hyponatremia0.850.800.92
Hypotension0.830.740.93
Thrombocytopenia0.940.871.01
Prior liver transplant0.080.030.23
Awaiting liver transplant0.860.760.98
Complication of liver transplant1.070.881.31
Liver transplant procedure2.031.402.95
Charlson score1.000.991.01
Number of paracentesis procedures0.900.850.95

Predicted Probabilities of IR Referral

Figure 1 presents the distribution of predicted probabilities for IR referral. Predicted probabilities were low overall, with very few patients having an equal chance of referralthe standard often used in comparative effectiveness analyses from observational data. Figure 1 indicates that IR referral probabilities were clustered in an unusual bimodal distribution. The cluster on the left, which centers around a 15% predicted probability of IR referral, consists of discharges with patient characteristics that were associated with a very low chance of an IR paracentesis. We therefore used this distribution to conduct comparative analyses of admission outcomes between clinical specialty groups, choosing to examine patients with a 20% or greater chance of IR referral.

Figure 1
Distribution of predicted probability of interventional radiology paracentesis. Discharges with paracentesis procedures from 145 University HealthSystem Consortium hospitals performed by a single specialty group (n = 34,291).

Post hoc analysis revealed that the biggest factor driving low predicted probability of IR referral was whether patients experienced an ICU stay at any time during hospitalization. Among the discharges with a predicted probability 0.2 (n=26,615 discharges), there were only 87 discharges with ICU stays (0.3%). For the discharges with predicted probability <0.2 (n=7676), 91.9% (n=7055) had an ICU admission. We therefore used a threshold of 0.2 or greater to present the most comparable LOS and direct cost differences.

LOS and Cost Comparisons by Specialty

Mean LOS and hospital direct costs by specialty for our final analysis sample can be found in Table 4; differences between specialties were significant (P<0.0001). Patients undergoing IR procedures had equivalent LOS and costs to medicine patients, but lower LOS and costs than other clinical specialty groups. Random effects linear regression showed that neither medicine nor gastroenterology/hepatology patients had significantly different LOS from IR patients, but subspecialty medicine was associated with 0.89 additional days and general surgery with 1.47 additional days (both P<0.0001; R2=0.10). In the direct cost regression model, medicine patients were associated with $1308 lower costs and gastroenterology/hepatology patients with $803 lower costs than IR patients (both P=0.0001), whereas subspecialty medicine and general surgery had higher direct costs per discharge of $1886 and $3039, respectively (both P<0.0001, R2=0.19). Older age, obesity, coagulopathy, hyponatremia, hypotension, thrombocytopenia, liver transplant status, ICU care, higher Charlson score, and higher number of paracentesis procedures performed were all significantly associated with higher LOS and hospital costs in these linear models.

Length of Stay and Total Hospital Direct Costs for Paracentesis Procedure Discharges Performed by a Single Specialty Group (Interventional Radiology Referral Probability 0.2)
 All Admissions n=26,615Interventional Radiology n=7,677Medicine n=10,413Medicine Subspecialties n=2,210Gastroenterology/ Hepatology n=5,182General Surgery n=1,133
 All Admissions n=24,408Interventional Radiology n =7,265Medicine n=8,965,Medicine Subspecialties n=2,064Gastroenterology/Hepatology n=5,031General Surgery n=1,083
  • NOTE: Length of stay and direct cost comparisons across all specialty groups, P<0.0001. Abbreviations: SD, standard deviation. Data not adjusted for patient characteristics.

  • Total costs n=8.3% missing.

Mean length of stay, d (SD)5.57 (5.63)5.20 (4.72)5.59 (5.85)6.28 (6.47)5.54 (5.31)6.67 (8.16)
Mean total direct cost, $ (SD)a11,447 (12,247)10,975 (9,723)10,517 (10,895)13,705 (16,591)12,000 (11,712)15,448 (23,807)

DISCUSSION

This study showed that internal medicine‐ and family medicine‐trained clinicians perform approximately half of the inpatient paracentesis procedures at university hospitals and their affiliates. This confirms findings from our earlier single‐institution study[16] but contrasts with previously published reports involving Medicare data. The earlier report, using Medicare claims and including ambulatory procedures, revealed that primary care physicians and gastroenterologists only performed approximately 10% of US paracentesis procedures in 2008.[7] Our findings suggest that practices are different at university hospitals, where patients with severe liver disease often seek care. Because we used the UHC database, it was not possible to determine if the clinicians who performed paracentesis procedures in this study were internal medicine or family medicine residents, fellows, or attending physicians. However, findings from our own institution show that the vast majority of bedside paracentesis procedures are performed by internal medicine residents.[16]

Our findings have implications for certification of internal medicine and family medicine trainees. In 2008, the ABIM removed the requirement that internal medicine residents demonstrate competency in paracentesis.[12] This decision was informed by a lack of standardized methods to determine procedural competency and published surveys showing that internal medicine and hospitalist physicians rarely performed bedside procedures.[5, 6] Despite this policy change, our findings show that current clinical practice at university hospitals does not reflect national practice patterns or certification requirements, because many internal medicine‐ and family medicine‐trained clinicians still perform paracentesis procedures. This is concerning because internal medicine and family medicine trainees report variable confidence, experience, expertise, and supervision regarding performance of invasive procedures.[9, 10, 21, 22, 23, 24] Furthermore, earlier research also demonstrates that graduating residents and fellows are not able to competently perform common bedside procedures such as thoracentesis, temporary hemodialysis catheter insertion, and lumbar puncture.[25, 26, 27]

The American Association for the Study of Liver Diseases (AASLD) recommends that trained clinicians perform paracentesis procedures.[3, 4] However, the AASLD provides no definition for how training should occur. Because competency in this procedure is not specifically required by the ABIM, ABFM, or ACGME, a paradoxical situation occurs in which internal medicine and family medicine residents, and internal medicine‐trained fellows and faculty continue to perform paracentesis procedures on highly complex patients, but are no longer required to be competent to do so.

In earlier research we showed that simulation‐based mastery learning (SBML) was an effective method to boost internal medicine residents' paracentesis skills.[28] In SBML, all trainees must meet or exceed a minimum passing score on a simulated procedure before performing one on an actual patient.[29] This approach improves clinical care and outcomes in procedures such as central venous catheter insertion[30, 31] and advanced cardiac life support.[32] SBML‐trained residents also performed safe paracentesis procedures with shorter hospital LOS, fewer ICU transfers, and fewer blood product transfusions than IR procedures.[16] Based on the results of this study, AASLD guidelines regarding training, and our experience with SBML, we recommend that all clinicians complete paracentesis SBML training before performing procedures on patients.

Using our propensity model we identified patient characteristics that were associated with IR referral. Patients with a liver transplant were more likely to be cared for in IR. This may be due to a belief that postoperative procedures are anatomically more complex or because surgical trainees do not commonly perform this procedure. The current study confirms findings from earlier work that obese and female patients are more likely to be referred to IR.[16] IR referral of obese patients is likely to occur because paracentesis procedures are technically more difficult. We have no explanation why female patients were more likely to be referred to IR, because most decisions appear to be discretionary. Prospective studies are needed to determine evidence‐based recommendations regarding paracentesis procedure location. Patients with more comorbidities (eg, ICU stay, awaiting liver transplant, coagulation disorders) were more likely to undergo bedside procedures. The complexity of patients undergoing bedside paracentesis procedures reinforces the need for rigorous skill assessment for clinicians who perform them because complications such as intraperitoneal bleeding can be fatal.

Finally, we showed that LOS was similar but hospital direct costs were $800 to $1300 lower for patients whose paracentesis procedure was performed by medicine or gastroenterology/hepatology compared to IR. Medical subspecialties and surgery procedures were more expensive than IR, consistent with the higher LOS seen in these groups. IR procedures add costs due to facility charges for space, personnel, and equipment.[33] At our institution, the hospital cost of an IR paracentesis in 2012 was $361. If we use this figure, and assume costs are similar across university hospitals, the resultant cost savings would be $10,257,454 (for the procedure alone) if all procedures referred to IR in this 2‐year study were instead performed at the bedside. This estimate is approximate because it does not consider factors such as cost of clinician staffing models, which may differ across UHC hospitals. As hospitals look to reduce costs, potential savings due to appropriate use of bedside and IR procedures should be considered. This is especially important because there is no evidence that the extra expense of IR procedures is justified due to improved patient outcomes.

This study has several limitations. First, this was an observational study. Although the database was large, we were limited by coding accuracy and could not control for all potential confounding factors such as Model for End‐Stage Liver Disease score,[34, 35] other specific laboratory values, amount of ascites fluid removed, or bedside procedure failures later referred to IR. However, we do know that only a small number of second, third, or fourth procedures were subsequently referred to IR after earlier ones were performed at the bedside. Additionally the UHC database does not include patient‐specific data, and therefore we could not adjust for multiple visits or procedures by the same patient. Second, we were unable to determine the level of teaching involvement at each UHC affiliated hospital. Community hospitals where attendings managed most of the patients without trainees could not be differentiated from university hospitals where trainees were involved in most patients' care. Third, we did not have specialty information for 9434 (9.7%) procedures and had to exclude these cases. We also excluded a large number of paracentesis procedures in our final outcomes analysis. However, this was necessary because we needed to perform a patient‐level analysis to ensure the propensity and outcomes models were accurate. Finally, we did not evaluate inpatient mortality or 30‐day hospital readmission rates. Mortality and readmission from complications of a paracentesis procedure are rare events.[3, 4, 36] However, mortality and hospital readmission among patients with liver disease are relatively common.[37, 38] It was impossible to link these outcomes to a paracentesis procedure without the ability to perform medical records review.

In conclusion, paracentesis procedures are performed frequently by internal medicine‐ and family medicine‐trained clinicians in university hospitals. Because of these findings regarding current practice patterns, we believe the ACGME, ABIM, and ABFM should clarify their policies to require that residents are competent to perform paracentesis procedures before performing them on patients. This may improve supervision and training for paracentesis procedures that are already occurring and possibly encourage performance of additional, less costly bedside procedures.

Acknowledgements

The authors acknowledge Drs. Douglas Vaughan and Mark Williams for their support and encouragement of this work.

Disclosure: Nothing to report.

Files
References
  1. Runyon BA. A primer on detecting cirrhosis and caring for these patients without causing harm. Int J Hepatol. 2011:801983.
  2. Lefton HB, Rosa A, Cohen M. Diagnosis and epidemiology of cirrhosis. Med Clin North Am. 2009;93(4):787799.
  3. Runyon BA; AASLD Practice Guidelines Committee. Management of adult patients with ascites due to cirrhosis: an update. Hepatology. 2009;49(6):20872107.
  4. Runyan BA; AASLD Practice Guidelines Committee. Management of adult patients with ascites due to cirrhosis: update 2012. Hepatology. 2009;49:2087–2107. Available at: http://www.aasld.org/practiceguidelines/Documents/ascitesupdate2013.pdf. Accessed October 21, 2013.
  5. Thakkar R, Wright SM, Alguire P, Wigton RS, Boonyasai RT. Procedures performed by hospitalist and non‐hospitalist general internists. J Gen Intern Med. 2010;25(5):448452.
  6. Wigton RS, Alguire P. The declining number and variety of procedures done by general internists: a resurvey of members of the American College of Physicians. Ann Intern Med. 2007;146(5):355360.
  7. Duszak R, Chatterjee AR, Schneider DA. National fluid shifts: fifteen‐year trends in paracentesis and thoracentesis procedures. J Am Coll Radiol. 2010;7(11):859864.
  8. Duffy FD, Holmboe ES. What procedures should internists do? Ann Intern Med. 2007;146(5):392393.
  9. Huang GC, Smith CC, Gordon CE, Feller‐Kopman DJ, Davis RB, Phillips RS. Beyond the comfort zone: residents assess their comfort performing inpatient medicine procedures. Am J Med. 2006;119(1)71.e17e24.
  10. Sharp LK, Wang R, Lipsky MS. Perception of competency to perform procedures and future practice intent: a national survey of family practice residents. Acad Med. 2003;78(9):926932.
  11. Guardino JM, Proctor DD, Lopez R, Carey W. Utilization of and adherence to the gastroenterology core curriculum on hepatology training during a gastrointestinal fellowship. Clin Gastroenterol Hepatol. 2008;6(6):682688.
  12. American Board of Internal Medicine. Internal medicine policies. Available at: http://www.abim.org/certification/policies/imss/im.aspx. Accessed December 21, 2013.
  13. A CGME program requirements for graduate medical education in internal medicine. Available at: http://acgme.org/acgmeweb/portals/0/PFassets/2013‐PR‐FAQ‐PIF/140_internal_medicine_07012013.pdf. Accessed December 17, 2013.
  14. American Board of Family Medicine residency requirements. Available at: https://www.theabfm.org/cert/guidelines.aspx. Accessed December 17, 2013.
  15. ACGME program requirements for graduate medical education in family medicine. Available at: http://www.acgme.org/acgmeweb/Portals/0/PFAssets/ProgramRequirements/120pr07012007.pdf. Accessed December 17, 2013.
  16. Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Clinical outcomes after bedside and interventional radiology paracentesis procedures. Am J Med. 2013;126(4):349356.
  17. Grabau CM, Crago SF, Hoff LK, et al. Performance standards for therapeutic abdominal paracentesis. Hepatology. 2004;40(2):484488.
  18. Deyo RA, Cherkin DC, Ciol MA. Adapting a clinical comorbidity index for use with ICD‐9‐CM administrative databases. J Clin Epidemiol. 1992;45(6):613619.
  19. Romano PS, Roos LL, Jollis JG. Adapting a clinical comorbidity index for use with ICD‐9‐CM administrative data: differing perspectives. J Clin Epidemiol. 1993;46(10):10751079; discussion 1081–1090.
  20. Rosenbaum PR. The role of known effects in observational studies. Biometrics. 1998;45(2):557569.
  21. Farnan JM, Petty LA, Georgitis E, et al. A systematic review: the effect of clinical supervision on patient and residency education outcomes. Acad Med. 2012;87(4):428442.
  22. Lucas BP, Asbury JK, Wang Y, et al. Impact of a bedside procedure service on general medicine inpatients: a firm‐based trial. J Hosp Med. 2007;2(3):143149.
  23. Haber LA, Lau CY, Sharpe BA, Arora VM, Farnan JM, Ranji SR. Effects of increased overnight supervision on resident education, decision‐making, and autonomy. J Hosp Med. 2012;7(8):606610.
  24. Berns JS, O'Neill WC. Performance of procedures by nephrologists and nephrology fellows at U.S. nephrology training programs. Clin J Am Soc Nephrol. 2008;3(4):941947.
  25. Wayne DB, Barsuk JH, O'Leary K, Fudala M, McGaghie WC. Mastery learning of thoracentesis skills by internal medicine residents using simulation technology and deliberate practice. J Hosp Med. 2008;3:4954.
  26. Barsuk JH, Ahya SN, Cohen ER, McGaghie WC, Wayne DB. Mastery learning of temporary dialysis catheter insertion skills by nephrology fellows using simulation technology and deliberate practice. Am J Kidney Dis. 2009;54:7076.
  27. Barsuk JH, Cohen ER, Caprio T, McGaghie WC, Simuni T, Wayne DB. Simulation‐based education with mastery learning improves residents' lumbar puncture skills. Neurology. 2012;79:132137.
  28. Barsuk JH, Cohen ER, Vozenilek JA, O'Connor LM, McGaghie WC, Wayne DB. Simulation‐based education with mastery learning improves paracentesis skills. J Grad Med Educ. 2012;4(1):2327.
  29. McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Medical education featuring mastery learning with deliberate practice can lead to better health for individuals and populations. Acad Med. 2011; 86(11):e8e9.
  30. Barsuk JH, McGaghie WC, Cohen ER, O'Leary KJ, Wayne DB. Simulation‐based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med. 2009;37(10):26972701.
  31. Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Use of simulation‐based education to reduce catheter‐related bloodstream infections. Arch Intern Med. 2009;169(15):14201423.
  32. Didwania A, McGaghie WC, Cohen ER, et al. Progress toward improving the quality of cardiac arrest medical team responses at an academic teaching hospital. J Grad Med Educ. 2011;3(2):211216.
  33. Saini S, Seltzer SE, Bramson RT, et al. Technical cost of radiologic examinations: analysis across imaging modalities. Radiology. 2000;216(1):269272.
  34. Kamath PS, Wiesner RH, Malinchoc M, et al. A model to predict survival in patients with end‐stage liver disease. Hepatology. 2001;33(2):464470.
  35. Wiesner R, Edwards E, Freeman R, et al. Model for end‐stage liver disease (MELD) and allocation of donor livers. Gastroenterology. 2003;124(1):9196.
  36. Thomsen TW, Shaffer RW, White B, Setnik GS. Videos in clinical medicine. Paracentesis. N Engl J Med. 2006;355(19):e21.
  37. Center for Disease Control and Prevention. Chronic liver disease or cirrhosis. National Hospital Discharge Survey: 2010 detailed diagnosis and procedure tables, number of first‐listed diagnoses (see ICD9‐CM code 571). Available at: http://www.cdc.gov/nchs/fastats/liverdis.htm. Accessed October 19, 2013.
  38. Berman K, Tandra S, Forssell K, et al. Incidence and predictors of 30‐day readmission among patients hospitalized for advanced liver disease. Clin Gastroenterol Hepatol. 2011;9(3):254259.
Article PDF
Issue
Journal of Hospital Medicine - 9(3)
Publications
Page Number
162-168
Sections
Files
Files
Article PDF
Article PDF

Cirrhosis affects up to 3% of the population and is 1 of the 10 most common causes of death in the United States.[1, 2, 3, 4] Paracentesis procedures are frequently performed in patients with liver disease and ascites for diagnostic and/or therapeutic purposes. These procedures can be performed safely by trained clinicians at the bedside or referred to interventional radiology (IR).[2, 3, 4]

National practice patterns show that paracentesis procedures are increasingly referred to IR rather than performed at the bedside by internal medicine or gastroenterology clinicians.[5, 6, 7] In fact, a recent study of Medicare beneficiaries showed that inpatient and outpatient paracentesis procedures performed by radiologists increased by 964% from 1993 to 2008.[7] Reasons for the decline in bedside procedures include the increased availability of IR, lack of sufficient reimbursement, and the time required to perform paracentesis procedures.[5, 6, 7, 8] Surveys of internal medicine and family medicine residents and gastroenterology fellows show trainees often lack the confidence and experience needed to perform the procedure safely.[9, 10, 11] Additionally, many clinicians do not have expertise with ultrasound use and may not have access to necessary equipment.

Inconsistent certification requirements may also impact the competence and experience of physicians to perform paracentesis procedures. Internal medicine residents are no longer required by the American Board of Internal Medicine (ABIM) to demonstrate competency in procedures such as paracentesis for certification.[12] However, the Accreditation Council for Graduate Medical Education (ACGME) requirements state that internal medicine programs must offer residents the opportunity to demonstrate competence in the performance of procedures such as paracentesis, thoracentesis, and central venous catheter insertion.[13] The American Board of Family Medicine (ABFM) does not outline specific procedural competence for initial certification.[14] The ACGME states that family medicine residents must receive training to perform those clinical procedures required for their future practices but allows each program to determine which procedures to require.[15] Due to this uncertainty, practicing hospitalists are likely to have variable training and competence in bedside procedures such as paracentesis.

We previously showed that internal medicine residents rotating on the hepatology service of an academic medical center performed 59% of paracentesis procedures at the bedside.[16] These findings are in contrast to national data showing that 74% of paracentesis procedures performed on Medicare beneficiaries were performed by radiologists.[7] Practice patterns at university hospitals may not be reflected in this data because the study was limited to Medicare beneficiaries and included ambulatory patients.[7] In addition to uncertainty about who is performing this procedure in inpatient settings, little is known about the effect of specialty on postparacentesis clinical outcomes.[16, 17]

The current study had 3 aims: (1) evaluate which clinical specialties perform paracentesis procedures at university hospitals; (2) model patient characteristics associated with procedures performed at the bedside versus those referred to IR; and (3) among patients with a similar likelihood of IR referral, evaluate length of stay (LOS) and hospital costs of patients undergoing procedures performed by different specialties.

METHODS

We performed an observational administrative database review of patients who underwent paracentesis procedures in hospitals participating in the University HealthSystem Consortium (UHC) Clinical Database from January 2010 through December 2012. UHC is an alliance of 120 nonprofit academic medical centers and their 290 affiliated hospitals. UHC maintains databases containing clinical, operational, financial, and patient safety data from affiliated hospitals. Using the UHC database, we described the characteristics of all patients who underwent paracentesis procedures by clinical specialty performing the procedure. We then modeled the effects of patient characteristics on decision‐making about IR referral. Finally, among patients with a homogeneous predicted probability of IR referral, we compared LOS and direct costs by specialty performing the procedure. The Northwestern University institutional review board approved this study.

Procedure

We queried the UHC database for all patients over the age of 18 years who underwent paracentesis procedures (International Classification of Disease Revision 9 [ICD‐9] procedure code 54.91) and had at least 1 diagnosis code of liver disease (571.x). We excluded patients admitted to obstetrics. The query included patient and clinical characteristics such as admission, discharge, and procedure date; age, gender, procedure provider specialty, and intensive care unit (ICU) stay. We also obtained all ICD‐9 codes associated with the admission including obesity, severe liver disease, coagulation disorders, blood loss anemia, hyponatremia, hypotension, thrombocytopenia, liver transplant before or during the admission, awaiting liver transplant, and complications of liver transplant. We used ICD‐9 codes to calculate patients' Charlson score[18, 19] to assess severity of illness on admission.

LOS and total direct hospital costs were compared among patients with a paracentesis performed by a single clinical group and among patients with a similar predicted probability of IR referral. UHC generates direct cost estimates by applying Medicare Cost Report ratios of cost to charges with the labor cost further adjusted by the respective area wage index. Hospital costs were not available from 8.3% of UHC hospitals. We therefore based cost estimates on nonmissing data.

Paracentesis provider specialties were divided into 6 general categories: (1) IR (interventional and diagnostic radiology); (2) medicine (family medicine, general medicine, and hospital medicine); (3) subspecialty medicine (infectious disease, cardiology, nephrology, hematology/oncology, endocrinology, pulmonary, and geriatrics); (4) gastroenterology/hepatology (gastroenterology, hepatology, and transplant medicine); (5) general surgery (general surgery and transplant surgery); and (6) all other (included unclassified specialties). We present patient characteristics categorized by these specialty groups and for admissions in which multiple specialties performed procedures.

Study Design

To analyze an individual patient's likelihood of IR referral, we needed to restrict our sample to discharges where only 1 clinical specialty performed a paracentesis. Therefore, we excluded hybrid discharges with procedures performed by more than 1 specialty in a single admission as well as discharges with procedures performed by all other specialties. To compare LOS and direct cost outcomes, and to minimize selection bias among exclusively IR‐treated patients, we excluded hospitals without procedures done by both IR and medicine.

We modeled referral to IR as a function of patients' demographic and clinical variables, which we believed would affect the probability of referral. We then examined the IR referral model predicted probabilities (propensity score).[20] Finally, we examined mean differences in LOS and direct costs among discharges with a single clinical specialty group, while using the predicted probability of referral as a filter to compare these outcomes by specialty. We further tested specialty differences in LOS and direct costs controlling for demographic and clinical variables.

Statistical Analysis

To test the significance of differences between demographic and clinical characteristics of patients across specialties, we used 2 tests for categorical variables and analysis of variance or the Kruskal‐Wallis rank test for continuous variables. Random effects logistic regression, which adjusts standard errors for clustering by hospital, was used to model the likelihood of referral to IR. Independent variables included patient age, gender, obesity, coagulation disorders, blood loss anemia, hyponatremia, hypotension, thrombocytopenia, liver transplant before hospitalization, liver transplant during hospitalization, awaiting transplant, complications of liver transplant, ICU stay, Charlson score, and number of paracentesis procedures performed during the admission. Predicted probabilities derived from this IR referral model were used to investigate selection bias in our subsequent analyses of LOS and costs.[20]

We used random effects multiple linear regression to test the association of procedure specialty with hospital LOS and total direct costs, controlling for the same independent variables listed above. Analyses were conducted using both actual LOS in days and Medicare costs. We also performed a log transformation of LOS and costs to account for rightward skew. We only present actual LOS and cost results because results were virtually identical. We used SAS version 9 (SAS Institute Inc., Cary, NC) to extract data from the UHC Clinical Database. We performed all statistical analyses using Stata version 12 (StataCorp LP, College Station, TX).

RESULTS

Procedure and Discharge Level Results

There were 97,577 paracentesis procedures performed during 70,862 hospital admissions in 204 UHC hospitals during the study period. Table 1 shows specific specialty groups for each procedure. The all other category consisted of 17,558 subspecialty groups including 9,434 with specialty unknown. Twenty‐nine percent of procedures were performed in IR versus 27% by medicine, 11% by gastroenterology/hepatology, and 11% by subspecialty medicine.

Group Frequencies of 97,577 Paracentesis Procedures by Specialty Within Specialty Groups
Specialty GroupNo.%
Interventional radiology28,41429.1
Medicine26,03126.7
Family medicine1,0261.1
General medicine21,78722.3
Hospitalist3,2183.3
Subspecialty medicine10,55810.8
Infectious disease8480.9
Nephrology6150.6
Cardiology9911.0
Hematology oncology7950.8
Endocrinology3590.4
Pulmonology6,6056.8
Geriatrics3450.4
Gastroenterology/hepatology11,14311.4
Transplant medicine990.1
Hepatology8740.9
Gastroenterology10,17010.4
General surgery3,8734.0
Transplant surgery2,1462.2
General surgery1,7271.8
All other17,55818.0
Specialty unknown9,4349.7

Table 2 presents patient characteristics for 70,862 hospital discharges with paracentesis procedures grouped by whether single or multiple specialties performed procedures. Patient characteristics were significantly different across specialty groups. Medicine, subspecialty medicine, and gastroenterology/hepatology patients were younger, more likely to be male, and more likely to have severe liver disease, coagulation disorders, hypotension, and hyponatremia than IR patients.

Characteristics of Patients Discharged by Medical Specialty Groups Performing Paracentesis Procedures at University HealthSystem Consortium Hospitals (N=70,862 Discharges From 20102012 in 204 Hospitals)
 All Discharges, N=70,862Interventional Radiology, n=9,348Medicine, n=13,789Subspecialty Medicine, n=5,085Gastroenterology/Hepatology, n=6,664General Surgery, n=1,891All Other, n=7,912Discharges With Multiple Specialties, n=26,173
  • NOTE: All patient characteristic comparisons across all specialty groups, P<0.0001. Abbreviations: BMI, body mass index; SD, standard deviation.

  • International Classification of Diseases, 9th Revision codes 572.2582.8, 456.0456.2x.

Age group, y (%)        
184925.422.527.624.923.520.825.526.1
505939.839.840.939.441.540.340.038.7
606924.724.921.624.726.530.023.625.8
70+10.112.99.911.18.48.911.09.4
Male (%)65.564.267.667.565.766.665.764.2
Severe liver disease (%)a73.765.367.871.075.366.667.682.1
Obesity (BMI 40+) (%)6.36.15.35.75.15.85.27.6
Any intensive care unit stay (%)31.010.916.850.516.936.722.347.8
Coagulation disorders (%)24.314.820.229.916.119.017.833.1
Blood loss anemia (%)3.41.32.82.72.71.92.15.2
Hyponatremia (%)29.927.129.228.928.026.627.333.1
Hypotension (%)9.87.08.011.07.710.58.112.4
Thrombocytopenia (%)29.624.628.332.522.121.524.035.8
Complication of transplant (%)3.32.11.12.44.010.32.74.7
Awaiting liver transplant (%)7.66.44.05.412.816.07.88.2
Prior liver transplant (%)0.50.80.30.30.70.70.40.6
Liver transplant procedure (%)2.70.00.00.30.415.61.65.6
Mean Charlson score (SD)4.51 (2.17)4.28 (2.26)4.16 (2.17)4.72 (2.30)4.30 (1.98)4.26 (2.22)4.36 (2.30)4.84 (2.07)
Mean paracentesis procedures per discharge (SD)1.38 (0.88)1.21 (0.56)1.26 (0.66)1.30 (0.76)1.31 (0.70)1.28 (0.78)1.22 (0.61)1.58 (1.13)

IR Referral Model

We first excluded 6030/70,862 discharges (8.5%) from 59 hospitals without both IR and medicine procedures. We then further excluded 24,986/70,862 (35.3%) discharges with procedures performed by multiple specialties during the same admission. Finally, we excluded 5555/70,862 (7.8%) of discharges with procedure specialty coded as all other. Therefore, 34,291 (48.4%) discharges (43,337/97,577; 44.4% procedures) from 145 UHC hospitals with paracentesis procedures performed by a single clinical specialty group remained for the IR referral analysis sample. Among admissions with multiple specialty paracentesis performed within the same admission, 3128/26,606 admissions with any IR procedure (11.8%) had a different specialty ascribed to the first, second, or third paracentesis with a subsequent IR procedure.

Model results (Table 3) indicate that patients who were obese (odds ratio [OR]: 1.25; 95% confidence interval [CI]: 1.10‐1.43) or had a liver transplant on a prior admission (OR: 2.03; 95% CI: 1.40‐2.95) were more likely to be referred to IR. However, male patients (OR: 0.89; 95% CI: 0.83‐0.95), or patients who required an ICU stay (OR: 0.39; 95% CI: 0.36‐0.43) were less likely to have IR procedures. Other patient factors reducing the likelihood of IR referral included characteristics associated with higher severity of illness (coagulation disorders, hyponatremia, hypotension, and thrombocytopenia).

Random Effects Logistic Regression Model of Likelihood of Interventional Radiology Paracentesis (N=34,291 Discharges From 145 Hospitals)
 Odds Ratio95% CI
LowerUpper
  • NOTE: Abbreviations: BMI, body mass index; CI, confidence interval; ICU, intensive care unit.

Age group, y   
1849Reference  
50591.050.971.14
60691.121.021.22
70+1.110.991.24
Male0.890.830.95
Obesity, BMI 40+1.251.101.43
ICU care0.390.360.43
Coagulation disorders0.680.630.75
Blood loss anemia0.520.410.66
Hyponatremia0.850.800.92
Hypotension0.830.740.93
Thrombocytopenia0.940.871.01
Prior liver transplant0.080.030.23
Awaiting liver transplant0.860.760.98
Complication of liver transplant1.070.881.31
Liver transplant procedure2.031.402.95
Charlson score1.000.991.01
Number of paracentesis procedures0.900.850.95

Predicted Probabilities of IR Referral

Figure 1 presents the distribution of predicted probabilities for IR referral. Predicted probabilities were low overall, with very few patients having an equal chance of referralthe standard often used in comparative effectiveness analyses from observational data. Figure 1 indicates that IR referral probabilities were clustered in an unusual bimodal distribution. The cluster on the left, which centers around a 15% predicted probability of IR referral, consists of discharges with patient characteristics that were associated with a very low chance of an IR paracentesis. We therefore used this distribution to conduct comparative analyses of admission outcomes between clinical specialty groups, choosing to examine patients with a 20% or greater chance of IR referral.

Figure 1
Distribution of predicted probability of interventional radiology paracentesis. Discharges with paracentesis procedures from 145 University HealthSystem Consortium hospitals performed by a single specialty group (n = 34,291).

Post hoc analysis revealed that the biggest factor driving low predicted probability of IR referral was whether patients experienced an ICU stay at any time during hospitalization. Among the discharges with a predicted probability 0.2 (n=26,615 discharges), there were only 87 discharges with ICU stays (0.3%). For the discharges with predicted probability <0.2 (n=7676), 91.9% (n=7055) had an ICU admission. We therefore used a threshold of 0.2 or greater to present the most comparable LOS and direct cost differences.

LOS and Cost Comparisons by Specialty

Mean LOS and hospital direct costs by specialty for our final analysis sample can be found in Table 4; differences between specialties were significant (P<0.0001). Patients undergoing IR procedures had equivalent LOS and costs to medicine patients, but lower LOS and costs than other clinical specialty groups. Random effects linear regression showed that neither medicine nor gastroenterology/hepatology patients had significantly different LOS from IR patients, but subspecialty medicine was associated with 0.89 additional days and general surgery with 1.47 additional days (both P<0.0001; R2=0.10). In the direct cost regression model, medicine patients were associated with $1308 lower costs and gastroenterology/hepatology patients with $803 lower costs than IR patients (both P=0.0001), whereas subspecialty medicine and general surgery had higher direct costs per discharge of $1886 and $3039, respectively (both P<0.0001, R2=0.19). Older age, obesity, coagulopathy, hyponatremia, hypotension, thrombocytopenia, liver transplant status, ICU care, higher Charlson score, and higher number of paracentesis procedures performed were all significantly associated with higher LOS and hospital costs in these linear models.

Length of Stay and Total Hospital Direct Costs for Paracentesis Procedure Discharges Performed by a Single Specialty Group (Interventional Radiology Referral Probability 0.2)
 All Admissions n=26,615Interventional Radiology n=7,677Medicine n=10,413Medicine Subspecialties n=2,210Gastroenterology/ Hepatology n=5,182General Surgery n=1,133
 All Admissions n=24,408Interventional Radiology n =7,265Medicine n=8,965,Medicine Subspecialties n=2,064Gastroenterology/Hepatology n=5,031General Surgery n=1,083
  • NOTE: Length of stay and direct cost comparisons across all specialty groups, P<0.0001. Abbreviations: SD, standard deviation. Data not adjusted for patient characteristics.

  • Total costs n=8.3% missing.

Mean length of stay, d (SD)5.57 (5.63)5.20 (4.72)5.59 (5.85)6.28 (6.47)5.54 (5.31)6.67 (8.16)
Mean total direct cost, $ (SD)a11,447 (12,247)10,975 (9,723)10,517 (10,895)13,705 (16,591)12,000 (11,712)15,448 (23,807)

DISCUSSION

This study showed that internal medicine‐ and family medicine‐trained clinicians perform approximately half of the inpatient paracentesis procedures at university hospitals and their affiliates. This confirms findings from our earlier single‐institution study[16] but contrasts with previously published reports involving Medicare data. The earlier report, using Medicare claims and including ambulatory procedures, revealed that primary care physicians and gastroenterologists only performed approximately 10% of US paracentesis procedures in 2008.[7] Our findings suggest that practices are different at university hospitals, where patients with severe liver disease often seek care. Because we used the UHC database, it was not possible to determine if the clinicians who performed paracentesis procedures in this study were internal medicine or family medicine residents, fellows, or attending physicians. However, findings from our own institution show that the vast majority of bedside paracentesis procedures are performed by internal medicine residents.[16]

Our findings have implications for certification of internal medicine and family medicine trainees. In 2008, the ABIM removed the requirement that internal medicine residents demonstrate competency in paracentesis.[12] This decision was informed by a lack of standardized methods to determine procedural competency and published surveys showing that internal medicine and hospitalist physicians rarely performed bedside procedures.[5, 6] Despite this policy change, our findings show that current clinical practice at university hospitals does not reflect national practice patterns or certification requirements, because many internal medicine‐ and family medicine‐trained clinicians still perform paracentesis procedures. This is concerning because internal medicine and family medicine trainees report variable confidence, experience, expertise, and supervision regarding performance of invasive procedures.[9, 10, 21, 22, 23, 24] Furthermore, earlier research also demonstrates that graduating residents and fellows are not able to competently perform common bedside procedures such as thoracentesis, temporary hemodialysis catheter insertion, and lumbar puncture.[25, 26, 27]

The American Association for the Study of Liver Diseases (AASLD) recommends that trained clinicians perform paracentesis procedures.[3, 4] However, the AASLD provides no definition for how training should occur. Because competency in this procedure is not specifically required by the ABIM, ABFM, or ACGME, a paradoxical situation occurs in which internal medicine and family medicine residents, and internal medicine‐trained fellows and faculty continue to perform paracentesis procedures on highly complex patients, but are no longer required to be competent to do so.

In earlier research we showed that simulation‐based mastery learning (SBML) was an effective method to boost internal medicine residents' paracentesis skills.[28] In SBML, all trainees must meet or exceed a minimum passing score on a simulated procedure before performing one on an actual patient.[29] This approach improves clinical care and outcomes in procedures such as central venous catheter insertion[30, 31] and advanced cardiac life support.[32] SBML‐trained residents also performed safe paracentesis procedures with shorter hospital LOS, fewer ICU transfers, and fewer blood product transfusions than IR procedures.[16] Based on the results of this study, AASLD guidelines regarding training, and our experience with SBML, we recommend that all clinicians complete paracentesis SBML training before performing procedures on patients.

Using our propensity model we identified patient characteristics that were associated with IR referral. Patients with a liver transplant were more likely to be cared for in IR. This may be due to a belief that postoperative procedures are anatomically more complex or because surgical trainees do not commonly perform this procedure. The current study confirms findings from earlier work that obese and female patients are more likely to be referred to IR.[16] IR referral of obese patients is likely to occur because paracentesis procedures are technically more difficult. We have no explanation why female patients were more likely to be referred to IR, because most decisions appear to be discretionary. Prospective studies are needed to determine evidence‐based recommendations regarding paracentesis procedure location. Patients with more comorbidities (eg, ICU stay, awaiting liver transplant, coagulation disorders) were more likely to undergo bedside procedures. The complexity of patients undergoing bedside paracentesis procedures reinforces the need for rigorous skill assessment for clinicians who perform them because complications such as intraperitoneal bleeding can be fatal.

Finally, we showed that LOS was similar but hospital direct costs were $800 to $1300 lower for patients whose paracentesis procedure was performed by medicine or gastroenterology/hepatology compared to IR. Medical subspecialties and surgery procedures were more expensive than IR, consistent with the higher LOS seen in these groups. IR procedures add costs due to facility charges for space, personnel, and equipment.[33] At our institution, the hospital cost of an IR paracentesis in 2012 was $361. If we use this figure, and assume costs are similar across university hospitals, the resultant cost savings would be $10,257,454 (for the procedure alone) if all procedures referred to IR in this 2‐year study were instead performed at the bedside. This estimate is approximate because it does not consider factors such as cost of clinician staffing models, which may differ across UHC hospitals. As hospitals look to reduce costs, potential savings due to appropriate use of bedside and IR procedures should be considered. This is especially important because there is no evidence that the extra expense of IR procedures is justified due to improved patient outcomes.

This study has several limitations. First, this was an observational study. Although the database was large, we were limited by coding accuracy and could not control for all potential confounding factors such as Model for End‐Stage Liver Disease score,[34, 35] other specific laboratory values, amount of ascites fluid removed, or bedside procedure failures later referred to IR. However, we do know that only a small number of second, third, or fourth procedures were subsequently referred to IR after earlier ones were performed at the bedside. Additionally the UHC database does not include patient‐specific data, and therefore we could not adjust for multiple visits or procedures by the same patient. Second, we were unable to determine the level of teaching involvement at each UHC affiliated hospital. Community hospitals where attendings managed most of the patients without trainees could not be differentiated from university hospitals where trainees were involved in most patients' care. Third, we did not have specialty information for 9434 (9.7%) procedures and had to exclude these cases. We also excluded a large number of paracentesis procedures in our final outcomes analysis. However, this was necessary because we needed to perform a patient‐level analysis to ensure the propensity and outcomes models were accurate. Finally, we did not evaluate inpatient mortality or 30‐day hospital readmission rates. Mortality and readmission from complications of a paracentesis procedure are rare events.[3, 4, 36] However, mortality and hospital readmission among patients with liver disease are relatively common.[37, 38] It was impossible to link these outcomes to a paracentesis procedure without the ability to perform medical records review.

In conclusion, paracentesis procedures are performed frequently by internal medicine‐ and family medicine‐trained clinicians in university hospitals. Because of these findings regarding current practice patterns, we believe the ACGME, ABIM, and ABFM should clarify their policies to require that residents are competent to perform paracentesis procedures before performing them on patients. This may improve supervision and training for paracentesis procedures that are already occurring and possibly encourage performance of additional, less costly bedside procedures.

Acknowledgements

The authors acknowledge Drs. Douglas Vaughan and Mark Williams for their support and encouragement of this work.

Disclosure: Nothing to report.

Cirrhosis affects up to 3% of the population and is 1 of the 10 most common causes of death in the United States.[1, 2, 3, 4] Paracentesis procedures are frequently performed in patients with liver disease and ascites for diagnostic and/or therapeutic purposes. These procedures can be performed safely by trained clinicians at the bedside or referred to interventional radiology (IR).[2, 3, 4]

National practice patterns show that paracentesis procedures are increasingly referred to IR rather than performed at the bedside by internal medicine or gastroenterology clinicians.[5, 6, 7] In fact, a recent study of Medicare beneficiaries showed that inpatient and outpatient paracentesis procedures performed by radiologists increased by 964% from 1993 to 2008.[7] Reasons for the decline in bedside procedures include the increased availability of IR, lack of sufficient reimbursement, and the time required to perform paracentesis procedures.[5, 6, 7, 8] Surveys of internal medicine and family medicine residents and gastroenterology fellows show trainees often lack the confidence and experience needed to perform the procedure safely.[9, 10, 11] Additionally, many clinicians do not have expertise with ultrasound use and may not have access to necessary equipment.

Inconsistent certification requirements may also impact the competence and experience of physicians to perform paracentesis procedures. Internal medicine residents are no longer required by the American Board of Internal Medicine (ABIM) to demonstrate competency in procedures such as paracentesis for certification.[12] However, the Accreditation Council for Graduate Medical Education (ACGME) requirements state that internal medicine programs must offer residents the opportunity to demonstrate competence in the performance of procedures such as paracentesis, thoracentesis, and central venous catheter insertion.[13] The American Board of Family Medicine (ABFM) does not outline specific procedural competence for initial certification.[14] The ACGME states that family medicine residents must receive training to perform those clinical procedures required for their future practices but allows each program to determine which procedures to require.[15] Due to this uncertainty, practicing hospitalists are likely to have variable training and competence in bedside procedures such as paracentesis.

We previously showed that internal medicine residents rotating on the hepatology service of an academic medical center performed 59% of paracentesis procedures at the bedside.[16] These findings are in contrast to national data showing that 74% of paracentesis procedures performed on Medicare beneficiaries were performed by radiologists.[7] Practice patterns at university hospitals may not be reflected in this data because the study was limited to Medicare beneficiaries and included ambulatory patients.[7] In addition to uncertainty about who is performing this procedure in inpatient settings, little is known about the effect of specialty on postparacentesis clinical outcomes.[16, 17]

The current study had 3 aims: (1) evaluate which clinical specialties perform paracentesis procedures at university hospitals; (2) model patient characteristics associated with procedures performed at the bedside versus those referred to IR; and (3) among patients with a similar likelihood of IR referral, evaluate length of stay (LOS) and hospital costs of patients undergoing procedures performed by different specialties.

METHODS

We performed an observational administrative database review of patients who underwent paracentesis procedures in hospitals participating in the University HealthSystem Consortium (UHC) Clinical Database from January 2010 through December 2012. UHC is an alliance of 120 nonprofit academic medical centers and their 290 affiliated hospitals. UHC maintains databases containing clinical, operational, financial, and patient safety data from affiliated hospitals. Using the UHC database, we described the characteristics of all patients who underwent paracentesis procedures by clinical specialty performing the procedure. We then modeled the effects of patient characteristics on decision‐making about IR referral. Finally, among patients with a homogeneous predicted probability of IR referral, we compared LOS and direct costs by specialty performing the procedure. The Northwestern University institutional review board approved this study.

Procedure

We queried the UHC database for all patients over the age of 18 years who underwent paracentesis procedures (International Classification of Disease Revision 9 [ICD‐9] procedure code 54.91) and had at least 1 diagnosis code of liver disease (571.x). We excluded patients admitted to obstetrics. The query included patient and clinical characteristics such as admission, discharge, and procedure date; age, gender, procedure provider specialty, and intensive care unit (ICU) stay. We also obtained all ICD‐9 codes associated with the admission including obesity, severe liver disease, coagulation disorders, blood loss anemia, hyponatremia, hypotension, thrombocytopenia, liver transplant before or during the admission, awaiting liver transplant, and complications of liver transplant. We used ICD‐9 codes to calculate patients' Charlson score[18, 19] to assess severity of illness on admission.

LOS and total direct hospital costs were compared among patients with a paracentesis performed by a single clinical group and among patients with a similar predicted probability of IR referral. UHC generates direct cost estimates by applying Medicare Cost Report ratios of cost to charges with the labor cost further adjusted by the respective area wage index. Hospital costs were not available from 8.3% of UHC hospitals. We therefore based cost estimates on nonmissing data.

Paracentesis provider specialties were divided into 6 general categories: (1) IR (interventional and diagnostic radiology); (2) medicine (family medicine, general medicine, and hospital medicine); (3) subspecialty medicine (infectious disease, cardiology, nephrology, hematology/oncology, endocrinology, pulmonary, and geriatrics); (4) gastroenterology/hepatology (gastroenterology, hepatology, and transplant medicine); (5) general surgery (general surgery and transplant surgery); and (6) all other (included unclassified specialties). We present patient characteristics categorized by these specialty groups and for admissions in which multiple specialties performed procedures.

Study Design

To analyze an individual patient's likelihood of IR referral, we needed to restrict our sample to discharges where only 1 clinical specialty performed a paracentesis. Therefore, we excluded hybrid discharges with procedures performed by more than 1 specialty in a single admission as well as discharges with procedures performed by all other specialties. To compare LOS and direct cost outcomes, and to minimize selection bias among exclusively IR‐treated patients, we excluded hospitals without procedures done by both IR and medicine.

We modeled referral to IR as a function of patients' demographic and clinical variables, which we believed would affect the probability of referral. We then examined the IR referral model predicted probabilities (propensity score).[20] Finally, we examined mean differences in LOS and direct costs among discharges with a single clinical specialty group, while using the predicted probability of referral as a filter to compare these outcomes by specialty. We further tested specialty differences in LOS and direct costs controlling for demographic and clinical variables.

Statistical Analysis

To test the significance of differences between demographic and clinical characteristics of patients across specialties, we used 2 tests for categorical variables and analysis of variance or the Kruskal‐Wallis rank test for continuous variables. Random effects logistic regression, which adjusts standard errors for clustering by hospital, was used to model the likelihood of referral to IR. Independent variables included patient age, gender, obesity, coagulation disorders, blood loss anemia, hyponatremia, hypotension, thrombocytopenia, liver transplant before hospitalization, liver transplant during hospitalization, awaiting transplant, complications of liver transplant, ICU stay, Charlson score, and number of paracentesis procedures performed during the admission. Predicted probabilities derived from this IR referral model were used to investigate selection bias in our subsequent analyses of LOS and costs.[20]

We used random effects multiple linear regression to test the association of procedure specialty with hospital LOS and total direct costs, controlling for the same independent variables listed above. Analyses were conducted using both actual LOS in days and Medicare costs. We also performed a log transformation of LOS and costs to account for rightward skew. We only present actual LOS and cost results because results were virtually identical. We used SAS version 9 (SAS Institute Inc., Cary, NC) to extract data from the UHC Clinical Database. We performed all statistical analyses using Stata version 12 (StataCorp LP, College Station, TX).

RESULTS

Procedure and Discharge Level Results

There were 97,577 paracentesis procedures performed during 70,862 hospital admissions in 204 UHC hospitals during the study period. Table 1 shows specific specialty groups for each procedure. The all other category consisted of 17,558 subspecialty groups including 9,434 with specialty unknown. Twenty‐nine percent of procedures were performed in IR versus 27% by medicine, 11% by gastroenterology/hepatology, and 11% by subspecialty medicine.

Group Frequencies of 97,577 Paracentesis Procedures by Specialty Within Specialty Groups
Specialty GroupNo.%
Interventional radiology28,41429.1
Medicine26,03126.7
Family medicine1,0261.1
General medicine21,78722.3
Hospitalist3,2183.3
Subspecialty medicine10,55810.8
Infectious disease8480.9
Nephrology6150.6
Cardiology9911.0
Hematology oncology7950.8
Endocrinology3590.4
Pulmonology6,6056.8
Geriatrics3450.4
Gastroenterology/hepatology11,14311.4
Transplant medicine990.1
Hepatology8740.9
Gastroenterology10,17010.4
General surgery3,8734.0
Transplant surgery2,1462.2
General surgery1,7271.8
All other17,55818.0
Specialty unknown9,4349.7

Table 2 presents patient characteristics for 70,862 hospital discharges with paracentesis procedures grouped by whether single or multiple specialties performed procedures. Patient characteristics were significantly different across specialty groups. Medicine, subspecialty medicine, and gastroenterology/hepatology patients were younger, more likely to be male, and more likely to have severe liver disease, coagulation disorders, hypotension, and hyponatremia than IR patients.

Characteristics of Patients Discharged by Medical Specialty Groups Performing Paracentesis Procedures at University HealthSystem Consortium Hospitals (N=70,862 Discharges From 20102012 in 204 Hospitals)
 All Discharges, N=70,862Interventional Radiology, n=9,348Medicine, n=13,789Subspecialty Medicine, n=5,085Gastroenterology/Hepatology, n=6,664General Surgery, n=1,891All Other, n=7,912Discharges With Multiple Specialties, n=26,173
  • NOTE: All patient characteristic comparisons across all specialty groups, P<0.0001. Abbreviations: BMI, body mass index; SD, standard deviation.

  • International Classification of Diseases, 9th Revision codes 572.2582.8, 456.0456.2x.

Age group, y (%)        
184925.422.527.624.923.520.825.526.1
505939.839.840.939.441.540.340.038.7
606924.724.921.624.726.530.023.625.8
70+10.112.99.911.18.48.911.09.4
Male (%)65.564.267.667.565.766.665.764.2
Severe liver disease (%)a73.765.367.871.075.366.667.682.1
Obesity (BMI 40+) (%)6.36.15.35.75.15.85.27.6
Any intensive care unit stay (%)31.010.916.850.516.936.722.347.8
Coagulation disorders (%)24.314.820.229.916.119.017.833.1
Blood loss anemia (%)3.41.32.82.72.71.92.15.2
Hyponatremia (%)29.927.129.228.928.026.627.333.1
Hypotension (%)9.87.08.011.07.710.58.112.4
Thrombocytopenia (%)29.624.628.332.522.121.524.035.8
Complication of transplant (%)3.32.11.12.44.010.32.74.7
Awaiting liver transplant (%)7.66.44.05.412.816.07.88.2
Prior liver transplant (%)0.50.80.30.30.70.70.40.6
Liver transplant procedure (%)2.70.00.00.30.415.61.65.6
Mean Charlson score (SD)4.51 (2.17)4.28 (2.26)4.16 (2.17)4.72 (2.30)4.30 (1.98)4.26 (2.22)4.36 (2.30)4.84 (2.07)
Mean paracentesis procedures per discharge (SD)1.38 (0.88)1.21 (0.56)1.26 (0.66)1.30 (0.76)1.31 (0.70)1.28 (0.78)1.22 (0.61)1.58 (1.13)

IR Referral Model

We first excluded 6030/70,862 discharges (8.5%) from 59 hospitals without both IR and medicine procedures. We then further excluded 24,986/70,862 (35.3%) discharges with procedures performed by multiple specialties during the same admission. Finally, we excluded 5555/70,862 (7.8%) of discharges with procedure specialty coded as all other. Therefore, 34,291 (48.4%) discharges (43,337/97,577; 44.4% procedures) from 145 UHC hospitals with paracentesis procedures performed by a single clinical specialty group remained for the IR referral analysis sample. Among admissions with multiple specialty paracentesis performed within the same admission, 3128/26,606 admissions with any IR procedure (11.8%) had a different specialty ascribed to the first, second, or third paracentesis with a subsequent IR procedure.

Model results (Table 3) indicate that patients who were obese (odds ratio [OR]: 1.25; 95% confidence interval [CI]: 1.10‐1.43) or had a liver transplant on a prior admission (OR: 2.03; 95% CI: 1.40‐2.95) were more likely to be referred to IR. However, male patients (OR: 0.89; 95% CI: 0.83‐0.95), or patients who required an ICU stay (OR: 0.39; 95% CI: 0.36‐0.43) were less likely to have IR procedures. Other patient factors reducing the likelihood of IR referral included characteristics associated with higher severity of illness (coagulation disorders, hyponatremia, hypotension, and thrombocytopenia).

Random Effects Logistic Regression Model of Likelihood of Interventional Radiology Paracentesis (N=34,291 Discharges From 145 Hospitals)
 Odds Ratio95% CI
LowerUpper
  • NOTE: Abbreviations: BMI, body mass index; CI, confidence interval; ICU, intensive care unit.

Age group, y   
1849Reference  
50591.050.971.14
60691.121.021.22
70+1.110.991.24
Male0.890.830.95
Obesity, BMI 40+1.251.101.43
ICU care0.390.360.43
Coagulation disorders0.680.630.75
Blood loss anemia0.520.410.66
Hyponatremia0.850.800.92
Hypotension0.830.740.93
Thrombocytopenia0.940.871.01
Prior liver transplant0.080.030.23
Awaiting liver transplant0.860.760.98
Complication of liver transplant1.070.881.31
Liver transplant procedure2.031.402.95
Charlson score1.000.991.01
Number of paracentesis procedures0.900.850.95

Predicted Probabilities of IR Referral

Figure 1 presents the distribution of predicted probabilities for IR referral. Predicted probabilities were low overall, with very few patients having an equal chance of referralthe standard often used in comparative effectiveness analyses from observational data. Figure 1 indicates that IR referral probabilities were clustered in an unusual bimodal distribution. The cluster on the left, which centers around a 15% predicted probability of IR referral, consists of discharges with patient characteristics that were associated with a very low chance of an IR paracentesis. We therefore used this distribution to conduct comparative analyses of admission outcomes between clinical specialty groups, choosing to examine patients with a 20% or greater chance of IR referral.

Figure 1
Distribution of predicted probability of interventional radiology paracentesis. Discharges with paracentesis procedures from 145 University HealthSystem Consortium hospitals performed by a single specialty group (n = 34,291).

Post hoc analysis revealed that the biggest factor driving low predicted probability of IR referral was whether patients experienced an ICU stay at any time during hospitalization. Among the discharges with a predicted probability 0.2 (n=26,615 discharges), there were only 87 discharges with ICU stays (0.3%). For the discharges with predicted probability <0.2 (n=7676), 91.9% (n=7055) had an ICU admission. We therefore used a threshold of 0.2 or greater to present the most comparable LOS and direct cost differences.

LOS and Cost Comparisons by Specialty

Mean LOS and hospital direct costs by specialty for our final analysis sample can be found in Table 4; differences between specialties were significant (P<0.0001). Patients undergoing IR procedures had equivalent LOS and costs to medicine patients, but lower LOS and costs than other clinical specialty groups. Random effects linear regression showed that neither medicine nor gastroenterology/hepatology patients had significantly different LOS from IR patients, but subspecialty medicine was associated with 0.89 additional days and general surgery with 1.47 additional days (both P<0.0001; R2=0.10). In the direct cost regression model, medicine patients were associated with $1308 lower costs and gastroenterology/hepatology patients with $803 lower costs than IR patients (both P=0.0001), whereas subspecialty medicine and general surgery had higher direct costs per discharge of $1886 and $3039, respectively (both P<0.0001, R2=0.19). Older age, obesity, coagulopathy, hyponatremia, hypotension, thrombocytopenia, liver transplant status, ICU care, higher Charlson score, and higher number of paracentesis procedures performed were all significantly associated with higher LOS and hospital costs in these linear models.

Length of Stay and Total Hospital Direct Costs for Paracentesis Procedure Discharges Performed by a Single Specialty Group (Interventional Radiology Referral Probability 0.2)
 All Admissions n=26,615Interventional Radiology n=7,677Medicine n=10,413Medicine Subspecialties n=2,210Gastroenterology/ Hepatology n=5,182General Surgery n=1,133
 All Admissions n=24,408Interventional Radiology n =7,265Medicine n=8,965,Medicine Subspecialties n=2,064Gastroenterology/Hepatology n=5,031General Surgery n=1,083
  • NOTE: Length of stay and direct cost comparisons across all specialty groups, P<0.0001. Abbreviations: SD, standard deviation. Data not adjusted for patient characteristics.

  • Total costs n=8.3% missing.

Mean length of stay, d (SD)5.57 (5.63)5.20 (4.72)5.59 (5.85)6.28 (6.47)5.54 (5.31)6.67 (8.16)
Mean total direct cost, $ (SD)a11,447 (12,247)10,975 (9,723)10,517 (10,895)13,705 (16,591)12,000 (11,712)15,448 (23,807)

DISCUSSION

This study showed that internal medicine‐ and family medicine‐trained clinicians perform approximately half of the inpatient paracentesis procedures at university hospitals and their affiliates. This confirms findings from our earlier single‐institution study[16] but contrasts with previously published reports involving Medicare data. The earlier report, using Medicare claims and including ambulatory procedures, revealed that primary care physicians and gastroenterologists only performed approximately 10% of US paracentesis procedures in 2008.[7] Our findings suggest that practices are different at university hospitals, where patients with severe liver disease often seek care. Because we used the UHC database, it was not possible to determine if the clinicians who performed paracentesis procedures in this study were internal medicine or family medicine residents, fellows, or attending physicians. However, findings from our own institution show that the vast majority of bedside paracentesis procedures are performed by internal medicine residents.[16]

Our findings have implications for certification of internal medicine and family medicine trainees. In 2008, the ABIM removed the requirement that internal medicine residents demonstrate competency in paracentesis.[12] This decision was informed by a lack of standardized methods to determine procedural competency and published surveys showing that internal medicine and hospitalist physicians rarely performed bedside procedures.[5, 6] Despite this policy change, our findings show that current clinical practice at university hospitals does not reflect national practice patterns or certification requirements, because many internal medicine‐ and family medicine‐trained clinicians still perform paracentesis procedures. This is concerning because internal medicine and family medicine trainees report variable confidence, experience, expertise, and supervision regarding performance of invasive procedures.[9, 10, 21, 22, 23, 24] Furthermore, earlier research also demonstrates that graduating residents and fellows are not able to competently perform common bedside procedures such as thoracentesis, temporary hemodialysis catheter insertion, and lumbar puncture.[25, 26, 27]

The American Association for the Study of Liver Diseases (AASLD) recommends that trained clinicians perform paracentesis procedures.[3, 4] However, the AASLD provides no definition for how training should occur. Because competency in this procedure is not specifically required by the ABIM, ABFM, or ACGME, a paradoxical situation occurs in which internal medicine and family medicine residents, and internal medicine‐trained fellows and faculty continue to perform paracentesis procedures on highly complex patients, but are no longer required to be competent to do so.

In earlier research we showed that simulation‐based mastery learning (SBML) was an effective method to boost internal medicine residents' paracentesis skills.[28] In SBML, all trainees must meet or exceed a minimum passing score on a simulated procedure before performing one on an actual patient.[29] This approach improves clinical care and outcomes in procedures such as central venous catheter insertion[30, 31] and advanced cardiac life support.[32] SBML‐trained residents also performed safe paracentesis procedures with shorter hospital LOS, fewer ICU transfers, and fewer blood product transfusions than IR procedures.[16] Based on the results of this study, AASLD guidelines regarding training, and our experience with SBML, we recommend that all clinicians complete paracentesis SBML training before performing procedures on patients.

Using our propensity model we identified patient characteristics that were associated with IR referral. Patients with a liver transplant were more likely to be cared for in IR. This may be due to a belief that postoperative procedures are anatomically more complex or because surgical trainees do not commonly perform this procedure. The current study confirms findings from earlier work that obese and female patients are more likely to be referred to IR.[16] IR referral of obese patients is likely to occur because paracentesis procedures are technically more difficult. We have no explanation why female patients were more likely to be referred to IR, because most decisions appear to be discretionary. Prospective studies are needed to determine evidence‐based recommendations regarding paracentesis procedure location. Patients with more comorbidities (eg, ICU stay, awaiting liver transplant, coagulation disorders) were more likely to undergo bedside procedures. The complexity of patients undergoing bedside paracentesis procedures reinforces the need for rigorous skill assessment for clinicians who perform them because complications such as intraperitoneal bleeding can be fatal.

Finally, we showed that LOS was similar but hospital direct costs were $800 to $1300 lower for patients whose paracentesis procedure was performed by medicine or gastroenterology/hepatology compared to IR. Medical subspecialties and surgery procedures were more expensive than IR, consistent with the higher LOS seen in these groups. IR procedures add costs due to facility charges for space, personnel, and equipment.[33] At our institution, the hospital cost of an IR paracentesis in 2012 was $361. If we use this figure, and assume costs are similar across university hospitals, the resultant cost savings would be $10,257,454 (for the procedure alone) if all procedures referred to IR in this 2‐year study were instead performed at the bedside. This estimate is approximate because it does not consider factors such as cost of clinician staffing models, which may differ across UHC hospitals. As hospitals look to reduce costs, potential savings due to appropriate use of bedside and IR procedures should be considered. This is especially important because there is no evidence that the extra expense of IR procedures is justified due to improved patient outcomes.

This study has several limitations. First, this was an observational study. Although the database was large, we were limited by coding accuracy and could not control for all potential confounding factors such as Model for End‐Stage Liver Disease score,[34, 35] other specific laboratory values, amount of ascites fluid removed, or bedside procedure failures later referred to IR. However, we do know that only a small number of second, third, or fourth procedures were subsequently referred to IR after earlier ones were performed at the bedside. Additionally the UHC database does not include patient‐specific data, and therefore we could not adjust for multiple visits or procedures by the same patient. Second, we were unable to determine the level of teaching involvement at each UHC affiliated hospital. Community hospitals where attendings managed most of the patients without trainees could not be differentiated from university hospitals where trainees were involved in most patients' care. Third, we did not have specialty information for 9434 (9.7%) procedures and had to exclude these cases. We also excluded a large number of paracentesis procedures in our final outcomes analysis. However, this was necessary because we needed to perform a patient‐level analysis to ensure the propensity and outcomes models were accurate. Finally, we did not evaluate inpatient mortality or 30‐day hospital readmission rates. Mortality and readmission from complications of a paracentesis procedure are rare events.[3, 4, 36] However, mortality and hospital readmission among patients with liver disease are relatively common.[37, 38] It was impossible to link these outcomes to a paracentesis procedure without the ability to perform medical records review.

In conclusion, paracentesis procedures are performed frequently by internal medicine‐ and family medicine‐trained clinicians in university hospitals. Because of these findings regarding current practice patterns, we believe the ACGME, ABIM, and ABFM should clarify their policies to require that residents are competent to perform paracentesis procedures before performing them on patients. This may improve supervision and training for paracentesis procedures that are already occurring and possibly encourage performance of additional, less costly bedside procedures.

Acknowledgements

The authors acknowledge Drs. Douglas Vaughan and Mark Williams for their support and encouragement of this work.

Disclosure: Nothing to report.

References
  1. Runyon BA. A primer on detecting cirrhosis and caring for these patients without causing harm. Int J Hepatol. 2011:801983.
  2. Lefton HB, Rosa A, Cohen M. Diagnosis and epidemiology of cirrhosis. Med Clin North Am. 2009;93(4):787799.
  3. Runyon BA; AASLD Practice Guidelines Committee. Management of adult patients with ascites due to cirrhosis: an update. Hepatology. 2009;49(6):20872107.
  4. Runyan BA; AASLD Practice Guidelines Committee. Management of adult patients with ascites due to cirrhosis: update 2012. Hepatology. 2009;49:2087–2107. Available at: http://www.aasld.org/practiceguidelines/Documents/ascitesupdate2013.pdf. Accessed October 21, 2013.
  5. Thakkar R, Wright SM, Alguire P, Wigton RS, Boonyasai RT. Procedures performed by hospitalist and non‐hospitalist general internists. J Gen Intern Med. 2010;25(5):448452.
  6. Wigton RS, Alguire P. The declining number and variety of procedures done by general internists: a resurvey of members of the American College of Physicians. Ann Intern Med. 2007;146(5):355360.
  7. Duszak R, Chatterjee AR, Schneider DA. National fluid shifts: fifteen‐year trends in paracentesis and thoracentesis procedures. J Am Coll Radiol. 2010;7(11):859864.
  8. Duffy FD, Holmboe ES. What procedures should internists do? Ann Intern Med. 2007;146(5):392393.
  9. Huang GC, Smith CC, Gordon CE, Feller‐Kopman DJ, Davis RB, Phillips RS. Beyond the comfort zone: residents assess their comfort performing inpatient medicine procedures. Am J Med. 2006;119(1)71.e17e24.
  10. Sharp LK, Wang R, Lipsky MS. Perception of competency to perform procedures and future practice intent: a national survey of family practice residents. Acad Med. 2003;78(9):926932.
  11. Guardino JM, Proctor DD, Lopez R, Carey W. Utilization of and adherence to the gastroenterology core curriculum on hepatology training during a gastrointestinal fellowship. Clin Gastroenterol Hepatol. 2008;6(6):682688.
  12. American Board of Internal Medicine. Internal medicine policies. Available at: http://www.abim.org/certification/policies/imss/im.aspx. Accessed December 21, 2013.
  13. A CGME program requirements for graduate medical education in internal medicine. Available at: http://acgme.org/acgmeweb/portals/0/PFassets/2013‐PR‐FAQ‐PIF/140_internal_medicine_07012013.pdf. Accessed December 17, 2013.
  14. American Board of Family Medicine residency requirements. Available at: https://www.theabfm.org/cert/guidelines.aspx. Accessed December 17, 2013.
  15. ACGME program requirements for graduate medical education in family medicine. Available at: http://www.acgme.org/acgmeweb/Portals/0/PFAssets/ProgramRequirements/120pr07012007.pdf. Accessed December 17, 2013.
  16. Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Clinical outcomes after bedside and interventional radiology paracentesis procedures. Am J Med. 2013;126(4):349356.
  17. Grabau CM, Crago SF, Hoff LK, et al. Performance standards for therapeutic abdominal paracentesis. Hepatology. 2004;40(2):484488.
  18. Deyo RA, Cherkin DC, Ciol MA. Adapting a clinical comorbidity index for use with ICD‐9‐CM administrative databases. J Clin Epidemiol. 1992;45(6):613619.
  19. Romano PS, Roos LL, Jollis JG. Adapting a clinical comorbidity index for use with ICD‐9‐CM administrative data: differing perspectives. J Clin Epidemiol. 1993;46(10):10751079; discussion 1081–1090.
  20. Rosenbaum PR. The role of known effects in observational studies. Biometrics. 1998;45(2):557569.
  21. Farnan JM, Petty LA, Georgitis E, et al. A systematic review: the effect of clinical supervision on patient and residency education outcomes. Acad Med. 2012;87(4):428442.
  22. Lucas BP, Asbury JK, Wang Y, et al. Impact of a bedside procedure service on general medicine inpatients: a firm‐based trial. J Hosp Med. 2007;2(3):143149.
  23. Haber LA, Lau CY, Sharpe BA, Arora VM, Farnan JM, Ranji SR. Effects of increased overnight supervision on resident education, decision‐making, and autonomy. J Hosp Med. 2012;7(8):606610.
  24. Berns JS, O'Neill WC. Performance of procedures by nephrologists and nephrology fellows at U.S. nephrology training programs. Clin J Am Soc Nephrol. 2008;3(4):941947.
  25. Wayne DB, Barsuk JH, O'Leary K, Fudala M, McGaghie WC. Mastery learning of thoracentesis skills by internal medicine residents using simulation technology and deliberate practice. J Hosp Med. 2008;3:4954.
  26. Barsuk JH, Ahya SN, Cohen ER, McGaghie WC, Wayne DB. Mastery learning of temporary dialysis catheter insertion skills by nephrology fellows using simulation technology and deliberate practice. Am J Kidney Dis. 2009;54:7076.
  27. Barsuk JH, Cohen ER, Caprio T, McGaghie WC, Simuni T, Wayne DB. Simulation‐based education with mastery learning improves residents' lumbar puncture skills. Neurology. 2012;79:132137.
  28. Barsuk JH, Cohen ER, Vozenilek JA, O'Connor LM, McGaghie WC, Wayne DB. Simulation‐based education with mastery learning improves paracentesis skills. J Grad Med Educ. 2012;4(1):2327.
  29. McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Medical education featuring mastery learning with deliberate practice can lead to better health for individuals and populations. Acad Med. 2011; 86(11):e8e9.
  30. Barsuk JH, McGaghie WC, Cohen ER, O'Leary KJ, Wayne DB. Simulation‐based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med. 2009;37(10):26972701.
  31. Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Use of simulation‐based education to reduce catheter‐related bloodstream infections. Arch Intern Med. 2009;169(15):14201423.
  32. Didwania A, McGaghie WC, Cohen ER, et al. Progress toward improving the quality of cardiac arrest medical team responses at an academic teaching hospital. J Grad Med Educ. 2011;3(2):211216.
  33. Saini S, Seltzer SE, Bramson RT, et al. Technical cost of radiologic examinations: analysis across imaging modalities. Radiology. 2000;216(1):269272.
  34. Kamath PS, Wiesner RH, Malinchoc M, et al. A model to predict survival in patients with end‐stage liver disease. Hepatology. 2001;33(2):464470.
  35. Wiesner R, Edwards E, Freeman R, et al. Model for end‐stage liver disease (MELD) and allocation of donor livers. Gastroenterology. 2003;124(1):9196.
  36. Thomsen TW, Shaffer RW, White B, Setnik GS. Videos in clinical medicine. Paracentesis. N Engl J Med. 2006;355(19):e21.
  37. Center for Disease Control and Prevention. Chronic liver disease or cirrhosis. National Hospital Discharge Survey: 2010 detailed diagnosis and procedure tables, number of first‐listed diagnoses (see ICD9‐CM code 571). Available at: http://www.cdc.gov/nchs/fastats/liverdis.htm. Accessed October 19, 2013.
  38. Berman K, Tandra S, Forssell K, et al. Incidence and predictors of 30‐day readmission among patients hospitalized for advanced liver disease. Clin Gastroenterol Hepatol. 2011;9(3):254259.
References
  1. Runyon BA. A primer on detecting cirrhosis and caring for these patients without causing harm. Int J Hepatol. 2011:801983.
  2. Lefton HB, Rosa A, Cohen M. Diagnosis and epidemiology of cirrhosis. Med Clin North Am. 2009;93(4):787799.
  3. Runyon BA; AASLD Practice Guidelines Committee. Management of adult patients with ascites due to cirrhosis: an update. Hepatology. 2009;49(6):20872107.
  4. Runyan BA; AASLD Practice Guidelines Committee. Management of adult patients with ascites due to cirrhosis: update 2012. Hepatology. 2009;49:2087–2107. Available at: http://www.aasld.org/practiceguidelines/Documents/ascitesupdate2013.pdf. Accessed October 21, 2013.
  5. Thakkar R, Wright SM, Alguire P, Wigton RS, Boonyasai RT. Procedures performed by hospitalist and non‐hospitalist general internists. J Gen Intern Med. 2010;25(5):448452.
  6. Wigton RS, Alguire P. The declining number and variety of procedures done by general internists: a resurvey of members of the American College of Physicians. Ann Intern Med. 2007;146(5):355360.
  7. Duszak R, Chatterjee AR, Schneider DA. National fluid shifts: fifteen‐year trends in paracentesis and thoracentesis procedures. J Am Coll Radiol. 2010;7(11):859864.
  8. Duffy FD, Holmboe ES. What procedures should internists do? Ann Intern Med. 2007;146(5):392393.
  9. Huang GC, Smith CC, Gordon CE, Feller‐Kopman DJ, Davis RB, Phillips RS. Beyond the comfort zone: residents assess their comfort performing inpatient medicine procedures. Am J Med. 2006;119(1)71.e17e24.
  10. Sharp LK, Wang R, Lipsky MS. Perception of competency to perform procedures and future practice intent: a national survey of family practice residents. Acad Med. 2003;78(9):926932.
  11. Guardino JM, Proctor DD, Lopez R, Carey W. Utilization of and adherence to the gastroenterology core curriculum on hepatology training during a gastrointestinal fellowship. Clin Gastroenterol Hepatol. 2008;6(6):682688.
  12. American Board of Internal Medicine. Internal medicine policies. Available at: http://www.abim.org/certification/policies/imss/im.aspx. Accessed December 21, 2013.
  13. A CGME program requirements for graduate medical education in internal medicine. Available at: http://acgme.org/acgmeweb/portals/0/PFassets/2013‐PR‐FAQ‐PIF/140_internal_medicine_07012013.pdf. Accessed December 17, 2013.
  14. American Board of Family Medicine residency requirements. Available at: https://www.theabfm.org/cert/guidelines.aspx. Accessed December 17, 2013.
  15. ACGME program requirements for graduate medical education in family medicine. Available at: http://www.acgme.org/acgmeweb/Portals/0/PFAssets/ProgramRequirements/120pr07012007.pdf. Accessed December 17, 2013.
  16. Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Clinical outcomes after bedside and interventional radiology paracentesis procedures. Am J Med. 2013;126(4):349356.
  17. Grabau CM, Crago SF, Hoff LK, et al. Performance standards for therapeutic abdominal paracentesis. Hepatology. 2004;40(2):484488.
  18. Deyo RA, Cherkin DC, Ciol MA. Adapting a clinical comorbidity index for use with ICD‐9‐CM administrative databases. J Clin Epidemiol. 1992;45(6):613619.
  19. Romano PS, Roos LL, Jollis JG. Adapting a clinical comorbidity index for use with ICD‐9‐CM administrative data: differing perspectives. J Clin Epidemiol. 1993;46(10):10751079; discussion 1081–1090.
  20. Rosenbaum PR. The role of known effects in observational studies. Biometrics. 1998;45(2):557569.
  21. Farnan JM, Petty LA, Georgitis E, et al. A systematic review: the effect of clinical supervision on patient and residency education outcomes. Acad Med. 2012;87(4):428442.
  22. Lucas BP, Asbury JK, Wang Y, et al. Impact of a bedside procedure service on general medicine inpatients: a firm‐based trial. J Hosp Med. 2007;2(3):143149.
  23. Haber LA, Lau CY, Sharpe BA, Arora VM, Farnan JM, Ranji SR. Effects of increased overnight supervision on resident education, decision‐making, and autonomy. J Hosp Med. 2012;7(8):606610.
  24. Berns JS, O'Neill WC. Performance of procedures by nephrologists and nephrology fellows at U.S. nephrology training programs. Clin J Am Soc Nephrol. 2008;3(4):941947.
  25. Wayne DB, Barsuk JH, O'Leary K, Fudala M, McGaghie WC. Mastery learning of thoracentesis skills by internal medicine residents using simulation technology and deliberate practice. J Hosp Med. 2008;3:4954.
  26. Barsuk JH, Ahya SN, Cohen ER, McGaghie WC, Wayne DB. Mastery learning of temporary dialysis catheter insertion skills by nephrology fellows using simulation technology and deliberate practice. Am J Kidney Dis. 2009;54:7076.
  27. Barsuk JH, Cohen ER, Caprio T, McGaghie WC, Simuni T, Wayne DB. Simulation‐based education with mastery learning improves residents' lumbar puncture skills. Neurology. 2012;79:132137.
  28. Barsuk JH, Cohen ER, Vozenilek JA, O'Connor LM, McGaghie WC, Wayne DB. Simulation‐based education with mastery learning improves paracentesis skills. J Grad Med Educ. 2012;4(1):2327.
  29. McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Medical education featuring mastery learning with deliberate practice can lead to better health for individuals and populations. Acad Med. 2011; 86(11):e8e9.
  30. Barsuk JH, McGaghie WC, Cohen ER, O'Leary KJ, Wayne DB. Simulation‐based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med. 2009;37(10):26972701.
  31. Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Use of simulation‐based education to reduce catheter‐related bloodstream infections. Arch Intern Med. 2009;169(15):14201423.
  32. Didwania A, McGaghie WC, Cohen ER, et al. Progress toward improving the quality of cardiac arrest medical team responses at an academic teaching hospital. J Grad Med Educ. 2011;3(2):211216.
  33. Saini S, Seltzer SE, Bramson RT, et al. Technical cost of radiologic examinations: analysis across imaging modalities. Radiology. 2000;216(1):269272.
  34. Kamath PS, Wiesner RH, Malinchoc M, et al. A model to predict survival in patients with end‐stage liver disease. Hepatology. 2001;33(2):464470.
  35. Wiesner R, Edwards E, Freeman R, et al. Model for end‐stage liver disease (MELD) and allocation of donor livers. Gastroenterology. 2003;124(1):9196.
  36. Thomsen TW, Shaffer RW, White B, Setnik GS. Videos in clinical medicine. Paracentesis. N Engl J Med. 2006;355(19):e21.
  37. Center for Disease Control and Prevention. Chronic liver disease or cirrhosis. National Hospital Discharge Survey: 2010 detailed diagnosis and procedure tables, number of first‐listed diagnoses (see ICD9‐CM code 571). Available at: http://www.cdc.gov/nchs/fastats/liverdis.htm. Accessed October 19, 2013.
  38. Berman K, Tandra S, Forssell K, et al. Incidence and predictors of 30‐day readmission among patients hospitalized for advanced liver disease. Clin Gastroenterol Hepatol. 2011;9(3):254259.
Issue
Journal of Hospital Medicine - 9(3)
Issue
Journal of Hospital Medicine - 9(3)
Page Number
162-168
Page Number
162-168
Publications
Publications
Article Type
Display Headline
Specialties performing paracentesis procedures at university hospitals: Implications for training and certification
Display Headline
Specialties performing paracentesis procedures at university hospitals: Implications for training and certification
Sections
Article Source

© 2014 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Address for correspondence and reprint requests: Jeffrey H. Barsuk, MD, Division of Hospital Medicine, 211 E. Ontario St., Suite 717, Chicago, IL 60611; Telephone: 312‐926‐3680; Fax: 312‐926‐4588; E‐mail: jbarsuk@nmh.org
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Article PDF Media
Media Files

Residents' ECG Interpretation Skills

Article Type
Changed
Mon, 01/02/2017 - 19:34
Display Headline
Imperfect beginnings: Incoming residents vary in their ability to interpret basic electrocardiogram findings

Decreased efficiency at the beginning of residency training likely results in preventable harm for patients, a phenomenon known as the July Effect.[1, 2] Postgraduate year (PGY)1 residents enter training with a variety of clinical skills and experiences, and concerns exist regarding their preparation to enter graduate medical education (GME).[3] Electrocardiogram (ECG) interpretation is a core clinical skill that residents must have on the first day of training to manage patients, recognize emergencies, and develop evidence‐based and cost‐effective treatment plans. We assessed incoming PGY‐1 residents' ability to interpret common ECG findings as part of a rigorous boot camp experience.[4]

METHODS

This was an institutional review board‐approved pre‐post study of 81 new PGY‐1 residents' ECG interpretation skills. Subjects represented all trainees from internal medicine (n=47), emergency medicine (n=13), anesthesiology (n=11), and general surgery (n=10), who entered GME at Northwestern University in June 2013. Residents completed a pretest, followed by a 60‐minute interactive small group tutorial and a post‐test. Program faculty and expert cardiologists selected 10 common ECG findings for the study, many representing medical emergencies requiring immediate treatment. The diagnoses were: normal sinus rhythm, hyperkalemia, right bundle branch block (RBBB), left bundle branch block (LBBB), complete heart block, lateral wall myocardial infarction (MI), anterior wall MI, atrial fibrillation, ventricular paced rhythm, and ventricular tachycardia (VT). ECGs were selected from an online reference set (www.ecg.bidmc.harvard.edu; Harvard University, Cambridge, MA) using different pre‐ and post‐test ECGs in varying order to represent the same 10 common findings.

RESULTS

All 81 residents completed the study. The mean age was 27 years, and 56% were male. Eighty (99%) graduated from a US medical school. The mean United States Medical Licensing Examination scores were step 1: 243.8 (14.4) and step 2: 251.8 (13.6). Twenty‐six (32%) completed a cardiology rotation in medical school. Before the pretest, residents self‐assessed their ECG interpretation skills as a mean of 61.8 (standard deviation 17.2) using a scale of 0 (not confident) to 100 (very confident). Pretest results ranged from 60.5% correct (complete heart block) to 96.3% correct (normal sinus rhythm). Eighteen residents (22%) did not recognize hyperkalemia, 20 (25%) were unable to identify RBBB, and 15 (18%) LBBB. Twenty‐two (27%) could not discern a lateral wall MI, and 8 residents (10%) missed an anterior wall MI. Sixteen (20%) could not diagnose atrial fibrillation, 18 (22%) could not identify a ventricular paced rhythm, and 13 (16%) did not recognize VT. Mean post‐test scores improved significantly for 5 cases (P<0.05), but did not rise significantly for normal sinus rhythm, lateral wall MI, anterior wall MI, hyperkalemia, and ventricular paced rhythm 1.

Figure 1
Interns' overall ECG skills pre‐ and post‐boot camp. Abbreviations: AWMI, anterior wall myocardial infarction; CHB, complete heart block; LBBB, left bundle branch block; MI, myocardial infarction; RBBB, right bundle branch block; VT, ventricular tachycardia.

DISCUSSION

PGY‐1 residents from multiple specialties were not confident regarding their ability to interpret ECGs and could not reliably identify 10 basic findings. This is despite graduating almost exclusively from US medical schools and performing at high levels on standardized tests. Although boot camp improved recognition of important ECG findings, including VT and bundle branch blocks, identification of emergent diagnoses such as lateral/anterior MI and hyperkalemia require additional training and close supervision during patient care. This study provides further evidence that the preparation of PGY‐1 residents to enter GME is lacking. Recent calls for inclusion of cost‐consciousness and stewardship of resources as a seventh competency for residents[5] are challenging, because incoming trainees do not uniformly possess the basic clinical skills needed to make these judgments.[3, 4] If residents cannot reliably interpret ECGs, it is not possible to determine cost‐effective testing strategies for patients with cardiac conditions. Based on the result of this study and others,[3, 4] we believe medical schools should agree upon specific graduation requirements to ensure all students have mastered core competencies and are prepared to enter GME.

Acknowledgments

Disclosure: Nothing to report.

Files
References
  1. Barach P, Philibert I. The July effect: fertile ground for systems improvement. Ann Intern Med. 2011;155(5):331332.
  2. Young JQ, Ranji SR, Wachter RM, Lee CM, Niehaus B, Auerbach AD. July effect: impact of the academic year‐end changeover on patient outcomes: a systematic review. Ann Intern Med. 2011;155(5):309315.
  3. Lypson ML, Frohna JG, Gruppen LD, Woolliscroft JO. Assessing residents' competencies at baseline: identifying the gaps. Acad Med. 2004;79(6):564570.
  4. Cohen ER, Barsuk JH, Moazed F, et al. Making July safer: simulation‐based mastery learning during intern boot camp. Acad Med. 2013;88(2):233239.
  5. Weinberger SE. Providing high‐value, cost‐conscious care: a critical seventh general competency for physicians. Ann Intern Med. 2011;155(6):386388.
Article PDF
Issue
Journal of Hospital Medicine - 9(3)
Publications
Page Number
197-198
Sections
Files
Files
Article PDF
Article PDF

Decreased efficiency at the beginning of residency training likely results in preventable harm for patients, a phenomenon known as the July Effect.[1, 2] Postgraduate year (PGY)1 residents enter training with a variety of clinical skills and experiences, and concerns exist regarding their preparation to enter graduate medical education (GME).[3] Electrocardiogram (ECG) interpretation is a core clinical skill that residents must have on the first day of training to manage patients, recognize emergencies, and develop evidence‐based and cost‐effective treatment plans. We assessed incoming PGY‐1 residents' ability to interpret common ECG findings as part of a rigorous boot camp experience.[4]

METHODS

This was an institutional review board‐approved pre‐post study of 81 new PGY‐1 residents' ECG interpretation skills. Subjects represented all trainees from internal medicine (n=47), emergency medicine (n=13), anesthesiology (n=11), and general surgery (n=10), who entered GME at Northwestern University in June 2013. Residents completed a pretest, followed by a 60‐minute interactive small group tutorial and a post‐test. Program faculty and expert cardiologists selected 10 common ECG findings for the study, many representing medical emergencies requiring immediate treatment. The diagnoses were: normal sinus rhythm, hyperkalemia, right bundle branch block (RBBB), left bundle branch block (LBBB), complete heart block, lateral wall myocardial infarction (MI), anterior wall MI, atrial fibrillation, ventricular paced rhythm, and ventricular tachycardia (VT). ECGs were selected from an online reference set (www.ecg.bidmc.harvard.edu; Harvard University, Cambridge, MA) using different pre‐ and post‐test ECGs in varying order to represent the same 10 common findings.

RESULTS

All 81 residents completed the study. The mean age was 27 years, and 56% were male. Eighty (99%) graduated from a US medical school. The mean United States Medical Licensing Examination scores were step 1: 243.8 (14.4) and step 2: 251.8 (13.6). Twenty‐six (32%) completed a cardiology rotation in medical school. Before the pretest, residents self‐assessed their ECG interpretation skills as a mean of 61.8 (standard deviation 17.2) using a scale of 0 (not confident) to 100 (very confident). Pretest results ranged from 60.5% correct (complete heart block) to 96.3% correct (normal sinus rhythm). Eighteen residents (22%) did not recognize hyperkalemia, 20 (25%) were unable to identify RBBB, and 15 (18%) LBBB. Twenty‐two (27%) could not discern a lateral wall MI, and 8 residents (10%) missed an anterior wall MI. Sixteen (20%) could not diagnose atrial fibrillation, 18 (22%) could not identify a ventricular paced rhythm, and 13 (16%) did not recognize VT. Mean post‐test scores improved significantly for 5 cases (P<0.05), but did not rise significantly for normal sinus rhythm, lateral wall MI, anterior wall MI, hyperkalemia, and ventricular paced rhythm 1.

Figure 1
Interns' overall ECG skills pre‐ and post‐boot camp. Abbreviations: AWMI, anterior wall myocardial infarction; CHB, complete heart block; LBBB, left bundle branch block; MI, myocardial infarction; RBBB, right bundle branch block; VT, ventricular tachycardia.

DISCUSSION

PGY‐1 residents from multiple specialties were not confident regarding their ability to interpret ECGs and could not reliably identify 10 basic findings. This is despite graduating almost exclusively from US medical schools and performing at high levels on standardized tests. Although boot camp improved recognition of important ECG findings, including VT and bundle branch blocks, identification of emergent diagnoses such as lateral/anterior MI and hyperkalemia require additional training and close supervision during patient care. This study provides further evidence that the preparation of PGY‐1 residents to enter GME is lacking. Recent calls for inclusion of cost‐consciousness and stewardship of resources as a seventh competency for residents[5] are challenging, because incoming trainees do not uniformly possess the basic clinical skills needed to make these judgments.[3, 4] If residents cannot reliably interpret ECGs, it is not possible to determine cost‐effective testing strategies for patients with cardiac conditions. Based on the result of this study and others,[3, 4] we believe medical schools should agree upon specific graduation requirements to ensure all students have mastered core competencies and are prepared to enter GME.

Acknowledgments

Disclosure: Nothing to report.

Decreased efficiency at the beginning of residency training likely results in preventable harm for patients, a phenomenon known as the July Effect.[1, 2] Postgraduate year (PGY)1 residents enter training with a variety of clinical skills and experiences, and concerns exist regarding their preparation to enter graduate medical education (GME).[3] Electrocardiogram (ECG) interpretation is a core clinical skill that residents must have on the first day of training to manage patients, recognize emergencies, and develop evidence‐based and cost‐effective treatment plans. We assessed incoming PGY‐1 residents' ability to interpret common ECG findings as part of a rigorous boot camp experience.[4]

METHODS

This was an institutional review board‐approved pre‐post study of 81 new PGY‐1 residents' ECG interpretation skills. Subjects represented all trainees from internal medicine (n=47), emergency medicine (n=13), anesthesiology (n=11), and general surgery (n=10), who entered GME at Northwestern University in June 2013. Residents completed a pretest, followed by a 60‐minute interactive small group tutorial and a post‐test. Program faculty and expert cardiologists selected 10 common ECG findings for the study, many representing medical emergencies requiring immediate treatment. The diagnoses were: normal sinus rhythm, hyperkalemia, right bundle branch block (RBBB), left bundle branch block (LBBB), complete heart block, lateral wall myocardial infarction (MI), anterior wall MI, atrial fibrillation, ventricular paced rhythm, and ventricular tachycardia (VT). ECGs were selected from an online reference set (www.ecg.bidmc.harvard.edu; Harvard University, Cambridge, MA) using different pre‐ and post‐test ECGs in varying order to represent the same 10 common findings.

RESULTS

All 81 residents completed the study. The mean age was 27 years, and 56% were male. Eighty (99%) graduated from a US medical school. The mean United States Medical Licensing Examination scores were step 1: 243.8 (14.4) and step 2: 251.8 (13.6). Twenty‐six (32%) completed a cardiology rotation in medical school. Before the pretest, residents self‐assessed their ECG interpretation skills as a mean of 61.8 (standard deviation 17.2) using a scale of 0 (not confident) to 100 (very confident). Pretest results ranged from 60.5% correct (complete heart block) to 96.3% correct (normal sinus rhythm). Eighteen residents (22%) did not recognize hyperkalemia, 20 (25%) were unable to identify RBBB, and 15 (18%) LBBB. Twenty‐two (27%) could not discern a lateral wall MI, and 8 residents (10%) missed an anterior wall MI. Sixteen (20%) could not diagnose atrial fibrillation, 18 (22%) could not identify a ventricular paced rhythm, and 13 (16%) did not recognize VT. Mean post‐test scores improved significantly for 5 cases (P<0.05), but did not rise significantly for normal sinus rhythm, lateral wall MI, anterior wall MI, hyperkalemia, and ventricular paced rhythm 1.

Figure 1
Interns' overall ECG skills pre‐ and post‐boot camp. Abbreviations: AWMI, anterior wall myocardial infarction; CHB, complete heart block; LBBB, left bundle branch block; MI, myocardial infarction; RBBB, right bundle branch block; VT, ventricular tachycardia.

DISCUSSION

PGY‐1 residents from multiple specialties were not confident regarding their ability to interpret ECGs and could not reliably identify 10 basic findings. This is despite graduating almost exclusively from US medical schools and performing at high levels on standardized tests. Although boot camp improved recognition of important ECG findings, including VT and bundle branch blocks, identification of emergent diagnoses such as lateral/anterior MI and hyperkalemia require additional training and close supervision during patient care. This study provides further evidence that the preparation of PGY‐1 residents to enter GME is lacking. Recent calls for inclusion of cost‐consciousness and stewardship of resources as a seventh competency for residents[5] are challenging, because incoming trainees do not uniformly possess the basic clinical skills needed to make these judgments.[3, 4] If residents cannot reliably interpret ECGs, it is not possible to determine cost‐effective testing strategies for patients with cardiac conditions. Based on the result of this study and others,[3, 4] we believe medical schools should agree upon specific graduation requirements to ensure all students have mastered core competencies and are prepared to enter GME.

Acknowledgments

Disclosure: Nothing to report.

References
  1. Barach P, Philibert I. The July effect: fertile ground for systems improvement. Ann Intern Med. 2011;155(5):331332.
  2. Young JQ, Ranji SR, Wachter RM, Lee CM, Niehaus B, Auerbach AD. July effect: impact of the academic year‐end changeover on patient outcomes: a systematic review. Ann Intern Med. 2011;155(5):309315.
  3. Lypson ML, Frohna JG, Gruppen LD, Woolliscroft JO. Assessing residents' competencies at baseline: identifying the gaps. Acad Med. 2004;79(6):564570.
  4. Cohen ER, Barsuk JH, Moazed F, et al. Making July safer: simulation‐based mastery learning during intern boot camp. Acad Med. 2013;88(2):233239.
  5. Weinberger SE. Providing high‐value, cost‐conscious care: a critical seventh general competency for physicians. Ann Intern Med. 2011;155(6):386388.
References
  1. Barach P, Philibert I. The July effect: fertile ground for systems improvement. Ann Intern Med. 2011;155(5):331332.
  2. Young JQ, Ranji SR, Wachter RM, Lee CM, Niehaus B, Auerbach AD. July effect: impact of the academic year‐end changeover on patient outcomes: a systematic review. Ann Intern Med. 2011;155(5):309315.
  3. Lypson ML, Frohna JG, Gruppen LD, Woolliscroft JO. Assessing residents' competencies at baseline: identifying the gaps. Acad Med. 2004;79(6):564570.
  4. Cohen ER, Barsuk JH, Moazed F, et al. Making July safer: simulation‐based mastery learning during intern boot camp. Acad Med. 2013;88(2):233239.
  5. Weinberger SE. Providing high‐value, cost‐conscious care: a critical seventh general competency for physicians. Ann Intern Med. 2011;155(6):386388.
Issue
Journal of Hospital Medicine - 9(3)
Issue
Journal of Hospital Medicine - 9(3)
Page Number
197-198
Page Number
197-198
Publications
Publications
Article Type
Display Headline
Imperfect beginnings: Incoming residents vary in their ability to interpret basic electrocardiogram findings
Display Headline
Imperfect beginnings: Incoming residents vary in their ability to interpret basic electrocardiogram findings
Sections
Article Source
© 2014 Society of Hospital Medicine
Disallow All Ads
Correspondence Location
Address for correspondence and reprint requests: Jane E. Wilcox, MD, Department of Medicine, Division of Cardiology, 251 East Huron St., Galter Suite 3‐150, Chicago, IL 60611; Telephone: 815‐847‐8455; Fax: 312‐926‐4227; E‐mail: jane-wilcox@fsm.northwestern.edu
Content Gating
Gated (full article locked unless allowed per User)
Gating Strategy
First Peek Free
Article PDF Media
Media Files

Promoting Professionalism

Article Type
Changed
Mon, 01/02/2017 - 19:34
Display Headline
Promoting professionalism via a video‐based educational workshop for academic hospitalists and housestaff

Unprofessional behavior in the inpatient setting has the potential to impact care delivery and the quality of trainee's educational experience. These behaviors, from disparaging colleagues to blocking admissions, can negatively impact the learning environment. The learning environment or conditions created by the patient care team's actions play a critical role in the development of trainees.[1, 2] The rising presence of hospitalists in the inpatient setting raises the question of how their actions impact the learning environment. Professional behavior has been defined as a core competency for hospitalists by the Society of Hospital Medicine.[3] Professional behavior of all team members, from faculty to trainee, can impact the learning environment and patient safety.[4, 5] However, few educational materials exist to train faculty and housestaff on recognizing and ameliorating unprofessional behaviors.

A prior assessment regarding hospitalists' lapses in professionalism identified scenarios that demonstrated increased participation by hospitalists at 3 institutions.[6] Participants reported observation or participation in specific unprofessional behaviors and rated their perception of these behaviors. Additional work within those residency environments demonstrated that residents' perceptions of and participation in these behaviors increased throughout training, with environmental characteristics, specifically faculty behavior, influencing trainee professional development and acclimation of these behaviors.[7, 8]

Although overall participation in egregious behavior was low, resident participation in 3 categories of unprofessional behavior increased during internship. Those scenarios included disparaging the emergency room or primary care physician for missed findings or management decisions, blocking or not taking admissions appropriate for the service in question, and misrepresenting a test as urgent to expedite obtaining the test. We developed our intervention focused on these areas to address professionalism lapses that occur during internship. Our earlier work showed faculty role models influenced trainee behavior. For this reason, we provided education to both residents and hospitalists to maximize the impact of the intervention.

We present here a novel, interactive, video‐based workshop curriculum for faculty and trainees that aims to illustrate unprofessional behaviors and outlines the role faculty may play in promoting such behaviors. In addition, we review the result of postworkshop evaluation on intent to change behavior and satisfaction.

METHODS

A grant from the American Board of Internal Medicine Foundation supported this project. The working group that resulted, the Chicago Professional Practice Project and Outcomes, included faculty representation from 3 Chicago‐area hospitals: the University of Chicago, Northwestern University, and NorthShore University HealthSystem. Academic hospitalists at these sites were invited to participate. Each site also has an internal medicine residency program in which hospitalists were expected to attend the teaching service. Given this, resident trainees at all participating sites, and 1 community teaching affiliate program (Mercy Hospital and Medical Center) where academic hospitalists at the University of Chicago rotate, were recruited for participation. Faculty champions were identified for each site, and 1 internal and external faculty representative from the working group served to debrief and facilitate. Trainee workshops were administered by 1 internal and external collaborator, and for the community site, 2 external faculty members. Workshops were held during established educational conference times, and lunch was provided.

Scripts highlighting each of the behaviors identified in the prior survey were developed and peer reviewed for clarity and face validity across the 3 sites. Medical student and resident actors were trained utilizing the finalized scripts, and a performance artist affiliated with the Screen Actors Guild assisted in their preparation for filming. All videos were filmed at the University of Chicago Pritzker School of Medicine Clinical Performance Center. The final videos ranged in length from 4 to 7 minutes and included title, cast, and funding source. As an example, 1 video highlighted the unprofessional behavior of misrepresenting a test as urgent to prioritize one's patient in the queue. This video included a resident, intern, and attending on inpatient rounds during which the resident encouraged the intern to misrepresent the patient's status to expedite obtaining the study and facilitate the patient's discharge. The resident stressed that he would be in the clinic and had many patients to see, highlighting the impact of workload on unprofessional behavior, and aggressively persuaded the intern to sell her test to have it performed the same day. When this occurred, the attending applauded the intern for her strong work.

A moderator guide and debriefing tools were developed to facilitate discussion. The duration of each of the workshops was approximately 60 minutes. After welcoming remarks, participants were provided tools to utilize during the viewing of each video. These checklists noted the roles of those depicted in the video, asked to identify positive or negative behaviors displayed, and included questions regarding how behaviors could be detrimental and how the situation could have been prevented. After viewing the videos, participants divided into small groups to discuss the individual exhibiting the unprofessional behavior, their perceived motivation for said behavior, and its impact on the team culture and patient care. Following a small‐group discussion, large‐group debriefing was performed, addressing the barriers and facilitators to professional behavior. Two videos were shown at each workshop, and participants completed a postworkshop evaluation. Videos chosen for viewing were based upon preworkshop survey results that highlighted areas of concern at that specific site.

Postworkshop paper‐based evaluations assessed participants' perception of displayed behaviors on a Likert‐type scale (1=unprofessional to 5=professional) utilizing items validated in prior work,[6, 7, 8] their level of agreement regarding the impact of video‐based exercises, and intent to change behavior using a Likert‐type scale (1=strongly disagree to 5=strongly agree). A constructed‐response section for comments regarding their experience was included. Descriptive statistics and Wilcoxon rank sum analyses were performed.

RESULTS

Forty‐four academic hospitalist faculty members (44/83; 53%) and 244 resident trainees (244/356; 68%) participated. When queried regarding their perception of the displayed behaviors in the videos, nearly 100% of faculty and trainees felt disparaging the emergency department or primary care physician for missed findings or clinical decisions was somewhat unprofessional or unprofessional. Ninety percent of hospitalists and 93% of trainees rated celebrating a blocked admission as somewhat unprofessional or unprofessional (Table 1).

Hospitalist and Resident Perception of Portrayed Behaviors
Behavior Faculty Rated as Unprofessional or Somewhat Unprofessional (n = 44) Housestaff Rated as Unprofessional or Somewhat Unprofessional (n=244)
  • NOTE: Abbreviations: ED/PCP, emergency department/primary care physician.

Disparaging the ED/PCP to colleagues for findings later discovered on the floor or patient care management decisions 95.6% 97.5%
Refusing an admission that could be considered appropriate for your service (eg, blocking) 86.4% 95.1%
Celebrating a blocked admission 90.1% 93.0%
Ordering a routine test as urgent to get it expedited 77.2% 80.3%

The scenarios portrayed were well received, with more than 85% of faculty and trainees agreeing that the behaviors displayed were realistic. Those who perceived videos as very realistic were more likely to report intent to change behavior (93% vs 53%, P=0.01). Nearly two‐thirds of faculty and 67% of housestaff expressed agreement that they intended to change behavior based upon the experience (Table 2).

Postworkshop Evaluation
Evaluation Item Faculty Level of Agreement (StronglyAgree or Agree) (n=44) Housestaff Level of Agreement (Strongly Agree or Agree) (n=244)
The scenarios portrayed in the videos were realistic 86.4% 86.9%
I will change my behavior as a result of this exercise 65.9% 67.2%
I feel that this was a useful and effective exercise 65.9% 77.1%

Qualitative comments in the constructed‐response portion of the evaluation noted the effectiveness of the interactive materials. In addition, the need for focused faculty development was identified by 1 respondent who stated: If unprofessional behavior is the unwritten curriculum, there needs to be an explicit, written curriculum to address it. Finally, the aim of facilitating self‐reflection is echoed in this faculty respondent's comment: Always good to be reminded of our behaviors and the influence they have on others and from this resident physician It helps to re‐evaluate how you talk to people.

CONCLUSIONS

Faculty can be a large determinant of the learning environment and impact trainees' professional development.[9] Hospitalists should be encouraged to embrace faculty role‐modeling of effective professional behaviors, especially given their increased presence in the inpatient learning environment. In addition, resident trainees and their behaviors contribute to the learning environment and influence the further professional development of more junior trainees.[10] Targeting professionalism education toward previously identified and prevalent unprofessional behaviors in the inpatient care of patients may serve to affect the most change among providers who practice in this setting. Individualized assessment of the learning environment may aid in identifying common scenarios that may plague a specific learning culture, allowing for relevant and targeted discussion of factors that promote and perpetuate such behaviors.[11]

Interactive, video‐based modules provided an effective way to promote interactive reflection and robust discussion. This model of experiential learning is an effective form of professional development as it engages the learner and stimulates ongoing incorporation of the topics addressed.[12, 13] Creating a shared concrete experience among targeted learners, using the video‐based scenarios, stimulates reflective observation, and ultimately experimentation, or incorporation into practice.[14]

There are several limitations to our evaluation including that we focused solely on academic hospitalist programs, and our sample size for faculty and residents was small. Also, we only addressed a small, though representative, sample of unprofessional behaviors and have not yet linked intervention to actual behavior change. Finally, the script scenarios that we used in this study were not previously published as they were created specifically for this intervention. Validity evidence for these scenarios include that they were based upon the results of earlier work from our institutions and underwent thorough peer review for content and clarity. Further studies will be required to do this. However, we do believe that these are positive findings for utilizing this type of interactive curriculum for professionalism education to promote self‐reflection and behavior change.

Video‐based professionalism education is a feasible, interactive mechanism to encourage self‐reflection and intent to change behavior among faculty and resident physicians. Future study is underway to conduct longitudinal assessments of the learning environments at the participating institutions to assess culture change, perceptions of behaviors, and sustainability of this type of intervention.

Disclosures: The authors acknowledge funding from the American Board of Internal Medicine. The funders had no role in the design of the study; the collection, analysis, and interpretation of the data; or the decision to approve publication of the finished manuscript. Results from this work have been presented at the Midwest Society of General Internal Medicine Regional Meeting, Chicago, Illinois, September 2011; Midwest Society of Hospital Medicine Regional Meeting, Chicago, Illinois, October 2011, and Society of Hospital Medicine Annual Meeting, San Diego, California, April 2012. The authors declare that they do not have any conflicts of interest to disclose.

Files
References
  1. Liaison Committee on Medical Education. Functions and structure of a medical school. Available at: http://www.lcme.org/functions.pdf. Accessed October 10, 2012.
  2. Gillespie C, Paik S, Ark T, Zabar S, Kalet A. Residents' perceptions of their own professionalism and the professionalism of their learning environment. J Grad Med Educ. 2009;1:208215.
  3. Society of Hospital Medicine. The core competencies in hospital medicine. http://www.hospitalmedicine.org/Content/NavigationMenu/Education/CoreCurriculum/Core_Competencies.htm. Accessed October 10, 2012.
  4. The Joint Commission. Behaviors that undermine a culture of safety. Sentinel Event Alert. 2008;(40):1–3. http://www.jointcommission.org/assets/1/18/SEA_40.pdf. Accessed October 10, 2012.
  5. Rosenstein AH, O'Daniel M. A survey of the impact of disruptive behaviors and communication defects on patient safety. Jt Comm J Qual Patient Saf. 2008;34:464471.
  6. Reddy ST, Iwaz JA, Didwania AK, et al. Participation in unprofessional behaviors among hospitalists: a multicenter study. J Hosp Med. 2012;7(7):543550.
  7. Arora VM, Wayne DB, Anderson RA et al. Participation in and perceptions of unprofessional behaviors among incoming internal medicine interns. JAMA. 2008;300:11321134.
  8. Arora VM, Wayne DB, Anderson RA, et al., Changes in perception of and participation in unprofessional behaviors during internship. Acad Med. 2010;85:S76S80.
  9. Schumacher DJ, Slovin SR, Riebschleger MP, et al. Perspective: beyond counting hours: the importance of supervision, professionalism, transitions of care, and workload in residency training. Acad Med. 2012;87(7):883888.
  10. Haidet P, Stein H. The role of the student‐teacher relationship in the formation of physicians: the hidden curriculum as process. J Gen Intern Med. 2006;21:S16S20.
  11. Thrush CR, Spollen JJ, Tariq SG, et al. Evidence for validity of a survey to measure the learning environment for professionalism. Med Teach. 2011;33(12):e683e688.
  12. Kolb DA. Experiential Learning: Experience as the Source of Learning and Development. Englewood Cliffs, NJ: Prentice Hall; 1984.
  13. Armstrong E, Parsa‐Parsi R. How can physicians' learning style drive educational planning? Acad Med. 2005;80:68084.
  14. Ber R, Alroy G. Twenty years of experience using trigger films as a teaching tool. Acad Med. 2001;76:656658.
Article PDF
Issue
Journal of Hospital Medicine - 8(7)
Publications
Page Number
386-389
Sections
Files
Files
Article PDF
Article PDF

Unprofessional behavior in the inpatient setting has the potential to impact care delivery and the quality of trainee's educational experience. These behaviors, from disparaging colleagues to blocking admissions, can negatively impact the learning environment. The learning environment or conditions created by the patient care team's actions play a critical role in the development of trainees.[1, 2] The rising presence of hospitalists in the inpatient setting raises the question of how their actions impact the learning environment. Professional behavior has been defined as a core competency for hospitalists by the Society of Hospital Medicine.[3] Professional behavior of all team members, from faculty to trainee, can impact the learning environment and patient safety.[4, 5] However, few educational materials exist to train faculty and housestaff on recognizing and ameliorating unprofessional behaviors.

A prior assessment regarding hospitalists' lapses in professionalism identified scenarios that demonstrated increased participation by hospitalists at 3 institutions.[6] Participants reported observation or participation in specific unprofessional behaviors and rated their perception of these behaviors. Additional work within those residency environments demonstrated that residents' perceptions of and participation in these behaviors increased throughout training, with environmental characteristics, specifically faculty behavior, influencing trainee professional development and acclimation of these behaviors.[7, 8]

Although overall participation in egregious behavior was low, resident participation in 3 categories of unprofessional behavior increased during internship. Those scenarios included disparaging the emergency room or primary care physician for missed findings or management decisions, blocking or not taking admissions appropriate for the service in question, and misrepresenting a test as urgent to expedite obtaining the test. We developed our intervention focused on these areas to address professionalism lapses that occur during internship. Our earlier work showed faculty role models influenced trainee behavior. For this reason, we provided education to both residents and hospitalists to maximize the impact of the intervention.

We present here a novel, interactive, video‐based workshop curriculum for faculty and trainees that aims to illustrate unprofessional behaviors and outlines the role faculty may play in promoting such behaviors. In addition, we review the result of postworkshop evaluation on intent to change behavior and satisfaction.

METHODS

A grant from the American Board of Internal Medicine Foundation supported this project. The working group that resulted, the Chicago Professional Practice Project and Outcomes, included faculty representation from 3 Chicago‐area hospitals: the University of Chicago, Northwestern University, and NorthShore University HealthSystem. Academic hospitalists at these sites were invited to participate. Each site also has an internal medicine residency program in which hospitalists were expected to attend the teaching service. Given this, resident trainees at all participating sites, and 1 community teaching affiliate program (Mercy Hospital and Medical Center) where academic hospitalists at the University of Chicago rotate, were recruited for participation. Faculty champions were identified for each site, and 1 internal and external faculty representative from the working group served to debrief and facilitate. Trainee workshops were administered by 1 internal and external collaborator, and for the community site, 2 external faculty members. Workshops were held during established educational conference times, and lunch was provided.

Scripts highlighting each of the behaviors identified in the prior survey were developed and peer reviewed for clarity and face validity across the 3 sites. Medical student and resident actors were trained utilizing the finalized scripts, and a performance artist affiliated with the Screen Actors Guild assisted in their preparation for filming. All videos were filmed at the University of Chicago Pritzker School of Medicine Clinical Performance Center. The final videos ranged in length from 4 to 7 minutes and included title, cast, and funding source. As an example, 1 video highlighted the unprofessional behavior of misrepresenting a test as urgent to prioritize one's patient in the queue. This video included a resident, intern, and attending on inpatient rounds during which the resident encouraged the intern to misrepresent the patient's status to expedite obtaining the study and facilitate the patient's discharge. The resident stressed that he would be in the clinic and had many patients to see, highlighting the impact of workload on unprofessional behavior, and aggressively persuaded the intern to sell her test to have it performed the same day. When this occurred, the attending applauded the intern for her strong work.

A moderator guide and debriefing tools were developed to facilitate discussion. The duration of each of the workshops was approximately 60 minutes. After welcoming remarks, participants were provided tools to utilize during the viewing of each video. These checklists noted the roles of those depicted in the video, asked to identify positive or negative behaviors displayed, and included questions regarding how behaviors could be detrimental and how the situation could have been prevented. After viewing the videos, participants divided into small groups to discuss the individual exhibiting the unprofessional behavior, their perceived motivation for said behavior, and its impact on the team culture and patient care. Following a small‐group discussion, large‐group debriefing was performed, addressing the barriers and facilitators to professional behavior. Two videos were shown at each workshop, and participants completed a postworkshop evaluation. Videos chosen for viewing were based upon preworkshop survey results that highlighted areas of concern at that specific site.

Postworkshop paper‐based evaluations assessed participants' perception of displayed behaviors on a Likert‐type scale (1=unprofessional to 5=professional) utilizing items validated in prior work,[6, 7, 8] their level of agreement regarding the impact of video‐based exercises, and intent to change behavior using a Likert‐type scale (1=strongly disagree to 5=strongly agree). A constructed‐response section for comments regarding their experience was included. Descriptive statistics and Wilcoxon rank sum analyses were performed.

RESULTS

Forty‐four academic hospitalist faculty members (44/83; 53%) and 244 resident trainees (244/356; 68%) participated. When queried regarding their perception of the displayed behaviors in the videos, nearly 100% of faculty and trainees felt disparaging the emergency department or primary care physician for missed findings or clinical decisions was somewhat unprofessional or unprofessional. Ninety percent of hospitalists and 93% of trainees rated celebrating a blocked admission as somewhat unprofessional or unprofessional (Table 1).

Hospitalist and Resident Perception of Portrayed Behaviors
Behavior Faculty Rated as Unprofessional or Somewhat Unprofessional (n = 44) Housestaff Rated as Unprofessional or Somewhat Unprofessional (n=244)
  • NOTE: Abbreviations: ED/PCP, emergency department/primary care physician.

Disparaging the ED/PCP to colleagues for findings later discovered on the floor or patient care management decisions 95.6% 97.5%
Refusing an admission that could be considered appropriate for your service (eg, blocking) 86.4% 95.1%
Celebrating a blocked admission 90.1% 93.0%
Ordering a routine test as urgent to get it expedited 77.2% 80.3%

The scenarios portrayed were well received, with more than 85% of faculty and trainees agreeing that the behaviors displayed were realistic. Those who perceived videos as very realistic were more likely to report intent to change behavior (93% vs 53%, P=0.01). Nearly two‐thirds of faculty and 67% of housestaff expressed agreement that they intended to change behavior based upon the experience (Table 2).

Postworkshop Evaluation
Evaluation Item Faculty Level of Agreement (StronglyAgree or Agree) (n=44) Housestaff Level of Agreement (Strongly Agree or Agree) (n=244)
The scenarios portrayed in the videos were realistic 86.4% 86.9%
I will change my behavior as a result of this exercise 65.9% 67.2%
I feel that this was a useful and effective exercise 65.9% 77.1%

Qualitative comments in the constructed‐response portion of the evaluation noted the effectiveness of the interactive materials. In addition, the need for focused faculty development was identified by 1 respondent who stated: If unprofessional behavior is the unwritten curriculum, there needs to be an explicit, written curriculum to address it. Finally, the aim of facilitating self‐reflection is echoed in this faculty respondent's comment: Always good to be reminded of our behaviors and the influence they have on others and from this resident physician It helps to re‐evaluate how you talk to people.

CONCLUSIONS

Faculty can be a large determinant of the learning environment and impact trainees' professional development.[9] Hospitalists should be encouraged to embrace faculty role‐modeling of effective professional behaviors, especially given their increased presence in the inpatient learning environment. In addition, resident trainees and their behaviors contribute to the learning environment and influence the further professional development of more junior trainees.[10] Targeting professionalism education toward previously identified and prevalent unprofessional behaviors in the inpatient care of patients may serve to affect the most change among providers who practice in this setting. Individualized assessment of the learning environment may aid in identifying common scenarios that may plague a specific learning culture, allowing for relevant and targeted discussion of factors that promote and perpetuate such behaviors.[11]

Interactive, video‐based modules provided an effective way to promote interactive reflection and robust discussion. This model of experiential learning is an effective form of professional development as it engages the learner and stimulates ongoing incorporation of the topics addressed.[12, 13] Creating a shared concrete experience among targeted learners, using the video‐based scenarios, stimulates reflective observation, and ultimately experimentation, or incorporation into practice.[14]

There are several limitations to our evaluation including that we focused solely on academic hospitalist programs, and our sample size for faculty and residents was small. Also, we only addressed a small, though representative, sample of unprofessional behaviors and have not yet linked intervention to actual behavior change. Finally, the script scenarios that we used in this study were not previously published as they were created specifically for this intervention. Validity evidence for these scenarios include that they were based upon the results of earlier work from our institutions and underwent thorough peer review for content and clarity. Further studies will be required to do this. However, we do believe that these are positive findings for utilizing this type of interactive curriculum for professionalism education to promote self‐reflection and behavior change.

Video‐based professionalism education is a feasible, interactive mechanism to encourage self‐reflection and intent to change behavior among faculty and resident physicians. Future study is underway to conduct longitudinal assessments of the learning environments at the participating institutions to assess culture change, perceptions of behaviors, and sustainability of this type of intervention.

Disclosures: The authors acknowledge funding from the American Board of Internal Medicine. The funders had no role in the design of the study; the collection, analysis, and interpretation of the data; or the decision to approve publication of the finished manuscript. Results from this work have been presented at the Midwest Society of General Internal Medicine Regional Meeting, Chicago, Illinois, September 2011; Midwest Society of Hospital Medicine Regional Meeting, Chicago, Illinois, October 2011, and Society of Hospital Medicine Annual Meeting, San Diego, California, April 2012. The authors declare that they do not have any conflicts of interest to disclose.

Unprofessional behavior in the inpatient setting has the potential to impact care delivery and the quality of trainee's educational experience. These behaviors, from disparaging colleagues to blocking admissions, can negatively impact the learning environment. The learning environment or conditions created by the patient care team's actions play a critical role in the development of trainees.[1, 2] The rising presence of hospitalists in the inpatient setting raises the question of how their actions impact the learning environment. Professional behavior has been defined as a core competency for hospitalists by the Society of Hospital Medicine.[3] Professional behavior of all team members, from faculty to trainee, can impact the learning environment and patient safety.[4, 5] However, few educational materials exist to train faculty and housestaff on recognizing and ameliorating unprofessional behaviors.

A prior assessment regarding hospitalists' lapses in professionalism identified scenarios that demonstrated increased participation by hospitalists at 3 institutions.[6] Participants reported observation or participation in specific unprofessional behaviors and rated their perception of these behaviors. Additional work within those residency environments demonstrated that residents' perceptions of and participation in these behaviors increased throughout training, with environmental characteristics, specifically faculty behavior, influencing trainee professional development and acclimation of these behaviors.[7, 8]

Although overall participation in egregious behavior was low, resident participation in 3 categories of unprofessional behavior increased during internship. Those scenarios included disparaging the emergency room or primary care physician for missed findings or management decisions, blocking or not taking admissions appropriate for the service in question, and misrepresenting a test as urgent to expedite obtaining the test. We developed our intervention focused on these areas to address professionalism lapses that occur during internship. Our earlier work showed faculty role models influenced trainee behavior. For this reason, we provided education to both residents and hospitalists to maximize the impact of the intervention.

We present here a novel, interactive, video‐based workshop curriculum for faculty and trainees that aims to illustrate unprofessional behaviors and outlines the role faculty may play in promoting such behaviors. In addition, we review the result of postworkshop evaluation on intent to change behavior and satisfaction.

METHODS

A grant from the American Board of Internal Medicine Foundation supported this project. The working group that resulted, the Chicago Professional Practice Project and Outcomes, included faculty representation from 3 Chicago‐area hospitals: the University of Chicago, Northwestern University, and NorthShore University HealthSystem. Academic hospitalists at these sites were invited to participate. Each site also has an internal medicine residency program in which hospitalists were expected to attend the teaching service. Given this, resident trainees at all participating sites, and 1 community teaching affiliate program (Mercy Hospital and Medical Center) where academic hospitalists at the University of Chicago rotate, were recruited for participation. Faculty champions were identified for each site, and 1 internal and external faculty representative from the working group served to debrief and facilitate. Trainee workshops were administered by 1 internal and external collaborator, and for the community site, 2 external faculty members. Workshops were held during established educational conference times, and lunch was provided.

Scripts highlighting each of the behaviors identified in the prior survey were developed and peer reviewed for clarity and face validity across the 3 sites. Medical student and resident actors were trained utilizing the finalized scripts, and a performance artist affiliated with the Screen Actors Guild assisted in their preparation for filming. All videos were filmed at the University of Chicago Pritzker School of Medicine Clinical Performance Center. The final videos ranged in length from 4 to 7 minutes and included title, cast, and funding source. As an example, 1 video highlighted the unprofessional behavior of misrepresenting a test as urgent to prioritize one's patient in the queue. This video included a resident, intern, and attending on inpatient rounds during which the resident encouraged the intern to misrepresent the patient's status to expedite obtaining the study and facilitate the patient's discharge. The resident stressed that he would be in the clinic and had many patients to see, highlighting the impact of workload on unprofessional behavior, and aggressively persuaded the intern to sell her test to have it performed the same day. When this occurred, the attending applauded the intern for her strong work.

A moderator guide and debriefing tools were developed to facilitate discussion. The duration of each of the workshops was approximately 60 minutes. After welcoming remarks, participants were provided tools to utilize during the viewing of each video. These checklists noted the roles of those depicted in the video, asked to identify positive or negative behaviors displayed, and included questions regarding how behaviors could be detrimental and how the situation could have been prevented. After viewing the videos, participants divided into small groups to discuss the individual exhibiting the unprofessional behavior, their perceived motivation for said behavior, and its impact on the team culture and patient care. Following a small‐group discussion, large‐group debriefing was performed, addressing the barriers and facilitators to professional behavior. Two videos were shown at each workshop, and participants completed a postworkshop evaluation. Videos chosen for viewing were based upon preworkshop survey results that highlighted areas of concern at that specific site.

Postworkshop paper‐based evaluations assessed participants' perception of displayed behaviors on a Likert‐type scale (1=unprofessional to 5=professional) utilizing items validated in prior work,[6, 7, 8] their level of agreement regarding the impact of video‐based exercises, and intent to change behavior using a Likert‐type scale (1=strongly disagree to 5=strongly agree). A constructed‐response section for comments regarding their experience was included. Descriptive statistics and Wilcoxon rank sum analyses were performed.

RESULTS

Forty‐four academic hospitalist faculty members (44/83; 53%) and 244 resident trainees (244/356; 68%) participated. When queried regarding their perception of the displayed behaviors in the videos, nearly 100% of faculty and trainees felt disparaging the emergency department or primary care physician for missed findings or clinical decisions was somewhat unprofessional or unprofessional. Ninety percent of hospitalists and 93% of trainees rated celebrating a blocked admission as somewhat unprofessional or unprofessional (Table 1).

Hospitalist and Resident Perception of Portrayed Behaviors
Behavior Faculty Rated as Unprofessional or Somewhat Unprofessional (n = 44) Housestaff Rated as Unprofessional or Somewhat Unprofessional (n=244)
  • NOTE: Abbreviations: ED/PCP, emergency department/primary care physician.

Disparaging the ED/PCP to colleagues for findings later discovered on the floor or patient care management decisions 95.6% 97.5%
Refusing an admission that could be considered appropriate for your service (eg, blocking) 86.4% 95.1%
Celebrating a blocked admission 90.1% 93.0%
Ordering a routine test as urgent to get it expedited 77.2% 80.3%

The scenarios portrayed were well received, with more than 85% of faculty and trainees agreeing that the behaviors displayed were realistic. Those who perceived videos as very realistic were more likely to report intent to change behavior (93% vs 53%, P=0.01). Nearly two‐thirds of faculty and 67% of housestaff expressed agreement that they intended to change behavior based upon the experience (Table 2).

Postworkshop Evaluation
Evaluation Item Faculty Level of Agreement (StronglyAgree or Agree) (n=44) Housestaff Level of Agreement (Strongly Agree or Agree) (n=244)
The scenarios portrayed in the videos were realistic 86.4% 86.9%
I will change my behavior as a result of this exercise 65.9% 67.2%
I feel that this was a useful and effective exercise 65.9% 77.1%

Qualitative comments in the constructed‐response portion of the evaluation noted the effectiveness of the interactive materials. In addition, the need for focused faculty development was identified by 1 respondent who stated: If unprofessional behavior is the unwritten curriculum, there needs to be an explicit, written curriculum to address it. Finally, the aim of facilitating self‐reflection is echoed in this faculty respondent's comment: Always good to be reminded of our behaviors and the influence they have on others and from this resident physician It helps to re‐evaluate how you talk to people.

CONCLUSIONS

Faculty can be a large determinant of the learning environment and impact trainees' professional development.[9] Hospitalists should be encouraged to embrace faculty role‐modeling of effective professional behaviors, especially given their increased presence in the inpatient learning environment. In addition, resident trainees and their behaviors contribute to the learning environment and influence the further professional development of more junior trainees.[10] Targeting professionalism education toward previously identified and prevalent unprofessional behaviors in the inpatient care of patients may serve to affect the most change among providers who practice in this setting. Individualized assessment of the learning environment may aid in identifying common scenarios that may plague a specific learning culture, allowing for relevant and targeted discussion of factors that promote and perpetuate such behaviors.[11]

Interactive, video‐based modules provided an effective way to promote interactive reflection and robust discussion. This model of experiential learning is an effective form of professional development as it engages the learner and stimulates ongoing incorporation of the topics addressed.[12, 13] Creating a shared concrete experience among targeted learners, using the video‐based scenarios, stimulates reflective observation, and ultimately experimentation, or incorporation into practice.[14]

There are several limitations to our evaluation including that we focused solely on academic hospitalist programs, and our sample size for faculty and residents was small. Also, we only addressed a small, though representative, sample of unprofessional behaviors and have not yet linked intervention to actual behavior change. Finally, the script scenarios that we used in this study were not previously published as they were created specifically for this intervention. Validity evidence for these scenarios include that they were based upon the results of earlier work from our institutions and underwent thorough peer review for content and clarity. Further studies will be required to do this. However, we do believe that these are positive findings for utilizing this type of interactive curriculum for professionalism education to promote self‐reflection and behavior change.

Video‐based professionalism education is a feasible, interactive mechanism to encourage self‐reflection and intent to change behavior among faculty and resident physicians. Future study is underway to conduct longitudinal assessments of the learning environments at the participating institutions to assess culture change, perceptions of behaviors, and sustainability of this type of intervention.

Disclosures: The authors acknowledge funding from the American Board of Internal Medicine. The funders had no role in the design of the study; the collection, analysis, and interpretation of the data; or the decision to approve publication of the finished manuscript. Results from this work have been presented at the Midwest Society of General Internal Medicine Regional Meeting, Chicago, Illinois, September 2011; Midwest Society of Hospital Medicine Regional Meeting, Chicago, Illinois, October 2011, and Society of Hospital Medicine Annual Meeting, San Diego, California, April 2012. The authors declare that they do not have any conflicts of interest to disclose.

References
  1. Liaison Committee on Medical Education. Functions and structure of a medical school. Available at: http://www.lcme.org/functions.pdf. Accessed October 10, 2012.
  2. Gillespie C, Paik S, Ark T, Zabar S, Kalet A. Residents' perceptions of their own professionalism and the professionalism of their learning environment. J Grad Med Educ. 2009;1:208215.
  3. Society of Hospital Medicine. The core competencies in hospital medicine. http://www.hospitalmedicine.org/Content/NavigationMenu/Education/CoreCurriculum/Core_Competencies.htm. Accessed October 10, 2012.
  4. The Joint Commission. Behaviors that undermine a culture of safety. Sentinel Event Alert. 2008;(40):1–3. http://www.jointcommission.org/assets/1/18/SEA_40.pdf. Accessed October 10, 2012.
  5. Rosenstein AH, O'Daniel M. A survey of the impact of disruptive behaviors and communication defects on patient safety. Jt Comm J Qual Patient Saf. 2008;34:464471.
  6. Reddy ST, Iwaz JA, Didwania AK, et al. Participation in unprofessional behaviors among hospitalists: a multicenter study. J Hosp Med. 2012;7(7):543550.
  7. Arora VM, Wayne DB, Anderson RA et al. Participation in and perceptions of unprofessional behaviors among incoming internal medicine interns. JAMA. 2008;300:11321134.
  8. Arora VM, Wayne DB, Anderson RA, et al., Changes in perception of and participation in unprofessional behaviors during internship. Acad Med. 2010;85:S76S80.
  9. Schumacher DJ, Slovin SR, Riebschleger MP, et al. Perspective: beyond counting hours: the importance of supervision, professionalism, transitions of care, and workload in residency training. Acad Med. 2012;87(7):883888.
  10. Haidet P, Stein H. The role of the student‐teacher relationship in the formation of physicians: the hidden curriculum as process. J Gen Intern Med. 2006;21:S16S20.
  11. Thrush CR, Spollen JJ, Tariq SG, et al. Evidence for validity of a survey to measure the learning environment for professionalism. Med Teach. 2011;33(12):e683e688.
  12. Kolb DA. Experiential Learning: Experience as the Source of Learning and Development. Englewood Cliffs, NJ: Prentice Hall; 1984.
  13. Armstrong E, Parsa‐Parsi R. How can physicians' learning style drive educational planning? Acad Med. 2005;80:68084.
  14. Ber R, Alroy G. Twenty years of experience using trigger films as a teaching tool. Acad Med. 2001;76:656658.
References
  1. Liaison Committee on Medical Education. Functions and structure of a medical school. Available at: http://www.lcme.org/functions.pdf. Accessed October 10, 2012.
  2. Gillespie C, Paik S, Ark T, Zabar S, Kalet A. Residents' perceptions of their own professionalism and the professionalism of their learning environment. J Grad Med Educ. 2009;1:208215.
  3. Society of Hospital Medicine. The core competencies in hospital medicine. http://www.hospitalmedicine.org/Content/NavigationMenu/Education/CoreCurriculum/Core_Competencies.htm. Accessed October 10, 2012.
  4. The Joint Commission. Behaviors that undermine a culture of safety. Sentinel Event Alert. 2008;(40):1–3. http://www.jointcommission.org/assets/1/18/SEA_40.pdf. Accessed October 10, 2012.
  5. Rosenstein AH, O'Daniel M. A survey of the impact of disruptive behaviors and communication defects on patient safety. Jt Comm J Qual Patient Saf. 2008;34:464471.
  6. Reddy ST, Iwaz JA, Didwania AK, et al. Participation in unprofessional behaviors among hospitalists: a multicenter study. J Hosp Med. 2012;7(7):543550.
  7. Arora VM, Wayne DB, Anderson RA et al. Participation in and perceptions of unprofessional behaviors among incoming internal medicine interns. JAMA. 2008;300:11321134.
  8. Arora VM, Wayne DB, Anderson RA, et al., Changes in perception of and participation in unprofessional behaviors during internship. Acad Med. 2010;85:S76S80.
  9. Schumacher DJ, Slovin SR, Riebschleger MP, et al. Perspective: beyond counting hours: the importance of supervision, professionalism, transitions of care, and workload in residency training. Acad Med. 2012;87(7):883888.
  10. Haidet P, Stein H. The role of the student‐teacher relationship in the formation of physicians: the hidden curriculum as process. J Gen Intern Med. 2006;21:S16S20.
  11. Thrush CR, Spollen JJ, Tariq SG, et al. Evidence for validity of a survey to measure the learning environment for professionalism. Med Teach. 2011;33(12):e683e688.
  12. Kolb DA. Experiential Learning: Experience as the Source of Learning and Development. Englewood Cliffs, NJ: Prentice Hall; 1984.
  13. Armstrong E, Parsa‐Parsi R. How can physicians' learning style drive educational planning? Acad Med. 2005;80:68084.
  14. Ber R, Alroy G. Twenty years of experience using trigger films as a teaching tool. Acad Med. 2001;76:656658.
Issue
Journal of Hospital Medicine - 8(7)
Issue
Journal of Hospital Medicine - 8(7)
Page Number
386-389
Page Number
386-389
Publications
Publications
Article Type
Display Headline
Promoting professionalism via a video‐based educational workshop for academic hospitalists and housestaff
Display Headline
Promoting professionalism via a video‐based educational workshop for academic hospitalists and housestaff
Sections
Article Source
© 2013 Society of Hospital Medicine
Disallow All Ads
Correspondence Location
Address for correspondence and reprint requests: Jeanne M. Farnan, MD, 5841 South Maryland Avenue, AMB W216 MC 2007, Chicago, IL 60637; Telephone: 773‐834‐3401; Fax: 773‐834‐2238; E‐mail: jfarnan@medicine.bsd.uchicago.edu
Content Gating
Gated (full article locked unless allowed per User)
Gating Strategy
First Peek Free
Article PDF Media
Media Files

Unprofessional Behavior and Hospitalists

Article Type
Changed
Mon, 05/22/2017 - 18:36
Display Headline
Participation in unprofessional behaviors among hospitalists: A multicenter study

The discrepancy between what is taught about professionalism in formal medical education and what is witnessed in the hospital has received increasing attention.17 This latter aspect of medical education contributes to the hidden curriculum and impacts medical trainees' views on professionalism.8 The hidden curriculum refers to the lessons trainees learn through informal interactions within the multilayered educational learning environment.9 A growing body of work examines how the hidden curriculum and disruptive physicians impact the learning environment.9, 10 In response, regulatory agencies, such as the Liaison Committee on Medical Education (LCME) and Accreditation Council for Graduate Medical Education (ACGME), require training programs and medical schools to maintain standards of professionalism, and to regularly evaluate the learning environment and its impact on professionalism.11, 12 The ACGME in 2011 expanded its standards regarding professionalism by making certain that the program director and institution ensure a culture of professionalism that supports patient safety and personal responsibility.11 Given this increasing focus on professionalism in medical school and residency training programs, it is critical to examine faculty perceptions and actions that may perpetuate the discrepancy between the formal and hidden curriculum.13 This early exposure is especially significant because unprofessional behavior in medical school is strongly associated with later disciplinary action by a medical board.14, 15 Certain unprofessional behaviors can also compromise patient care and safety, and can detract from the hospital working environment.1618

In our previous work, we demonstrated that internal medicine interns reported increased participation in unprofessional behaviors regarding on‐call etiquette during internship.19, 20 Examples of these behaviors include refusing an admission (ie, blocking) and misrepresenting a test as urgent. Interestingly, students and residents have highlighted the powerful role of supervising faculty physicians in condoning or inhibiting such behavior. Given the increasing role of hospitalists as resident supervisors, it is important to consider the perceptions and actions of hospitalists with respect to perpetuating or hindering some unprofessional behaviors. Although hospital medicine is a relatively new specialty, many hospitalists are in frequent contact with medical trainees, perhaps because many residency programs and medical schools have a strong inpatient focus.2123 It is thus possible that hospitalists have a major influence on residents' behaviors and views of professionalism. In fact, the Society of Hospital Medicine's Core Competencies for Hospital Medicine explicitly state that hospitalists are expected to serve as a role model for professional and ethical conduct to house staff, medical students and other members of the interdisciplinary team.24

Therefore, the current study had 2 aims: first, to measure internal medicine hospitalists' perceptions of, and participation in, unprofessional behaviors using a previously validated survey; and second, to examine associations between job characteristics and participation in unprofessional behaviors.

METHODS

Study Design

This was a multi‐institutional, observational study that took place at the University of Chicago Pritzker School of Medicine, Northwestern University Feinberg School of Medicine, and NorthShore University HealthSystem. Hospitalist physicians employed at these hospitals were recruited for this study between June 2010 and July 2010. The Institutional Review Boards of the University of Chicago, Northwestern University, and NorthShore University HealthSystem approved this study. All subjects provided informed consent before participating.

Survey Development and Administration

Based on a prior survey of interns and third‐year medical students, a 35‐item survey was used to measure perceptions of, and participation in, unprofessional behaviors.8, 19, 20 The original survey was developed in 2005 by medical students who observed behaviors by trainees and faculty that they considered to be unprofessional. The survey was subsequently modified by interns to ascertain unprofessional behavior among interns. For this iteration, hospitalists and study authors at each site reviewed the survey items and adapted each item to ensure relevance to hospitalist work and also generalizability to site. New items were also created to refer specifically to work routinely performed by hospitalist attendings (attesting to resident notes, transferring patients to other services to reduce workload, etc). Because of this, certain items utilized jargon to refer to the unprofessional behavior as hospitalists do (ie, blocking admissions and turfing), and resonate with literature describing these phenomena.25 Items were also written in such a fashion to elicit the unprofessional nature (ie, blocking an admission that could be appropriate for your service).

The final survey (see Supporting Information, Appendix, in the online version of this article) included domains such as interactions with others, interactions with trainees, and patient‐care scenarios. Demographic information and job characteristics were collected including year of residency completion, total amount of clinical work, amount of night work, and amount of administrative work. Hospitalists were not asked whether they completed residency at the institution where they currently work in order to maintain anonymity in the context of a small sample. Instead, they were asked to rate their familiarity with residents at their institution on a Likert‐type scale ranging from very unfamiliar (1) to familiar (3) to very familiar (5). To help standardize levels of familiarity across hospitalists, we developed anchors that corresponded to how well a hospitalist would know resident names with familiar defined as knowing over half of resident names.

Participants reported whether they participated in, or observed, a particular behavior and rated their perception of each behavior from 1 (unprofessional) to 5 (professional), with unprofessional and somewhat unprofessional defined as unprofessional. A site champion administered paper surveys during a routine faculty meeting at each site. An electronic version was administered using SurveyMonkey (SurveyMonkey, Palo Alto, CA) to hospitalists not present at the faculty meeting. Participants chose a unique, nonidentifiable code to facilitate truthful reporting while allowing data tracking in follow‐up studies.

Data Analysis

Clinical time was dichotomized using above and below 50% full‐time equivalents (FTE) to define those that did less clinical. Because teaching time was relatively low with the median percent FTE spent on teaching at 10%, we used a cutoff of greater than 10% as greater teaching. Because many hospitalists engaged in no night work, night work was reported as those who engaged in any night work and those who did not. Similarly, because many hospitalists had no administrative time, administrative time was split into those with any administrative work and those without any administrative work. Lastly, those born after 1970 were classified as younger hospitalists.

Chi‐square tests were used to compare site response rates, and descriptive statistics were used to examine demographic characteristics of hospitalist respondents, in addition to perception of, and participation in, unprofessional behaviors. Because items on the survey were highly correlated, we used factor analysis to identify the underlying constructs that related to unprofessional behavior.26 Factor analysis is a statistical procedure that is most often used to explore which variables in a data set are most related or correlated to each other. By examining the patterns of similar responses, the underlying factors can be identified and extracted. These factors, by definition, are not correlated with each other. To select the number of factors to retain, the most common convention is to use Kaiser criterion, or retain all factors with eigenvalues greater than, or equal to, one.27 An eigenvalue measures the amount of variation in all of the items on the survey which is accounted for by that factor. If a factor has a low eigenvalue (less than 1 is the convention), then it is contributing little and is ignored, as it is likely redundant with the higher value factors.

Because use of Kaiser criterion often overestimates the number of factors to retain, another method is to use a scree plot which tends to underestimate the factors. Both were used in this study to ensure a stable solution. To name the factors, we examined which items or group of items loaded or were most highly related to which factor. To ensure an optimal factor solution, items with minimal participation (less than 3%) were excluded from factor analysis.

Then, site‐adjusted multivariate regression analysis was used to examine associations between job and demographic characteristics, and the factors of unprofessional behavior identified. Models controlled for gender and familiarity with residents. Because sample medians were used to define greater teaching (>10% FTE), we also performed a sensitivity analysis using different cutoffs for teaching time (>20% FTE and teaching tertiles). Likewise, we also used varying definitions of less clinical time to ensure that any statistically significant associations were robust across varying definitions. All data were analyzed using STATA 11.0 (Stata Corp, College Station, TX) and statistical significance was defined as P < 0.05.

RESULTS

Seventy‐seven of the 101 hospitalists (76.2%) at 3 sites completed the survey. While response rates varied by site (site 1, 67%; site 2, 74%; site 3, 86%), the differences were not statistically significant (2 = 2.9, P = 0.24). Most hospitalists (79.2%) completed residency after 2000. Over half (57.1%) of participants were male, and over half (61%) reported having worked with their current hospitalist group from 1 to 4 years. Almost 60% (59.7%) reported being unfamiliar with residents in the program. Over 40% of hospitalists did not do any night work. Hospitalists were largely clinical, one‐quarter of hospitalists reported working over 50% FTE, and the median was 80% FTE. While 78% of hospitalists reported some teaching time, median time on teaching service was low at 10% (Table 1).

Demographics of Responders* (n = 77)
 Total n (%)
  • Abbreviations: IQR, interquartile range.

  • Site differences were observed for clinical practice characteristics, such as number of weeks of teaching service, weeks working nights, clinical time, research time, completed fellowship, and won teaching awards. Due to item nonresponse, number of respondents reporting is listed for each item.

  • Familiarity with residents asked in lieu of whether hospitalist trained at the institution. Familiarity defined as a rating of 4 or 5 on Likert scale ranging from Very Unfamiliar (1) to Very Familiar (5), with Familiar (4) defined further as knowing >50% of residents' names.

Male (%)44 (57.1)
Completed residency (%)
Between 1981 and 19902 (2.6)
Between 1991 and 200014 (18.2)
After 200061 (79.2)
Medical school matriculation (%) (n = 76) 
US medical school59 (77.6)
International medical school17 (22.3)
Years spent with current hospitalist group (%)
<1 yr14 (18.2)
14 yr47 (61.0)
59 yr15 (19.5)
>10 yr1 (1.3)
Familiarity with residents (%)
Familiar31 (40.2)
Unfamiliar46 (59.7)
No. of weeks per year spent on (median IQR)
Hospitalist practice (n = 72)26.0 [16.026.0]
Teaching services (n = 68)4.0 [1.08.0]
Weeks working nights* (n = 71)
>2 wk16 (22.5)
12 wk24 (33.8)
0 wk31 (43.7)
% Clinical time (median IQR)* (n = 73)80 (5099)
% Teaching time (median IQR)* (n = 74)10 (120)
Any research time (%)* (n = 71)22 (31.0)
Any administrative time (%) (n = 72)29 (40.3)
Completed fellowship (%)*12 (15.6)
Won teaching awards (%)* (n = 76)21 (27.6)
View a career in hospital medicine as (%)
Temporary11 (14.3)
Long term47 (61.0)
Unsure19 (24.7)

Hospitalists perceived almost all behaviors as unprofessional (unprofessional or somewhat unprofessional on a 5‐point Likert Scale). The only behavior rated as professional with a mean of 4.25 (95% CI 4.014.49) was staying past shift limit to complete a patient‐care task that could have been signed out. This behavior also had the highest level of participation by hospitalists (81.7%). Hospitalists were most ambivalent when rating professionalism of attending an industry‐sponsored dinner or social event (mean 3.20, 95% CI 2.983.41) (Table 2).

Perception of, and Observation and Participation in, Unprofessional Behaviors Among Hospitalists (n = 77)
BehaviorReported Perception (Mean Likert score)*Reported Participation (%)Reported Observation (%)
  • Abbreviations: ER, emergency room.

  • Perception rated on Likert scale from 1 (unprofessional) to 5 (professional).

Having nonmedical/personal conversations in patient corridors (eg, discussing evening plans)2.55 (2.342.76)67.180.3
Ordering a routine test as urgent to get it expedited2.82 (2.583.06)62.380.5
Making fun of other physicians to colleagues1.56 (1.391.70)40.367.5
Disparaging the ER team/outpatient doctor to others for findings later discovered on the floor (eg, after the patient is admitted)2.01 (1.842.19)39.567.1
Signing out patients over the phone at the end of shift when sign‐out could have been done in person2.95 (2.743.16)40.865.8
Texting or using smartphone during educational conferences (ie, noon lecture)2.16 (1.952.36)39.072.7
Discussing patient information in public spaces1.49 (1.341.63)37.766.2
Making fun of other attendings to colleagues1.62 (1.461.78)35.161.0
Deferring family members' concerns about a change in the patient's clinical course to the primary team in order to avoid engaging in such a discussion2.16 (1.912.40)30.355.3
Making disparaging comments about a patient on rounds1.42 (1.271.56)29.867.5
Attending an industry (eg, pharmaceutical or equipment/device manufacturer)‐sponsored dinner or social event3.20 (2.983.41)28.660.5
Ignoring family member's nonurgent questions about a cross‐cover patient when you had time to answer2.05 (1.852.25)26.348.7
Attesting to a resident's note when not fully confident of the content of their documentation1.65 (1.451.85)23.432.5
Making fun of support staff to colleagues1.45 (1.311.59)22.157.9
Not correcting someone who mistakes a student for a physician2.19 (2.012.38)20.835.1
Celebrating a blocked‐admission1.80 (1.612.00)21.160.5
Making fun of residents to colleagues1.53 (1.371.70)18.244.2
Coming to work when you have a significant illness (eg, influenza)1.99 (1.792.19)14.335.1
Celebrating a successful turf1.71 (1.511.92)11.739.0
Failing to notify the patient that a member of the team made, or is concerned that they made, an error1.53 (1.341.71)10.420.8
Transferring a patient, who could be cared for on one's own service, to another service in order to reduce one's census (eg, turfing)1.72 (1.521.91)9.358.7
Refusing an admission which could be considered appropriate for your service (eg, blocking)1.63 (1.441.82)7.968.4
Falsifying patient records (ie, back‐dating a note, copying forward unverified information, or documenting physical findings not personally obtained)1.22 (1.101.34)6.527.3
Making fun of students to colleagues1.35 (1.191.51)6.524.7
Failing to notify patient‐safety or risk management that a member of the team made, or is concerned that they made, an error1.64 (1.461.82)5.213.2
Introducing a student as a doctor to patients1.96 (1.762.16)3.920.8
Signing‐out a procedure or task, that could have been completed during a required shift or by the primary team, in order to go home as early in the day as possible1.48 (1.321.64)3.948.1
Performing medical or surgical procedures on a patient beyond self‐perceived level of skill1.27 (1.141.41)2.67.8
Asking a student to obtain written consent from a patient or their proxy without supervision (eg, for blood transfusion or minor procedures)1.60 (1.421.78)2.636.5
Encouraging a student to state that they are a doctor in order to expedite patient care1.31 (1.151.47)2.66.5
Discharging a patient before they are ready to go home in order to reduce one's census1.18 (1.071.29)2.619.5
Reporting patient information (eg, labs, test results, exam results) as normal when uncertain of the true results1.29 (1.161.41)2.615.6
Asking a student to perform medical or surgical procedures which are perceived to be beyond their level of skill1.26 (1.121.40)1.33.9
Asking a student to discuss, with patients, medical or surgical information which is perceived to be beyond their level of knowledge1.41 (1.261.56)0.015.8

Participation in egregious behaviors, such as falsifying patient records (6.49%) and performing medical or surgical procedures on a patient beyond self‐perceived level of skill (2.60%), was very low. The most common behaviors rated as unprofessional that hospitalists reported participating in were having nonmedical/personal conversations in patient corridors (67.1%), ordering a routine test as urgent to expedite care (62.3%), and making fun of other physicians to colleagues (40.3%). Forty percent of participants reported disparaging the emergency room (ER) team or primary care physician for findings later discovered, signing out over the phone when it could have been done in person, and texting or using smartphones during educational conferences. In particular, participation in unprofessional behaviors related to trainees was close to zero (eg, asking a student to discuss, with patients, medical or surgical information which is perceived to be beyond their level of knowledge). The least common behaviors that hospitalists reported participating in were discharging a patient before they are ready to go home in order to reduce one's census (2.56%) and reporting patient information as normal when uncertain of the true results (2.60%). Like previous studies of unprofessional behaviors, those that reported participation were less likely to report the behavior as unprofessional.8, 19

Observation of behaviors ranged from 4% to 80%. In all cases, observation of the behavior was reported at a higher level than participation. Correlation between observation and participation was also high, with the exception of a few behaviors that had zero or near zero participation rates (ie, reporting patient information as normal when unsure of true results.)

After performing factor analysis, 4 factors had eigenvalues greater than 1 and were therefore retained and extracted for further analysis. These 4 factors accounted for 76% of the variance in responses reported on the survey. By examining which items or groups of items most strongly loaded on each factor, the factors were named accordingly: factor 1 referred to behaviors related to making fun of others, factor 2 referred to workload management, factor 3 referred to behaviors related to the learning environment, and factor 4 referred to behaviors related to time pressure (Table 3).

Results of Factor Analysis Displaying Items by Primary Loading
  • NOTE: Items were categorized using factor analysis to the factor that they loaded most highly on. All items shown loaded at 0.4 or above onto each factor. Four items were omitted due to loadings less than 0.4. One item cross‐loaded on multiple factors (deferring family questions). Abbreviations: ER, emergency room.

Factor 1: Making fun of others
Making fun of other physicians (0.78)
Making fun of attendings (0.77)
Making fun of residents (0.70)
Making disparaging comments about a patient on rounds (0.51)
Factor 2: Workload management
Celebrating a successful turf (0.81)
Celebrating a blocked‐admission (0.65)
Coming to work sick (0.56)
Transferring a patient, who could be cared for on one's own service, to another service in order to reduce one's census (eg, turfing.) (0.51)
Disparaging the ER team/outpatient doctor to others for findings later discovered on the floor (0.48)
Discharging a patient before they are ready to go home in order to reduce one's census (0.43)
Factor 3: Learning environment
Not correcting someone who mistakes a student for a physician (0.72)
Texting or using smartphone during educational conferences (ie, noon lecture) (0.51)
Failing to notify patient‐safety or risk management that a member of the team made, or is concerned that they made, an error (0.45)
Having nonmedical/personal conversations in patient corridors (eg, discussing evening plans) (0.43)
Factor 4: Time pressure
Ignoring family member's nonurgent questions about a cross‐cover patient when you had the time to answer (0.50)
Signing out patients over the phone at the end of shift when sign‐out could have been done in person (0.46)
Attesting to a resident's note when not fully confident of the content of their documentation (0.44)

Using site‐adjusted multivariate regression, certain hospitalist job characteristics were associated with certain patterns of participating in unprofessional behavior (Table 4). Those with less clinical time (<50% FTE) were more likely to participate in unprofessional behaviors related to making fun of others (factor 1, value = 0.94, 95% CI 0.32 to 1.56, P value <0.05). Hospitalists who had any administrative time ( value = 0.61, 95% CI 0.111.10, P value <0.05) were more likely to report participation in behaviors related to workload management. Hospitalists engaged in any night work were more likely to report participation in unprofessional behaviors related to time pressure ( value = 0.67, 95% CI 0.171.17, P value <0.05). Time devoted to teaching or research was not associated with greater participation in any of the domains of unprofessional behavior surveyed.

Association Between Hospitalist Job and Demographic Characteristics and Factors of Unprofessional Behavior
ModelMaking Fun of OthersLearning EnvironmentWorkload ManagementTime Pressure
PredictorBeta [95% CI]Beta [95% CI]Beta [95% CI]Beta [95% CI]
  • NOTE: Table shows the results of 4 different multivariable linear regression models, which examine the association between various covariates (job characteristics, demographic characteristics, and site) and factors of participation in unprofessional behaviors (communication, patient safety, workload). Due to item nonresponse, n = 63 for all regression models. Abbreviations: CI, confidence interval.

  • P < 0.05.

  • Less clinical was defined as less than 50% full‐time equivalent (FTE) in a given year spent on clinical work.

  • Teaching was defined as greater than the median (10% FTE) spent on teaching. Results did not change when using tertiles of teaching effort, or a cutoff at teaching greater than 20% FTE.

  • Administrative time, research time, and nights were defined as reporting any administrative time, research time, or night work, respectively (greater than 0% per year).

  • Younger was defined as those born after 1970.

Job characteristics
Less clinical0.94 [0.32, 1.56]*0.01 [0.66, 0.64]0.17 [0.84, 0.49]0.39 [0.24, 1.01]
Administrative0.30 [0.16, 0.76]0.06 [0.43, 0.54]0.61 [0.11, 1.10]*0.26 [0.20, 0.72]
Teaching0.01 [0.49, 0.48]0.09 [0.60, 0.42]0.12 [0.64, 0.40]0.16 [0.33, 0.65]
Research0.30 [0.87, 0.27]0.38 [0.98, 0.22]0.37 [0.98, 0.24]0.13 [0.45, 0.71]
Any nights0.08 [0.58, 0.42]0.24 [0.28, 0.77]0.24 [0.29, 0.76]0.67 [0.17,1.17]*
Demographic characteristics
Male0.06 [0.42, 0.53]0.03 [0.47, 0.53]0.05 [0.56, 0.47]0.40 [0.89, 0.08]
Younger0.05 [0.79, 0.69]0.64 [1.42, 0.14]0.87 [0.07, 1.67]*0.62 [0.13, 1.37]
Unfamiliar with residents0.32 [0.85, 0.22]0.32 [0.89, 0.24]0.13 [0.45, 0.70]0.47 [0.08, 1.01]
Institution
Site 10.58 [0.22, 1.38]0.05 [0.89, 0.79]1.01 [0.15, 1.86]*0.77 [1.57, 0.04]
Site 30.11 [0.68, 0.47]0.70 [1.31, 0.09]*0.43 [0.20, 1.05]0.45 [0.13, 1.04]
Constant0.03 [0.99, 1.06]0.94 [0.14, 2.02]1.23[2.34, 0.13]*1.34[2.39, 0.31]*

The only demographic characteristic that was significantly associated with unprofessional behavior was age. Specifically, those who were born after 1970 were more likely to participate in unprofessional behaviors related to workload management ( value = 0.87, 95% CI 0.071.67, P value <0.05). Site differences were also present. Specifically, one site was more likely to report participation in unprofessional behaviors related to workload management ( value site 1 = 1.01, 95% CI 0.15 to 1.86, P value <0.05), while another site was less likely to report participation in behaviors related to the learning environment ( value site 3 = 0.70, 95% CI 1.31 to 0.09, P value <0.05). Gender and familiarity with residents were not significant predictors of participation in unprofessional behaviors. Results remained robust in sensitivity analyses using different cutoffs of clinical time and teaching time.

DISCUSSION

This multisite study adds to what is known about the perceptions of, and participation in, unprofessional behaviors among internal medicine hospitalists. Hospitalists perceived almost all surveyed behaviors as unprofessional. Participation in egregious and trainee‐related unprofessional behaviors was very low. Four categories appeared to explain the variability in how hospitalists reported participation in unprofessional behaviors: making fun of others, workload management, learning environment, and time pressure. Participation in behaviors within these factors was associated with certain job characteristics, such as clinical time, administrative time, and night work, as well as age and site.

It is reassuring that participation in, and trainee‐related, unprofessional behaviors is very low, and it is noteworthy that attending an industry‐sponsored dinner is not considered unprofessional. This was surprising in the setting of increased external pressures to report and ban such interactions.28 Perception that attending such dinners is acceptable may reflect a lag between current practice and national recommendations.

It is important to explore why certain job characteristics are associated with participation in unprofessional behaviors. For example, those with less clinical time were more likely to participate in making fun of others. It may be the case that hospitalists with more clinical time may make a larger effort to develop and maintain positive relationships. Another possible explanation is that hospitalists with less clinical time are more easily influenced by those in the learning environment who make fun of others, such as residents who they are supervising for only a brief period.

For unprofessional behaviors related to workload management, those who were younger, and those with any administrative time, were more likely to participate in behaviors such as celebrating a blocked‐admission. Our prior work shows that behaviors related to workload management are more widespread in residency, and therefore younger hospitalists, who are often recent residency graduates, may be more prone to participating in these behaviors. While unproven, it is possible that those with more administrative time may have competing priorities with their administrative roles, which motivate them to more actively manage their workload, leading them to participate in workload management behaviors.

Hospitalists who did any night work were more likely to participate in unprofessional behaviors related to time pressure. This could reflect the high workloads that night hospitalists may face and the pressure they feel to wrap up work, resulting in a hasty handoff (ie, over the phone) or to defer work (ie, family questions). Site differences were also observed for participation in behaviors related to the learning environment, speaking to the importance of institutional culture.

It is worth mentioning that hospitalists who were teachers were not any less likely to report participating in certain behaviors. While 78% of hospitalists reported some level of teaching, the median reported percentage of teaching was 10% FTE. This level of teaching likely reflects the diverse nature of work in which hospitalists engage. While hospitalists spend some time working with trainees, services that are not staffed with residents (eg, uncovered services) are becoming increasingly common due to stricter resident duty hour restrictions. This may explain why 60% of hospitalists reported being unfamiliar with residents. We also used a high bar for familiarity, which we defined as knowing half of residents by name, and served as a proxy for those who may have trained at the institution where they currently work. In spite of hospitalists reporting a low fraction of their total clinical time devoted to resident services, a significant fraction of resident services were staffed by hospitalists at all sites, making them a natural target for interventions.

These results have implications for future work to assess and improve professionalism in the hospital learning environment. First, interventions to address unprofessional behaviors should focus on behaviors with the highest participation rates. Like our earlier studies of residents, participation is high in certain behaviors, such as misrepresenting a test as urgent, or disparaging the ER or primary care physician (PCP) for a missed finding.19, 20 While blocking an admission was common in our studies of residents, reported participation among hospitalists was low. Similar to a prior study of clinical year medical students at one of our sites, 1 in 5 hospitalists reported not correcting someone who mistakes a student for a physician, highlighting the role that hospitalists may have in perpetuating this behavior.8 Additionally, addressing the behaviors identified in this study, through novel curricular tools, may help to teach residents many of the interpersonal and communication skills called for in the 2011 ACGME Common Program Requirements.11 The ACGME requirements also include the expectation that faculty model how to manage their time before, during, and after clinical assignments, and recognize that transferring a patient to a rested provider is best. Given that most hospitalists believe staying past shift limit is professional, these requirements will be difficult to adopt without widespread culture change.

Moreover, interventions could be tailored to hospitalists with certain job characteristics. Interventions may be educational or systems based. An example of the former is stressing the impact of the learning and working environment on trainees, and an example of the latter is streamlining the process in which ordered tests are executed to result in a more timely completion of tests. This may result in fewer physicians misrepresenting a test as urgent in order to have the test done in a timely manner. Additionally, hospitalists with less clinical time could receive education on their impact as a role model for trainees. Hospitalists who are younger or with administrative commitments could be trained on the importance of avoiding behaviors related to workload management, such as blocking or turfing patients. Lastly, given the site differences, critical examination of institutional culture and policies is also important. With funding from the American Board of Internal Medicine (ABIM) Foundation, we are currently creating an educational intervention, targeting those behaviors that were most frequent among hospitalists and residents at our institutions to promote dialogue and critical reflection, with the hope of reducing the most prevalent behaviors encountered.

There are several limitations to this study. Despite the anonymity of the survey, participants may have inaccurately reported their participation in unprofessional behaviors due to socially desirable response. In addition, because we used factor analysis and multivariate regression models with a small sample size, item nonresponse limited the sample for regression analyses and raises the concern for response bias. However, all significant associations remained so after performing backwards stepwise elimination of covariates that were P > 0.10 in models that were larger (ranging from 65 to 69). Because we used self‐report and not direct observation of participation in unprofessional behaviors, it is not possible to validate the responses given. Future work could rely on the use of 360 degree evaluations or other methods to validate responses given by self‐report. It is also important to consider assessing whether these behaviors are associated with actual patient outcomes, such as length of stay or readmission. Some items may not always be unprofessional. For example, texting during an educational conference might be to advance care, which would not necessarily be unprofessional. The order in which the questions were asked could have led to bias. We asked about participation before perception to try to limit bias reporting in participation. Changing the order of these questions would potentially have resulted in under‐reporting participation in behaviors that one perceived to be unprofessional. This study was conducted at 3 institutions located in Chicago, limiting generalizability to institutions outside of this area. Only internal medicine hospitalists were surveyed, which also limits generalizability to other disciplines and specialties within internal medicine. Lastly, it is important to highlight that hospitalists are not the sole teachers on inpatient services, since residents encounter a variety of faculty who serve as teaching attendings. Future work should expand to other centers and other specialties.

In conclusion, in this multi‐institutional study of hospitalists, participation in egregious behaviors was low. Four factors or patterns underlie hospitalists' reports of participation in unprofessional behavior: making fun of others, learning environment, workload management, and time pressure. Job characteristics (clinical time, administrative time, night work), age, and site were all associated with different patterns of unprofessional behavior. Specifically, hospitalists with less clinical time were more likely to make fun of others. Hospitalists who were younger in age, as well as those who had any administrative work, were more likely to participate in behaviors related to workload management. Hospitalists who work nights were more likely to report behaviors related to time pressure. Interventions to promote professionalism should take institutional culture into account and should focus on behaviors with the highest participation rates. Efforts should also be made to address underlying reasons for participation in these behaviors.

Acknowledgements

The authors thank Meryl Prochaska for her research assistance and manuscript preparation.

Disclosures: The authors acknowledge funding from the ABIM Foundation and the Pritzker Summer Research Program. The funders had no role in the design of the study; the collection, analysis, and interpretation of the data; or the decision to approve publication of the finished manuscript. Prior presentations of the data include the 2010 University of Chicago Pritzker School of Medicine Summer Research Forum, the 2010 University of Chicago Pritzker School of Medicine Medical Education Day, the 2010 Midwest Society of Hospital Medicine Meeting in Chicago, IL, and the 2011 Society of Hospital Medicine National Meeting in Dallas, TX. All authors disclose no relevant or financial conflicts of interest.

Files
References
  1. Stern DT.Practicing what we preach? An analysis of the curriculum of values in medical education.Am J Med.1998;104:569575.
  2. Borgstrom E,Cohn S,Barclay S.Medical professionalism: conflicting values for tomorrow's doctors.J Gen Intern Med.2010;25(12):13301336.
  3. Karnieli‐Miller O,Vu TR,Holtman MC,Clyman SG,Inui TS.Medical students' professionalism narratives: a window on the informal and hidden curriculum.Acad Med.2010;85(1):124133.
  4. Cohn FG,Shapiro J,Lie DA,Boker J,Stephens F,Leung LA.Interpreting values conflicts experienced by obstetrics‐gynecology clerkship students using reflective writing.Acad Med.2009;84(5):587596.
  5. Gaiser RR.The teaching of professionalism during residency: why it is failing and a suggestion to improve its success.Anesth Analg.2009;108(3):948954.
  6. Gofton W,Regehr G.What we don't know we are teaching: unveiling the hidden curriculum.Clin Orthop Relat Res.2006;449:2027.
  7. Hafferty FW.Definitions of professionalism: a search for meaning and identity.Clin Orthop Relat Res.2006;449:193204.
  8. Reddy ST,Farnan JM,Yoon JD, et al.Third‐year medical students' participation in and perceptions of unprofessional behaviors.Acad Med.2007;82:S35S39.
  9. Hafferty FW.Beyond curriculum reform: confronting medicine's hidden curriculum.Acad Med.1998;73:403407.
  10. Pfifferling JH.Physicians' “disruptive” behavior: consequences for medical quality and safety.Am J Med Qual.2008;23:165167.
  11. Accreditation Council for Graduate Medical Education. Common Program Requirements: General Competencies. Available at: http://www.acgme.org/acwebsite/home/common_program_requirements_07012011.pdf. Accessed December 19,2011.
  12. Liaison Committee on Medical Education. Functions and Structure of a Medical School. Available at: http://www.lcme.org/functions2010jun.pdf.. Accessed June 30,2010.
  13. Gillespie C,Paik S,Ark T,Zabar S,Kalet A.Residents' perceptions of their own professionalism and the professionalism of their learning environment.J Grad Med Educ.2009;1:208215.
  14. Papadakis MA,Hodgson CS,Teherani A,Kohatsu ND.Unprofessional behavior in medical school is associated with subsequent disciplinary action by a state medical board.Acad Med.2004;79:244249.
  15. Papadakis MA,Teherani A,Banach MA, et al.Disciplinary action by medical boards and prior behavior in medical school.N Engl J Med.2005;353:26732682.
  16. Rosenstein AH,O'Daniel M.A survey of the impact of disruptive behaviors and communication defects on patient safety.Jt Comm J Qual Patient Saf.2008;34:464471.
  17. Rosenstein AH,O'Daniel M.Managing disruptive physician behavior—impact on staff relationships and patient care.Neurology.2008;70:15641570.
  18. The Joint Commission.Behaviors that undermine a culture of safety. Sentinel Event Alert.2008. Available at: http://www.jointcommission.org/assets/1/18/SEA_40.PDF. Accessed April 28, 2012.
  19. Arora VM,Wayne DB,Anderson RA,Didwania A,Humphrey HJ.Participation in and perceptions of unprofessional behaviors among incoming internal medicine interns.JAMA.2008;300:11321134.
  20. Arora VM,Wayne DB,Anderson RA, et al.Changes in perception of and participation in unprofessional behaviors during internship.Acad Med.2010;85:S76S80.
  21. Wachter RM.Reflections: the hospitalist movement a decade later.J Hosp Med.2006;1:248252.
  22. Society of Hospital Medicine, 2007–2008 Bi‐Annual Survey.2008. Available at: http://www.medscape.org/viewarticle/578134. Accessed April 28, 2012.
  23. Holmboe ES,Bowen JL,Green M, et al.Reforming internal medicine residency training. A report from the Society of General Internal Medicine's Task Force for Residency Reform.J Gen Intern Med.2005;20:11651172.
  24. Society of Hospital Medicine.The Core Competencies in Hospital Medicine: a framework for curriculum development by the Society of Hospital Medicine.J Hosp Med.2006;1(suppl 1):25.
  25. Caldicott CV,Dunn KA,Frankel RM.Can patients tell when they are unwanted? “Turfing” in residency training.Patient Educ Couns.2005;56:104111.
  26. Costello AB,Osborn JW.Best practices in exploratory factor analysis: four recommendations for getting the most from your analysis.Pract Assess Res Eval.2005;10:19.
  27. Principal Components and Factor Analysis. StatSoft Electronic Statistics Textbook. Available at: http://www.statsoft.com/textbook/principal‐components‐factor‐analysis/. Accessed December 30,2011.
  28. Brennan TA,Rothman DJ,Blank L, et al.Health industry practices that create conflicts of interest: a policy proposal for academic medical centers.JAMA.2006;295(4):429433.
Article PDF
Issue
Journal of Hospital Medicine - 7(7)
Publications
Page Number
543-550
Sections
Files
Files
Article PDF
Article PDF

The discrepancy between what is taught about professionalism in formal medical education and what is witnessed in the hospital has received increasing attention.17 This latter aspect of medical education contributes to the hidden curriculum and impacts medical trainees' views on professionalism.8 The hidden curriculum refers to the lessons trainees learn through informal interactions within the multilayered educational learning environment.9 A growing body of work examines how the hidden curriculum and disruptive physicians impact the learning environment.9, 10 In response, regulatory agencies, such as the Liaison Committee on Medical Education (LCME) and Accreditation Council for Graduate Medical Education (ACGME), require training programs and medical schools to maintain standards of professionalism, and to regularly evaluate the learning environment and its impact on professionalism.11, 12 The ACGME in 2011 expanded its standards regarding professionalism by making certain that the program director and institution ensure a culture of professionalism that supports patient safety and personal responsibility.11 Given this increasing focus on professionalism in medical school and residency training programs, it is critical to examine faculty perceptions and actions that may perpetuate the discrepancy between the formal and hidden curriculum.13 This early exposure is especially significant because unprofessional behavior in medical school is strongly associated with later disciplinary action by a medical board.14, 15 Certain unprofessional behaviors can also compromise patient care and safety, and can detract from the hospital working environment.1618

In our previous work, we demonstrated that internal medicine interns reported increased participation in unprofessional behaviors regarding on‐call etiquette during internship.19, 20 Examples of these behaviors include refusing an admission (ie, blocking) and misrepresenting a test as urgent. Interestingly, students and residents have highlighted the powerful role of supervising faculty physicians in condoning or inhibiting such behavior. Given the increasing role of hospitalists as resident supervisors, it is important to consider the perceptions and actions of hospitalists with respect to perpetuating or hindering some unprofessional behaviors. Although hospital medicine is a relatively new specialty, many hospitalists are in frequent contact with medical trainees, perhaps because many residency programs and medical schools have a strong inpatient focus.2123 It is thus possible that hospitalists have a major influence on residents' behaviors and views of professionalism. In fact, the Society of Hospital Medicine's Core Competencies for Hospital Medicine explicitly state that hospitalists are expected to serve as a role model for professional and ethical conduct to house staff, medical students and other members of the interdisciplinary team.24

Therefore, the current study had 2 aims: first, to measure internal medicine hospitalists' perceptions of, and participation in, unprofessional behaviors using a previously validated survey; and second, to examine associations between job characteristics and participation in unprofessional behaviors.

METHODS

Study Design

This was a multi‐institutional, observational study that took place at the University of Chicago Pritzker School of Medicine, Northwestern University Feinberg School of Medicine, and NorthShore University HealthSystem. Hospitalist physicians employed at these hospitals were recruited for this study between June 2010 and July 2010. The Institutional Review Boards of the University of Chicago, Northwestern University, and NorthShore University HealthSystem approved this study. All subjects provided informed consent before participating.

Survey Development and Administration

Based on a prior survey of interns and third‐year medical students, a 35‐item survey was used to measure perceptions of, and participation in, unprofessional behaviors.8, 19, 20 The original survey was developed in 2005 by medical students who observed behaviors by trainees and faculty that they considered to be unprofessional. The survey was subsequently modified by interns to ascertain unprofessional behavior among interns. For this iteration, hospitalists and study authors at each site reviewed the survey items and adapted each item to ensure relevance to hospitalist work and also generalizability to site. New items were also created to refer specifically to work routinely performed by hospitalist attendings (attesting to resident notes, transferring patients to other services to reduce workload, etc). Because of this, certain items utilized jargon to refer to the unprofessional behavior as hospitalists do (ie, blocking admissions and turfing), and resonate with literature describing these phenomena.25 Items were also written in such a fashion to elicit the unprofessional nature (ie, blocking an admission that could be appropriate for your service).

The final survey (see Supporting Information, Appendix, in the online version of this article) included domains such as interactions with others, interactions with trainees, and patient‐care scenarios. Demographic information and job characteristics were collected including year of residency completion, total amount of clinical work, amount of night work, and amount of administrative work. Hospitalists were not asked whether they completed residency at the institution where they currently work in order to maintain anonymity in the context of a small sample. Instead, they were asked to rate their familiarity with residents at their institution on a Likert‐type scale ranging from very unfamiliar (1) to familiar (3) to very familiar (5). To help standardize levels of familiarity across hospitalists, we developed anchors that corresponded to how well a hospitalist would know resident names with familiar defined as knowing over half of resident names.

Participants reported whether they participated in, or observed, a particular behavior and rated their perception of each behavior from 1 (unprofessional) to 5 (professional), with unprofessional and somewhat unprofessional defined as unprofessional. A site champion administered paper surveys during a routine faculty meeting at each site. An electronic version was administered using SurveyMonkey (SurveyMonkey, Palo Alto, CA) to hospitalists not present at the faculty meeting. Participants chose a unique, nonidentifiable code to facilitate truthful reporting while allowing data tracking in follow‐up studies.

Data Analysis

Clinical time was dichotomized using above and below 50% full‐time equivalents (FTE) to define those that did less clinical. Because teaching time was relatively low with the median percent FTE spent on teaching at 10%, we used a cutoff of greater than 10% as greater teaching. Because many hospitalists engaged in no night work, night work was reported as those who engaged in any night work and those who did not. Similarly, because many hospitalists had no administrative time, administrative time was split into those with any administrative work and those without any administrative work. Lastly, those born after 1970 were classified as younger hospitalists.

Chi‐square tests were used to compare site response rates, and descriptive statistics were used to examine demographic characteristics of hospitalist respondents, in addition to perception of, and participation in, unprofessional behaviors. Because items on the survey were highly correlated, we used factor analysis to identify the underlying constructs that related to unprofessional behavior.26 Factor analysis is a statistical procedure that is most often used to explore which variables in a data set are most related or correlated to each other. By examining the patterns of similar responses, the underlying factors can be identified and extracted. These factors, by definition, are not correlated with each other. To select the number of factors to retain, the most common convention is to use Kaiser criterion, or retain all factors with eigenvalues greater than, or equal to, one.27 An eigenvalue measures the amount of variation in all of the items on the survey which is accounted for by that factor. If a factor has a low eigenvalue (less than 1 is the convention), then it is contributing little and is ignored, as it is likely redundant with the higher value factors.

Because use of Kaiser criterion often overestimates the number of factors to retain, another method is to use a scree plot which tends to underestimate the factors. Both were used in this study to ensure a stable solution. To name the factors, we examined which items or group of items loaded or were most highly related to which factor. To ensure an optimal factor solution, items with minimal participation (less than 3%) were excluded from factor analysis.

Then, site‐adjusted multivariate regression analysis was used to examine associations between job and demographic characteristics, and the factors of unprofessional behavior identified. Models controlled for gender and familiarity with residents. Because sample medians were used to define greater teaching (>10% FTE), we also performed a sensitivity analysis using different cutoffs for teaching time (>20% FTE and teaching tertiles). Likewise, we also used varying definitions of less clinical time to ensure that any statistically significant associations were robust across varying definitions. All data were analyzed using STATA 11.0 (Stata Corp, College Station, TX) and statistical significance was defined as P < 0.05.

RESULTS

Seventy‐seven of the 101 hospitalists (76.2%) at 3 sites completed the survey. While response rates varied by site (site 1, 67%; site 2, 74%; site 3, 86%), the differences were not statistically significant (2 = 2.9, P = 0.24). Most hospitalists (79.2%) completed residency after 2000. Over half (57.1%) of participants were male, and over half (61%) reported having worked with their current hospitalist group from 1 to 4 years. Almost 60% (59.7%) reported being unfamiliar with residents in the program. Over 40% of hospitalists did not do any night work. Hospitalists were largely clinical, one‐quarter of hospitalists reported working over 50% FTE, and the median was 80% FTE. While 78% of hospitalists reported some teaching time, median time on teaching service was low at 10% (Table 1).

Demographics of Responders* (n = 77)
 Total n (%)
  • Abbreviations: IQR, interquartile range.

  • Site differences were observed for clinical practice characteristics, such as number of weeks of teaching service, weeks working nights, clinical time, research time, completed fellowship, and won teaching awards. Due to item nonresponse, number of respondents reporting is listed for each item.

  • Familiarity with residents asked in lieu of whether hospitalist trained at the institution. Familiarity defined as a rating of 4 or 5 on Likert scale ranging from Very Unfamiliar (1) to Very Familiar (5), with Familiar (4) defined further as knowing >50% of residents' names.

Male (%)44 (57.1)
Completed residency (%)
Between 1981 and 19902 (2.6)
Between 1991 and 200014 (18.2)
After 200061 (79.2)
Medical school matriculation (%) (n = 76) 
US medical school59 (77.6)
International medical school17 (22.3)
Years spent with current hospitalist group (%)
<1 yr14 (18.2)
14 yr47 (61.0)
59 yr15 (19.5)
>10 yr1 (1.3)
Familiarity with residents (%)
Familiar31 (40.2)
Unfamiliar46 (59.7)
No. of weeks per year spent on (median IQR)
Hospitalist practice (n = 72)26.0 [16.026.0]
Teaching services (n = 68)4.0 [1.08.0]
Weeks working nights* (n = 71)
>2 wk16 (22.5)
12 wk24 (33.8)
0 wk31 (43.7)
% Clinical time (median IQR)* (n = 73)80 (5099)
% Teaching time (median IQR)* (n = 74)10 (120)
Any research time (%)* (n = 71)22 (31.0)
Any administrative time (%) (n = 72)29 (40.3)
Completed fellowship (%)*12 (15.6)
Won teaching awards (%)* (n = 76)21 (27.6)
View a career in hospital medicine as (%)
Temporary11 (14.3)
Long term47 (61.0)
Unsure19 (24.7)

Hospitalists perceived almost all behaviors as unprofessional (unprofessional or somewhat unprofessional on a 5‐point Likert Scale). The only behavior rated as professional with a mean of 4.25 (95% CI 4.014.49) was staying past shift limit to complete a patient‐care task that could have been signed out. This behavior also had the highest level of participation by hospitalists (81.7%). Hospitalists were most ambivalent when rating professionalism of attending an industry‐sponsored dinner or social event (mean 3.20, 95% CI 2.983.41) (Table 2).

Perception of, and Observation and Participation in, Unprofessional Behaviors Among Hospitalists (n = 77)
BehaviorReported Perception (Mean Likert score)*Reported Participation (%)Reported Observation (%)
  • Abbreviations: ER, emergency room.

  • Perception rated on Likert scale from 1 (unprofessional) to 5 (professional).

Having nonmedical/personal conversations in patient corridors (eg, discussing evening plans)2.55 (2.342.76)67.180.3
Ordering a routine test as urgent to get it expedited2.82 (2.583.06)62.380.5
Making fun of other physicians to colleagues1.56 (1.391.70)40.367.5
Disparaging the ER team/outpatient doctor to others for findings later discovered on the floor (eg, after the patient is admitted)2.01 (1.842.19)39.567.1
Signing out patients over the phone at the end of shift when sign‐out could have been done in person2.95 (2.743.16)40.865.8
Texting or using smartphone during educational conferences (ie, noon lecture)2.16 (1.952.36)39.072.7
Discussing patient information in public spaces1.49 (1.341.63)37.766.2
Making fun of other attendings to colleagues1.62 (1.461.78)35.161.0
Deferring family members' concerns about a change in the patient's clinical course to the primary team in order to avoid engaging in such a discussion2.16 (1.912.40)30.355.3
Making disparaging comments about a patient on rounds1.42 (1.271.56)29.867.5
Attending an industry (eg, pharmaceutical or equipment/device manufacturer)‐sponsored dinner or social event3.20 (2.983.41)28.660.5
Ignoring family member's nonurgent questions about a cross‐cover patient when you had time to answer2.05 (1.852.25)26.348.7
Attesting to a resident's note when not fully confident of the content of their documentation1.65 (1.451.85)23.432.5
Making fun of support staff to colleagues1.45 (1.311.59)22.157.9
Not correcting someone who mistakes a student for a physician2.19 (2.012.38)20.835.1
Celebrating a blocked‐admission1.80 (1.612.00)21.160.5
Making fun of residents to colleagues1.53 (1.371.70)18.244.2
Coming to work when you have a significant illness (eg, influenza)1.99 (1.792.19)14.335.1
Celebrating a successful turf1.71 (1.511.92)11.739.0
Failing to notify the patient that a member of the team made, or is concerned that they made, an error1.53 (1.341.71)10.420.8
Transferring a patient, who could be cared for on one's own service, to another service in order to reduce one's census (eg, turfing)1.72 (1.521.91)9.358.7
Refusing an admission which could be considered appropriate for your service (eg, blocking)1.63 (1.441.82)7.968.4
Falsifying patient records (ie, back‐dating a note, copying forward unverified information, or documenting physical findings not personally obtained)1.22 (1.101.34)6.527.3
Making fun of students to colleagues1.35 (1.191.51)6.524.7
Failing to notify patient‐safety or risk management that a member of the team made, or is concerned that they made, an error1.64 (1.461.82)5.213.2
Introducing a student as a doctor to patients1.96 (1.762.16)3.920.8
Signing‐out a procedure or task, that could have been completed during a required shift or by the primary team, in order to go home as early in the day as possible1.48 (1.321.64)3.948.1
Performing medical or surgical procedures on a patient beyond self‐perceived level of skill1.27 (1.141.41)2.67.8
Asking a student to obtain written consent from a patient or their proxy without supervision (eg, for blood transfusion or minor procedures)1.60 (1.421.78)2.636.5
Encouraging a student to state that they are a doctor in order to expedite patient care1.31 (1.151.47)2.66.5
Discharging a patient before they are ready to go home in order to reduce one's census1.18 (1.071.29)2.619.5
Reporting patient information (eg, labs, test results, exam results) as normal when uncertain of the true results1.29 (1.161.41)2.615.6
Asking a student to perform medical or surgical procedures which are perceived to be beyond their level of skill1.26 (1.121.40)1.33.9
Asking a student to discuss, with patients, medical or surgical information which is perceived to be beyond their level of knowledge1.41 (1.261.56)0.015.8

Participation in egregious behaviors, such as falsifying patient records (6.49%) and performing medical or surgical procedures on a patient beyond self‐perceived level of skill (2.60%), was very low. The most common behaviors rated as unprofessional that hospitalists reported participating in were having nonmedical/personal conversations in patient corridors (67.1%), ordering a routine test as urgent to expedite care (62.3%), and making fun of other physicians to colleagues (40.3%). Forty percent of participants reported disparaging the emergency room (ER) team or primary care physician for findings later discovered, signing out over the phone when it could have been done in person, and texting or using smartphones during educational conferences. In particular, participation in unprofessional behaviors related to trainees was close to zero (eg, asking a student to discuss, with patients, medical or surgical information which is perceived to be beyond their level of knowledge). The least common behaviors that hospitalists reported participating in were discharging a patient before they are ready to go home in order to reduce one's census (2.56%) and reporting patient information as normal when uncertain of the true results (2.60%). Like previous studies of unprofessional behaviors, those that reported participation were less likely to report the behavior as unprofessional.8, 19

Observation of behaviors ranged from 4% to 80%. In all cases, observation of the behavior was reported at a higher level than participation. Correlation between observation and participation was also high, with the exception of a few behaviors that had zero or near zero participation rates (ie, reporting patient information as normal when unsure of true results.)

After performing factor analysis, 4 factors had eigenvalues greater than 1 and were therefore retained and extracted for further analysis. These 4 factors accounted for 76% of the variance in responses reported on the survey. By examining which items or groups of items most strongly loaded on each factor, the factors were named accordingly: factor 1 referred to behaviors related to making fun of others, factor 2 referred to workload management, factor 3 referred to behaviors related to the learning environment, and factor 4 referred to behaviors related to time pressure (Table 3).

Results of Factor Analysis Displaying Items by Primary Loading
  • NOTE: Items were categorized using factor analysis to the factor that they loaded most highly on. All items shown loaded at 0.4 or above onto each factor. Four items were omitted due to loadings less than 0.4. One item cross‐loaded on multiple factors (deferring family questions). Abbreviations: ER, emergency room.

Factor 1: Making fun of others
Making fun of other physicians (0.78)
Making fun of attendings (0.77)
Making fun of residents (0.70)
Making disparaging comments about a patient on rounds (0.51)
Factor 2: Workload management
Celebrating a successful turf (0.81)
Celebrating a blocked‐admission (0.65)
Coming to work sick (0.56)
Transferring a patient, who could be cared for on one's own service, to another service in order to reduce one's census (eg, turfing.) (0.51)
Disparaging the ER team/outpatient doctor to others for findings later discovered on the floor (0.48)
Discharging a patient before they are ready to go home in order to reduce one's census (0.43)
Factor 3: Learning environment
Not correcting someone who mistakes a student for a physician (0.72)
Texting or using smartphone during educational conferences (ie, noon lecture) (0.51)
Failing to notify patient‐safety or risk management that a member of the team made, or is concerned that they made, an error (0.45)
Having nonmedical/personal conversations in patient corridors (eg, discussing evening plans) (0.43)
Factor 4: Time pressure
Ignoring family member's nonurgent questions about a cross‐cover patient when you had the time to answer (0.50)
Signing out patients over the phone at the end of shift when sign‐out could have been done in person (0.46)
Attesting to a resident's note when not fully confident of the content of their documentation (0.44)

Using site‐adjusted multivariate regression, certain hospitalist job characteristics were associated with certain patterns of participating in unprofessional behavior (Table 4). Those with less clinical time (<50% FTE) were more likely to participate in unprofessional behaviors related to making fun of others (factor 1, value = 0.94, 95% CI 0.32 to 1.56, P value <0.05). Hospitalists who had any administrative time ( value = 0.61, 95% CI 0.111.10, P value <0.05) were more likely to report participation in behaviors related to workload management. Hospitalists engaged in any night work were more likely to report participation in unprofessional behaviors related to time pressure ( value = 0.67, 95% CI 0.171.17, P value <0.05). Time devoted to teaching or research was not associated with greater participation in any of the domains of unprofessional behavior surveyed.

Association Between Hospitalist Job and Demographic Characteristics and Factors of Unprofessional Behavior
ModelMaking Fun of OthersLearning EnvironmentWorkload ManagementTime Pressure
PredictorBeta [95% CI]Beta [95% CI]Beta [95% CI]Beta [95% CI]
  • NOTE: Table shows the results of 4 different multivariable linear regression models, which examine the association between various covariates (job characteristics, demographic characteristics, and site) and factors of participation in unprofessional behaviors (communication, patient safety, workload). Due to item nonresponse, n = 63 for all regression models. Abbreviations: CI, confidence interval.

  • P < 0.05.

  • Less clinical was defined as less than 50% full‐time equivalent (FTE) in a given year spent on clinical work.

  • Teaching was defined as greater than the median (10% FTE) spent on teaching. Results did not change when using tertiles of teaching effort, or a cutoff at teaching greater than 20% FTE.

  • Administrative time, research time, and nights were defined as reporting any administrative time, research time, or night work, respectively (greater than 0% per year).

  • Younger was defined as those born after 1970.

Job characteristics
Less clinical0.94 [0.32, 1.56]*0.01 [0.66, 0.64]0.17 [0.84, 0.49]0.39 [0.24, 1.01]
Administrative0.30 [0.16, 0.76]0.06 [0.43, 0.54]0.61 [0.11, 1.10]*0.26 [0.20, 0.72]
Teaching0.01 [0.49, 0.48]0.09 [0.60, 0.42]0.12 [0.64, 0.40]0.16 [0.33, 0.65]
Research0.30 [0.87, 0.27]0.38 [0.98, 0.22]0.37 [0.98, 0.24]0.13 [0.45, 0.71]
Any nights0.08 [0.58, 0.42]0.24 [0.28, 0.77]0.24 [0.29, 0.76]0.67 [0.17,1.17]*
Demographic characteristics
Male0.06 [0.42, 0.53]0.03 [0.47, 0.53]0.05 [0.56, 0.47]0.40 [0.89, 0.08]
Younger0.05 [0.79, 0.69]0.64 [1.42, 0.14]0.87 [0.07, 1.67]*0.62 [0.13, 1.37]
Unfamiliar with residents0.32 [0.85, 0.22]0.32 [0.89, 0.24]0.13 [0.45, 0.70]0.47 [0.08, 1.01]
Institution
Site 10.58 [0.22, 1.38]0.05 [0.89, 0.79]1.01 [0.15, 1.86]*0.77 [1.57, 0.04]
Site 30.11 [0.68, 0.47]0.70 [1.31, 0.09]*0.43 [0.20, 1.05]0.45 [0.13, 1.04]
Constant0.03 [0.99, 1.06]0.94 [0.14, 2.02]1.23[2.34, 0.13]*1.34[2.39, 0.31]*

The only demographic characteristic that was significantly associated with unprofessional behavior was age. Specifically, those who were born after 1970 were more likely to participate in unprofessional behaviors related to workload management ( value = 0.87, 95% CI 0.071.67, P value <0.05). Site differences were also present. Specifically, one site was more likely to report participation in unprofessional behaviors related to workload management ( value site 1 = 1.01, 95% CI 0.15 to 1.86, P value <0.05), while another site was less likely to report participation in behaviors related to the learning environment ( value site 3 = 0.70, 95% CI 1.31 to 0.09, P value <0.05). Gender and familiarity with residents were not significant predictors of participation in unprofessional behaviors. Results remained robust in sensitivity analyses using different cutoffs of clinical time and teaching time.

DISCUSSION

This multisite study adds to what is known about the perceptions of, and participation in, unprofessional behaviors among internal medicine hospitalists. Hospitalists perceived almost all surveyed behaviors as unprofessional. Participation in egregious and trainee‐related unprofessional behaviors was very low. Four categories appeared to explain the variability in how hospitalists reported participation in unprofessional behaviors: making fun of others, workload management, learning environment, and time pressure. Participation in behaviors within these factors was associated with certain job characteristics, such as clinical time, administrative time, and night work, as well as age and site.

It is reassuring that participation in, and trainee‐related, unprofessional behaviors is very low, and it is noteworthy that attending an industry‐sponsored dinner is not considered unprofessional. This was surprising in the setting of increased external pressures to report and ban such interactions.28 Perception that attending such dinners is acceptable may reflect a lag between current practice and national recommendations.

It is important to explore why certain job characteristics are associated with participation in unprofessional behaviors. For example, those with less clinical time were more likely to participate in making fun of others. It may be the case that hospitalists with more clinical time may make a larger effort to develop and maintain positive relationships. Another possible explanation is that hospitalists with less clinical time are more easily influenced by those in the learning environment who make fun of others, such as residents who they are supervising for only a brief period.

For unprofessional behaviors related to workload management, those who were younger, and those with any administrative time, were more likely to participate in behaviors such as celebrating a blocked‐admission. Our prior work shows that behaviors related to workload management are more widespread in residency, and therefore younger hospitalists, who are often recent residency graduates, may be more prone to participating in these behaviors. While unproven, it is possible that those with more administrative time may have competing priorities with their administrative roles, which motivate them to more actively manage their workload, leading them to participate in workload management behaviors.

Hospitalists who did any night work were more likely to participate in unprofessional behaviors related to time pressure. This could reflect the high workloads that night hospitalists may face and the pressure they feel to wrap up work, resulting in a hasty handoff (ie, over the phone) or to defer work (ie, family questions). Site differences were also observed for participation in behaviors related to the learning environment, speaking to the importance of institutional culture.

It is worth mentioning that hospitalists who were teachers were not any less likely to report participating in certain behaviors. While 78% of hospitalists reported some level of teaching, the median reported percentage of teaching was 10% FTE. This level of teaching likely reflects the diverse nature of work in which hospitalists engage. While hospitalists spend some time working with trainees, services that are not staffed with residents (eg, uncovered services) are becoming increasingly common due to stricter resident duty hour restrictions. This may explain why 60% of hospitalists reported being unfamiliar with residents. We also used a high bar for familiarity, which we defined as knowing half of residents by name, and served as a proxy for those who may have trained at the institution where they currently work. In spite of hospitalists reporting a low fraction of their total clinical time devoted to resident services, a significant fraction of resident services were staffed by hospitalists at all sites, making them a natural target for interventions.

These results have implications for future work to assess and improve professionalism in the hospital learning environment. First, interventions to address unprofessional behaviors should focus on behaviors with the highest participation rates. Like our earlier studies of residents, participation is high in certain behaviors, such as misrepresenting a test as urgent, or disparaging the ER or primary care physician (PCP) for a missed finding.19, 20 While blocking an admission was common in our studies of residents, reported participation among hospitalists was low. Similar to a prior study of clinical year medical students at one of our sites, 1 in 5 hospitalists reported not correcting someone who mistakes a student for a physician, highlighting the role that hospitalists may have in perpetuating this behavior.8 Additionally, addressing the behaviors identified in this study, through novel curricular tools, may help to teach residents many of the interpersonal and communication skills called for in the 2011 ACGME Common Program Requirements.11 The ACGME requirements also include the expectation that faculty model how to manage their time before, during, and after clinical assignments, and recognize that transferring a patient to a rested provider is best. Given that most hospitalists believe staying past shift limit is professional, these requirements will be difficult to adopt without widespread culture change.

Moreover, interventions could be tailored to hospitalists with certain job characteristics. Interventions may be educational or systems based. An example of the former is stressing the impact of the learning and working environment on trainees, and an example of the latter is streamlining the process in which ordered tests are executed to result in a more timely completion of tests. This may result in fewer physicians misrepresenting a test as urgent in order to have the test done in a timely manner. Additionally, hospitalists with less clinical time could receive education on their impact as a role model for trainees. Hospitalists who are younger or with administrative commitments could be trained on the importance of avoiding behaviors related to workload management, such as blocking or turfing patients. Lastly, given the site differences, critical examination of institutional culture and policies is also important. With funding from the American Board of Internal Medicine (ABIM) Foundation, we are currently creating an educational intervention, targeting those behaviors that were most frequent among hospitalists and residents at our institutions to promote dialogue and critical reflection, with the hope of reducing the most prevalent behaviors encountered.

There are several limitations to this study. Despite the anonymity of the survey, participants may have inaccurately reported their participation in unprofessional behaviors due to socially desirable response. In addition, because we used factor analysis and multivariate regression models with a small sample size, item nonresponse limited the sample for regression analyses and raises the concern for response bias. However, all significant associations remained so after performing backwards stepwise elimination of covariates that were P > 0.10 in models that were larger (ranging from 65 to 69). Because we used self‐report and not direct observation of participation in unprofessional behaviors, it is not possible to validate the responses given. Future work could rely on the use of 360 degree evaluations or other methods to validate responses given by self‐report. It is also important to consider assessing whether these behaviors are associated with actual patient outcomes, such as length of stay or readmission. Some items may not always be unprofessional. For example, texting during an educational conference might be to advance care, which would not necessarily be unprofessional. The order in which the questions were asked could have led to bias. We asked about participation before perception to try to limit bias reporting in participation. Changing the order of these questions would potentially have resulted in under‐reporting participation in behaviors that one perceived to be unprofessional. This study was conducted at 3 institutions located in Chicago, limiting generalizability to institutions outside of this area. Only internal medicine hospitalists were surveyed, which also limits generalizability to other disciplines and specialties within internal medicine. Lastly, it is important to highlight that hospitalists are not the sole teachers on inpatient services, since residents encounter a variety of faculty who serve as teaching attendings. Future work should expand to other centers and other specialties.

In conclusion, in this multi‐institutional study of hospitalists, participation in egregious behaviors was low. Four factors or patterns underlie hospitalists' reports of participation in unprofessional behavior: making fun of others, learning environment, workload management, and time pressure. Job characteristics (clinical time, administrative time, night work), age, and site were all associated with different patterns of unprofessional behavior. Specifically, hospitalists with less clinical time were more likely to make fun of others. Hospitalists who were younger in age, as well as those who had any administrative work, were more likely to participate in behaviors related to workload management. Hospitalists who work nights were more likely to report behaviors related to time pressure. Interventions to promote professionalism should take institutional culture into account and should focus on behaviors with the highest participation rates. Efforts should also be made to address underlying reasons for participation in these behaviors.

Acknowledgements

The authors thank Meryl Prochaska for her research assistance and manuscript preparation.

Disclosures: The authors acknowledge funding from the ABIM Foundation and the Pritzker Summer Research Program. The funders had no role in the design of the study; the collection, analysis, and interpretation of the data; or the decision to approve publication of the finished manuscript. Prior presentations of the data include the 2010 University of Chicago Pritzker School of Medicine Summer Research Forum, the 2010 University of Chicago Pritzker School of Medicine Medical Education Day, the 2010 Midwest Society of Hospital Medicine Meeting in Chicago, IL, and the 2011 Society of Hospital Medicine National Meeting in Dallas, TX. All authors disclose no relevant or financial conflicts of interest.

The discrepancy between what is taught about professionalism in formal medical education and what is witnessed in the hospital has received increasing attention.17 This latter aspect of medical education contributes to the hidden curriculum and impacts medical trainees' views on professionalism.8 The hidden curriculum refers to the lessons trainees learn through informal interactions within the multilayered educational learning environment.9 A growing body of work examines how the hidden curriculum and disruptive physicians impact the learning environment.9, 10 In response, regulatory agencies, such as the Liaison Committee on Medical Education (LCME) and Accreditation Council for Graduate Medical Education (ACGME), require training programs and medical schools to maintain standards of professionalism, and to regularly evaluate the learning environment and its impact on professionalism.11, 12 The ACGME in 2011 expanded its standards regarding professionalism by making certain that the program director and institution ensure a culture of professionalism that supports patient safety and personal responsibility.11 Given this increasing focus on professionalism in medical school and residency training programs, it is critical to examine faculty perceptions and actions that may perpetuate the discrepancy between the formal and hidden curriculum.13 This early exposure is especially significant because unprofessional behavior in medical school is strongly associated with later disciplinary action by a medical board.14, 15 Certain unprofessional behaviors can also compromise patient care and safety, and can detract from the hospital working environment.1618

In our previous work, we demonstrated that internal medicine interns reported increased participation in unprofessional behaviors regarding on‐call etiquette during internship.19, 20 Examples of these behaviors include refusing an admission (ie, blocking) and misrepresenting a test as urgent. Interestingly, students and residents have highlighted the powerful role of supervising faculty physicians in condoning or inhibiting such behavior. Given the increasing role of hospitalists as resident supervisors, it is important to consider the perceptions and actions of hospitalists with respect to perpetuating or hindering some unprofessional behaviors. Although hospital medicine is a relatively new specialty, many hospitalists are in frequent contact with medical trainees, perhaps because many residency programs and medical schools have a strong inpatient focus.2123 It is thus possible that hospitalists have a major influence on residents' behaviors and views of professionalism. In fact, the Society of Hospital Medicine's Core Competencies for Hospital Medicine explicitly state that hospitalists are expected to serve as a role model for professional and ethical conduct to house staff, medical students and other members of the interdisciplinary team.24

Therefore, the current study had 2 aims: first, to measure internal medicine hospitalists' perceptions of, and participation in, unprofessional behaviors using a previously validated survey; and second, to examine associations between job characteristics and participation in unprofessional behaviors.

METHODS

Study Design

This was a multi‐institutional, observational study that took place at the University of Chicago Pritzker School of Medicine, Northwestern University Feinberg School of Medicine, and NorthShore University HealthSystem. Hospitalist physicians employed at these hospitals were recruited for this study between June 2010 and July 2010. The Institutional Review Boards of the University of Chicago, Northwestern University, and NorthShore University HealthSystem approved this study. All subjects provided informed consent before participating.

Survey Development and Administration

Based on a prior survey of interns and third‐year medical students, a 35‐item survey was used to measure perceptions of, and participation in, unprofessional behaviors.8, 19, 20 The original survey was developed in 2005 by medical students who observed behaviors by trainees and faculty that they considered to be unprofessional. The survey was subsequently modified by interns to ascertain unprofessional behavior among interns. For this iteration, hospitalists and study authors at each site reviewed the survey items and adapted each item to ensure relevance to hospitalist work and also generalizability to site. New items were also created to refer specifically to work routinely performed by hospitalist attendings (attesting to resident notes, transferring patients to other services to reduce workload, etc). Because of this, certain items utilized jargon to refer to the unprofessional behavior as hospitalists do (ie, blocking admissions and turfing), and resonate with literature describing these phenomena.25 Items were also written in such a fashion to elicit the unprofessional nature (ie, blocking an admission that could be appropriate for your service).

The final survey (see Supporting Information, Appendix, in the online version of this article) included domains such as interactions with others, interactions with trainees, and patient‐care scenarios. Demographic information and job characteristics were collected including year of residency completion, total amount of clinical work, amount of night work, and amount of administrative work. Hospitalists were not asked whether they completed residency at the institution where they currently work in order to maintain anonymity in the context of a small sample. Instead, they were asked to rate their familiarity with residents at their institution on a Likert‐type scale ranging from very unfamiliar (1) to familiar (3) to very familiar (5). To help standardize levels of familiarity across hospitalists, we developed anchors that corresponded to how well a hospitalist would know resident names with familiar defined as knowing over half of resident names.

Participants reported whether they participated in, or observed, a particular behavior and rated their perception of each behavior from 1 (unprofessional) to 5 (professional), with unprofessional and somewhat unprofessional defined as unprofessional. A site champion administered paper surveys during a routine faculty meeting at each site. An electronic version was administered using SurveyMonkey (SurveyMonkey, Palo Alto, CA) to hospitalists not present at the faculty meeting. Participants chose a unique, nonidentifiable code to facilitate truthful reporting while allowing data tracking in follow‐up studies.

Data Analysis

Clinical time was dichotomized using above and below 50% full‐time equivalents (FTE) to define those that did less clinical. Because teaching time was relatively low with the median percent FTE spent on teaching at 10%, we used a cutoff of greater than 10% as greater teaching. Because many hospitalists engaged in no night work, night work was reported as those who engaged in any night work and those who did not. Similarly, because many hospitalists had no administrative time, administrative time was split into those with any administrative work and those without any administrative work. Lastly, those born after 1970 were classified as younger hospitalists.

Chi‐square tests were used to compare site response rates, and descriptive statistics were used to examine demographic characteristics of hospitalist respondents, in addition to perception of, and participation in, unprofessional behaviors. Because items on the survey were highly correlated, we used factor analysis to identify the underlying constructs that related to unprofessional behavior.26 Factor analysis is a statistical procedure that is most often used to explore which variables in a data set are most related or correlated to each other. By examining the patterns of similar responses, the underlying factors can be identified and extracted. These factors, by definition, are not correlated with each other. To select the number of factors to retain, the most common convention is to use Kaiser criterion, or retain all factors with eigenvalues greater than, or equal to, one.27 An eigenvalue measures the amount of variation in all of the items on the survey which is accounted for by that factor. If a factor has a low eigenvalue (less than 1 is the convention), then it is contributing little and is ignored, as it is likely redundant with the higher value factors.

Because use of Kaiser criterion often overestimates the number of factors to retain, another method is to use a scree plot which tends to underestimate the factors. Both were used in this study to ensure a stable solution. To name the factors, we examined which items or group of items loaded or were most highly related to which factor. To ensure an optimal factor solution, items with minimal participation (less than 3%) were excluded from factor analysis.

Then, site‐adjusted multivariate regression analysis was used to examine associations between job and demographic characteristics, and the factors of unprofessional behavior identified. Models controlled for gender and familiarity with residents. Because sample medians were used to define greater teaching (>10% FTE), we also performed a sensitivity analysis using different cutoffs for teaching time (>20% FTE and teaching tertiles). Likewise, we also used varying definitions of less clinical time to ensure that any statistically significant associations were robust across varying definitions. All data were analyzed using STATA 11.0 (Stata Corp, College Station, TX) and statistical significance was defined as P < 0.05.

RESULTS

Seventy‐seven of the 101 hospitalists (76.2%) at 3 sites completed the survey. While response rates varied by site (site 1, 67%; site 2, 74%; site 3, 86%), the differences were not statistically significant (2 = 2.9, P = 0.24). Most hospitalists (79.2%) completed residency after 2000. Over half (57.1%) of participants were male, and over half (61%) reported having worked with their current hospitalist group from 1 to 4 years. Almost 60% (59.7%) reported being unfamiliar with residents in the program. Over 40% of hospitalists did not do any night work. Hospitalists were largely clinical, one‐quarter of hospitalists reported working over 50% FTE, and the median was 80% FTE. While 78% of hospitalists reported some teaching time, median time on teaching service was low at 10% (Table 1).

Demographics of Responders* (n = 77)
 Total n (%)
  • Abbreviations: IQR, interquartile range.

  • Site differences were observed for clinical practice characteristics, such as number of weeks of teaching service, weeks working nights, clinical time, research time, completed fellowship, and won teaching awards. Due to item nonresponse, number of respondents reporting is listed for each item.

  • Familiarity with residents asked in lieu of whether hospitalist trained at the institution. Familiarity defined as a rating of 4 or 5 on Likert scale ranging from Very Unfamiliar (1) to Very Familiar (5), with Familiar (4) defined further as knowing >50% of residents' names.

Male (%)44 (57.1)
Completed residency (%)
Between 1981 and 19902 (2.6)
Between 1991 and 200014 (18.2)
After 200061 (79.2)
Medical school matriculation (%) (n = 76) 
US medical school59 (77.6)
International medical school17 (22.3)
Years spent with current hospitalist group (%)
<1 yr14 (18.2)
14 yr47 (61.0)
59 yr15 (19.5)
>10 yr1 (1.3)
Familiarity with residents (%)
Familiar31 (40.2)
Unfamiliar46 (59.7)
No. of weeks per year spent on (median IQR)
Hospitalist practice (n = 72)26.0 [16.026.0]
Teaching services (n = 68)4.0 [1.08.0]
Weeks working nights* (n = 71)
>2 wk16 (22.5)
12 wk24 (33.8)
0 wk31 (43.7)
% Clinical time (median IQR)* (n = 73)80 (5099)
% Teaching time (median IQR)* (n = 74)10 (120)
Any research time (%)* (n = 71)22 (31.0)
Any administrative time (%) (n = 72)29 (40.3)
Completed fellowship (%)*12 (15.6)
Won teaching awards (%)* (n = 76)21 (27.6)
View a career in hospital medicine as (%)
Temporary11 (14.3)
Long term47 (61.0)
Unsure19 (24.7)

Hospitalists perceived almost all behaviors as unprofessional (unprofessional or somewhat unprofessional on a 5‐point Likert Scale). The only behavior rated as professional with a mean of 4.25 (95% CI 4.014.49) was staying past shift limit to complete a patient‐care task that could have been signed out. This behavior also had the highest level of participation by hospitalists (81.7%). Hospitalists were most ambivalent when rating professionalism of attending an industry‐sponsored dinner or social event (mean 3.20, 95% CI 2.983.41) (Table 2).

Perception of, and Observation and Participation in, Unprofessional Behaviors Among Hospitalists (n = 77)
BehaviorReported Perception (Mean Likert score)*Reported Participation (%)Reported Observation (%)
  • Abbreviations: ER, emergency room.

  • Perception rated on Likert scale from 1 (unprofessional) to 5 (professional).

Having nonmedical/personal conversations in patient corridors (eg, discussing evening plans)2.55 (2.342.76)67.180.3
Ordering a routine test as urgent to get it expedited2.82 (2.583.06)62.380.5
Making fun of other physicians to colleagues1.56 (1.391.70)40.367.5
Disparaging the ER team/outpatient doctor to others for findings later discovered on the floor (eg, after the patient is admitted)2.01 (1.842.19)39.567.1
Signing out patients over the phone at the end of shift when sign‐out could have been done in person2.95 (2.743.16)40.865.8
Texting or using smartphone during educational conferences (ie, noon lecture)2.16 (1.952.36)39.072.7
Discussing patient information in public spaces1.49 (1.341.63)37.766.2
Making fun of other attendings to colleagues1.62 (1.461.78)35.161.0
Deferring family members' concerns about a change in the patient's clinical course to the primary team in order to avoid engaging in such a discussion2.16 (1.912.40)30.355.3
Making disparaging comments about a patient on rounds1.42 (1.271.56)29.867.5
Attending an industry (eg, pharmaceutical or equipment/device manufacturer)‐sponsored dinner or social event3.20 (2.983.41)28.660.5
Ignoring family member's nonurgent questions about a cross‐cover patient when you had time to answer2.05 (1.852.25)26.348.7
Attesting to a resident's note when not fully confident of the content of their documentation1.65 (1.451.85)23.432.5
Making fun of support staff to colleagues1.45 (1.311.59)22.157.9
Not correcting someone who mistakes a student for a physician2.19 (2.012.38)20.835.1
Celebrating a blocked‐admission1.80 (1.612.00)21.160.5
Making fun of residents to colleagues1.53 (1.371.70)18.244.2
Coming to work when you have a significant illness (eg, influenza)1.99 (1.792.19)14.335.1
Celebrating a successful turf1.71 (1.511.92)11.739.0
Failing to notify the patient that a member of the team made, or is concerned that they made, an error1.53 (1.341.71)10.420.8
Transferring a patient, who could be cared for on one's own service, to another service in order to reduce one's census (eg, turfing)1.72 (1.521.91)9.358.7
Refusing an admission which could be considered appropriate for your service (eg, blocking)1.63 (1.441.82)7.968.4
Falsifying patient records (ie, back‐dating a note, copying forward unverified information, or documenting physical findings not personally obtained)1.22 (1.101.34)6.527.3
Making fun of students to colleagues1.35 (1.191.51)6.524.7
Failing to notify patient‐safety or risk management that a member of the team made, or is concerned that they made, an error1.64 (1.461.82)5.213.2
Introducing a student as a doctor to patients1.96 (1.762.16)3.920.8
Signing‐out a procedure or task, that could have been completed during a required shift or by the primary team, in order to go home as early in the day as possible1.48 (1.321.64)3.948.1
Performing medical or surgical procedures on a patient beyond self‐perceived level of skill1.27 (1.141.41)2.67.8
Asking a student to obtain written consent from a patient or their proxy without supervision (eg, for blood transfusion or minor procedures)1.60 (1.421.78)2.636.5
Encouraging a student to state that they are a doctor in order to expedite patient care1.31 (1.151.47)2.66.5
Discharging a patient before they are ready to go home in order to reduce one's census1.18 (1.071.29)2.619.5
Reporting patient information (eg, labs, test results, exam results) as normal when uncertain of the true results1.29 (1.161.41)2.615.6
Asking a student to perform medical or surgical procedures which are perceived to be beyond their level of skill1.26 (1.121.40)1.33.9
Asking a student to discuss, with patients, medical or surgical information which is perceived to be beyond their level of knowledge1.41 (1.261.56)0.015.8

Participation in egregious behaviors, such as falsifying patient records (6.49%) and performing medical or surgical procedures on a patient beyond self‐perceived level of skill (2.60%), was very low. The most common behaviors rated as unprofessional that hospitalists reported participating in were having nonmedical/personal conversations in patient corridors (67.1%), ordering a routine test as urgent to expedite care (62.3%), and making fun of other physicians to colleagues (40.3%). Forty percent of participants reported disparaging the emergency room (ER) team or primary care physician for findings later discovered, signing out over the phone when it could have been done in person, and texting or using smartphones during educational conferences. In particular, participation in unprofessional behaviors related to trainees was close to zero (eg, asking a student to discuss, with patients, medical or surgical information which is perceived to be beyond their level of knowledge). The least common behaviors that hospitalists reported participating in were discharging a patient before they are ready to go home in order to reduce one's census (2.56%) and reporting patient information as normal when uncertain of the true results (2.60%). Like previous studies of unprofessional behaviors, those that reported participation were less likely to report the behavior as unprofessional.8, 19

Observation of behaviors ranged from 4% to 80%. In all cases, observation of the behavior was reported at a higher level than participation. Correlation between observation and participation was also high, with the exception of a few behaviors that had zero or near zero participation rates (ie, reporting patient information as normal when unsure of true results.)

After performing factor analysis, 4 factors had eigenvalues greater than 1 and were therefore retained and extracted for further analysis. These 4 factors accounted for 76% of the variance in responses reported on the survey. By examining which items or groups of items most strongly loaded on each factor, the factors were named accordingly: factor 1 referred to behaviors related to making fun of others, factor 2 referred to workload management, factor 3 referred to behaviors related to the learning environment, and factor 4 referred to behaviors related to time pressure (Table 3).

Results of Factor Analysis Displaying Items by Primary Loading
  • NOTE: Items were categorized using factor analysis to the factor that they loaded most highly on. All items shown loaded at 0.4 or above onto each factor. Four items were omitted due to loadings less than 0.4. One item cross‐loaded on multiple factors (deferring family questions). Abbreviations: ER, emergency room.

Factor 1: Making fun of others
Making fun of other physicians (0.78)
Making fun of attendings (0.77)
Making fun of residents (0.70)
Making disparaging comments about a patient on rounds (0.51)
Factor 2: Workload management
Celebrating a successful turf (0.81)
Celebrating a blocked‐admission (0.65)
Coming to work sick (0.56)
Transferring a patient, who could be cared for on one's own service, to another service in order to reduce one's census (eg, turfing.) (0.51)
Disparaging the ER team/outpatient doctor to others for findings later discovered on the floor (0.48)
Discharging a patient before they are ready to go home in order to reduce one's census (0.43)
Factor 3: Learning environment
Not correcting someone who mistakes a student for a physician (0.72)
Texting or using smartphone during educational conferences (ie, noon lecture) (0.51)
Failing to notify patient‐safety or risk management that a member of the team made, or is concerned that they made, an error (0.45)
Having nonmedical/personal conversations in patient corridors (eg, discussing evening plans) (0.43)
Factor 4: Time pressure
Ignoring family member's nonurgent questions about a cross‐cover patient when you had the time to answer (0.50)
Signing out patients over the phone at the end of shift when sign‐out could have been done in person (0.46)
Attesting to a resident's note when not fully confident of the content of their documentation (0.44)

Using site‐adjusted multivariate regression, certain hospitalist job characteristics were associated with certain patterns of participating in unprofessional behavior (Table 4). Those with less clinical time (<50% FTE) were more likely to participate in unprofessional behaviors related to making fun of others (factor 1, value = 0.94, 95% CI 0.32 to 1.56, P value <0.05). Hospitalists who had any administrative time ( value = 0.61, 95% CI 0.111.10, P value <0.05) were more likely to report participation in behaviors related to workload management. Hospitalists engaged in any night work were more likely to report participation in unprofessional behaviors related to time pressure ( value = 0.67, 95% CI 0.171.17, P value <0.05). Time devoted to teaching or research was not associated with greater participation in any of the domains of unprofessional behavior surveyed.

Association Between Hospitalist Job and Demographic Characteristics and Factors of Unprofessional Behavior
ModelMaking Fun of OthersLearning EnvironmentWorkload ManagementTime Pressure
PredictorBeta [95% CI]Beta [95% CI]Beta [95% CI]Beta [95% CI]
  • NOTE: Table shows the results of 4 different multivariable linear regression models, which examine the association between various covariates (job characteristics, demographic characteristics, and site) and factors of participation in unprofessional behaviors (communication, patient safety, workload). Due to item nonresponse, n = 63 for all regression models. Abbreviations: CI, confidence interval.

  • P < 0.05.

  • Less clinical was defined as less than 50% full‐time equivalent (FTE) in a given year spent on clinical work.

  • Teaching was defined as greater than the median (10% FTE) spent on teaching. Results did not change when using tertiles of teaching effort, or a cutoff at teaching greater than 20% FTE.

  • Administrative time, research time, and nights were defined as reporting any administrative time, research time, or night work, respectively (greater than 0% per year).

  • Younger was defined as those born after 1970.

Job characteristics
Less clinical0.94 [0.32, 1.56]*0.01 [0.66, 0.64]0.17 [0.84, 0.49]0.39 [0.24, 1.01]
Administrative0.30 [0.16, 0.76]0.06 [0.43, 0.54]0.61 [0.11, 1.10]*0.26 [0.20, 0.72]
Teaching0.01 [0.49, 0.48]0.09 [0.60, 0.42]0.12 [0.64, 0.40]0.16 [0.33, 0.65]
Research0.30 [0.87, 0.27]0.38 [0.98, 0.22]0.37 [0.98, 0.24]0.13 [0.45, 0.71]
Any nights0.08 [0.58, 0.42]0.24 [0.28, 0.77]0.24 [0.29, 0.76]0.67 [0.17,1.17]*
Demographic characteristics
Male0.06 [0.42, 0.53]0.03 [0.47, 0.53]0.05 [0.56, 0.47]0.40 [0.89, 0.08]
Younger0.05 [0.79, 0.69]0.64 [1.42, 0.14]0.87 [0.07, 1.67]*0.62 [0.13, 1.37]
Unfamiliar with residents0.32 [0.85, 0.22]0.32 [0.89, 0.24]0.13 [0.45, 0.70]0.47 [0.08, 1.01]
Institution
Site 10.58 [0.22, 1.38]0.05 [0.89, 0.79]1.01 [0.15, 1.86]*0.77 [1.57, 0.04]
Site 30.11 [0.68, 0.47]0.70 [1.31, 0.09]*0.43 [0.20, 1.05]0.45 [0.13, 1.04]
Constant0.03 [0.99, 1.06]0.94 [0.14, 2.02]1.23[2.34, 0.13]*1.34[2.39, 0.31]*

The only demographic characteristic that was significantly associated with unprofessional behavior was age. Specifically, those who were born after 1970 were more likely to participate in unprofessional behaviors related to workload management ( value = 0.87, 95% CI 0.071.67, P value <0.05). Site differences were also present. Specifically, one site was more likely to report participation in unprofessional behaviors related to workload management ( value site 1 = 1.01, 95% CI 0.15 to 1.86, P value <0.05), while another site was less likely to report participation in behaviors related to the learning environment ( value site 3 = 0.70, 95% CI 1.31 to 0.09, P value <0.05). Gender and familiarity with residents were not significant predictors of participation in unprofessional behaviors. Results remained robust in sensitivity analyses using different cutoffs of clinical time and teaching time.

DISCUSSION

This multisite study adds to what is known about the perceptions of, and participation in, unprofessional behaviors among internal medicine hospitalists. Hospitalists perceived almost all surveyed behaviors as unprofessional. Participation in egregious and trainee‐related unprofessional behaviors was very low. Four categories appeared to explain the variability in how hospitalists reported participation in unprofessional behaviors: making fun of others, workload management, learning environment, and time pressure. Participation in behaviors within these factors was associated with certain job characteristics, such as clinical time, administrative time, and night work, as well as age and site.

It is reassuring that participation in, and trainee‐related, unprofessional behaviors is very low, and it is noteworthy that attending an industry‐sponsored dinner is not considered unprofessional. This was surprising in the setting of increased external pressures to report and ban such interactions.28 Perception that attending such dinners is acceptable may reflect a lag between current practice and national recommendations.

It is important to explore why certain job characteristics are associated with participation in unprofessional behaviors. For example, those with less clinical time were more likely to participate in making fun of others. It may be the case that hospitalists with more clinical time may make a larger effort to develop and maintain positive relationships. Another possible explanation is that hospitalists with less clinical time are more easily influenced by those in the learning environment who make fun of others, such as residents who they are supervising for only a brief period.

For unprofessional behaviors related to workload management, those who were younger, and those with any administrative time, were more likely to participate in behaviors such as celebrating a blocked‐admission. Our prior work shows that behaviors related to workload management are more widespread in residency, and therefore younger hospitalists, who are often recent residency graduates, may be more prone to participating in these behaviors. While unproven, it is possible that those with more administrative time may have competing priorities with their administrative roles, which motivate them to more actively manage their workload, leading them to participate in workload management behaviors.

Hospitalists who did any night work were more likely to participate in unprofessional behaviors related to time pressure. This could reflect the high workloads that night hospitalists may face and the pressure they feel to wrap up work, resulting in a hasty handoff (ie, over the phone) or to defer work (ie, family questions). Site differences were also observed for participation in behaviors related to the learning environment, speaking to the importance of institutional culture.

It is worth mentioning that hospitalists who were teachers were not any less likely to report participating in certain behaviors. While 78% of hospitalists reported some level of teaching, the median reported percentage of teaching was 10% FTE. This level of teaching likely reflects the diverse nature of work in which hospitalists engage. While hospitalists spend some time working with trainees, services that are not staffed with residents (eg, uncovered services) are becoming increasingly common due to stricter resident duty hour restrictions. This may explain why 60% of hospitalists reported being unfamiliar with residents. We also used a high bar for familiarity, which we defined as knowing half of residents by name, and served as a proxy for those who may have trained at the institution where they currently work. In spite of hospitalists reporting a low fraction of their total clinical time devoted to resident services, a significant fraction of resident services were staffed by hospitalists at all sites, making them a natural target for interventions.

These results have implications for future work to assess and improve professionalism in the hospital learning environment. First, interventions to address unprofessional behaviors should focus on behaviors with the highest participation rates. Like our earlier studies of residents, participation is high in certain behaviors, such as misrepresenting a test as urgent, or disparaging the ER or primary care physician (PCP) for a missed finding.19, 20 While blocking an admission was common in our studies of residents, reported participation among hospitalists was low. Similar to a prior study of clinical year medical students at one of our sites, 1 in 5 hospitalists reported not correcting someone who mistakes a student for a physician, highlighting the role that hospitalists may have in perpetuating this behavior.8 Additionally, addressing the behaviors identified in this study, through novel curricular tools, may help to teach residents many of the interpersonal and communication skills called for in the 2011 ACGME Common Program Requirements.11 The ACGME requirements also include the expectation that faculty model how to manage their time before, during, and after clinical assignments, and recognize that transferring a patient to a rested provider is best. Given that most hospitalists believe staying past shift limit is professional, these requirements will be difficult to adopt without widespread culture change.

Moreover, interventions could be tailored to hospitalists with certain job characteristics. Interventions may be educational or systems based. An example of the former is stressing the impact of the learning and working environment on trainees, and an example of the latter is streamlining the process in which ordered tests are executed to result in a more timely completion of tests. This may result in fewer physicians misrepresenting a test as urgent in order to have the test done in a timely manner. Additionally, hospitalists with less clinical time could receive education on their impact as a role model for trainees. Hospitalists who are younger or with administrative commitments could be trained on the importance of avoiding behaviors related to workload management, such as blocking or turfing patients. Lastly, given the site differences, critical examination of institutional culture and policies is also important. With funding from the American Board of Internal Medicine (ABIM) Foundation, we are currently creating an educational intervention, targeting those behaviors that were most frequent among hospitalists and residents at our institutions to promote dialogue and critical reflection, with the hope of reducing the most prevalent behaviors encountered.

There are several limitations to this study. Despite the anonymity of the survey, participants may have inaccurately reported their participation in unprofessional behaviors due to socially desirable response. In addition, because we used factor analysis and multivariate regression models with a small sample size, item nonresponse limited the sample for regression analyses and raises the concern for response bias. However, all significant associations remained so after performing backwards stepwise elimination of covariates that were P > 0.10 in models that were larger (ranging from 65 to 69). Because we used self‐report and not direct observation of participation in unprofessional behaviors, it is not possible to validate the responses given. Future work could rely on the use of 360 degree evaluations or other methods to validate responses given by self‐report. It is also important to consider assessing whether these behaviors are associated with actual patient outcomes, such as length of stay or readmission. Some items may not always be unprofessional. For example, texting during an educational conference might be to advance care, which would not necessarily be unprofessional. The order in which the questions were asked could have led to bias. We asked about participation before perception to try to limit bias reporting in participation. Changing the order of these questions would potentially have resulted in under‐reporting participation in behaviors that one perceived to be unprofessional. This study was conducted at 3 institutions located in Chicago, limiting generalizability to institutions outside of this area. Only internal medicine hospitalists were surveyed, which also limits generalizability to other disciplines and specialties within internal medicine. Lastly, it is important to highlight that hospitalists are not the sole teachers on inpatient services, since residents encounter a variety of faculty who serve as teaching attendings. Future work should expand to other centers and other specialties.

In conclusion, in this multi‐institutional study of hospitalists, participation in egregious behaviors was low. Four factors or patterns underlie hospitalists' reports of participation in unprofessional behavior: making fun of others, learning environment, workload management, and time pressure. Job characteristics (clinical time, administrative time, night work), age, and site were all associated with different patterns of unprofessional behavior. Specifically, hospitalists with less clinical time were more likely to make fun of others. Hospitalists who were younger in age, as well as those who had any administrative work, were more likely to participate in behaviors related to workload management. Hospitalists who work nights were more likely to report behaviors related to time pressure. Interventions to promote professionalism should take institutional culture into account and should focus on behaviors with the highest participation rates. Efforts should also be made to address underlying reasons for participation in these behaviors.

Acknowledgements

The authors thank Meryl Prochaska for her research assistance and manuscript preparation.

Disclosures: The authors acknowledge funding from the ABIM Foundation and the Pritzker Summer Research Program. The funders had no role in the design of the study; the collection, analysis, and interpretation of the data; or the decision to approve publication of the finished manuscript. Prior presentations of the data include the 2010 University of Chicago Pritzker School of Medicine Summer Research Forum, the 2010 University of Chicago Pritzker School of Medicine Medical Education Day, the 2010 Midwest Society of Hospital Medicine Meeting in Chicago, IL, and the 2011 Society of Hospital Medicine National Meeting in Dallas, TX. All authors disclose no relevant or financial conflicts of interest.

References
  1. Stern DT.Practicing what we preach? An analysis of the curriculum of values in medical education.Am J Med.1998;104:569575.
  2. Borgstrom E,Cohn S,Barclay S.Medical professionalism: conflicting values for tomorrow's doctors.J Gen Intern Med.2010;25(12):13301336.
  3. Karnieli‐Miller O,Vu TR,Holtman MC,Clyman SG,Inui TS.Medical students' professionalism narratives: a window on the informal and hidden curriculum.Acad Med.2010;85(1):124133.
  4. Cohn FG,Shapiro J,Lie DA,Boker J,Stephens F,Leung LA.Interpreting values conflicts experienced by obstetrics‐gynecology clerkship students using reflective writing.Acad Med.2009;84(5):587596.
  5. Gaiser RR.The teaching of professionalism during residency: why it is failing and a suggestion to improve its success.Anesth Analg.2009;108(3):948954.
  6. Gofton W,Regehr G.What we don't know we are teaching: unveiling the hidden curriculum.Clin Orthop Relat Res.2006;449:2027.
  7. Hafferty FW.Definitions of professionalism: a search for meaning and identity.Clin Orthop Relat Res.2006;449:193204.
  8. Reddy ST,Farnan JM,Yoon JD, et al.Third‐year medical students' participation in and perceptions of unprofessional behaviors.Acad Med.2007;82:S35S39.
  9. Hafferty FW.Beyond curriculum reform: confronting medicine's hidden curriculum.Acad Med.1998;73:403407.
  10. Pfifferling JH.Physicians' “disruptive” behavior: consequences for medical quality and safety.Am J Med Qual.2008;23:165167.
  11. Accreditation Council for Graduate Medical Education. Common Program Requirements: General Competencies. Available at: http://www.acgme.org/acwebsite/home/common_program_requirements_07012011.pdf. Accessed December 19,2011.
  12. Liaison Committee on Medical Education. Functions and Structure of a Medical School. Available at: http://www.lcme.org/functions2010jun.pdf.. Accessed June 30,2010.
  13. Gillespie C,Paik S,Ark T,Zabar S,Kalet A.Residents' perceptions of their own professionalism and the professionalism of their learning environment.J Grad Med Educ.2009;1:208215.
  14. Papadakis MA,Hodgson CS,Teherani A,Kohatsu ND.Unprofessional behavior in medical school is associated with subsequent disciplinary action by a state medical board.Acad Med.2004;79:244249.
  15. Papadakis MA,Teherani A,Banach MA, et al.Disciplinary action by medical boards and prior behavior in medical school.N Engl J Med.2005;353:26732682.
  16. Rosenstein AH,O'Daniel M.A survey of the impact of disruptive behaviors and communication defects on patient safety.Jt Comm J Qual Patient Saf.2008;34:464471.
  17. Rosenstein AH,O'Daniel M.Managing disruptive physician behavior—impact on staff relationships and patient care.Neurology.2008;70:15641570.
  18. The Joint Commission.Behaviors that undermine a culture of safety. Sentinel Event Alert.2008. Available at: http://www.jointcommission.org/assets/1/18/SEA_40.PDF. Accessed April 28, 2012.
  19. Arora VM,Wayne DB,Anderson RA,Didwania A,Humphrey HJ.Participation in and perceptions of unprofessional behaviors among incoming internal medicine interns.JAMA.2008;300:11321134.
  20. Arora VM,Wayne DB,Anderson RA, et al.Changes in perception of and participation in unprofessional behaviors during internship.Acad Med.2010;85:S76S80.
  21. Wachter RM.Reflections: the hospitalist movement a decade later.J Hosp Med.2006;1:248252.
  22. Society of Hospital Medicine, 2007–2008 Bi‐Annual Survey.2008. Available at: http://www.medscape.org/viewarticle/578134. Accessed April 28, 2012.
  23. Holmboe ES,Bowen JL,Green M, et al.Reforming internal medicine residency training. A report from the Society of General Internal Medicine's Task Force for Residency Reform.J Gen Intern Med.2005;20:11651172.
  24. Society of Hospital Medicine.The Core Competencies in Hospital Medicine: a framework for curriculum development by the Society of Hospital Medicine.J Hosp Med.2006;1(suppl 1):25.
  25. Caldicott CV,Dunn KA,Frankel RM.Can patients tell when they are unwanted? “Turfing” in residency training.Patient Educ Couns.2005;56:104111.
  26. Costello AB,Osborn JW.Best practices in exploratory factor analysis: four recommendations for getting the most from your analysis.Pract Assess Res Eval.2005;10:19.
  27. Principal Components and Factor Analysis. StatSoft Electronic Statistics Textbook. Available at: http://www.statsoft.com/textbook/principal‐components‐factor‐analysis/. Accessed December 30,2011.
  28. Brennan TA,Rothman DJ,Blank L, et al.Health industry practices that create conflicts of interest: a policy proposal for academic medical centers.JAMA.2006;295(4):429433.
References
  1. Stern DT.Practicing what we preach? An analysis of the curriculum of values in medical education.Am J Med.1998;104:569575.
  2. Borgstrom E,Cohn S,Barclay S.Medical professionalism: conflicting values for tomorrow's doctors.J Gen Intern Med.2010;25(12):13301336.
  3. Karnieli‐Miller O,Vu TR,Holtman MC,Clyman SG,Inui TS.Medical students' professionalism narratives: a window on the informal and hidden curriculum.Acad Med.2010;85(1):124133.
  4. Cohn FG,Shapiro J,Lie DA,Boker J,Stephens F,Leung LA.Interpreting values conflicts experienced by obstetrics‐gynecology clerkship students using reflective writing.Acad Med.2009;84(5):587596.
  5. Gaiser RR.The teaching of professionalism during residency: why it is failing and a suggestion to improve its success.Anesth Analg.2009;108(3):948954.
  6. Gofton W,Regehr G.What we don't know we are teaching: unveiling the hidden curriculum.Clin Orthop Relat Res.2006;449:2027.
  7. Hafferty FW.Definitions of professionalism: a search for meaning and identity.Clin Orthop Relat Res.2006;449:193204.
  8. Reddy ST,Farnan JM,Yoon JD, et al.Third‐year medical students' participation in and perceptions of unprofessional behaviors.Acad Med.2007;82:S35S39.
  9. Hafferty FW.Beyond curriculum reform: confronting medicine's hidden curriculum.Acad Med.1998;73:403407.
  10. Pfifferling JH.Physicians' “disruptive” behavior: consequences for medical quality and safety.Am J Med Qual.2008;23:165167.
  11. Accreditation Council for Graduate Medical Education. Common Program Requirements: General Competencies. Available at: http://www.acgme.org/acwebsite/home/common_program_requirements_07012011.pdf. Accessed December 19,2011.
  12. Liaison Committee on Medical Education. Functions and Structure of a Medical School. Available at: http://www.lcme.org/functions2010jun.pdf.. Accessed June 30,2010.
  13. Gillespie C,Paik S,Ark T,Zabar S,Kalet A.Residents' perceptions of their own professionalism and the professionalism of their learning environment.J Grad Med Educ.2009;1:208215.
  14. Papadakis MA,Hodgson CS,Teherani A,Kohatsu ND.Unprofessional behavior in medical school is associated with subsequent disciplinary action by a state medical board.Acad Med.2004;79:244249.
  15. Papadakis MA,Teherani A,Banach MA, et al.Disciplinary action by medical boards and prior behavior in medical school.N Engl J Med.2005;353:26732682.
  16. Rosenstein AH,O'Daniel M.A survey of the impact of disruptive behaviors and communication defects on patient safety.Jt Comm J Qual Patient Saf.2008;34:464471.
  17. Rosenstein AH,O'Daniel M.Managing disruptive physician behavior—impact on staff relationships and patient care.Neurology.2008;70:15641570.
  18. The Joint Commission.Behaviors that undermine a culture of safety. Sentinel Event Alert.2008. Available at: http://www.jointcommission.org/assets/1/18/SEA_40.PDF. Accessed April 28, 2012.
  19. Arora VM,Wayne DB,Anderson RA,Didwania A,Humphrey HJ.Participation in and perceptions of unprofessional behaviors among incoming internal medicine interns.JAMA.2008;300:11321134.
  20. Arora VM,Wayne DB,Anderson RA, et al.Changes in perception of and participation in unprofessional behaviors during internship.Acad Med.2010;85:S76S80.
  21. Wachter RM.Reflections: the hospitalist movement a decade later.J Hosp Med.2006;1:248252.
  22. Society of Hospital Medicine, 2007–2008 Bi‐Annual Survey.2008. Available at: http://www.medscape.org/viewarticle/578134. Accessed April 28, 2012.
  23. Holmboe ES,Bowen JL,Green M, et al.Reforming internal medicine residency training. A report from the Society of General Internal Medicine's Task Force for Residency Reform.J Gen Intern Med.2005;20:11651172.
  24. Society of Hospital Medicine.The Core Competencies in Hospital Medicine: a framework for curriculum development by the Society of Hospital Medicine.J Hosp Med.2006;1(suppl 1):25.
  25. Caldicott CV,Dunn KA,Frankel RM.Can patients tell when they are unwanted? “Turfing” in residency training.Patient Educ Couns.2005;56:104111.
  26. Costello AB,Osborn JW.Best practices in exploratory factor analysis: four recommendations for getting the most from your analysis.Pract Assess Res Eval.2005;10:19.
  27. Principal Components and Factor Analysis. StatSoft Electronic Statistics Textbook. Available at: http://www.statsoft.com/textbook/principal‐components‐factor‐analysis/. Accessed December 30,2011.
  28. Brennan TA,Rothman DJ,Blank L, et al.Health industry practices that create conflicts of interest: a policy proposal for academic medical centers.JAMA.2006;295(4):429433.
Issue
Journal of Hospital Medicine - 7(7)
Issue
Journal of Hospital Medicine - 7(7)
Page Number
543-550
Page Number
543-550
Publications
Publications
Article Type
Display Headline
Participation in unprofessional behaviors among hospitalists: A multicenter study
Display Headline
Participation in unprofessional behaviors among hospitalists: A multicenter study
Sections
Article Source

Copyright © 2012 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Department of Medicine, The University of Chicago, 5841 S Maryland Ave, MC 2007, AMB B200, Chicago, IL 60637
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Article PDF Media
Media Files

Night or Weekend Admission and Outcomes

Article Type
Changed
Thu, 05/25/2017 - 21:52
Display Headline
The association between night or weekend admission and hospitalization‐relevant patient outcomes

The hospitalist movement and increasingly stringent resident work hour restrictions have led to the utilization of shift work in many hospitals.1 Use of nocturnist and night float systems, while often necessary, results in increased patient hand‐offs. Research suggests that hand‐offs in the inpatient setting can adversely affect patient outcomes as lack of continuity may increase the possibility of medical error.2, 3 In 2001, Bell et al.4 found that mortality was higher among patients admitted on weekends as compared to weekdays. Uneven staffing, lack of supervision, and fragmented care were cited as potential contributing factors.4 Similarly, Peberdy et al.5 in 2008 revealed that patients were less likely to survive a cardiac arrest if it occurred at night or on weekends, again attributed in part to fragmented patient care and understaffing.

The results of these studies raise concerns as to whether increased reliance on shift work and resulting handoffs compromises patient care.6, 7 The aim of this study was to evaluate the potential association between night admission and hospitalization‐relevant outcomes (length of stay [LOS], hospital charges, intensive care unit [ICU] transfer during hospitalization, repeat emergency department [ED] visit within 30 days of discharge, readmission within 30 days of discharge, and poor outcome [transfer to the ICU, cardiac arrest, or death] within the first 24 hours of admission) at an institution that exclusively uses nocturnists (night‐shift based hospitalists) and a resident night float system for patients admitted at night to the general medicine service. A secondary aim was to determine the potential association between weekend admission and hospitalization‐relevant outcomes.

Methods

Study Sample and Selection

We conducted a retrospective medical record review at a large urban academic hospital. Using an administrative hospital data set, we assembled a list of approximately 9000 admissions to the general medicine service from the ED between January 2008 and October 2008. We sampled consecutive admissions from 3 distinct periods beginning in January, April, and July to capture outcomes at various points in the academic year. We attempted to review approximately 10% of all charts equally distributed among the 3 sampling periods (ie, 900 charts total with one‐third from each period) based on time available to the reviewers. We excluded patients not admitted to the general medicine service and patients without complete demographic or outcome information. We also excluded patients not admitted from the ED given that the vast majority of admissions to our hospital during the night (96%) or weekend (93%) are from the ED. Patients admitted to the general medicine service are cared for either by a hospitalist or by a teaching team comprised of 1 attending (about 40% of whom are hospitalists), 1 resident, 1 to 2 interns, and 1 to 3 medical students. From 7 am to 6:59 pm patients are admitted to the care of 1 of the primary daytime admitting teams. From 7 pm to 6:59 am patients are admitted by nocturnists (hospitalist service) or night float residents (teaching service). These patients are handed off to day teams at 7 am. Hospitalist teams change service on a weekly to biweekly basis and resident teams switch on a monthly basis; there is no difference in physician staffing between the weekend and weekdays. The Northwestern University Institutional Review Board approved this study.

Data Acquisition and Medical Records Reviews

We obtained demographic data including gender, age, race and ethnicity, patient insurance, admission day (weekday vs. weekend), admission time (defined as the time that a patient receives a hospital bed, which at our institution is also the time that admitting teams receive report and assume care for the patient), and the International Classification of Disease codes required to determine the Major Diagnostic Category (MDC) and calculate the Charlson Comorbidity Index8, 9 as part of an administrative data set. We divided the admission time into night admission (defined as 7 pm to 6:59 am) and day admission (defined as 7:00 am to 6:59 pm). We created a chart abstraction tool to allow manual recording of the additional fields of admitting team (hospitalist vs. resident), 30 day repeat ED visit, 30 day readmission, and poor outcomes within the first 24 hours of admission, directly from the electronic record.

Study Outcomes

We evaluated each admission for the following 6 primary outcomes which were specified a priori: LOS (defined as discharge date and time minus admission date and time), hospital charges (defined as charges billed as recorded in the administrative data set), ICU transfer during hospitalization (defined as 1 ICU day in the administrative data set), 30 day repeat ED visit (defined as a visit to our ED within 30 days of discharge as assessed by chart abstraction), 30 day readmission (defined as any planned or unplanned admission to any inpatient service at our institution within 30 days of discharge as assessed by chart abstraction), and poor outcome within 24 hours of admission (defined as transfer to the ICU, cardiac arrest, or death as assessed by chart abstraction). Each of these outcomes has been used in prior work to assess the quality of inpatient care.10, 11

Statistical Analysis

Interrater reliability between the 3 physician reviewers was assessed for 20 randomly selected admissions across the 4 separate review measures using interclass correlation coefficients. Comparisons between night admissions and day admissions, and between weekend and weekday admissions, for the continuous primary outcomes (LOS, hospital charges) were assessed using 2‐tailed t‐tests as well as Wilcoxon rank sum test. In the multivariable modeling, these outcomes were assessed by linear regression controlling for age, gender, race and ethnicity, Medicaid or self‐pay insurance, admission to the hospitalist or teaching service, most common MDC categories, and Charlson Comorbidity Index. Because both outcomes were right‐skewed, we separately assessed each after log‐transformation controlling for the same variables.

All comparisons of the dichotomous primary outcomes (ICU transfer during hospitalization, 30 day repeat ED visit, 30 day readmission, and poor outcome within the first 24 hours after admission) were assessed at the univariate level by chi‐squared test, and in the multivariable models using logistic regression, controlling for the same variables as the linear models above. All adjustments were specified a priori. All data analyses were conducted using Stata (College Station, TX; Version 11).

Results

We reviewed 857 records. After excluding 33 records lacking administrative data regarding gender, race and ethnicity, and other demographic variables, there were 824 medical records available for analysis. We reviewed a similar number of records from each time period: 274 from January 2008, 265 from April 2008, and 285 from July 2008. A total of 345 (42%) patients were admitted during the day, and 479 (58%) at night; 641 (78%) were admitted on weekdays, and 183 (22%) on weekends. The 33 excluded charts were similar to the included charts for both time of admission and outcomes. Results for parametric testing and nonparametric testing, as well as for log‐transformation and non‐log‐transformation of the continuous outcomes were similar in both magnitude and statistical significance, so we present the parametric and nonlog‐transformed results below for ease of interpretation.

Interrater reliability among the 3 reviewers was very high. There were no disagreements among the 20 multiple reviews for either poor outcomes within 24 hours of admission or admitting service; the interclass correlation coefficients for 30 day repeat ED visit and 30 day readmission were 0.97 and 0.87, respectively.

Patients admitted at night or on the weekend were similar to patients admitted during the day and week across age, gender, insurance class, MDC, and Charlson Comorbidity Index (Table 1). For unadjusted outcomes, patients admitted at night has a similar LOS, hospital charges, 30 day repeat ED visits, 30 day readmissions, and poor outcome within 24 hours of admission as those patients admitted during the day. They had a potentially lower chance of any ICU transfer during hospitalization though this did not reach statistical significance at P < 0.05 (night admission 6%, day admission 3%, P = 0.06) (Table 2).

Baseline Characteristics of Patients
CharacteristicsTime of DayDay of the Week
Day Admission (n = 345)Night Admission (n = 479)Weekday Admission (n = 641)Weekend Admission (n = 183)
  • NOTE: All P values > 0.05.

  • Abbreviation: ED, emergency department.

Age (years)60.859.760.658.7
Gender (% male)47434546
Race/Ethnicity (%)
White, Asian, other61545755
Black34383734
Hispanic58610
Medicaid or self pay (%)9101011
Major diagnostic category (%)
Respiratory disease14131413
Circulatory disease28232624
Digestive disease12121212
Other45524851
Charlson Comorbidity Index3.713.603.663.60
Outcomes, Unadjusted
OutcomesTime of DayDay of the Week
Day Admission (n = 345)Night Admission (n = 479)Weekday Admission (n = 641)Weekend Admission (n = 183)
  • Abbreviations: ED, emergency department; ICU, intensive care unit.

  • P < 0.05.

  • P= 0.06.

Length of stay4.34.14.33.8
Hospital charges$27,500$25,200$27,200*$22,700*
ICU transfer during hospitalization (%)635*1*
Repeat ED visit at 30 days (%)20222221
Readmission at 30 days (%)17202017
Poor outcome at 24 hours (ICU transfer, cardiac arrest, or death)(%)2121

Patients admitted to the hospital during the weekend were similar to patients admitted during the week for unadjusted LOS, 30 day repeat ED visit or readmission rate, and poor outcomes within 24 hours of admission as those admitted during the week; however, they had lower hospital charges (weekend admission $22,700, weekday admission $27,200; P = 0.02), and a lower chance of ICU transfer during hospitalization (weekend admission 1%, weekday admission 5%; P = 0.02) (Table 2).

In the multivariable linear and logistic regression models (Tables 3 and 4), we assessed the independent association between night admission or weekend admission and each hospitalization‐relevant outcome except for poor outcome within 24 hours of admission (poor outcome within 24 hours of admission was not modeled to avoid the risk of overfitting because there were only 13 total events). After adjustment for age, gender, race and ethnicity, admitting service (hospitalist or teaching), Medicaid or self‐pay insurance, MDC, and Charlson Comorbidity Index, there was no statistically significant association between night admission and worse outcomes for LOS, hospital charges, 30 day repeat ED visit, or 30 day readmission. Night admission was associated with a decreased chance of ICU transfer during hospitalization, but the difference was not statistically significant (odds ratio, 0.54; 95% confidence interval [CI], 0.26‐1.11, P = 0.09). Weekend admission was not associated with worse outcomes for LOS or 30 day repeat ED visit or readmission; however, weekend admission was associated with a decrease in overall charges ($4400; 95% CI, $8300 to $600) and a decreased chance of ICU transfer during hospitalization (odds ratio, 0.20; 95% CI, 0.050.88).

Linear Regressions for Continuous Outcomes (With Coefficients)
PredictorsLength of Stay (days), Coefficient (95% CI)Hospital Charges (dollars), Coefficient (95% CI)
  • Abbreviations: CI, confidence intervals; ICU, intensive care unit; MDC, major diagnostic category: comparison to other.

  • P < 0.05.

Night admission0.23 (0.77 to 0.32)2100 (5400 to 1100)
Weekend admission0.42 (1.07 to 0.23)4400 (8300 to 600)*
Age0.01 (0.01 to 0.03)0 (100 to 100)
Male gender0.15 (0.70 to 0.39)400 (3700 to 2800)
Race, Black0.18 (0.41 to 0.78)200 (3700 to 3400)
Ethnicity, Hispanic0.62 (1.73 to 0.49)2300 (8900 to 4300)
Medicaid or self‐pay insurance1.87 (0.93 to 2.82)*8900 (3300 to 14600)*
Hospitalist service0.26 (0.29 to 0.81)600 (3900 to 2700)
MDC: respiratory0.36 (1.18 to 0.46)700 (4200 to 5600)
MDC: circulatory1.36 (2.04 to 0.68)*600 (4600 to 3400)
MDC: digestive1.22 (2.08 to 0.35)*6800 (12000 to 1700)*
Charlson Comorbidity Index0.35 (0.22 to 0.49)*2200 (1400 to 3000)*
Logistic Regressions for Dichotomous Outcomes (With Odds Ratios)
PredictorsICU Transfer during Hospitalization, Odds Ratio (95% CI)Repeat ED Visit at 30 days, Odds Ratio (95% CI)Readmission at 30 days, Odds Ratio (95% CI)
  • Abbreviations: CI, confidence intervals; ICU, intensive care unit; MDC, major diagnostic category: comparison to other.

  • P < 0.05.

Night admission0.53 (0.26 to 1.11)1.13 (0.80 to 1.60)1.23 (0.86 to 1.78)
Weekend admission0.20 (0.05 to 0.88)*0.95 (0.63 to 1.44)0.80 (0.51 to 1.25)
Age1.00 (0.98 to 1.02)0.99 (0.98 to 1.002)1.00 (0.99 to 1.01)
Male gender0.98 (0.47 to 2.02)1.09 (0.78 to 1.54)0.91 (0.64 to 1.31)
Race, Black0.75 (0.33 to 1.70)1.48 (1.02 to 2.14)*1.12 (0.76 to 1.65)
Ethnicity, Hispanic0.76 (0.16 to 3.73)1.09 (0.55 to 2.17)1.11 (0.55 to 2.22)
Medicaid or self‐pay insurance0.75 (0.16 to 3.49)1.61 (0.95 to 2.72)2.14 (1.24 to 3.67)*
Hospitalist service0.68 (0.33 to 1.44)1.15 (0.81 to 1.63)0.99 (0.69 to 1.43)
MDC: respiratory1.18 (0.41 to 3.38)1.02 (0.61 to 1.69)1.16 (0.69 to 1.95)
MDC: circulatory1.22 (0.52 to 2.87)0.79 (0.51 to 1.22)0.80 (0.51 to 1.27)
MDC: digestive0.51 (0.11 to 2.32)0.83 (0.47 to 1.46)1.08 (0.62 to 1.91)
Charlson Comobrbidity Index1.25 (1.09 to 1.45)*1.09 (1.01 to 1.19)*1.11 (1.02 to 1.21)*

Our multivariate models explained very little of the variance in patient outcomes. For LOS and hospital charges, adjusted R2 values were 0.06 and 0.05, respectively. For ICU transfer during hospitalization, 30 day repeat ED visit, and 30 day readmission, the areas under the receiver operator curves were 0.75, 0.51, and 0.61 respectively.

To assess the robustness of our conclusions regarding night admission, we redefined night to include only patients admitted between the hours of 8 pm and 5:59 am. This did not change our conclusions. We also tested for interaction between night admission and weekend admission for all outcomes to assess whether night admissions on the weekend were in fact at increased risk of worse outcomes; we found no evidence of interaction (P > 0.3 for the interaction terms in each model).

Discussion

Among patients admitted to the medicine services at our academic medical center, night or weekend admission was not associated with worse hospitalization‐relevant outcomes. In some cases, night or weekend admission was associated with better outcomes, particularly in terms of ICU transfer during hospitalization and hospital charges. Prior research indicates worse outcomes during off‐hours,5 but we did not replicate this finding in our study.

The finding that admission at night was not associated with worse outcomes, particularly proximal outcomes such as LOS or ICU transfer during hospitalization, was surprising, though reassuring in view of the fact that more than half of our patients are admitted at night. We believe a few factors may be responsible. First, our general medicine service is staffed during the night (7 pm to 7 am) by in‐house nocturnists and night float residents. Second, our staffing ratio, while lower at night than during the day, remains the same on weekends and may be higher than in other settings. In continuously well‐staffed settings such as the ED12 and ICU,13 night and weekend admissions are only inconsistently associated with worse outcomes, which may be the same phenomena we observed in the current study. Third, the hospital used as the site of this study has received Nursing Magnet recognition and numerous quality awards such as the National Research Corporation's Consumer Choice Award and recognition as a Distinguished Hospital for Clinical Excellence by HealthGrades. Fourth, our integrated electronic medical record, computerized physician order entry system, and automatically generated sign out serve as complements to the morning hand off. Fifth, hospitalists and teaching teams rotate on a weekly, biweekly, or every 4 week basis, which may protect against discontinuity associated with the weekend. We believe that all of these factors may facilitate alert, comprehensive care during the night and weekend as well as safe and efficient transfer of patients from the night to the day providers.

We were also surprised by the association between weekend admission and lower charges and a lower chance of ICU transfer during hospitalization. We believe many of the same factors noted above may have played a role in these findings. In terms of hospital charges, it is possible that some workups were completed outside of the hospital rather than during the hospitalization, and that some tests were not ordered at all due to unavailability on weekends. The decreased chance of ICU transfer is unexplained. We hypothesize that there may have been a more conservative admission strategy within the ED, such that patients with high baseline severity were admitted directly to the ICU on the weekend rather than being admitted first to the general medicine floor. This hypothesis requires further study.

Our study had important limitations. It was a retrospective study from a single academic hospital. The sample size lacked sufficient power to detect differences in the low frequency of certain outcomes such as poor outcomes within 24 hours of admission (2% vs. 1%), and also for more frequent outcomes such as 30 day readmission; it is possible that with a larger sample there would have been statistically significant differences. Further, we recognize that the Charlson Comorbidity Index, which was developed to predict 1‐year mortality for medicine service patients, does not adjust for severity of illness at presentation, particularly for outcomes such as readmission. If patients admitted at night and during the weekend were less acutely ill despite having similar comorbidities and MDCs at admission, true associations between time of admission and worse outcomes could have been masked. Furthermore, the multivariable modeling explained very little of the variance in patient outcomes such that significant unmeasured confounding may still be present, and consequently our results cannot be interpreted in a causal way. Data was collected from electronic records, so it is possible that some adverse events were not recorded. However, it seems unlikely that major events such as death and transfer to an ICU would have been missed.

Several aspects of the study strengthen our confidence in the findings, including a large sample size, relevance of the outcomes, the adjustment for confounders, and an assessment for robustness of the conclusions based on restricting the definition of night and also testing for interaction between night and weekend admission. Our patient demographics and insurance mix resemble that of other academic hospitals,10 and perhaps our results may be generalizable to these settings, if not to non‐urban or community hospitals. Furthermore, the Charlson Comorbidity Index was associated with all 5 of the modeled outcomes we chose for our study, reaffirming their utility in assessing the quality of hospital care. Future directions for investigation may include examining the association of night admission with hospitalization‐relevant outcomes in nonacademic, nonurban settings, and examining whether the lack of association between night and weekend admission and worse outcomes persists with adjustment for initial severity of illness.

In summary, at a large, well‐staffed urban academic hospital, day or time of admission were not associated with worse hospitalization‐relevant outcomes. The use of nocturnists and night float teams for night admissions and continuity across weekends appears to be a safe approach to handling the increased volume of patients admitted at night, and a viable alternative to overnight call in the era of work hour restrictions.

References
  1. Vaughn DM,Stout CL,McCampbell BL, et al.Three‐year results of mandated work hour restrictions: attending and resident perspectives and effects in a community hospital.Am Surg.2008;74(6):542546; discussion 546–547.
  2. Kitch BT,Cooper JB,Zapol WM, et al.Handoffs causing patient harm: a survey of medical and surgical house staff.Jt Comm J Qual Patient Saf.2008;34(10):563570.
  3. Petersen LA,Brennan TA,O'Neil AC,Cook EF,Lee TH.Does housestaff discontinuity of care increase the risk for preventable adverse events?Ann Intern Med.1994;121(11):866872.
  4. Bell CM,Redelmeier DA.Mortality among patients admitted to hospitals on weekends as compared with weekdays.N Engl J Med.2001;345(9):663668.
  5. Peberdy MA,Ornato JP,Larkin GL, et al.Survival from in‐hospital cardiac arrest during nights and weekends.JAMA.2008;299(7):785792.
  6. Sharma G,Freeman J,Zhang D,Goodwin JS.Continuity of care and intensive care unit use at the end of life.Arch Intern Med.2009;169(1):8186.
  7. Sharma G,Fletcher KE,Zhang D,Kuo YF,Freeman JL,Goodwin JS.Continuity of outpatient and inpatient care by primary care physicians for hospitalized older adults.JAMA.2009;301(16):16711680.
  8. Charlson ME,Ales KL,Simon R,MacKenzie CR.Why predictive indexes perform less well in validation studies: is it magic or methods?Arch Intern Med.1987;147:21552161.
  9. Deyo RA,Cherkin DC,Ciol MA.Adapting a clinical comorbidity index for use with ICD‐9‐CM administrative databases.J Clin Epidemiol.1992;45(6):613619.
  10. Roy CL,Liang CL,Lund M, et al.Implementation of a physician assistant/hospitalist service in an academic medical center: impact on efficiency and patient outcomes.J Hosp Med.2008;3(5):361368.
  11. Groarke JD,Gallagher J,Stack J, et al.Use of an admission early warning score to predict patient morbidity and mortality and treatment success.Emerg Med J.2008;25(12):803806.
  12. Schmulewitz L,Proudfoot A,Bell D.The impact of weekends on outcome for emergency patients.Clin Med.2005;5(6):621625.
  13. Meynaar IA,van der Spoel JI,Rommes JH,van Spreuwel‐Verheijen M,Bosman RJ,Spronk PE.Off hour admission to an intensivist‐led ICU is not associated with increased mortality.Crit Care.2009;13(3):R84.
Article PDF
Issue
Journal of Hospital Medicine - 6(1)
Publications
Page Number
10-14
Legacy Keywords
communication, continuity of care transition and discharge planning, education, outcomes measurement, patient safety, resident
Sections
Article PDF
Article PDF

The hospitalist movement and increasingly stringent resident work hour restrictions have led to the utilization of shift work in many hospitals.1 Use of nocturnist and night float systems, while often necessary, results in increased patient hand‐offs. Research suggests that hand‐offs in the inpatient setting can adversely affect patient outcomes as lack of continuity may increase the possibility of medical error.2, 3 In 2001, Bell et al.4 found that mortality was higher among patients admitted on weekends as compared to weekdays. Uneven staffing, lack of supervision, and fragmented care were cited as potential contributing factors.4 Similarly, Peberdy et al.5 in 2008 revealed that patients were less likely to survive a cardiac arrest if it occurred at night or on weekends, again attributed in part to fragmented patient care and understaffing.

The results of these studies raise concerns as to whether increased reliance on shift work and resulting handoffs compromises patient care.6, 7 The aim of this study was to evaluate the potential association between night admission and hospitalization‐relevant outcomes (length of stay [LOS], hospital charges, intensive care unit [ICU] transfer during hospitalization, repeat emergency department [ED] visit within 30 days of discharge, readmission within 30 days of discharge, and poor outcome [transfer to the ICU, cardiac arrest, or death] within the first 24 hours of admission) at an institution that exclusively uses nocturnists (night‐shift based hospitalists) and a resident night float system for patients admitted at night to the general medicine service. A secondary aim was to determine the potential association between weekend admission and hospitalization‐relevant outcomes.

Methods

Study Sample and Selection

We conducted a retrospective medical record review at a large urban academic hospital. Using an administrative hospital data set, we assembled a list of approximately 9000 admissions to the general medicine service from the ED between January 2008 and October 2008. We sampled consecutive admissions from 3 distinct periods beginning in January, April, and July to capture outcomes at various points in the academic year. We attempted to review approximately 10% of all charts equally distributed among the 3 sampling periods (ie, 900 charts total with one‐third from each period) based on time available to the reviewers. We excluded patients not admitted to the general medicine service and patients without complete demographic or outcome information. We also excluded patients not admitted from the ED given that the vast majority of admissions to our hospital during the night (96%) or weekend (93%) are from the ED. Patients admitted to the general medicine service are cared for either by a hospitalist or by a teaching team comprised of 1 attending (about 40% of whom are hospitalists), 1 resident, 1 to 2 interns, and 1 to 3 medical students. From 7 am to 6:59 pm patients are admitted to the care of 1 of the primary daytime admitting teams. From 7 pm to 6:59 am patients are admitted by nocturnists (hospitalist service) or night float residents (teaching service). These patients are handed off to day teams at 7 am. Hospitalist teams change service on a weekly to biweekly basis and resident teams switch on a monthly basis; there is no difference in physician staffing between the weekend and weekdays. The Northwestern University Institutional Review Board approved this study.

Data Acquisition and Medical Records Reviews

We obtained demographic data including gender, age, race and ethnicity, patient insurance, admission day (weekday vs. weekend), admission time (defined as the time that a patient receives a hospital bed, which at our institution is also the time that admitting teams receive report and assume care for the patient), and the International Classification of Disease codes required to determine the Major Diagnostic Category (MDC) and calculate the Charlson Comorbidity Index8, 9 as part of an administrative data set. We divided the admission time into night admission (defined as 7 pm to 6:59 am) and day admission (defined as 7:00 am to 6:59 pm). We created a chart abstraction tool to allow manual recording of the additional fields of admitting team (hospitalist vs. resident), 30 day repeat ED visit, 30 day readmission, and poor outcomes within the first 24 hours of admission, directly from the electronic record.

Study Outcomes

We evaluated each admission for the following 6 primary outcomes which were specified a priori: LOS (defined as discharge date and time minus admission date and time), hospital charges (defined as charges billed as recorded in the administrative data set), ICU transfer during hospitalization (defined as 1 ICU day in the administrative data set), 30 day repeat ED visit (defined as a visit to our ED within 30 days of discharge as assessed by chart abstraction), 30 day readmission (defined as any planned or unplanned admission to any inpatient service at our institution within 30 days of discharge as assessed by chart abstraction), and poor outcome within 24 hours of admission (defined as transfer to the ICU, cardiac arrest, or death as assessed by chart abstraction). Each of these outcomes has been used in prior work to assess the quality of inpatient care.10, 11

Statistical Analysis

Interrater reliability between the 3 physician reviewers was assessed for 20 randomly selected admissions across the 4 separate review measures using interclass correlation coefficients. Comparisons between night admissions and day admissions, and between weekend and weekday admissions, for the continuous primary outcomes (LOS, hospital charges) were assessed using 2‐tailed t‐tests as well as Wilcoxon rank sum test. In the multivariable modeling, these outcomes were assessed by linear regression controlling for age, gender, race and ethnicity, Medicaid or self‐pay insurance, admission to the hospitalist or teaching service, most common MDC categories, and Charlson Comorbidity Index. Because both outcomes were right‐skewed, we separately assessed each after log‐transformation controlling for the same variables.

All comparisons of the dichotomous primary outcomes (ICU transfer during hospitalization, 30 day repeat ED visit, 30 day readmission, and poor outcome within the first 24 hours after admission) were assessed at the univariate level by chi‐squared test, and in the multivariable models using logistic regression, controlling for the same variables as the linear models above. All adjustments were specified a priori. All data analyses were conducted using Stata (College Station, TX; Version 11).

Results

We reviewed 857 records. After excluding 33 records lacking administrative data regarding gender, race and ethnicity, and other demographic variables, there were 824 medical records available for analysis. We reviewed a similar number of records from each time period: 274 from January 2008, 265 from April 2008, and 285 from July 2008. A total of 345 (42%) patients were admitted during the day, and 479 (58%) at night; 641 (78%) were admitted on weekdays, and 183 (22%) on weekends. The 33 excluded charts were similar to the included charts for both time of admission and outcomes. Results for parametric testing and nonparametric testing, as well as for log‐transformation and non‐log‐transformation of the continuous outcomes were similar in both magnitude and statistical significance, so we present the parametric and nonlog‐transformed results below for ease of interpretation.

Interrater reliability among the 3 reviewers was very high. There were no disagreements among the 20 multiple reviews for either poor outcomes within 24 hours of admission or admitting service; the interclass correlation coefficients for 30 day repeat ED visit and 30 day readmission were 0.97 and 0.87, respectively.

Patients admitted at night or on the weekend were similar to patients admitted during the day and week across age, gender, insurance class, MDC, and Charlson Comorbidity Index (Table 1). For unadjusted outcomes, patients admitted at night has a similar LOS, hospital charges, 30 day repeat ED visits, 30 day readmissions, and poor outcome within 24 hours of admission as those patients admitted during the day. They had a potentially lower chance of any ICU transfer during hospitalization though this did not reach statistical significance at P < 0.05 (night admission 6%, day admission 3%, P = 0.06) (Table 2).

Baseline Characteristics of Patients
CharacteristicsTime of DayDay of the Week
Day Admission (n = 345)Night Admission (n = 479)Weekday Admission (n = 641)Weekend Admission (n = 183)
  • NOTE: All P values > 0.05.

  • Abbreviation: ED, emergency department.

Age (years)60.859.760.658.7
Gender (% male)47434546
Race/Ethnicity (%)
White, Asian, other61545755
Black34383734
Hispanic58610
Medicaid or self pay (%)9101011
Major diagnostic category (%)
Respiratory disease14131413
Circulatory disease28232624
Digestive disease12121212
Other45524851
Charlson Comorbidity Index3.713.603.663.60
Outcomes, Unadjusted
OutcomesTime of DayDay of the Week
Day Admission (n = 345)Night Admission (n = 479)Weekday Admission (n = 641)Weekend Admission (n = 183)
  • Abbreviations: ED, emergency department; ICU, intensive care unit.

  • P < 0.05.

  • P= 0.06.

Length of stay4.34.14.33.8
Hospital charges$27,500$25,200$27,200*$22,700*
ICU transfer during hospitalization (%)635*1*
Repeat ED visit at 30 days (%)20222221
Readmission at 30 days (%)17202017
Poor outcome at 24 hours (ICU transfer, cardiac arrest, or death)(%)2121

Patients admitted to the hospital during the weekend were similar to patients admitted during the week for unadjusted LOS, 30 day repeat ED visit or readmission rate, and poor outcomes within 24 hours of admission as those admitted during the week; however, they had lower hospital charges (weekend admission $22,700, weekday admission $27,200; P = 0.02), and a lower chance of ICU transfer during hospitalization (weekend admission 1%, weekday admission 5%; P = 0.02) (Table 2).

In the multivariable linear and logistic regression models (Tables 3 and 4), we assessed the independent association between night admission or weekend admission and each hospitalization‐relevant outcome except for poor outcome within 24 hours of admission (poor outcome within 24 hours of admission was not modeled to avoid the risk of overfitting because there were only 13 total events). After adjustment for age, gender, race and ethnicity, admitting service (hospitalist or teaching), Medicaid or self‐pay insurance, MDC, and Charlson Comorbidity Index, there was no statistically significant association between night admission and worse outcomes for LOS, hospital charges, 30 day repeat ED visit, or 30 day readmission. Night admission was associated with a decreased chance of ICU transfer during hospitalization, but the difference was not statistically significant (odds ratio, 0.54; 95% confidence interval [CI], 0.26‐1.11, P = 0.09). Weekend admission was not associated with worse outcomes for LOS or 30 day repeat ED visit or readmission; however, weekend admission was associated with a decrease in overall charges ($4400; 95% CI, $8300 to $600) and a decreased chance of ICU transfer during hospitalization (odds ratio, 0.20; 95% CI, 0.050.88).

Linear Regressions for Continuous Outcomes (With Coefficients)
PredictorsLength of Stay (days), Coefficient (95% CI)Hospital Charges (dollars), Coefficient (95% CI)
  • Abbreviations: CI, confidence intervals; ICU, intensive care unit; MDC, major diagnostic category: comparison to other.

  • P < 0.05.

Night admission0.23 (0.77 to 0.32)2100 (5400 to 1100)
Weekend admission0.42 (1.07 to 0.23)4400 (8300 to 600)*
Age0.01 (0.01 to 0.03)0 (100 to 100)
Male gender0.15 (0.70 to 0.39)400 (3700 to 2800)
Race, Black0.18 (0.41 to 0.78)200 (3700 to 3400)
Ethnicity, Hispanic0.62 (1.73 to 0.49)2300 (8900 to 4300)
Medicaid or self‐pay insurance1.87 (0.93 to 2.82)*8900 (3300 to 14600)*
Hospitalist service0.26 (0.29 to 0.81)600 (3900 to 2700)
MDC: respiratory0.36 (1.18 to 0.46)700 (4200 to 5600)
MDC: circulatory1.36 (2.04 to 0.68)*600 (4600 to 3400)
MDC: digestive1.22 (2.08 to 0.35)*6800 (12000 to 1700)*
Charlson Comorbidity Index0.35 (0.22 to 0.49)*2200 (1400 to 3000)*
Logistic Regressions for Dichotomous Outcomes (With Odds Ratios)
PredictorsICU Transfer during Hospitalization, Odds Ratio (95% CI)Repeat ED Visit at 30 days, Odds Ratio (95% CI)Readmission at 30 days, Odds Ratio (95% CI)
  • Abbreviations: CI, confidence intervals; ICU, intensive care unit; MDC, major diagnostic category: comparison to other.

  • P < 0.05.

Night admission0.53 (0.26 to 1.11)1.13 (0.80 to 1.60)1.23 (0.86 to 1.78)
Weekend admission0.20 (0.05 to 0.88)*0.95 (0.63 to 1.44)0.80 (0.51 to 1.25)
Age1.00 (0.98 to 1.02)0.99 (0.98 to 1.002)1.00 (0.99 to 1.01)
Male gender0.98 (0.47 to 2.02)1.09 (0.78 to 1.54)0.91 (0.64 to 1.31)
Race, Black0.75 (0.33 to 1.70)1.48 (1.02 to 2.14)*1.12 (0.76 to 1.65)
Ethnicity, Hispanic0.76 (0.16 to 3.73)1.09 (0.55 to 2.17)1.11 (0.55 to 2.22)
Medicaid or self‐pay insurance0.75 (0.16 to 3.49)1.61 (0.95 to 2.72)2.14 (1.24 to 3.67)*
Hospitalist service0.68 (0.33 to 1.44)1.15 (0.81 to 1.63)0.99 (0.69 to 1.43)
MDC: respiratory1.18 (0.41 to 3.38)1.02 (0.61 to 1.69)1.16 (0.69 to 1.95)
MDC: circulatory1.22 (0.52 to 2.87)0.79 (0.51 to 1.22)0.80 (0.51 to 1.27)
MDC: digestive0.51 (0.11 to 2.32)0.83 (0.47 to 1.46)1.08 (0.62 to 1.91)
Charlson Comobrbidity Index1.25 (1.09 to 1.45)*1.09 (1.01 to 1.19)*1.11 (1.02 to 1.21)*

Our multivariate models explained very little of the variance in patient outcomes. For LOS and hospital charges, adjusted R2 values were 0.06 and 0.05, respectively. For ICU transfer during hospitalization, 30 day repeat ED visit, and 30 day readmission, the areas under the receiver operator curves were 0.75, 0.51, and 0.61 respectively.

To assess the robustness of our conclusions regarding night admission, we redefined night to include only patients admitted between the hours of 8 pm and 5:59 am. This did not change our conclusions. We also tested for interaction between night admission and weekend admission for all outcomes to assess whether night admissions on the weekend were in fact at increased risk of worse outcomes; we found no evidence of interaction (P > 0.3 for the interaction terms in each model).

Discussion

Among patients admitted to the medicine services at our academic medical center, night or weekend admission was not associated with worse hospitalization‐relevant outcomes. In some cases, night or weekend admission was associated with better outcomes, particularly in terms of ICU transfer during hospitalization and hospital charges. Prior research indicates worse outcomes during off‐hours,5 but we did not replicate this finding in our study.

The finding that admission at night was not associated with worse outcomes, particularly proximal outcomes such as LOS or ICU transfer during hospitalization, was surprising, though reassuring in view of the fact that more than half of our patients are admitted at night. We believe a few factors may be responsible. First, our general medicine service is staffed during the night (7 pm to 7 am) by in‐house nocturnists and night float residents. Second, our staffing ratio, while lower at night than during the day, remains the same on weekends and may be higher than in other settings. In continuously well‐staffed settings such as the ED12 and ICU,13 night and weekend admissions are only inconsistently associated with worse outcomes, which may be the same phenomena we observed in the current study. Third, the hospital used as the site of this study has received Nursing Magnet recognition and numerous quality awards such as the National Research Corporation's Consumer Choice Award and recognition as a Distinguished Hospital for Clinical Excellence by HealthGrades. Fourth, our integrated electronic medical record, computerized physician order entry system, and automatically generated sign out serve as complements to the morning hand off. Fifth, hospitalists and teaching teams rotate on a weekly, biweekly, or every 4 week basis, which may protect against discontinuity associated with the weekend. We believe that all of these factors may facilitate alert, comprehensive care during the night and weekend as well as safe and efficient transfer of patients from the night to the day providers.

We were also surprised by the association between weekend admission and lower charges and a lower chance of ICU transfer during hospitalization. We believe many of the same factors noted above may have played a role in these findings. In terms of hospital charges, it is possible that some workups were completed outside of the hospital rather than during the hospitalization, and that some tests were not ordered at all due to unavailability on weekends. The decreased chance of ICU transfer is unexplained. We hypothesize that there may have been a more conservative admission strategy within the ED, such that patients with high baseline severity were admitted directly to the ICU on the weekend rather than being admitted first to the general medicine floor. This hypothesis requires further study.

Our study had important limitations. It was a retrospective study from a single academic hospital. The sample size lacked sufficient power to detect differences in the low frequency of certain outcomes such as poor outcomes within 24 hours of admission (2% vs. 1%), and also for more frequent outcomes such as 30 day readmission; it is possible that with a larger sample there would have been statistically significant differences. Further, we recognize that the Charlson Comorbidity Index, which was developed to predict 1‐year mortality for medicine service patients, does not adjust for severity of illness at presentation, particularly for outcomes such as readmission. If patients admitted at night and during the weekend were less acutely ill despite having similar comorbidities and MDCs at admission, true associations between time of admission and worse outcomes could have been masked. Furthermore, the multivariable modeling explained very little of the variance in patient outcomes such that significant unmeasured confounding may still be present, and consequently our results cannot be interpreted in a causal way. Data was collected from electronic records, so it is possible that some adverse events were not recorded. However, it seems unlikely that major events such as death and transfer to an ICU would have been missed.

Several aspects of the study strengthen our confidence in the findings, including a large sample size, relevance of the outcomes, the adjustment for confounders, and an assessment for robustness of the conclusions based on restricting the definition of night and also testing for interaction between night and weekend admission. Our patient demographics and insurance mix resemble that of other academic hospitals,10 and perhaps our results may be generalizable to these settings, if not to non‐urban or community hospitals. Furthermore, the Charlson Comorbidity Index was associated with all 5 of the modeled outcomes we chose for our study, reaffirming their utility in assessing the quality of hospital care. Future directions for investigation may include examining the association of night admission with hospitalization‐relevant outcomes in nonacademic, nonurban settings, and examining whether the lack of association between night and weekend admission and worse outcomes persists with adjustment for initial severity of illness.

In summary, at a large, well‐staffed urban academic hospital, day or time of admission were not associated with worse hospitalization‐relevant outcomes. The use of nocturnists and night float teams for night admissions and continuity across weekends appears to be a safe approach to handling the increased volume of patients admitted at night, and a viable alternative to overnight call in the era of work hour restrictions.

The hospitalist movement and increasingly stringent resident work hour restrictions have led to the utilization of shift work in many hospitals.1 Use of nocturnist and night float systems, while often necessary, results in increased patient hand‐offs. Research suggests that hand‐offs in the inpatient setting can adversely affect patient outcomes as lack of continuity may increase the possibility of medical error.2, 3 In 2001, Bell et al.4 found that mortality was higher among patients admitted on weekends as compared to weekdays. Uneven staffing, lack of supervision, and fragmented care were cited as potential contributing factors.4 Similarly, Peberdy et al.5 in 2008 revealed that patients were less likely to survive a cardiac arrest if it occurred at night or on weekends, again attributed in part to fragmented patient care and understaffing.

The results of these studies raise concerns as to whether increased reliance on shift work and resulting handoffs compromises patient care.6, 7 The aim of this study was to evaluate the potential association between night admission and hospitalization‐relevant outcomes (length of stay [LOS], hospital charges, intensive care unit [ICU] transfer during hospitalization, repeat emergency department [ED] visit within 30 days of discharge, readmission within 30 days of discharge, and poor outcome [transfer to the ICU, cardiac arrest, or death] within the first 24 hours of admission) at an institution that exclusively uses nocturnists (night‐shift based hospitalists) and a resident night float system for patients admitted at night to the general medicine service. A secondary aim was to determine the potential association between weekend admission and hospitalization‐relevant outcomes.

Methods

Study Sample and Selection

We conducted a retrospective medical record review at a large urban academic hospital. Using an administrative hospital data set, we assembled a list of approximately 9000 admissions to the general medicine service from the ED between January 2008 and October 2008. We sampled consecutive admissions from 3 distinct periods beginning in January, April, and July to capture outcomes at various points in the academic year. We attempted to review approximately 10% of all charts equally distributed among the 3 sampling periods (ie, 900 charts total with one‐third from each period) based on time available to the reviewers. We excluded patients not admitted to the general medicine service and patients without complete demographic or outcome information. We also excluded patients not admitted from the ED given that the vast majority of admissions to our hospital during the night (96%) or weekend (93%) are from the ED. Patients admitted to the general medicine service are cared for either by a hospitalist or by a teaching team comprised of 1 attending (about 40% of whom are hospitalists), 1 resident, 1 to 2 interns, and 1 to 3 medical students. From 7 am to 6:59 pm patients are admitted to the care of 1 of the primary daytime admitting teams. From 7 pm to 6:59 am patients are admitted by nocturnists (hospitalist service) or night float residents (teaching service). These patients are handed off to day teams at 7 am. Hospitalist teams change service on a weekly to biweekly basis and resident teams switch on a monthly basis; there is no difference in physician staffing between the weekend and weekdays. The Northwestern University Institutional Review Board approved this study.

Data Acquisition and Medical Records Reviews

We obtained demographic data including gender, age, race and ethnicity, patient insurance, admission day (weekday vs. weekend), admission time (defined as the time that a patient receives a hospital bed, which at our institution is also the time that admitting teams receive report and assume care for the patient), and the International Classification of Disease codes required to determine the Major Diagnostic Category (MDC) and calculate the Charlson Comorbidity Index8, 9 as part of an administrative data set. We divided the admission time into night admission (defined as 7 pm to 6:59 am) and day admission (defined as 7:00 am to 6:59 pm). We created a chart abstraction tool to allow manual recording of the additional fields of admitting team (hospitalist vs. resident), 30 day repeat ED visit, 30 day readmission, and poor outcomes within the first 24 hours of admission, directly from the electronic record.

Study Outcomes

We evaluated each admission for the following 6 primary outcomes which were specified a priori: LOS (defined as discharge date and time minus admission date and time), hospital charges (defined as charges billed as recorded in the administrative data set), ICU transfer during hospitalization (defined as 1 ICU day in the administrative data set), 30 day repeat ED visit (defined as a visit to our ED within 30 days of discharge as assessed by chart abstraction), 30 day readmission (defined as any planned or unplanned admission to any inpatient service at our institution within 30 days of discharge as assessed by chart abstraction), and poor outcome within 24 hours of admission (defined as transfer to the ICU, cardiac arrest, or death as assessed by chart abstraction). Each of these outcomes has been used in prior work to assess the quality of inpatient care.10, 11

Statistical Analysis

Interrater reliability between the 3 physician reviewers was assessed for 20 randomly selected admissions across the 4 separate review measures using interclass correlation coefficients. Comparisons between night admissions and day admissions, and between weekend and weekday admissions, for the continuous primary outcomes (LOS, hospital charges) were assessed using 2‐tailed t‐tests as well as Wilcoxon rank sum test. In the multivariable modeling, these outcomes were assessed by linear regression controlling for age, gender, race and ethnicity, Medicaid or self‐pay insurance, admission to the hospitalist or teaching service, most common MDC categories, and Charlson Comorbidity Index. Because both outcomes were right‐skewed, we separately assessed each after log‐transformation controlling for the same variables.

All comparisons of the dichotomous primary outcomes (ICU transfer during hospitalization, 30 day repeat ED visit, 30 day readmission, and poor outcome within the first 24 hours after admission) were assessed at the univariate level by chi‐squared test, and in the multivariable models using logistic regression, controlling for the same variables as the linear models above. All adjustments were specified a priori. All data analyses were conducted using Stata (College Station, TX; Version 11).

Results

We reviewed 857 records. After excluding 33 records lacking administrative data regarding gender, race and ethnicity, and other demographic variables, there were 824 medical records available for analysis. We reviewed a similar number of records from each time period: 274 from January 2008, 265 from April 2008, and 285 from July 2008. A total of 345 (42%) patients were admitted during the day, and 479 (58%) at night; 641 (78%) were admitted on weekdays, and 183 (22%) on weekends. The 33 excluded charts were similar to the included charts for both time of admission and outcomes. Results for parametric testing and nonparametric testing, as well as for log‐transformation and non‐log‐transformation of the continuous outcomes were similar in both magnitude and statistical significance, so we present the parametric and nonlog‐transformed results below for ease of interpretation.

Interrater reliability among the 3 reviewers was very high. There were no disagreements among the 20 multiple reviews for either poor outcomes within 24 hours of admission or admitting service; the interclass correlation coefficients for 30 day repeat ED visit and 30 day readmission were 0.97 and 0.87, respectively.

Patients admitted at night or on the weekend were similar to patients admitted during the day and week across age, gender, insurance class, MDC, and Charlson Comorbidity Index (Table 1). For unadjusted outcomes, patients admitted at night has a similar LOS, hospital charges, 30 day repeat ED visits, 30 day readmissions, and poor outcome within 24 hours of admission as those patients admitted during the day. They had a potentially lower chance of any ICU transfer during hospitalization though this did not reach statistical significance at P < 0.05 (night admission 6%, day admission 3%, P = 0.06) (Table 2).

Baseline Characteristics of Patients
CharacteristicsTime of DayDay of the Week
Day Admission (n = 345)Night Admission (n = 479)Weekday Admission (n = 641)Weekend Admission (n = 183)
  • NOTE: All P values > 0.05.

  • Abbreviation: ED, emergency department.

Age (years)60.859.760.658.7
Gender (% male)47434546
Race/Ethnicity (%)
White, Asian, other61545755
Black34383734
Hispanic58610
Medicaid or self pay (%)9101011
Major diagnostic category (%)
Respiratory disease14131413
Circulatory disease28232624
Digestive disease12121212
Other45524851
Charlson Comorbidity Index3.713.603.663.60
Outcomes, Unadjusted
OutcomesTime of DayDay of the Week
Day Admission (n = 345)Night Admission (n = 479)Weekday Admission (n = 641)Weekend Admission (n = 183)
  • Abbreviations: ED, emergency department; ICU, intensive care unit.

  • P < 0.05.

  • P= 0.06.

Length of stay4.34.14.33.8
Hospital charges$27,500$25,200$27,200*$22,700*
ICU transfer during hospitalization (%)635*1*
Repeat ED visit at 30 days (%)20222221
Readmission at 30 days (%)17202017
Poor outcome at 24 hours (ICU transfer, cardiac arrest, or death)(%)2121

Patients admitted to the hospital during the weekend were similar to patients admitted during the week for unadjusted LOS, 30 day repeat ED visit or readmission rate, and poor outcomes within 24 hours of admission as those admitted during the week; however, they had lower hospital charges (weekend admission $22,700, weekday admission $27,200; P = 0.02), and a lower chance of ICU transfer during hospitalization (weekend admission 1%, weekday admission 5%; P = 0.02) (Table 2).

In the multivariable linear and logistic regression models (Tables 3 and 4), we assessed the independent association between night admission or weekend admission and each hospitalization‐relevant outcome except for poor outcome within 24 hours of admission (poor outcome within 24 hours of admission was not modeled to avoid the risk of overfitting because there were only 13 total events). After adjustment for age, gender, race and ethnicity, admitting service (hospitalist or teaching), Medicaid or self‐pay insurance, MDC, and Charlson Comorbidity Index, there was no statistically significant association between night admission and worse outcomes for LOS, hospital charges, 30 day repeat ED visit, or 30 day readmission. Night admission was associated with a decreased chance of ICU transfer during hospitalization, but the difference was not statistically significant (odds ratio, 0.54; 95% confidence interval [CI], 0.26‐1.11, P = 0.09). Weekend admission was not associated with worse outcomes for LOS or 30 day repeat ED visit or readmission; however, weekend admission was associated with a decrease in overall charges ($4400; 95% CI, $8300 to $600) and a decreased chance of ICU transfer during hospitalization (odds ratio, 0.20; 95% CI, 0.050.88).

Linear Regressions for Continuous Outcomes (With Coefficients)
PredictorsLength of Stay (days), Coefficient (95% CI)Hospital Charges (dollars), Coefficient (95% CI)
  • Abbreviations: CI, confidence intervals; ICU, intensive care unit; MDC, major diagnostic category: comparison to other.

  • P < 0.05.

Night admission0.23 (0.77 to 0.32)2100 (5400 to 1100)
Weekend admission0.42 (1.07 to 0.23)4400 (8300 to 600)*
Age0.01 (0.01 to 0.03)0 (100 to 100)
Male gender0.15 (0.70 to 0.39)400 (3700 to 2800)
Race, Black0.18 (0.41 to 0.78)200 (3700 to 3400)
Ethnicity, Hispanic0.62 (1.73 to 0.49)2300 (8900 to 4300)
Medicaid or self‐pay insurance1.87 (0.93 to 2.82)*8900 (3300 to 14600)*
Hospitalist service0.26 (0.29 to 0.81)600 (3900 to 2700)
MDC: respiratory0.36 (1.18 to 0.46)700 (4200 to 5600)
MDC: circulatory1.36 (2.04 to 0.68)*600 (4600 to 3400)
MDC: digestive1.22 (2.08 to 0.35)*6800 (12000 to 1700)*
Charlson Comorbidity Index0.35 (0.22 to 0.49)*2200 (1400 to 3000)*
Logistic Regressions for Dichotomous Outcomes (With Odds Ratios)
PredictorsICU Transfer during Hospitalization, Odds Ratio (95% CI)Repeat ED Visit at 30 days, Odds Ratio (95% CI)Readmission at 30 days, Odds Ratio (95% CI)
  • Abbreviations: CI, confidence intervals; ICU, intensive care unit; MDC, major diagnostic category: comparison to other.

  • P < 0.05.

Night admission0.53 (0.26 to 1.11)1.13 (0.80 to 1.60)1.23 (0.86 to 1.78)
Weekend admission0.20 (0.05 to 0.88)*0.95 (0.63 to 1.44)0.80 (0.51 to 1.25)
Age1.00 (0.98 to 1.02)0.99 (0.98 to 1.002)1.00 (0.99 to 1.01)
Male gender0.98 (0.47 to 2.02)1.09 (0.78 to 1.54)0.91 (0.64 to 1.31)
Race, Black0.75 (0.33 to 1.70)1.48 (1.02 to 2.14)*1.12 (0.76 to 1.65)
Ethnicity, Hispanic0.76 (0.16 to 3.73)1.09 (0.55 to 2.17)1.11 (0.55 to 2.22)
Medicaid or self‐pay insurance0.75 (0.16 to 3.49)1.61 (0.95 to 2.72)2.14 (1.24 to 3.67)*
Hospitalist service0.68 (0.33 to 1.44)1.15 (0.81 to 1.63)0.99 (0.69 to 1.43)
MDC: respiratory1.18 (0.41 to 3.38)1.02 (0.61 to 1.69)1.16 (0.69 to 1.95)
MDC: circulatory1.22 (0.52 to 2.87)0.79 (0.51 to 1.22)0.80 (0.51 to 1.27)
MDC: digestive0.51 (0.11 to 2.32)0.83 (0.47 to 1.46)1.08 (0.62 to 1.91)
Charlson Comobrbidity Index1.25 (1.09 to 1.45)*1.09 (1.01 to 1.19)*1.11 (1.02 to 1.21)*

Our multivariate models explained very little of the variance in patient outcomes. For LOS and hospital charges, adjusted R2 values were 0.06 and 0.05, respectively. For ICU transfer during hospitalization, 30 day repeat ED visit, and 30 day readmission, the areas under the receiver operator curves were 0.75, 0.51, and 0.61 respectively.

To assess the robustness of our conclusions regarding night admission, we redefined night to include only patients admitted between the hours of 8 pm and 5:59 am. This did not change our conclusions. We also tested for interaction between night admission and weekend admission for all outcomes to assess whether night admissions on the weekend were in fact at increased risk of worse outcomes; we found no evidence of interaction (P > 0.3 for the interaction terms in each model).

Discussion

Among patients admitted to the medicine services at our academic medical center, night or weekend admission was not associated with worse hospitalization‐relevant outcomes. In some cases, night or weekend admission was associated with better outcomes, particularly in terms of ICU transfer during hospitalization and hospital charges. Prior research indicates worse outcomes during off‐hours,5 but we did not replicate this finding in our study.

The finding that admission at night was not associated with worse outcomes, particularly proximal outcomes such as LOS or ICU transfer during hospitalization, was surprising, though reassuring in view of the fact that more than half of our patients are admitted at night. We believe a few factors may be responsible. First, our general medicine service is staffed during the night (7 pm to 7 am) by in‐house nocturnists and night float residents. Second, our staffing ratio, while lower at night than during the day, remains the same on weekends and may be higher than in other settings. In continuously well‐staffed settings such as the ED12 and ICU,13 night and weekend admissions are only inconsistently associated with worse outcomes, which may be the same phenomena we observed in the current study. Third, the hospital used as the site of this study has received Nursing Magnet recognition and numerous quality awards such as the National Research Corporation's Consumer Choice Award and recognition as a Distinguished Hospital for Clinical Excellence by HealthGrades. Fourth, our integrated electronic medical record, computerized physician order entry system, and automatically generated sign out serve as complements to the morning hand off. Fifth, hospitalists and teaching teams rotate on a weekly, biweekly, or every 4 week basis, which may protect against discontinuity associated with the weekend. We believe that all of these factors may facilitate alert, comprehensive care during the night and weekend as well as safe and efficient transfer of patients from the night to the day providers.

We were also surprised by the association between weekend admission and lower charges and a lower chance of ICU transfer during hospitalization. We believe many of the same factors noted above may have played a role in these findings. In terms of hospital charges, it is possible that some workups were completed outside of the hospital rather than during the hospitalization, and that some tests were not ordered at all due to unavailability on weekends. The decreased chance of ICU transfer is unexplained. We hypothesize that there may have been a more conservative admission strategy within the ED, such that patients with high baseline severity were admitted directly to the ICU on the weekend rather than being admitted first to the general medicine floor. This hypothesis requires further study.

Our study had important limitations. It was a retrospective study from a single academic hospital. The sample size lacked sufficient power to detect differences in the low frequency of certain outcomes such as poor outcomes within 24 hours of admission (2% vs. 1%), and also for more frequent outcomes such as 30 day readmission; it is possible that with a larger sample there would have been statistically significant differences. Further, we recognize that the Charlson Comorbidity Index, which was developed to predict 1‐year mortality for medicine service patients, does not adjust for severity of illness at presentation, particularly for outcomes such as readmission. If patients admitted at night and during the weekend were less acutely ill despite having similar comorbidities and MDCs at admission, true associations between time of admission and worse outcomes could have been masked. Furthermore, the multivariable modeling explained very little of the variance in patient outcomes such that significant unmeasured confounding may still be present, and consequently our results cannot be interpreted in a causal way. Data was collected from electronic records, so it is possible that some adverse events were not recorded. However, it seems unlikely that major events such as death and transfer to an ICU would have been missed.

Several aspects of the study strengthen our confidence in the findings, including a large sample size, relevance of the outcomes, the adjustment for confounders, and an assessment for robustness of the conclusions based on restricting the definition of night and also testing for interaction between night and weekend admission. Our patient demographics and insurance mix resemble that of other academic hospitals,10 and perhaps our results may be generalizable to these settings, if not to non‐urban or community hospitals. Furthermore, the Charlson Comorbidity Index was associated with all 5 of the modeled outcomes we chose for our study, reaffirming their utility in assessing the quality of hospital care. Future directions for investigation may include examining the association of night admission with hospitalization‐relevant outcomes in nonacademic, nonurban settings, and examining whether the lack of association between night and weekend admission and worse outcomes persists with adjustment for initial severity of illness.

In summary, at a large, well‐staffed urban academic hospital, day or time of admission were not associated with worse hospitalization‐relevant outcomes. The use of nocturnists and night float teams for night admissions and continuity across weekends appears to be a safe approach to handling the increased volume of patients admitted at night, and a viable alternative to overnight call in the era of work hour restrictions.

References
  1. Vaughn DM,Stout CL,McCampbell BL, et al.Three‐year results of mandated work hour restrictions: attending and resident perspectives and effects in a community hospital.Am Surg.2008;74(6):542546; discussion 546–547.
  2. Kitch BT,Cooper JB,Zapol WM, et al.Handoffs causing patient harm: a survey of medical and surgical house staff.Jt Comm J Qual Patient Saf.2008;34(10):563570.
  3. Petersen LA,Brennan TA,O'Neil AC,Cook EF,Lee TH.Does housestaff discontinuity of care increase the risk for preventable adverse events?Ann Intern Med.1994;121(11):866872.
  4. Bell CM,Redelmeier DA.Mortality among patients admitted to hospitals on weekends as compared with weekdays.N Engl J Med.2001;345(9):663668.
  5. Peberdy MA,Ornato JP,Larkin GL, et al.Survival from in‐hospital cardiac arrest during nights and weekends.JAMA.2008;299(7):785792.
  6. Sharma G,Freeman J,Zhang D,Goodwin JS.Continuity of care and intensive care unit use at the end of life.Arch Intern Med.2009;169(1):8186.
  7. Sharma G,Fletcher KE,Zhang D,Kuo YF,Freeman JL,Goodwin JS.Continuity of outpatient and inpatient care by primary care physicians for hospitalized older adults.JAMA.2009;301(16):16711680.
  8. Charlson ME,Ales KL,Simon R,MacKenzie CR.Why predictive indexes perform less well in validation studies: is it magic or methods?Arch Intern Med.1987;147:21552161.
  9. Deyo RA,Cherkin DC,Ciol MA.Adapting a clinical comorbidity index for use with ICD‐9‐CM administrative databases.J Clin Epidemiol.1992;45(6):613619.
  10. Roy CL,Liang CL,Lund M, et al.Implementation of a physician assistant/hospitalist service in an academic medical center: impact on efficiency and patient outcomes.J Hosp Med.2008;3(5):361368.
  11. Groarke JD,Gallagher J,Stack J, et al.Use of an admission early warning score to predict patient morbidity and mortality and treatment success.Emerg Med J.2008;25(12):803806.
  12. Schmulewitz L,Proudfoot A,Bell D.The impact of weekends on outcome for emergency patients.Clin Med.2005;5(6):621625.
  13. Meynaar IA,van der Spoel JI,Rommes JH,van Spreuwel‐Verheijen M,Bosman RJ,Spronk PE.Off hour admission to an intensivist‐led ICU is not associated with increased mortality.Crit Care.2009;13(3):R84.
References
  1. Vaughn DM,Stout CL,McCampbell BL, et al.Three‐year results of mandated work hour restrictions: attending and resident perspectives and effects in a community hospital.Am Surg.2008;74(6):542546; discussion 546–547.
  2. Kitch BT,Cooper JB,Zapol WM, et al.Handoffs causing patient harm: a survey of medical and surgical house staff.Jt Comm J Qual Patient Saf.2008;34(10):563570.
  3. Petersen LA,Brennan TA,O'Neil AC,Cook EF,Lee TH.Does housestaff discontinuity of care increase the risk for preventable adverse events?Ann Intern Med.1994;121(11):866872.
  4. Bell CM,Redelmeier DA.Mortality among patients admitted to hospitals on weekends as compared with weekdays.N Engl J Med.2001;345(9):663668.
  5. Peberdy MA,Ornato JP,Larkin GL, et al.Survival from in‐hospital cardiac arrest during nights and weekends.JAMA.2008;299(7):785792.
  6. Sharma G,Freeman J,Zhang D,Goodwin JS.Continuity of care and intensive care unit use at the end of life.Arch Intern Med.2009;169(1):8186.
  7. Sharma G,Fletcher KE,Zhang D,Kuo YF,Freeman JL,Goodwin JS.Continuity of outpatient and inpatient care by primary care physicians for hospitalized older adults.JAMA.2009;301(16):16711680.
  8. Charlson ME,Ales KL,Simon R,MacKenzie CR.Why predictive indexes perform less well in validation studies: is it magic or methods?Arch Intern Med.1987;147:21552161.
  9. Deyo RA,Cherkin DC,Ciol MA.Adapting a clinical comorbidity index for use with ICD‐9‐CM administrative databases.J Clin Epidemiol.1992;45(6):613619.
  10. Roy CL,Liang CL,Lund M, et al.Implementation of a physician assistant/hospitalist service in an academic medical center: impact on efficiency and patient outcomes.J Hosp Med.2008;3(5):361368.
  11. Groarke JD,Gallagher J,Stack J, et al.Use of an admission early warning score to predict patient morbidity and mortality and treatment success.Emerg Med J.2008;25(12):803806.
  12. Schmulewitz L,Proudfoot A,Bell D.The impact of weekends on outcome for emergency patients.Clin Med.2005;5(6):621625.
  13. Meynaar IA,van der Spoel JI,Rommes JH,van Spreuwel‐Verheijen M,Bosman RJ,Spronk PE.Off hour admission to an intensivist‐led ICU is not associated with increased mortality.Crit Care.2009;13(3):R84.
Issue
Journal of Hospital Medicine - 6(1)
Issue
Journal of Hospital Medicine - 6(1)
Page Number
10-14
Page Number
10-14
Publications
Publications
Article Type
Display Headline
The association between night or weekend admission and hospitalization‐relevant patient outcomes
Display Headline
The association between night or weekend admission and hospitalization‐relevant patient outcomes
Legacy Keywords
communication, continuity of care transition and discharge planning, education, outcomes measurement, patient safety, resident
Legacy Keywords
communication, continuity of care transition and discharge planning, education, outcomes measurement, patient safety, resident
Sections
Article Source

Copyright © 2010 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Division of General Internal Medicine, University of California, San Francisco, Box 1211, San Francisco, CA, 94143‐1211
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Article PDF Media

Simulation Improves CVC Placement

Article Type
Changed
Sun, 05/28/2017 - 21:37
Display Headline
Use of simulation‐based mastery learning to improve the quality of central venous catheter placement in a medical intensive care unit

Central venous catheter (CVC) insertions are commonly performed at the bedside in medical intensive care unit (MICU) settings. Internal medicine residents are required to demonstrate knowledge regarding CVC indications, complications, and sterile technique,1 and often perform the procedure during training. Education in CVC insertion is needed because many internal medicine residents are uncomfortable performing this procedure.2 CVC insertion also carries the risk of potentially life‐threatening complications including infection, pneumothorax, arterial puncture, deep vein thrombosis, and bleeding. Education and training may also contribute to improved patient care because increased physician experience with CVC insertion reduces complication risk.3, 4 Similarly, a higher number of needle passes or attempts during CVC insertion correlates with mechanical complications such as pneumothorax or arterial punctures.48 Pneumothorax rates for internal jugular (IJ) CVCs have been reported to range from 0% to 0.2% and for subclavian (SC) CVCs from 1.5% to 3.1%.4, 5 The arterial puncture rate for IJ CVCs ranges from 5.0% to 9.4% and for SC CVCs from 3.1% to 4.9%.4, 5 Proper use of ultrasound to assist with IJ CVC insertion has been shown to decrease these mechanical complications.4, 5 However, studies of ultrasound use with SC CVC insertion have mixed results.4

Simulation‐based training has been used in medical education to increase knowledge, provide opportunities for deliberate and safe practice, and shape the development of clinical skills.9, 10 We previously used simulation‐based mastery learning to improve the thoracentesis and advanced cardiac life support (ACLS) skills of internal medicine residents.11, 12 Although a few small studies have linked simulation‐based interventions to improved quality of care,1319 more work is needed to show that results from a simulated environment transfer to actual patient care.

This study had 2 aims. The first was to expand our simulation‐based mastery learning to CVC insertion using a CVC simulator and ultrasound device. The second was to assess quality indicators (number of needle passes, pneumothorax, arterial punctures, and need for catheter adjustment) and resident confidence related to actual CVC insertions in the MICU before and after an educational intervention.

Materials and Methods

Design

This was a cohort study20 of IJ and SC CVC insertions by 41 second‐ and third‐year internal medicine residents rotating through the MICU in a university‐affiliated program from October 2006 to February 2007. The Northwestern University Institutional Review Board approved the study. All study participants were required to give informed consent prior to participation.

Thirteen residents rotated through the MICU during a 6‐week preintervention phase. These residents served as a traditionally trained group that did not receive CVC insertion simulator training. Simultaneously, 28 residents who rotated through the MICU later in the study period received simulation‐based training in CVC insertion and served as the simulator‐trained group (Figure 1). Demographic data were obtained from the participants including age, gender, ethnicity, year of training, and scores on the United States Medical Licensing Examination (USMLE) Steps 1 and 2.

Figure 1
Timeline of CVC training and clinical rotations.

Simulator‐trained residents underwent baseline skill assessment (pretest) using a 27‐item checklist in IJ and SC CVC insertions (see Appendix). Checklists were developed by one author (J.H.B.) using appropriate references4, 5 and a step‐by‐step process,21 and reviewed for completeness by another author with expertise in checklist development (D.B.W.). Each skill or other action was listed in order and given equal weight. A dichotomous scoring scale of 1 = done correctly and 0 = done incorrectly/not done was imposed for each item. Assessments were performed using Simulab's CentralLineMan. This model features realistic tissue with ultrasound compatibility, an arterial pulse, and self‐sealing veins and skins. Needles, dilators, and guidewires can be inserted and realistic venous and arterial pressures demonstrated (Figure 2).

Figure 2
Resident training on the CVC simulator.

Residents in the simulator‐trained group received two, 2‐hour education sessions featuring a lecture, ultrasound training, deliberate practice with the CVC simulator, and feedback.22 Education sessions contained standardized didactic material on CVC indications and complications, as well as a stepwise demonstration of IJ and SC CVC insertions using ultrasound and landmark techniques. These sessions were supervised by a senior hospitalist faculty member with expertise in CVC insertions (J.H.B.). Residents were expected to use the ultrasound device for all IJ CVC insertions. However, its use was optional for SC CVC insertion. After training, residents were retested (posttest) and required to meet or exceed a minimum passing score (MPS) set by an expert panel for both IJ and SC procedures.23 This 11 member expert panel provided item‐based (Angoff) and group‐based (Hofstee) judgments on the 27‐item checklists as described previously.23

Residents who did not achieve the MPS had more deliberate practice and were retested until the MPS was reached; the key feature of mastery learning.24 After completing simulation‐based mastery learning in CVC insertion, the 28 simulator‐trained residents rotated through the MICU.

Data Collection

All pretests and posttests (using the 27‐item checklist) were graded by a single unblinded instructor (J.H.B.) and were videotaped. Another faculty instructor with expertise in scoring clinical skills examinations and blind to pre‐post status (D.B.W.) rescored a random 50% sample of the tests to assess interrater reliability.

Data regarding actual CVC insertions in the MICU were collected by contacting all MICU residents daily during the study period. This allowed for CVC insertions to be identified within 24 hours. All survey data were collected anonymously. The primary inserter of each CVC was questioned about quality indicators and procedural self‐confidence concerning CVC placement. CVCs primarily inserted by nonstudy subjects (first‐year residents, emergency medicine residents, pulmonary‐critical care medicine faculty members, and subspecialty fellows) or CVC placements that were supervised, but not directly placed by study participants, were excluded.

Outcome Measures

Pretest and posttest checklist scores from simulator‐trained residents were compared to measure the impact of training sessions. Residents rotating through the MICU were asked about several quality indicators related to actual CVC insertions. Quality indicators include: (1) number of needle passes required during the procedure (skin punctures); (2) presence of complications including pneumothorax and arterial puncture; and (3) need for CVC adjustment after chest x‐ray. Participants were also questioned regarding their confidence in CVC insertion using a 100 point scale (0 = not confident and 100 = very confident). Survey results from the 28 simulator‐trained residents were compared to results from the 13 traditionally‐trained residents.

Data Analysis

Checklist score reliability was estimated by calculating interrater reliability, the preferred method for assessments that depend on human judges, using the kappa () coefficient adjusted25, 26 using the formula of Brennan and Prediger.27 Within‐group differences from pretest (baseline) to posttest (outcome) were analyzed using paired t‐tests.

MICU survey results were compared using t‐tests. Traditionally‐trained and simulator‐trained groups were assessed for demographic differences using t‐tests and the chi‐square statistic. Spearman's rank correlation coefficient was used to assess for relationships between resident self‐confidence and quality indicators. All analyses were preformed using SPSS statistical software, version 16.0 (SPSS, Inc., Chicago, IL).

Results

All eligible residents participated in the study and completed the entire protocol. There was no significant difference in age, gender, ethnicity, year of training, or USMLE Step 1 and 2 scores between the traditionally‐trained and simulator‐trained groups.

Interrater reliability measured by the mean kappa coefficient was very high (n = 0.94) across the 27 IJ and SC checklist items. No resident met the MPS (79.1%) for CVC insertion at baseline testing. In the simulator‐trained group, 25 of 28 (89%) residents achieved SC skill mastery and 27 of 28 (96%) achieved IJ skill mastery within the standard four hour curriculum. All residents subsequently reached the MPS with less than one hour of additional practice time. A graphic portrait of the residents' pretest and posttest performance on the simulated CVC clinical skills examination with descriptive statistics is shown in Figure 3. After the educational intervention, posttest scores significantly improved (p < 0.001), to meet or exceed the MPS.

Figure 3
Mean scores and standard deviations on the simulator‐based skills exam before and after the educational intervention. MPS = 79.1%.

Traditionally trained and simulator‐trained residents independently inserted 46 CVCs during the study period. Simulator‐trained residents required significantly fewer needle passes to insert all actual CVCs in the MICU compared to traditionally trained residents: mean (M) = 1.79, standard deviation (SD) = 1.03 versus M = 2.78, SD = 1.77 (p = 0.04). As shown in Table 1, the groups did not differ in pneumothorax, arterial puncture, or mean number of CVC adjustments. In addition, the groups did not differ in use of ultrasound for IJ or SC CVC insertions. One IJ CVC was inserted without ultrasound in the traditionally‐trained group; 2 were inserted without ultrasound in the simulator‐trained group. Ultrasound was not used during any SC CVC insertions in the traditionally‐trained group and was used for 1 SC CVC insertion in the simulator‐trained group.

Comparison of Traditionally Trained Residents vs. Simulator trained Residents in Self‐confidence and CVC Quality Indicators During Actual CVC Insertions in the MICU
 Internal Jugular and Subclavian CVCs
Traditionally Trained ResidentsSimulator Trained ResidentsP value
  • p < 0.05.

  • Abbreviations: CVC, central venous catheter; MICU, medical intensive care unit; n/a, not applicable.

Number of attempts during insertion [mean (SD)]2.78 (1.77)1.79 (1.03)0.04*
Pneumothorax (number)00n/a
Arterial puncture (%)1170.65
CVC adjustment (%)1580.52
Confidence (%) [mean (SD)]68 (20)81 (11)0.02*

Simulator‐trained residents displayed more self‐confidence about their procedural skills than traditionally‐trained residents (M = 81, SD = 11 versus M = 68, SD = 20, p = 0.02). Spearman correlations showed no practical association between resident self‐confidence and performance on CVC insertion quality indicators.

Discussion

This study demonstrates the use of a mastery learning model to develop CVC insertion skills to a high achievement level among internal medicine residents. Our data support prior work showing that procedural skills that are poor at baseline can be increased significantly using simulation‐based training and deliberate practice.1118, 28 This report on CVC insertion adds to the growing body of literature showing that simulation training complements standard medical education,1119, 28 and expands the clinical application of the mastery model beyond thoracentesis and ACLS.11, 12 Use of the mastery model described in this study also has important implications for patients. In our training program, residents are required to demonstrate procedural mastery in a simulated environment before independently performing a CVC insertion on an actual patient. This is in sharp contrast to the traditional clinical model of procedural training at the bedside, and may be used in other training programs and with other invasive procedures.

The second aim of our study was to determine the impact of simulation‐based training on actual clinical practice by residents in the MICU. To our knowledge, no prior study has demonstrated that simulation‐based training in CVC insertion improves patient outcomes. We believe our results advance what is known about the impact of simulation‐based training because simulator‐trained residents in this study performed actual CVC insertions in the MICU using significantly fewer needle passes. Needle passes have been used by other investigators as a surrogate measure for reduced CVC‐associated complications because mechanical complications rise exponentially with more than two insertion attempts.47, 29 We believe this finding demonstrates transfer of skill acquired from simulation‐based training to the actual clinical environment. It is possible that ultrasound training accounts for the improvement in the simulator‐trained group. However, we do not believe that ultrasound training is entirely responsible as prior work has shown that deliberate practice using mastery learning without ultrasound significantly improved resident performance of thoracentesis11 and ACLS12, 19 procedures. We did not show a significant reduction in complications such as pneumothorax or arterial puncture. This is likely due to the small sample size and the low number of procedures and complications during the study period.

Our results also show that resident self‐confidence regarding actual CVC insertions improved after simulation training. These findings are similar to prior reports linking improved confidence among trainees after simulation‐based training in CVC insertion.29, 30 Our results did not reveal a correlation between improved self‐confidence and clinical skill acquisition. Linking improved self‐confidence to improved clinical skill is important because self‐assessment does not always correlate with performance ability.31, 32

More study is needed to evaluate the impact of simulation‐based training on the quality of CVC insertions by trainees. Mechanisms shown to decrease complications of CVC placement include use of ultrasound,4, 7, 3336 full sterile barrier technique,3739 chlorhexidine skin preparations,4042 and nurse‐physician education.43 Our simulation‐training program incorporates each of these elements. We plan to expand our simulation‐based training intervention to a larger sample size to determine its impact on mechanical and infectious complication rates linked to CVC insertion.

This study has several limitations. It was performed at a single institution over a short time period. However, demonstration of significantly fewer needle passes and improved resident self‐confidence after simulator training are important findings that warrant further study. It was impossible to blind raters during the skills assessment examination about whether the resident was performing a pretest or posttest. This was accounted for by using a second rater, who was blind to the pretest and posttest status of the examinee. The arterial puncture rate of 7% among simulator‐trained residents was higher than expected, although it remains within published ranges.4, 5 Also, a low total number of CVCs were evaluated during the study. This is likely due to strict exclusion criteria employed in order to study the impact of simulation training. For example, CVC insertions were only evaluated if they were actually performed by study residents (supervised insertions were excluded) and femoral catheters were not evaluated. We did not track clinical experience with CVC insertion by residents before the study. Residents who were simulator‐trained may have had more clinical experience with CVC insertion and this may have impacted their performance. However, residents did not differ in year of training or clinical rotations, and there is clear evidence that clinical training is not a proxy for skill acquisition.44 Finally, outcome data were measured via resident questionnaires that relied on resident recall about CVC insertion rather than observer ratings. This method was selected because observer ratings could not be standardized given the large number of clinical supervisors in the MICU over the study period. Information about needle passes and arterial puncture also may not be documented in procedural notes and could not be obtained by medical record review. We attempted to minimize recall bias by surveying residents within 24 hours of CVC placement.

In conclusion, this study demonstrates that simulation‐based training and deliberate practice in a mastery learning setting improves performance of both simulated and actual CVC insertions by internal medicine residents. Procedural training remains an important component of internal medicine training, although internists are performing fewer invasive procedures now than in years past.45, 46 Use of a mastery model of CVC insertion requires that trainees demonstrate skill in a simulated environment before independently performing this invasive procedure on patients. Further study is needed to assess clinical outcomes such as reduced CVC‐related infections and mechanical complications after simulation‐based training.

Acknowledgements

The authors thank the Northwestern University internal medicine residents for their dedication to education and patient care. They acknowledge Drs. J. Larry Jameson and Charles Watts for their support and encouragement of this work.

Appendix

Central Venous Catheter Insertion Checklists for Simulation‐based Education 0, 0

Central Venous Catheter Placement (IJ)
  • Skill Key: A = Done Correctly B = Done Incorrectly C = Not Done.

Informed consent obtained: must do allABC
Benefits
Risks
Consent given
Place the patient in slight Trendelenburg positionABC
Flush the ports on the catheter with sterile salineABC
Clamp each port (ok to keep brown port open)ABC
Remove brown port from end of catheter to accommodate wireABC
Area is cleaned with chlorhexadineABC
Resident gets in sterile gown, gloves, hat and maskABC
Area is draped in usual sterile fashion (must be full body drape)ABC
The ultrasound (US) probe is properly set up with sterile sheath and sonographic gelABC
The vein is localized using anatomical landmarks with the US machineABC
If no US is used this is wrong
The skin is anesthetized with 1% lidocaine in a small whealABC
The deeper structures are anesthetizedABC
Localize the vein with this needle (optional)ABC
Using the large needle or catheter‐ syringe complex, cannulate the vein while aspirating (must be done with US)ABC
Remove the syringe from the needle or advance the catheter into the vein removing both the syringe and needleABC
Advance the guidewire into the vein no more than approximately 1215 cmABC
Knick the skin with the scalpel to advance the dilatorABC
Advance the dilator over the guidewire and dilate the veinABC
Advance the triple lumen over the guidewireABC
Never let go of the guidewireABC
Once the catheter is inserted remove the guidewire in its entiretyABC
Advance the catheter to approx to 1416cm on the right side, 1618 cm on the left sideABC
Ensure there is blood flow/flush each portABC
Secure the catheter in place (suture or staple)ABC
Place dressing over catheterABC
Get a chest x‐rayABC
Notify that the catheter is ok to useABC
Maintain sterile techniqueABC
Central Venous Catheter Placement (Subclavian)
  • Skill Key: A = Done Correctly B = Done Incorrectly C = Not Done

Informed consent obtained: must do allABC
Benefits
Risks
Consent given
Place the patient in slight Trendelenburg positionABC
Flush the ports on the catheter with sterile salineABC
Clamp each port (ok to leave brown port open)ABC
Remove brown port from end of catheter to accommodate wireABC
Area is cleaned with chlorhexadineABC
Resident gets in sterile gown, gloves, hat and maskABC
Area is draped in usual sterile fashion (must be full body drape)ABC
**The US probe is properly set up with sterile sheath and sonographic gel . (MUST DO if use US)ABC
The vein is localized using US machine or anatomical landmarks are verbalizedABC
The skin is anesthetized with 1% lidocaine in a small whealABC
The deeper structures are anesthetized using a larger needle (must verbalize they anesthetize the clavicle)ABC
Localize the vein with this needle (optional)ABC
Using the large needle or catheter syringe complex cannulate the vein while aspirating (optional confirmed by US)ABC
If US was not used then expected to state they are directing the needle to the sternal notchABC
Remove the syringe from the needle or advance the catheter into the vein removing both the syringe and needleABC
Advance the guidewire into the vein no more than approximately 1215 cmABC
Knick the skin with the scalpel to advance the dilatorABC
Advance the dilator over the guidewire and dilate the veinABC
Advance the triple lumen over the guidewireABC
Never let go of the guidewireABC
Once the catheter is inserted remove the guidewire in its entiretyABC
Advance the catheter to approx to 1416cm on the right side, 1618 cm on the left sideABC
Ensure there is blood flow/flush each portABC
Secure the catheter in place (suture or staple)ABC
Place dressing over catheterABC
Get a chest x‐rayABC
Notify that the catheter is ok to useABC
Maintain sterile techniqueABC
References
  1. American Board of Internal Medicine. Procedures Required for Internal Medicine. Available at: http://www.abim.org/certification/policies/imss/im.aspx. Accessed January 28, 2009.
  2. Huang GC,Smith CC,Gordon CE, et al.Beyond the comfort zone: residents assess their comfort performing inpatient medical procedures.Am J Med.2006;119:71.e17e24.
  3. Sznajder JI,Zveibil FR,Bitterman H, et al.Central vein catheterization. Failure and complication rates by three percutaneous approaches.Arch Intern Med.1986;146:259261.
  4. McGee DC,Gould MK.Preventing complications of central venous catheterization.N Engl J Med.2003;348:11231133.
  5. Eisen LA,Narasimhan M,Berger JS, et al.Mechanical complications of central venous catheters.J Intensive Care Med.2006;21:4046.
  6. Lefrant JY,Muller L,De La Coussaye JE, et al.Risk factors of failure and immediate complication of subclavian vein catheterization in critically ill patients.Intensive Care Med.2002;28:10361041.
  7. Mansfield PF,Hohn DC,Fornage BD, et al.Complications and failures of subclavian‐vein catheterization.N Engl J Med.1994;331:17351738.
  8. McGee WT.Central venous catheterization: better and worse.J Intensive Care Med.2006;21:5153.
  9. Boulet JR,Murray D,Kras J, et al.Reliability and validity of a simulation‐based acute care skills assessment for medical students and residents.Anesthesiology.2003;99:12701280.
  10. Issenberg SB,McGaghie WC,Hart IR, et al.Simulation technology for health care professional skills training and assessment.JAMA.1999;282:861866.
  11. Wayne DB,Barsuk JH,O'Leary KJ, et al.Mastery learning of thoracentesis skills by internal medicine residents using simulation technology and deliberate practice.J Hosp Med.2008;3:4854.
  12. Wayne DB,Butter J,Siddall VJ, et al.Mastery learning of advanced cardiac life support skills by internal medicine residents using simulation technology and deliberate practice.J Gen Intern Med.2006;21:251256.
  13. Andreatta PB,Woodrum DT,Birkmeyer JD, et al.Laparoscopic skills are improved with LapMentor training: results of a randomized, double‐blinded study.Ann Surg.2006;243:854860.
  14. Blum MG,Powers TW,Sundaresan S.Bronchoscopy simulator effectively prepares junior residents to competently perform basic clinical bronchoscopy.Ann Thorac Surg.2004;78:287291.
  15. Cohen J,Cohen SA,Vora KC, et al.Multicenter, randomized, controlled trial of virtual‐reality simulator training in acquisition of competency in colonoscopy.Gastrointest Endosc.2006;64:361368.
  16. Mayo PH,Hackney JE,Mueck JT, et al.Achieving house staff competence in emergency airway management: results of a teaching program using a computerized patient simulator.Crit Care Med.2004;32:24222427.
  17. Sedlack RE,Kolars JC.Computer simulator training enhances the competency of gastroenterology fellows at colonoscopy: results of a pilot study.Am J Gastroenterol.2004;99:3337.
  18. Seymour NE,Gallagher AG,Roman SA, et al.Virtual reality training improves operating room performance: results of a randomized, double‐blinded study.Ann Surg.2002;236:458463.
  19. Wayne DB,Didwania A,Feinglass J, et al.Simulation‐based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: a case‐control study.CHEST.2008;133:5661.
  20. Fletcher R,Fletcher S.Clinical Epidemiology: the Essentials.4th ed.Philadelphia:Lippincott Williams 2005.
  21. Stufflebeam DL. The Checklists Development Checklist. Western Michigan University Evaluation Center, July2000. Available at: http://www. wmich.edu/evalctr/checklists/cdc.htm. Accessed May 15, 2006.
  22. Ericsson KA.Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains.Acad Med.2004;79:S70S81.
  23. Wayne DB,Barsuk JH,Cohen E, et al.Do baseline data influence standard setting for a clinical skills examination?Acad Med.2007;82:S105S108.
  24. McGaghie W,Siddall V,Mazmanian P, et al.Lessons for Continuing Medical Education from simulation research in undergraduate and graduate medical education.CHEST.2009;135.
  25. Downing SM.Reliability: on the reproducibility of assessment data.Med Educ.2004;38:10061012.
  26. Fleiss JL,Levin B,Paik MC.Statistical Methods for Rates and Proportions.3rd ed.New York:John Wiley 41:687699.
  27. Wayne DB,Butter J,Siddall VJ, et al.Simulation‐based training of internal medicine residents in advanced cardiac life support protocols: a randomized trial.Teach Learn Med.2005;17:202208.
  28. Britt RC,Reed SF,Britt LD.Central catheter simulation: a new training algorithm.Am Surg.2007;73:680682.
  29. Ault MJ,Rosen BT,Ault B.The use of tissue models for vascular access training. Phase I of the procedural patient safety initiative.J Gen Intern Med.2006;21:514517.
  30. Bond WF,Lammers RL,Spillane LL, et al.The use of simulation in emergency medicine: a research agenda.Acad Emerg Med.2007;14:353363.
  31. Wayne DB,Butter J,Siddall VJ, et al.Graduating internal medicine residents' self‐assessment and performance of advanced cardiac life support skills.Med Teach.2006;28:365369.
  32. Beaulieu Y,Marik PE.Bedside ultrasonography in the ICU: Part 2.CHEST.2005;128:17661781.
  33. Lefrant JY,Cuvillon P,Benezet JF, et al.Pulsed Doppler ultrasonography guidance for catheterization of the subclavian vein: a randomized study.Anesthesiology.1998;88:11951201.
  34. Miller AH,Roth BA,Mills TJ, et al.Ultrasound guidance versus the landmark technique for the placement of central venous catheters in the emergency department.Acad Emerg Med.2002;9:800805.
  35. Randolph AG,Cook DJ,Gonzales CA, et al.Ultrasound guidance for placement of central venous catheters: a meta‐analysis of the literature.Crit Care Med.1996;24:20532058.
  36. Berenholtz SM,Pronovost PJ,Lipsett PA, et al.Eliminating catheter‐related bloodstream infections in the intensive care unit.Crit Care Med.2004;32:20142020.
  37. Pronovost P,Needham D,Berenholtz S, et al.An intervention to decrease catheter‐related bloodstream infections in the ICU.N Engl J Med.2006;355:27252732.
  38. Sherertz RJ,Ely EW,Westbrook DM, et al.Education of physicians‐in‐training can decrease the risk for vascular catheter infection.Ann Intern Med.2000;132:641648.
  39. Chaiyakunapruk N,Veenstra DL,Lipsky BA, et al.Chlorhexidine compared with povidone‐iodine solution for vascular catheter‐site care: a meta‐analysis.Ann Intern Med.2002;136:792801.
  40. Maki DG,Ringer M,Alvarado CJ.Prospective randomised trial of povidone‐iodine, alcohol, and chlorhexidine for prevention of infection associated with central venous and arterial catheters.Lancet.1991;338:339343.
  41. Mimoz O,Pieroni L,Lawrence C, et al.Prospective, randomized trial of two antiseptic solutions for prevention of central venous or arterial catheter colonization and infection in intensive care unit patients.Crit Care Med.1996;24:18181823.
  42. Warren DK,Zack JE,Mayfield JL, et al.The effect of an education program on the incidence of central venous catheter‐associated bloodstream infection in a medical ICU.CHEST.2004;126:16121618.
  43. Choudhry NK,Fletcher RH,Soumerai SB.Systematic review: the relationship between clinical experience and quality of health care.Ann Intern Med.2005;142:260273.
  44. Duffy FD,Holmboe ES.What procedures should internists do?Ann Intern Med.2007;146:392393.
  45. Wigton RS,Alguire P.The declining number and variety of procedures done by general internists: a resurvey of members of the American College of Physicians.Ann Intern Med.2007;146:355360.
Article PDF
Issue
Journal of Hospital Medicine - 4(7)
Publications
Page Number
397-403
Legacy Keywords
anatomic model, central venous catheterization, clinical competence, complications, medical education, quality of healthcare, simulation
Sections
Article PDF
Article PDF

Central venous catheter (CVC) insertions are commonly performed at the bedside in medical intensive care unit (MICU) settings. Internal medicine residents are required to demonstrate knowledge regarding CVC indications, complications, and sterile technique,1 and often perform the procedure during training. Education in CVC insertion is needed because many internal medicine residents are uncomfortable performing this procedure.2 CVC insertion also carries the risk of potentially life‐threatening complications including infection, pneumothorax, arterial puncture, deep vein thrombosis, and bleeding. Education and training may also contribute to improved patient care because increased physician experience with CVC insertion reduces complication risk.3, 4 Similarly, a higher number of needle passes or attempts during CVC insertion correlates with mechanical complications such as pneumothorax or arterial punctures.48 Pneumothorax rates for internal jugular (IJ) CVCs have been reported to range from 0% to 0.2% and for subclavian (SC) CVCs from 1.5% to 3.1%.4, 5 The arterial puncture rate for IJ CVCs ranges from 5.0% to 9.4% and for SC CVCs from 3.1% to 4.9%.4, 5 Proper use of ultrasound to assist with IJ CVC insertion has been shown to decrease these mechanical complications.4, 5 However, studies of ultrasound use with SC CVC insertion have mixed results.4

Simulation‐based training has been used in medical education to increase knowledge, provide opportunities for deliberate and safe practice, and shape the development of clinical skills.9, 10 We previously used simulation‐based mastery learning to improve the thoracentesis and advanced cardiac life support (ACLS) skills of internal medicine residents.11, 12 Although a few small studies have linked simulation‐based interventions to improved quality of care,1319 more work is needed to show that results from a simulated environment transfer to actual patient care.

This study had 2 aims. The first was to expand our simulation‐based mastery learning to CVC insertion using a CVC simulator and ultrasound device. The second was to assess quality indicators (number of needle passes, pneumothorax, arterial punctures, and need for catheter adjustment) and resident confidence related to actual CVC insertions in the MICU before and after an educational intervention.

Materials and Methods

Design

This was a cohort study20 of IJ and SC CVC insertions by 41 second‐ and third‐year internal medicine residents rotating through the MICU in a university‐affiliated program from October 2006 to February 2007. The Northwestern University Institutional Review Board approved the study. All study participants were required to give informed consent prior to participation.

Thirteen residents rotated through the MICU during a 6‐week preintervention phase. These residents served as a traditionally trained group that did not receive CVC insertion simulator training. Simultaneously, 28 residents who rotated through the MICU later in the study period received simulation‐based training in CVC insertion and served as the simulator‐trained group (Figure 1). Demographic data were obtained from the participants including age, gender, ethnicity, year of training, and scores on the United States Medical Licensing Examination (USMLE) Steps 1 and 2.

Figure 1
Timeline of CVC training and clinical rotations.

Simulator‐trained residents underwent baseline skill assessment (pretest) using a 27‐item checklist in IJ and SC CVC insertions (see Appendix). Checklists were developed by one author (J.H.B.) using appropriate references4, 5 and a step‐by‐step process,21 and reviewed for completeness by another author with expertise in checklist development (D.B.W.). Each skill or other action was listed in order and given equal weight. A dichotomous scoring scale of 1 = done correctly and 0 = done incorrectly/not done was imposed for each item. Assessments were performed using Simulab's CentralLineMan. This model features realistic tissue with ultrasound compatibility, an arterial pulse, and self‐sealing veins and skins. Needles, dilators, and guidewires can be inserted and realistic venous and arterial pressures demonstrated (Figure 2).

Figure 2
Resident training on the CVC simulator.

Residents in the simulator‐trained group received two, 2‐hour education sessions featuring a lecture, ultrasound training, deliberate practice with the CVC simulator, and feedback.22 Education sessions contained standardized didactic material on CVC indications and complications, as well as a stepwise demonstration of IJ and SC CVC insertions using ultrasound and landmark techniques. These sessions were supervised by a senior hospitalist faculty member with expertise in CVC insertions (J.H.B.). Residents were expected to use the ultrasound device for all IJ CVC insertions. However, its use was optional for SC CVC insertion. After training, residents were retested (posttest) and required to meet or exceed a minimum passing score (MPS) set by an expert panel for both IJ and SC procedures.23 This 11 member expert panel provided item‐based (Angoff) and group‐based (Hofstee) judgments on the 27‐item checklists as described previously.23

Residents who did not achieve the MPS had more deliberate practice and were retested until the MPS was reached; the key feature of mastery learning.24 After completing simulation‐based mastery learning in CVC insertion, the 28 simulator‐trained residents rotated through the MICU.

Data Collection

All pretests and posttests (using the 27‐item checklist) were graded by a single unblinded instructor (J.H.B.) and were videotaped. Another faculty instructor with expertise in scoring clinical skills examinations and blind to pre‐post status (D.B.W.) rescored a random 50% sample of the tests to assess interrater reliability.

Data regarding actual CVC insertions in the MICU were collected by contacting all MICU residents daily during the study period. This allowed for CVC insertions to be identified within 24 hours. All survey data were collected anonymously. The primary inserter of each CVC was questioned about quality indicators and procedural self‐confidence concerning CVC placement. CVCs primarily inserted by nonstudy subjects (first‐year residents, emergency medicine residents, pulmonary‐critical care medicine faculty members, and subspecialty fellows) or CVC placements that were supervised, but not directly placed by study participants, were excluded.

Outcome Measures

Pretest and posttest checklist scores from simulator‐trained residents were compared to measure the impact of training sessions. Residents rotating through the MICU were asked about several quality indicators related to actual CVC insertions. Quality indicators include: (1) number of needle passes required during the procedure (skin punctures); (2) presence of complications including pneumothorax and arterial puncture; and (3) need for CVC adjustment after chest x‐ray. Participants were also questioned regarding their confidence in CVC insertion using a 100 point scale (0 = not confident and 100 = very confident). Survey results from the 28 simulator‐trained residents were compared to results from the 13 traditionally‐trained residents.

Data Analysis

Checklist score reliability was estimated by calculating interrater reliability, the preferred method for assessments that depend on human judges, using the kappa () coefficient adjusted25, 26 using the formula of Brennan and Prediger.27 Within‐group differences from pretest (baseline) to posttest (outcome) were analyzed using paired t‐tests.

MICU survey results were compared using t‐tests. Traditionally‐trained and simulator‐trained groups were assessed for demographic differences using t‐tests and the chi‐square statistic. Spearman's rank correlation coefficient was used to assess for relationships between resident self‐confidence and quality indicators. All analyses were preformed using SPSS statistical software, version 16.0 (SPSS, Inc., Chicago, IL).

Results

All eligible residents participated in the study and completed the entire protocol. There was no significant difference in age, gender, ethnicity, year of training, or USMLE Step 1 and 2 scores between the traditionally‐trained and simulator‐trained groups.

Interrater reliability measured by the mean kappa coefficient was very high (n = 0.94) across the 27 IJ and SC checklist items. No resident met the MPS (79.1%) for CVC insertion at baseline testing. In the simulator‐trained group, 25 of 28 (89%) residents achieved SC skill mastery and 27 of 28 (96%) achieved IJ skill mastery within the standard four hour curriculum. All residents subsequently reached the MPS with less than one hour of additional practice time. A graphic portrait of the residents' pretest and posttest performance on the simulated CVC clinical skills examination with descriptive statistics is shown in Figure 3. After the educational intervention, posttest scores significantly improved (p < 0.001), to meet or exceed the MPS.

Figure 3
Mean scores and standard deviations on the simulator‐based skills exam before and after the educational intervention. MPS = 79.1%.

Traditionally trained and simulator‐trained residents independently inserted 46 CVCs during the study period. Simulator‐trained residents required significantly fewer needle passes to insert all actual CVCs in the MICU compared to traditionally trained residents: mean (M) = 1.79, standard deviation (SD) = 1.03 versus M = 2.78, SD = 1.77 (p = 0.04). As shown in Table 1, the groups did not differ in pneumothorax, arterial puncture, or mean number of CVC adjustments. In addition, the groups did not differ in use of ultrasound for IJ or SC CVC insertions. One IJ CVC was inserted without ultrasound in the traditionally‐trained group; 2 were inserted without ultrasound in the simulator‐trained group. Ultrasound was not used during any SC CVC insertions in the traditionally‐trained group and was used for 1 SC CVC insertion in the simulator‐trained group.

Comparison of Traditionally Trained Residents vs. Simulator trained Residents in Self‐confidence and CVC Quality Indicators During Actual CVC Insertions in the MICU
 Internal Jugular and Subclavian CVCs
Traditionally Trained ResidentsSimulator Trained ResidentsP value
  • p < 0.05.

  • Abbreviations: CVC, central venous catheter; MICU, medical intensive care unit; n/a, not applicable.

Number of attempts during insertion [mean (SD)]2.78 (1.77)1.79 (1.03)0.04*
Pneumothorax (number)00n/a
Arterial puncture (%)1170.65
CVC adjustment (%)1580.52
Confidence (%) [mean (SD)]68 (20)81 (11)0.02*

Simulator‐trained residents displayed more self‐confidence about their procedural skills than traditionally‐trained residents (M = 81, SD = 11 versus M = 68, SD = 20, p = 0.02). Spearman correlations showed no practical association between resident self‐confidence and performance on CVC insertion quality indicators.

Discussion

This study demonstrates the use of a mastery learning model to develop CVC insertion skills to a high achievement level among internal medicine residents. Our data support prior work showing that procedural skills that are poor at baseline can be increased significantly using simulation‐based training and deliberate practice.1118, 28 This report on CVC insertion adds to the growing body of literature showing that simulation training complements standard medical education,1119, 28 and expands the clinical application of the mastery model beyond thoracentesis and ACLS.11, 12 Use of the mastery model described in this study also has important implications for patients. In our training program, residents are required to demonstrate procedural mastery in a simulated environment before independently performing a CVC insertion on an actual patient. This is in sharp contrast to the traditional clinical model of procedural training at the bedside, and may be used in other training programs and with other invasive procedures.

The second aim of our study was to determine the impact of simulation‐based training on actual clinical practice by residents in the MICU. To our knowledge, no prior study has demonstrated that simulation‐based training in CVC insertion improves patient outcomes. We believe our results advance what is known about the impact of simulation‐based training because simulator‐trained residents in this study performed actual CVC insertions in the MICU using significantly fewer needle passes. Needle passes have been used by other investigators as a surrogate measure for reduced CVC‐associated complications because mechanical complications rise exponentially with more than two insertion attempts.47, 29 We believe this finding demonstrates transfer of skill acquired from simulation‐based training to the actual clinical environment. It is possible that ultrasound training accounts for the improvement in the simulator‐trained group. However, we do not believe that ultrasound training is entirely responsible as prior work has shown that deliberate practice using mastery learning without ultrasound significantly improved resident performance of thoracentesis11 and ACLS12, 19 procedures. We did not show a significant reduction in complications such as pneumothorax or arterial puncture. This is likely due to the small sample size and the low number of procedures and complications during the study period.

Our results also show that resident self‐confidence regarding actual CVC insertions improved after simulation training. These findings are similar to prior reports linking improved confidence among trainees after simulation‐based training in CVC insertion.29, 30 Our results did not reveal a correlation between improved self‐confidence and clinical skill acquisition. Linking improved self‐confidence to improved clinical skill is important because self‐assessment does not always correlate with performance ability.31, 32

More study is needed to evaluate the impact of simulation‐based training on the quality of CVC insertions by trainees. Mechanisms shown to decrease complications of CVC placement include use of ultrasound,4, 7, 3336 full sterile barrier technique,3739 chlorhexidine skin preparations,4042 and nurse‐physician education.43 Our simulation‐training program incorporates each of these elements. We plan to expand our simulation‐based training intervention to a larger sample size to determine its impact on mechanical and infectious complication rates linked to CVC insertion.

This study has several limitations. It was performed at a single institution over a short time period. However, demonstration of significantly fewer needle passes and improved resident self‐confidence after simulator training are important findings that warrant further study. It was impossible to blind raters during the skills assessment examination about whether the resident was performing a pretest or posttest. This was accounted for by using a second rater, who was blind to the pretest and posttest status of the examinee. The arterial puncture rate of 7% among simulator‐trained residents was higher than expected, although it remains within published ranges.4, 5 Also, a low total number of CVCs were evaluated during the study. This is likely due to strict exclusion criteria employed in order to study the impact of simulation training. For example, CVC insertions were only evaluated if they were actually performed by study residents (supervised insertions were excluded) and femoral catheters were not evaluated. We did not track clinical experience with CVC insertion by residents before the study. Residents who were simulator‐trained may have had more clinical experience with CVC insertion and this may have impacted their performance. However, residents did not differ in year of training or clinical rotations, and there is clear evidence that clinical training is not a proxy for skill acquisition.44 Finally, outcome data were measured via resident questionnaires that relied on resident recall about CVC insertion rather than observer ratings. This method was selected because observer ratings could not be standardized given the large number of clinical supervisors in the MICU over the study period. Information about needle passes and arterial puncture also may not be documented in procedural notes and could not be obtained by medical record review. We attempted to minimize recall bias by surveying residents within 24 hours of CVC placement.

In conclusion, this study demonstrates that simulation‐based training and deliberate practice in a mastery learning setting improves performance of both simulated and actual CVC insertions by internal medicine residents. Procedural training remains an important component of internal medicine training, although internists are performing fewer invasive procedures now than in years past.45, 46 Use of a mastery model of CVC insertion requires that trainees demonstrate skill in a simulated environment before independently performing this invasive procedure on patients. Further study is needed to assess clinical outcomes such as reduced CVC‐related infections and mechanical complications after simulation‐based training.

Acknowledgements

The authors thank the Northwestern University internal medicine residents for their dedication to education and patient care. They acknowledge Drs. J. Larry Jameson and Charles Watts for their support and encouragement of this work.

Appendix

Central Venous Catheter Insertion Checklists for Simulation‐based Education 0, 0

Central Venous Catheter Placement (IJ)
  • Skill Key: A = Done Correctly B = Done Incorrectly C = Not Done.

Informed consent obtained: must do allABC
Benefits
Risks
Consent given
Place the patient in slight Trendelenburg positionABC
Flush the ports on the catheter with sterile salineABC
Clamp each port (ok to keep brown port open)ABC
Remove brown port from end of catheter to accommodate wireABC
Area is cleaned with chlorhexadineABC
Resident gets in sterile gown, gloves, hat and maskABC
Area is draped in usual sterile fashion (must be full body drape)ABC
The ultrasound (US) probe is properly set up with sterile sheath and sonographic gelABC
The vein is localized using anatomical landmarks with the US machineABC
If no US is used this is wrong
The skin is anesthetized with 1% lidocaine in a small whealABC
The deeper structures are anesthetizedABC
Localize the vein with this needle (optional)ABC
Using the large needle or catheter‐ syringe complex, cannulate the vein while aspirating (must be done with US)ABC
Remove the syringe from the needle or advance the catheter into the vein removing both the syringe and needleABC
Advance the guidewire into the vein no more than approximately 1215 cmABC
Knick the skin with the scalpel to advance the dilatorABC
Advance the dilator over the guidewire and dilate the veinABC
Advance the triple lumen over the guidewireABC
Never let go of the guidewireABC
Once the catheter is inserted remove the guidewire in its entiretyABC
Advance the catheter to approx to 1416cm on the right side, 1618 cm on the left sideABC
Ensure there is blood flow/flush each portABC
Secure the catheter in place (suture or staple)ABC
Place dressing over catheterABC
Get a chest x‐rayABC
Notify that the catheter is ok to useABC
Maintain sterile techniqueABC
Central Venous Catheter Placement (Subclavian)
  • Skill Key: A = Done Correctly B = Done Incorrectly C = Not Done

Informed consent obtained: must do allABC
Benefits
Risks
Consent given
Place the patient in slight Trendelenburg positionABC
Flush the ports on the catheter with sterile salineABC
Clamp each port (ok to leave brown port open)ABC
Remove brown port from end of catheter to accommodate wireABC
Area is cleaned with chlorhexadineABC
Resident gets in sterile gown, gloves, hat and maskABC
Area is draped in usual sterile fashion (must be full body drape)ABC
**The US probe is properly set up with sterile sheath and sonographic gel . (MUST DO if use US)ABC
The vein is localized using US machine or anatomical landmarks are verbalizedABC
The skin is anesthetized with 1% lidocaine in a small whealABC
The deeper structures are anesthetized using a larger needle (must verbalize they anesthetize the clavicle)ABC
Localize the vein with this needle (optional)ABC
Using the large needle or catheter syringe complex cannulate the vein while aspirating (optional confirmed by US)ABC
If US was not used then expected to state they are directing the needle to the sternal notchABC
Remove the syringe from the needle or advance the catheter into the vein removing both the syringe and needleABC
Advance the guidewire into the vein no more than approximately 1215 cmABC
Knick the skin with the scalpel to advance the dilatorABC
Advance the dilator over the guidewire and dilate the veinABC
Advance the triple lumen over the guidewireABC
Never let go of the guidewireABC
Once the catheter is inserted remove the guidewire in its entiretyABC
Advance the catheter to approx to 1416cm on the right side, 1618 cm on the left sideABC
Ensure there is blood flow/flush each portABC
Secure the catheter in place (suture or staple)ABC
Place dressing over catheterABC
Get a chest x‐rayABC
Notify that the catheter is ok to useABC
Maintain sterile techniqueABC

Central venous catheter (CVC) insertions are commonly performed at the bedside in medical intensive care unit (MICU) settings. Internal medicine residents are required to demonstrate knowledge regarding CVC indications, complications, and sterile technique,1 and often perform the procedure during training. Education in CVC insertion is needed because many internal medicine residents are uncomfortable performing this procedure.2 CVC insertion also carries the risk of potentially life‐threatening complications including infection, pneumothorax, arterial puncture, deep vein thrombosis, and bleeding. Education and training may also contribute to improved patient care because increased physician experience with CVC insertion reduces complication risk.3, 4 Similarly, a higher number of needle passes or attempts during CVC insertion correlates with mechanical complications such as pneumothorax or arterial punctures.48 Pneumothorax rates for internal jugular (IJ) CVCs have been reported to range from 0% to 0.2% and for subclavian (SC) CVCs from 1.5% to 3.1%.4, 5 The arterial puncture rate for IJ CVCs ranges from 5.0% to 9.4% and for SC CVCs from 3.1% to 4.9%.4, 5 Proper use of ultrasound to assist with IJ CVC insertion has been shown to decrease these mechanical complications.4, 5 However, studies of ultrasound use with SC CVC insertion have mixed results.4

Simulation‐based training has been used in medical education to increase knowledge, provide opportunities for deliberate and safe practice, and shape the development of clinical skills.9, 10 We previously used simulation‐based mastery learning to improve the thoracentesis and advanced cardiac life support (ACLS) skills of internal medicine residents.11, 12 Although a few small studies have linked simulation‐based interventions to improved quality of care,1319 more work is needed to show that results from a simulated environment transfer to actual patient care.

This study had 2 aims. The first was to expand our simulation‐based mastery learning to CVC insertion using a CVC simulator and ultrasound device. The second was to assess quality indicators (number of needle passes, pneumothorax, arterial punctures, and need for catheter adjustment) and resident confidence related to actual CVC insertions in the MICU before and after an educational intervention.

Materials and Methods

Design

This was a cohort study20 of IJ and SC CVC insertions by 41 second‐ and third‐year internal medicine residents rotating through the MICU in a university‐affiliated program from October 2006 to February 2007. The Northwestern University Institutional Review Board approved the study. All study participants were required to give informed consent prior to participation.

Thirteen residents rotated through the MICU during a 6‐week preintervention phase. These residents served as a traditionally trained group that did not receive CVC insertion simulator training. Simultaneously, 28 residents who rotated through the MICU later in the study period received simulation‐based training in CVC insertion and served as the simulator‐trained group (Figure 1). Demographic data were obtained from the participants including age, gender, ethnicity, year of training, and scores on the United States Medical Licensing Examination (USMLE) Steps 1 and 2.

Figure 1
Timeline of CVC training and clinical rotations.

Simulator‐trained residents underwent baseline skill assessment (pretest) using a 27‐item checklist in IJ and SC CVC insertions (see Appendix). Checklists were developed by one author (J.H.B.) using appropriate references4, 5 and a step‐by‐step process,21 and reviewed for completeness by another author with expertise in checklist development (D.B.W.). Each skill or other action was listed in order and given equal weight. A dichotomous scoring scale of 1 = done correctly and 0 = done incorrectly/not done was imposed for each item. Assessments were performed using Simulab's CentralLineMan. This model features realistic tissue with ultrasound compatibility, an arterial pulse, and self‐sealing veins and skins. Needles, dilators, and guidewires can be inserted and realistic venous and arterial pressures demonstrated (Figure 2).

Figure 2
Resident training on the CVC simulator.

Residents in the simulator‐trained group received two, 2‐hour education sessions featuring a lecture, ultrasound training, deliberate practice with the CVC simulator, and feedback.22 Education sessions contained standardized didactic material on CVC indications and complications, as well as a stepwise demonstration of IJ and SC CVC insertions using ultrasound and landmark techniques. These sessions were supervised by a senior hospitalist faculty member with expertise in CVC insertions (J.H.B.). Residents were expected to use the ultrasound device for all IJ CVC insertions. However, its use was optional for SC CVC insertion. After training, residents were retested (posttest) and required to meet or exceed a minimum passing score (MPS) set by an expert panel for both IJ and SC procedures.23 This 11 member expert panel provided item‐based (Angoff) and group‐based (Hofstee) judgments on the 27‐item checklists as described previously.23

Residents who did not achieve the MPS had more deliberate practice and were retested until the MPS was reached; the key feature of mastery learning.24 After completing simulation‐based mastery learning in CVC insertion, the 28 simulator‐trained residents rotated through the MICU.

Data Collection

All pretests and posttests (using the 27‐item checklist) were graded by a single unblinded instructor (J.H.B.) and were videotaped. Another faculty instructor with expertise in scoring clinical skills examinations and blind to pre‐post status (D.B.W.) rescored a random 50% sample of the tests to assess interrater reliability.

Data regarding actual CVC insertions in the MICU were collected by contacting all MICU residents daily during the study period. This allowed for CVC insertions to be identified within 24 hours. All survey data were collected anonymously. The primary inserter of each CVC was questioned about quality indicators and procedural self‐confidence concerning CVC placement. CVCs primarily inserted by nonstudy subjects (first‐year residents, emergency medicine residents, pulmonary‐critical care medicine faculty members, and subspecialty fellows) or CVC placements that were supervised, but not directly placed by study participants, were excluded.

Outcome Measures

Pretest and posttest checklist scores from simulator‐trained residents were compared to measure the impact of training sessions. Residents rotating through the MICU were asked about several quality indicators related to actual CVC insertions. Quality indicators include: (1) number of needle passes required during the procedure (skin punctures); (2) presence of complications including pneumothorax and arterial puncture; and (3) need for CVC adjustment after chest x‐ray. Participants were also questioned regarding their confidence in CVC insertion using a 100 point scale (0 = not confident and 100 = very confident). Survey results from the 28 simulator‐trained residents were compared to results from the 13 traditionally‐trained residents.

Data Analysis

Checklist score reliability was estimated by calculating interrater reliability, the preferred method for assessments that depend on human judges, using the kappa () coefficient adjusted25, 26 using the formula of Brennan and Prediger.27 Within‐group differences from pretest (baseline) to posttest (outcome) were analyzed using paired t‐tests.

MICU survey results were compared using t‐tests. Traditionally‐trained and simulator‐trained groups were assessed for demographic differences using t‐tests and the chi‐square statistic. Spearman's rank correlation coefficient was used to assess for relationships between resident self‐confidence and quality indicators. All analyses were preformed using SPSS statistical software, version 16.0 (SPSS, Inc., Chicago, IL).

Results

All eligible residents participated in the study and completed the entire protocol. There was no significant difference in age, gender, ethnicity, year of training, or USMLE Step 1 and 2 scores between the traditionally‐trained and simulator‐trained groups.

Interrater reliability measured by the mean kappa coefficient was very high (n = 0.94) across the 27 IJ and SC checklist items. No resident met the MPS (79.1%) for CVC insertion at baseline testing. In the simulator‐trained group, 25 of 28 (89%) residents achieved SC skill mastery and 27 of 28 (96%) achieved IJ skill mastery within the standard four hour curriculum. All residents subsequently reached the MPS with less than one hour of additional practice time. A graphic portrait of the residents' pretest and posttest performance on the simulated CVC clinical skills examination with descriptive statistics is shown in Figure 3. After the educational intervention, posttest scores significantly improved (p < 0.001), to meet or exceed the MPS.

Figure 3
Mean scores and standard deviations on the simulator‐based skills exam before and after the educational intervention. MPS = 79.1%.

Traditionally trained and simulator‐trained residents independently inserted 46 CVCs during the study period. Simulator‐trained residents required significantly fewer needle passes to insert all actual CVCs in the MICU compared to traditionally trained residents: mean (M) = 1.79, standard deviation (SD) = 1.03 versus M = 2.78, SD = 1.77 (p = 0.04). As shown in Table 1, the groups did not differ in pneumothorax, arterial puncture, or mean number of CVC adjustments. In addition, the groups did not differ in use of ultrasound for IJ or SC CVC insertions. One IJ CVC was inserted without ultrasound in the traditionally‐trained group; 2 were inserted without ultrasound in the simulator‐trained group. Ultrasound was not used during any SC CVC insertions in the traditionally‐trained group and was used for 1 SC CVC insertion in the simulator‐trained group.

Comparison of Traditionally Trained Residents vs. Simulator trained Residents in Self‐confidence and CVC Quality Indicators During Actual CVC Insertions in the MICU
 Internal Jugular and Subclavian CVCs
Traditionally Trained ResidentsSimulator Trained ResidentsP value
  • p < 0.05.

  • Abbreviations: CVC, central venous catheter; MICU, medical intensive care unit; n/a, not applicable.

Number of attempts during insertion [mean (SD)]2.78 (1.77)1.79 (1.03)0.04*
Pneumothorax (number)00n/a
Arterial puncture (%)1170.65
CVC adjustment (%)1580.52
Confidence (%) [mean (SD)]68 (20)81 (11)0.02*

Simulator‐trained residents displayed more self‐confidence about their procedural skills than traditionally‐trained residents (M = 81, SD = 11 versus M = 68, SD = 20, p = 0.02). Spearman correlations showed no practical association between resident self‐confidence and performance on CVC insertion quality indicators.

Discussion

This study demonstrates the use of a mastery learning model to develop CVC insertion skills to a high achievement level among internal medicine residents. Our data support prior work showing that procedural skills that are poor at baseline can be increased significantly using simulation‐based training and deliberate practice.1118, 28 This report on CVC insertion adds to the growing body of literature showing that simulation training complements standard medical education,1119, 28 and expands the clinical application of the mastery model beyond thoracentesis and ACLS.11, 12 Use of the mastery model described in this study also has important implications for patients. In our training program, residents are required to demonstrate procedural mastery in a simulated environment before independently performing a CVC insertion on an actual patient. This is in sharp contrast to the traditional clinical model of procedural training at the bedside, and may be used in other training programs and with other invasive procedures.

The second aim of our study was to determine the impact of simulation‐based training on actual clinical practice by residents in the MICU. To our knowledge, no prior study has demonstrated that simulation‐based training in CVC insertion improves patient outcomes. We believe our results advance what is known about the impact of simulation‐based training because simulator‐trained residents in this study performed actual CVC insertions in the MICU using significantly fewer needle passes. Needle passes have been used by other investigators as a surrogate measure for reduced CVC‐associated complications because mechanical complications rise exponentially with more than two insertion attempts.47, 29 We believe this finding demonstrates transfer of skill acquired from simulation‐based training to the actual clinical environment. It is possible that ultrasound training accounts for the improvement in the simulator‐trained group. However, we do not believe that ultrasound training is entirely responsible as prior work has shown that deliberate practice using mastery learning without ultrasound significantly improved resident performance of thoracentesis11 and ACLS12, 19 procedures. We did not show a significant reduction in complications such as pneumothorax or arterial puncture. This is likely due to the small sample size and the low number of procedures and complications during the study period.

Our results also show that resident self‐confidence regarding actual CVC insertions improved after simulation training. These findings are similar to prior reports linking improved confidence among trainees after simulation‐based training in CVC insertion.29, 30 Our results did not reveal a correlation between improved self‐confidence and clinical skill acquisition. Linking improved self‐confidence to improved clinical skill is important because self‐assessment does not always correlate with performance ability.31, 32

More study is needed to evaluate the impact of simulation‐based training on the quality of CVC insertions by trainees. Mechanisms shown to decrease complications of CVC placement include use of ultrasound,4, 7, 3336 full sterile barrier technique,3739 chlorhexidine skin preparations,4042 and nurse‐physician education.43 Our simulation‐training program incorporates each of these elements. We plan to expand our simulation‐based training intervention to a larger sample size to determine its impact on mechanical and infectious complication rates linked to CVC insertion.

This study has several limitations. It was performed at a single institution over a short time period. However, demonstration of significantly fewer needle passes and improved resident self‐confidence after simulator training are important findings that warrant further study. It was impossible to blind raters during the skills assessment examination about whether the resident was performing a pretest or posttest. This was accounted for by using a second rater, who was blind to the pretest and posttest status of the examinee. The arterial puncture rate of 7% among simulator‐trained residents was higher than expected, although it remains within published ranges.4, 5 Also, a low total number of CVCs were evaluated during the study. This is likely due to strict exclusion criteria employed in order to study the impact of simulation training. For example, CVC insertions were only evaluated if they were actually performed by study residents (supervised insertions were excluded) and femoral catheters were not evaluated. We did not track clinical experience with CVC insertion by residents before the study. Residents who were simulator‐trained may have had more clinical experience with CVC insertion and this may have impacted their performance. However, residents did not differ in year of training or clinical rotations, and there is clear evidence that clinical training is not a proxy for skill acquisition.44 Finally, outcome data were measured via resident questionnaires that relied on resident recall about CVC insertion rather than observer ratings. This method was selected because observer ratings could not be standardized given the large number of clinical supervisors in the MICU over the study period. Information about needle passes and arterial puncture also may not be documented in procedural notes and could not be obtained by medical record review. We attempted to minimize recall bias by surveying residents within 24 hours of CVC placement.

In conclusion, this study demonstrates that simulation‐based training and deliberate practice in a mastery learning setting improves performance of both simulated and actual CVC insertions by internal medicine residents. Procedural training remains an important component of internal medicine training, although internists are performing fewer invasive procedures now than in years past.45, 46 Use of a mastery model of CVC insertion requires that trainees demonstrate skill in a simulated environment before independently performing this invasive procedure on patients. Further study is needed to assess clinical outcomes such as reduced CVC‐related infections and mechanical complications after simulation‐based training.

Acknowledgements

The authors thank the Northwestern University internal medicine residents for their dedication to education and patient care. They acknowledge Drs. J. Larry Jameson and Charles Watts for their support and encouragement of this work.

Appendix

Central Venous Catheter Insertion Checklists for Simulation‐based Education 0, 0

Central Venous Catheter Placement (IJ)
  • Skill Key: A = Done Correctly B = Done Incorrectly C = Not Done.

Informed consent obtained: must do allABC
Benefits
Risks
Consent given
Place the patient in slight Trendelenburg positionABC
Flush the ports on the catheter with sterile salineABC
Clamp each port (ok to keep brown port open)ABC
Remove brown port from end of catheter to accommodate wireABC
Area is cleaned with chlorhexadineABC
Resident gets in sterile gown, gloves, hat and maskABC
Area is draped in usual sterile fashion (must be full body drape)ABC
The ultrasound (US) probe is properly set up with sterile sheath and sonographic gelABC
The vein is localized using anatomical landmarks with the US machineABC
If no US is used this is wrong
The skin is anesthetized with 1% lidocaine in a small whealABC
The deeper structures are anesthetizedABC
Localize the vein with this needle (optional)ABC
Using the large needle or catheter‐ syringe complex, cannulate the vein while aspirating (must be done with US)ABC
Remove the syringe from the needle or advance the catheter into the vein removing both the syringe and needleABC
Advance the guidewire into the vein no more than approximately 1215 cmABC
Knick the skin with the scalpel to advance the dilatorABC
Advance the dilator over the guidewire and dilate the veinABC
Advance the triple lumen over the guidewireABC
Never let go of the guidewireABC
Once the catheter is inserted remove the guidewire in its entiretyABC
Advance the catheter to approx to 1416cm on the right side, 1618 cm on the left sideABC
Ensure there is blood flow/flush each portABC
Secure the catheter in place (suture or staple)ABC
Place dressing over catheterABC
Get a chest x‐rayABC
Notify that the catheter is ok to useABC
Maintain sterile techniqueABC
Central Venous Catheter Placement (Subclavian)
  • Skill Key: A = Done Correctly B = Done Incorrectly C = Not Done

Informed consent obtained: must do allABC
Benefits
Risks
Consent given
Place the patient in slight Trendelenburg positionABC
Flush the ports on the catheter with sterile salineABC
Clamp each port (ok to leave brown port open)ABC
Remove brown port from end of catheter to accommodate wireABC
Area is cleaned with chlorhexadineABC
Resident gets in sterile gown, gloves, hat and maskABC
Area is draped in usual sterile fashion (must be full body drape)ABC
**The US probe is properly set up with sterile sheath and sonographic gel . (MUST DO if use US)ABC
The vein is localized using US machine or anatomical landmarks are verbalizedABC
The skin is anesthetized with 1% lidocaine in a small whealABC
The deeper structures are anesthetized using a larger needle (must verbalize they anesthetize the clavicle)ABC
Localize the vein with this needle (optional)ABC
Using the large needle or catheter syringe complex cannulate the vein while aspirating (optional confirmed by US)ABC
If US was not used then expected to state they are directing the needle to the sternal notchABC
Remove the syringe from the needle or advance the catheter into the vein removing both the syringe and needleABC
Advance the guidewire into the vein no more than approximately 1215 cmABC
Knick the skin with the scalpel to advance the dilatorABC
Advance the dilator over the guidewire and dilate the veinABC
Advance the triple lumen over the guidewireABC
Never let go of the guidewireABC
Once the catheter is inserted remove the guidewire in its entiretyABC
Advance the catheter to approx to 1416cm on the right side, 1618 cm on the left sideABC
Ensure there is blood flow/flush each portABC
Secure the catheter in place (suture or staple)ABC
Place dressing over catheterABC
Get a chest x‐rayABC
Notify that the catheter is ok to useABC
Maintain sterile techniqueABC
References
  1. American Board of Internal Medicine. Procedures Required for Internal Medicine. Available at: http://www.abim.org/certification/policies/imss/im.aspx. Accessed January 28, 2009.
  2. Huang GC,Smith CC,Gordon CE, et al.Beyond the comfort zone: residents assess their comfort performing inpatient medical procedures.Am J Med.2006;119:71.e17e24.
  3. Sznajder JI,Zveibil FR,Bitterman H, et al.Central vein catheterization. Failure and complication rates by three percutaneous approaches.Arch Intern Med.1986;146:259261.
  4. McGee DC,Gould MK.Preventing complications of central venous catheterization.N Engl J Med.2003;348:11231133.
  5. Eisen LA,Narasimhan M,Berger JS, et al.Mechanical complications of central venous catheters.J Intensive Care Med.2006;21:4046.
  6. Lefrant JY,Muller L,De La Coussaye JE, et al.Risk factors of failure and immediate complication of subclavian vein catheterization in critically ill patients.Intensive Care Med.2002;28:10361041.
  7. Mansfield PF,Hohn DC,Fornage BD, et al.Complications and failures of subclavian‐vein catheterization.N Engl J Med.1994;331:17351738.
  8. McGee WT.Central venous catheterization: better and worse.J Intensive Care Med.2006;21:5153.
  9. Boulet JR,Murray D,Kras J, et al.Reliability and validity of a simulation‐based acute care skills assessment for medical students and residents.Anesthesiology.2003;99:12701280.
  10. Issenberg SB,McGaghie WC,Hart IR, et al.Simulation technology for health care professional skills training and assessment.JAMA.1999;282:861866.
  11. Wayne DB,Barsuk JH,O'Leary KJ, et al.Mastery learning of thoracentesis skills by internal medicine residents using simulation technology and deliberate practice.J Hosp Med.2008;3:4854.
  12. Wayne DB,Butter J,Siddall VJ, et al.Mastery learning of advanced cardiac life support skills by internal medicine residents using simulation technology and deliberate practice.J Gen Intern Med.2006;21:251256.
  13. Andreatta PB,Woodrum DT,Birkmeyer JD, et al.Laparoscopic skills are improved with LapMentor training: results of a randomized, double‐blinded study.Ann Surg.2006;243:854860.
  14. Blum MG,Powers TW,Sundaresan S.Bronchoscopy simulator effectively prepares junior residents to competently perform basic clinical bronchoscopy.Ann Thorac Surg.2004;78:287291.
  15. Cohen J,Cohen SA,Vora KC, et al.Multicenter, randomized, controlled trial of virtual‐reality simulator training in acquisition of competency in colonoscopy.Gastrointest Endosc.2006;64:361368.
  16. Mayo PH,Hackney JE,Mueck JT, et al.Achieving house staff competence in emergency airway management: results of a teaching program using a computerized patient simulator.Crit Care Med.2004;32:24222427.
  17. Sedlack RE,Kolars JC.Computer simulator training enhances the competency of gastroenterology fellows at colonoscopy: results of a pilot study.Am J Gastroenterol.2004;99:3337.
  18. Seymour NE,Gallagher AG,Roman SA, et al.Virtual reality training improves operating room performance: results of a randomized, double‐blinded study.Ann Surg.2002;236:458463.
  19. Wayne DB,Didwania A,Feinglass J, et al.Simulation‐based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: a case‐control study.CHEST.2008;133:5661.
  20. Fletcher R,Fletcher S.Clinical Epidemiology: the Essentials.4th ed.Philadelphia:Lippincott Williams 2005.
  21. Stufflebeam DL. The Checklists Development Checklist. Western Michigan University Evaluation Center, July2000. Available at: http://www. wmich.edu/evalctr/checklists/cdc.htm. Accessed May 15, 2006.
  22. Ericsson KA.Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains.Acad Med.2004;79:S70S81.
  23. Wayne DB,Barsuk JH,Cohen E, et al.Do baseline data influence standard setting for a clinical skills examination?Acad Med.2007;82:S105S108.
  24. McGaghie W,Siddall V,Mazmanian P, et al.Lessons for Continuing Medical Education from simulation research in undergraduate and graduate medical education.CHEST.2009;135.
  25. Downing SM.Reliability: on the reproducibility of assessment data.Med Educ.2004;38:10061012.
  26. Fleiss JL,Levin B,Paik MC.Statistical Methods for Rates and Proportions.3rd ed.New York:John Wiley 41:687699.
  27. Wayne DB,Butter J,Siddall VJ, et al.Simulation‐based training of internal medicine residents in advanced cardiac life support protocols: a randomized trial.Teach Learn Med.2005;17:202208.
  28. Britt RC,Reed SF,Britt LD.Central catheter simulation: a new training algorithm.Am Surg.2007;73:680682.
  29. Ault MJ,Rosen BT,Ault B.The use of tissue models for vascular access training. Phase I of the procedural patient safety initiative.J Gen Intern Med.2006;21:514517.
  30. Bond WF,Lammers RL,Spillane LL, et al.The use of simulation in emergency medicine: a research agenda.Acad Emerg Med.2007;14:353363.
  31. Wayne DB,Butter J,Siddall VJ, et al.Graduating internal medicine residents' self‐assessment and performance of advanced cardiac life support skills.Med Teach.2006;28:365369.
  32. Beaulieu Y,Marik PE.Bedside ultrasonography in the ICU: Part 2.CHEST.2005;128:17661781.
  33. Lefrant JY,Cuvillon P,Benezet JF, et al.Pulsed Doppler ultrasonography guidance for catheterization of the subclavian vein: a randomized study.Anesthesiology.1998;88:11951201.
  34. Miller AH,Roth BA,Mills TJ, et al.Ultrasound guidance versus the landmark technique for the placement of central venous catheters in the emergency department.Acad Emerg Med.2002;9:800805.
  35. Randolph AG,Cook DJ,Gonzales CA, et al.Ultrasound guidance for placement of central venous catheters: a meta‐analysis of the literature.Crit Care Med.1996;24:20532058.
  36. Berenholtz SM,Pronovost PJ,Lipsett PA, et al.Eliminating catheter‐related bloodstream infections in the intensive care unit.Crit Care Med.2004;32:20142020.
  37. Pronovost P,Needham D,Berenholtz S, et al.An intervention to decrease catheter‐related bloodstream infections in the ICU.N Engl J Med.2006;355:27252732.
  38. Sherertz RJ,Ely EW,Westbrook DM, et al.Education of physicians‐in‐training can decrease the risk for vascular catheter infection.Ann Intern Med.2000;132:641648.
  39. Chaiyakunapruk N,Veenstra DL,Lipsky BA, et al.Chlorhexidine compared with povidone‐iodine solution for vascular catheter‐site care: a meta‐analysis.Ann Intern Med.2002;136:792801.
  40. Maki DG,Ringer M,Alvarado CJ.Prospective randomised trial of povidone‐iodine, alcohol, and chlorhexidine for prevention of infection associated with central venous and arterial catheters.Lancet.1991;338:339343.
  41. Mimoz O,Pieroni L,Lawrence C, et al.Prospective, randomized trial of two antiseptic solutions for prevention of central venous or arterial catheter colonization and infection in intensive care unit patients.Crit Care Med.1996;24:18181823.
  42. Warren DK,Zack JE,Mayfield JL, et al.The effect of an education program on the incidence of central venous catheter‐associated bloodstream infection in a medical ICU.CHEST.2004;126:16121618.
  43. Choudhry NK,Fletcher RH,Soumerai SB.Systematic review: the relationship between clinical experience and quality of health care.Ann Intern Med.2005;142:260273.
  44. Duffy FD,Holmboe ES.What procedures should internists do?Ann Intern Med.2007;146:392393.
  45. Wigton RS,Alguire P.The declining number and variety of procedures done by general internists: a resurvey of members of the American College of Physicians.Ann Intern Med.2007;146:355360.
References
  1. American Board of Internal Medicine. Procedures Required for Internal Medicine. Available at: http://www.abim.org/certification/policies/imss/im.aspx. Accessed January 28, 2009.
  2. Huang GC,Smith CC,Gordon CE, et al.Beyond the comfort zone: residents assess their comfort performing inpatient medical procedures.Am J Med.2006;119:71.e17e24.
  3. Sznajder JI,Zveibil FR,Bitterman H, et al.Central vein catheterization. Failure and complication rates by three percutaneous approaches.Arch Intern Med.1986;146:259261.
  4. McGee DC,Gould MK.Preventing complications of central venous catheterization.N Engl J Med.2003;348:11231133.
  5. Eisen LA,Narasimhan M,Berger JS, et al.Mechanical complications of central venous catheters.J Intensive Care Med.2006;21:4046.
  6. Lefrant JY,Muller L,De La Coussaye JE, et al.Risk factors of failure and immediate complication of subclavian vein catheterization in critically ill patients.Intensive Care Med.2002;28:10361041.
  7. Mansfield PF,Hohn DC,Fornage BD, et al.Complications and failures of subclavian‐vein catheterization.N Engl J Med.1994;331:17351738.
  8. McGee WT.Central venous catheterization: better and worse.J Intensive Care Med.2006;21:5153.
  9. Boulet JR,Murray D,Kras J, et al.Reliability and validity of a simulation‐based acute care skills assessment for medical students and residents.Anesthesiology.2003;99:12701280.
  10. Issenberg SB,McGaghie WC,Hart IR, et al.Simulation technology for health care professional skills training and assessment.JAMA.1999;282:861866.
  11. Wayne DB,Barsuk JH,O'Leary KJ, et al.Mastery learning of thoracentesis skills by internal medicine residents using simulation technology and deliberate practice.J Hosp Med.2008;3:4854.
  12. Wayne DB,Butter J,Siddall VJ, et al.Mastery learning of advanced cardiac life support skills by internal medicine residents using simulation technology and deliberate practice.J Gen Intern Med.2006;21:251256.
  13. Andreatta PB,Woodrum DT,Birkmeyer JD, et al.Laparoscopic skills are improved with LapMentor training: results of a randomized, double‐blinded study.Ann Surg.2006;243:854860.
  14. Blum MG,Powers TW,Sundaresan S.Bronchoscopy simulator effectively prepares junior residents to competently perform basic clinical bronchoscopy.Ann Thorac Surg.2004;78:287291.
  15. Cohen J,Cohen SA,Vora KC, et al.Multicenter, randomized, controlled trial of virtual‐reality simulator training in acquisition of competency in colonoscopy.Gastrointest Endosc.2006;64:361368.
  16. Mayo PH,Hackney JE,Mueck JT, et al.Achieving house staff competence in emergency airway management: results of a teaching program using a computerized patient simulator.Crit Care Med.2004;32:24222427.
  17. Sedlack RE,Kolars JC.Computer simulator training enhances the competency of gastroenterology fellows at colonoscopy: results of a pilot study.Am J Gastroenterol.2004;99:3337.
  18. Seymour NE,Gallagher AG,Roman SA, et al.Virtual reality training improves operating room performance: results of a randomized, double‐blinded study.Ann Surg.2002;236:458463.
  19. Wayne DB,Didwania A,Feinglass J, et al.Simulation‐based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: a case‐control study.CHEST.2008;133:5661.
  20. Fletcher R,Fletcher S.Clinical Epidemiology: the Essentials.4th ed.Philadelphia:Lippincott Williams 2005.
  21. Stufflebeam DL. The Checklists Development Checklist. Western Michigan University Evaluation Center, July2000. Available at: http://www. wmich.edu/evalctr/checklists/cdc.htm. Accessed May 15, 2006.
  22. Ericsson KA.Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains.Acad Med.2004;79:S70S81.
  23. Wayne DB,Barsuk JH,Cohen E, et al.Do baseline data influence standard setting for a clinical skills examination?Acad Med.2007;82:S105S108.
  24. McGaghie W,Siddall V,Mazmanian P, et al.Lessons for Continuing Medical Education from simulation research in undergraduate and graduate medical education.CHEST.2009;135.
  25. Downing SM.Reliability: on the reproducibility of assessment data.Med Educ.2004;38:10061012.
  26. Fleiss JL,Levin B,Paik MC.Statistical Methods for Rates and Proportions.3rd ed.New York:John Wiley 41:687699.
  27. Wayne DB,Butter J,Siddall VJ, et al.Simulation‐based training of internal medicine residents in advanced cardiac life support protocols: a randomized trial.Teach Learn Med.2005;17:202208.
  28. Britt RC,Reed SF,Britt LD.Central catheter simulation: a new training algorithm.Am Surg.2007;73:680682.
  29. Ault MJ,Rosen BT,Ault B.The use of tissue models for vascular access training. Phase I of the procedural patient safety initiative.J Gen Intern Med.2006;21:514517.
  30. Bond WF,Lammers RL,Spillane LL, et al.The use of simulation in emergency medicine: a research agenda.Acad Emerg Med.2007;14:353363.
  31. Wayne DB,Butter J,Siddall VJ, et al.Graduating internal medicine residents' self‐assessment and performance of advanced cardiac life support skills.Med Teach.2006;28:365369.
  32. Beaulieu Y,Marik PE.Bedside ultrasonography in the ICU: Part 2.CHEST.2005;128:17661781.
  33. Lefrant JY,Cuvillon P,Benezet JF, et al.Pulsed Doppler ultrasonography guidance for catheterization of the subclavian vein: a randomized study.Anesthesiology.1998;88:11951201.
  34. Miller AH,Roth BA,Mills TJ, et al.Ultrasound guidance versus the landmark technique for the placement of central venous catheters in the emergency department.Acad Emerg Med.2002;9:800805.
  35. Randolph AG,Cook DJ,Gonzales CA, et al.Ultrasound guidance for placement of central venous catheters: a meta‐analysis of the literature.Crit Care Med.1996;24:20532058.
  36. Berenholtz SM,Pronovost PJ,Lipsett PA, et al.Eliminating catheter‐related bloodstream infections in the intensive care unit.Crit Care Med.2004;32:20142020.
  37. Pronovost P,Needham D,Berenholtz S, et al.An intervention to decrease catheter‐related bloodstream infections in the ICU.N Engl J Med.2006;355:27252732.
  38. Sherertz RJ,Ely EW,Westbrook DM, et al.Education of physicians‐in‐training can decrease the risk for vascular catheter infection.Ann Intern Med.2000;132:641648.
  39. Chaiyakunapruk N,Veenstra DL,Lipsky BA, et al.Chlorhexidine compared with povidone‐iodine solution for vascular catheter‐site care: a meta‐analysis.Ann Intern Med.2002;136:792801.
  40. Maki DG,Ringer M,Alvarado CJ.Prospective randomised trial of povidone‐iodine, alcohol, and chlorhexidine for prevention of infection associated with central venous and arterial catheters.Lancet.1991;338:339343.
  41. Mimoz O,Pieroni L,Lawrence C, et al.Prospective, randomized trial of two antiseptic solutions for prevention of central venous or arterial catheter colonization and infection in intensive care unit patients.Crit Care Med.1996;24:18181823.
  42. Warren DK,Zack JE,Mayfield JL, et al.The effect of an education program on the incidence of central venous catheter‐associated bloodstream infection in a medical ICU.CHEST.2004;126:16121618.
  43. Choudhry NK,Fletcher RH,Soumerai SB.Systematic review: the relationship between clinical experience and quality of health care.Ann Intern Med.2005;142:260273.
  44. Duffy FD,Holmboe ES.What procedures should internists do?Ann Intern Med.2007;146:392393.
  45. Wigton RS,Alguire P.The declining number and variety of procedures done by general internists: a resurvey of members of the American College of Physicians.Ann Intern Med.2007;146:355360.
Issue
Journal of Hospital Medicine - 4(7)
Issue
Journal of Hospital Medicine - 4(7)
Page Number
397-403
Page Number
397-403
Publications
Publications
Article Type
Display Headline
Use of simulation‐based mastery learning to improve the quality of central venous catheter placement in a medical intensive care unit
Display Headline
Use of simulation‐based mastery learning to improve the quality of central venous catheter placement in a medical intensive care unit
Legacy Keywords
anatomic model, central venous catheterization, clinical competence, complications, medical education, quality of healthcare, simulation
Legacy Keywords
anatomic model, central venous catheterization, clinical competence, complications, medical education, quality of healthcare, simulation
Sections
Article Source

Copyright © 2009 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Division of Hospital Medicine, 251 East Huron St, Feinberg 16‐738, Chicago, IL 60611; Telephone: 312‐926‐5924; Fax: 312‐926‐6134
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Article PDF Media

Mastery Learning of Procedural Skills

Article Type
Changed
Sun, 05/28/2017 - 22:22
Display Headline
Mastery learning of thoracentesis skills by internal medicine residents using simulation technology and deliberate practice

In a supplement to its inaugural issue, the Journal of Hospital Medicine published core competencies for hospitalists covering 3 areas: clinical conditions, systems in health care, and procedures.1 Completion of a traditional internal medicine residency may not provide hospitalists with the skills necessary to safely perform necessary procedures such as thoracentesis. A recent article reported that most internal medicine residents surveyed were uncomfortable performing common procedures, and their discomfort was higher for thoracentesis than for central line insertion, lumbar puncture, or paracentesis.2 This confirmed a previous report that family practice residents had low confidence in performing thoracenteses.3 Thoracentesis also carries the risk of the potentially life‐threatening complication of pneumothorax, which may be increased when performed by physicians‐in‐training.4

One method for improving training and assessment is the use of simulation technology. Simulation has been used to increase knowledge, provide opportunities for deliberate and safe practice, and shape the development of clinical skills.5, 6 Simulation has also been advocated for assessing competence in procedures including carotid angiography,7 emergency airway management,8 basic bronchoscopy,9 and advanced cardiac life support (ACLS).10, 11

Recently, we used simulation technology to help residents reach mastery learning standards for ACLS.11 Mastery learning,12 an extreme form of competency‐based education,13 implies that learners have acquired the clinical knowledge and skill measured against rigorous achievement standards. In mastery learning, educational results are equivalent, whereas educational practice time differs. To demonstrate mastery learning, we first documented a 38% improvement in skill after a simulation‐based educational intervention10 and used a multidisciplinary panel to determine mastery achievement standards for ACLS skills in 6 clinical scenarios.14 These standards were used in a study in which the amount of time needed to achieve skill mastery was allowed to vary while the skill outcomes of the residents were identical clinically.11

The present study had 4 aims. The first was to assess the baseline skill and knowledge of third‐year residents in thoracentesis. The second was to compare the thoracentesis‐related knowledge and skills of residents before and after an educational intervention. The third was to assess the correlation of medical knowledge and clinical experience with performance on a clinical skills examination after simulation training. The last was to document the feasibility of incorporating simulation‐based education into a training program.

METHODS

Objectives and Design

The study, which had a pretestposttest design without a control group,15 was of a simulation‐based, mastery learning educational intervention in thoracentesis. Primary measurements were obtained at baseline (pretest) and after the educational intervention (posttest).

Participants

Study participants were all 40 third‐year residents in the internal medicine residency program at Northwestern University's Chicago campus from January to May 2006. The Northwestern University Institutional Review Board approved the study. Participants provided informed consent before baseline assessment.

This residency program is based at Northwestern Memorial Hospital (NMH) and the Jesse Brown Veteran's Affairs Medical Center. Residents perform thoracenteses under the supervision of second‐ or third‐year residents or faculty members who are credentialed to perform the procedure. A didactic lecture on thoracentesis is part of the annual lecture series.

Procedure

The residents were kept as an intact group during the study period. The research procedure had 2 phases. First, the knowledge and clinical skills of participants at baseline were measured. Second, residents received two 2‐hour education sessions featuring didactic content and deliberate practice using a thoracentesis model. Between 4 and 6 weeks after the pretest, all residents were retested and were expected to meet or exceed a minimum passing score (MPS) on the clinical skills exam. Those who scored below the MPS engaged in more clinical skills practice until the mastery standard was reached. The amount of extra time needed to achieve the MPS was documented.

Educational Intervention

The intervention was designed to help residents acquire the knowledge and skills needed to perform a competent thoracentesis. The necessary components for mastery skill development were contained in the intervention. These included deliberate practice, rigorous skills assessment, and the provision of feedback in a supportive environment.16

The study was conducted in the Northwestern University Center for Advanced Surgical Education (N‐CASE) using the thoracentesis simulator developed by MediSim Inc. (Alton, Ontario) (http://www.medisim.ca/product.php?id=13). The model features realistic skin texture, ribs, and a fluid filled reservoir. Needles of various sizes can be inserted and fluid withdrawn. The model also accommodates the catheter/needle apparatus found in the thoracentesis kits (Tyco Healthcare, Pembroke, Bermuda) used at NMH.

Teaching and testing sessions were standardized. In teaching sessions, groups of 2‐4 residents had 4 hours to practice and ask questions, and to receive structured education and feedback from 1 of 2 hospitalist faculty instructors (J.H.B., K.J.O.). One of the 4 hours was devoted to the presentation of didactic material on indications, complications, and interpretation of results and a step‐by‐step demonstration of a thoracentesis. This presentation was videotaped to ensure standardization of content. The remaining 3 hours were devoted to clinical skills exam education, deliberate practice, and feedback.

One resident was present at each pretest and posttest session with 1 of the 2 faculty instructors who gave standardized instructions. The resident was expected to obtain a relevant history; perform a limited physical examination; review PA, lateral, and decubitus chest radiographs; perform a simulated thoracentesis; and order appropriate diagnostic tests. Written examinations were completed at the pretest and posttest sessions.

Measurements

A 25‐item checklist was developed for the thoracentesis procedure using relevant sources17, 18 and rigorous step‐by‐step procedures.19 Each skill or other action was listed in order and given equal weight. Each skill or action was scored dichotomouslyeither 0 = done correctly or 1 = done incorrectly. Checklists were reviewed for completeness and accuracy by 2 authors who frequently perform and supervise thoracenteses (J.H.B., K.J.O.), 2 authors with expertise in checklist design (D.B.W., W.C.M.), and the physician director of the medical intensive care unit at NMH. The checklist was used in a pilot clinical skills examination of 4 chief medical residents to estimate checklist reliability and face validity.

The MPS for the thoracentesis clinical skills examination was determined by 10 clinical experts using the Angoff and Hofstee standard setting methods. The panel was composed of clinical pulmonary critical care medicine faculty (n = 7) and senior fellows (n = 3). Each panel member was given instruction on standard setting and asked to use the Angoff and Hofstee methods to assign pass/fail standards. The Angoff method asks expert judges to estimate the percentage of borderline examinees who would answer each test item correctly. The Hofstee method requires judges to estimate 4 properties of an evaluation's passing scores and failure rates. The panel was asked to repeat their judgments 6 weeks later to assure stability of the MPS. Details about the use of a standard setting exercise to set an MPS for clinical skills examinations have been published previously.14, 20

Evaluation of each resident's skill was recorded on the checklist by 1 of the 2 faculty raters at the pretest and posttest sessions. A random sample of 50% of the pretest sessions was rescored by a third rater with expertise in scoring clinical skills examinations (D.B.W.) to assess interrater reliability. The rescorer was blinded to the results of the first evaluation.

A multiple choice written examination was prepared according to examination development guidelines21 using appropriate reference articles and texts.17, 18, 22 The examination was prepared by 1 author (J.H.B.) and reviewed for accuracy and clarity by 2 others (K.J.O., D.B.W.) and by the director of the medical intensive care unit at NMH. The examination had questions on knowledge and comprehension of the procedure as well as data interpretation and application. It was administered to 9 fourth‐year medical students and 5 pulmonary/critical care fellows to obtain pilot data. Results of the pilot allowed creation of a pretest and a posttest that were equivalent in content and difficulty.23 The Kuder Richardson Formula 20 (KR‐20) reliability coefficients for the 20‐item pretest and the 20‐item posttest were .72 and .74, respectively.

Demographic data were obtained from the participants including age, gender, ethnicity, medical school, and scores on the United States Medical Licensing Examination (USMLE) Steps 1 and 2. Each resident's experience performing the procedure was also collected at pretest.

Primary outcome measures were performance on the posttest written and clinical examinations. Secondary outcome measures were the total training time needed to reach the MPS (minimum = 240 minutes) and a course evaluation questionnaire.

Data Analysis

Checklist score reliability was estimated by calculating interrater reliability, the preferred method for assessments that depend on human judges,24 using the kappa () coefficient25 adjusted using the formula of Brennan and Prediger.26 Within‐group differences from pretest (baseline) to posttest (outcome) were analyzed using paired t tests. Multiple regression analysis was used to assess the correlation of posttest performance on thoracentesis skills with (1) performance on pretest thoracentesis skills, (2) medical knowledge measured by the thoracentesis pretest and posttest and USMLE Steps 1 and 2, (3) clinical experience in performing thoracentesis, (4) clinical self‐confidence about performing thoracentesis, and (5) whether additional training was needed to master the procedure.

RESULTS

All residents consented to participate and completed the entire training protocol. Table 1 presents demographic data about the residents. Most had limited experience performing and supervising thoracenteses.

Baseline Demographic Data from 40 Internal Medicine PGY3 Residents Participating in a Simulation‐Based Training Program on Thoracentesis
CharacteristicPGY‐3 Resident
Age (years), mean (SD)28.88 (1.57)
Male23 (57.5%)
Female17 (42.5%)
African American1 (2.5%)
White21 (52.5%)
Asian14 (35.0%)
Other4 (10.0%)
U.S. medical school graduate39 (97.5%)
Foreign medical school graduate1 (2.5%)
Number of thoracentesis procedures 
Performed as an intern 
0‐127.5%
2‐460.0%
512.5%
Performed as a PGY‐2 and PGY‐3 resident 
0‐125.0%
2‐455.0%
520.0%
Supervised others as a PGY‐2 and PGY‐3 resident 
0‐127.5%
2‐457.5%
515.0%

Interrater reliability for the thoracentesis checklist data was calculated at pretest. Across the 25 checklist items, the mean kappa coefficient was very high (n = .94). The MPS used as the mastery achievement standard was 80% (eg, 20 of 25 checklist items). This was the mean of the Angoff and Hofstee ratings obtained from the first judgment of the expert panel and is displayed in Figure 1.

Figure 1
Performance on thoracentesis written exam and clinical skills exam performance (MPS, minimum passing score).

No resident achieved mastery at pretest. However, 37 of the 40 medicine residents (93%) achieved mastery within the standard 4‐hour thoracentesis curriculum. The remaining 3 residents (7%) needed extra time ranging from 20 to 90 minutes to reach mastery.

Figure 1 is a graphic portrait with descriptive statistics of the residents' pretest and posttest performance on the thoracentesis written and clinical skills exams. For the written exam, the mean score rose from 57.63% to 89.75%, a statistically significant improvement of 56% from pretest to posttest (t[39] = 17.0, P < .0001). The clinical skills exam also showed a highly significant 71% pretest‐to‐posttest gain, as the mean score rose from 51.70% to 88.3% (t[39] = 15.6, P < .0001).

Results from the regression analysis indicate that neither pretest performance, medical knowledge measured by local or USMLE examinations, nor thoracentesis clinical experience was correlated with the posttest measure of thoracentesis clinical skills. However, the need for additional practice to reach the mastery standard on the posttest was a powerful negative predictor of posttest performance: b = .27 (95% CI = .46 to .09; P < .006; r2 = .28). For those residents who required extra practice time, the initial clinical skills posttest score was 20% lower than that of their peers. Although the need for extra deliberate practice was associated with relatively lower initial posttest scores, all residents ultimately met or exceeded the rigorous thoracentesis MPS.

The responses of the 40 residents on a course evaluation questionnaire were uniformly positive. Responses were recorded on a Likert scale where 1 = strongly disagree, 2 = disagree, 3 = uncertain, 4 = agree, and 5 = strongly agree (Table 2). The data show that residents strongly agreed that practice with the medical simulator boosts clinical skills and self‐confidence, that they received useful feedback from the training sessions, and that deliberate practice using the simulator is a valuable educational experience. Residents were uncertain whether practice with the medical simulator has more educational value than patient care.

Course Evaluations Provided by All Residents (n = 40) after Simulation‐Based Educational Program
 MeanSD
Practice with the thoracentesis model boosts my skills to perform this procedure.4.30.8
I receive useful educational feedback from the training sessions.4.00.6
Practice with the thoracentesis model boosts my clinical self‐confidence.4.10.9
Practice with the thoracentesis model has more educational value than patient care experience.2.31.0
The Skills Center staff are competent.4.30.6
Practice sessions in the Skills Center are a good use of my time.3.71.0
Practice sessions using procedural models should be a required component of residency education.3.80.8
Deliberate practice using models is a valuable educational experience.4.00.9
Practice sessions using models are hard work.2.10.7
Increasing the difficulty of simulated clinical problems helps me become a better doctor.3.90.7
The controlled environment in the Skills Center helps me focus on clinical education problems.3.90.8
Practice with the thoracentesis model has helped to prepare me to perform the procedure better than clinical experience alone.4.01.0

DISCUSSION

This study demonstrates the use of a mastery learning model to develop the thoracentesis skills of internal medicine residents to a high level. Use of a thoracentesis model in a structured educational program offering an opportunity for deliberate practice with feedback produced large and consistent improvements in residents' skills. An important finding of our study is that despite having completed most of their internal medicine training, residents displayed poor knowledge and clinical skill in thoracentesis procedures at baseline. This is similar to previous studies showing that the procedural skills and knowledge of physicians at all stages of training are often poor. Examples of areas in which significant gaps were found include basic skills such as chest radiography,27 emergency airway management,8 and pulmonary auscultation.28 In contrast, after the mastery learning program, all the residents met or exceeded the MPS for the thoracentesis clinical procedure and scored much higher on the posttest written examination.

Our data also demonstrate that medical knowledge measured by procedure‐specific pretests and posttests and USMLE Steps 1 and 2 scores were not correlated with thoracentesis skill acquisition. This reinforces findings from our previous studies of ACLS skill acquisition10, 11 and supports the difference between professional and academic achievement. Pretest skill performance and clinical experience also were not correlated with posttest outcomes. However, the amount of deliberate practice needed to reach the mastery standard was a powerful negative predictor of posttest thoracentesis skill scores, replicating our research on ACLS.11 We believe that clinical experience was not correlated with posttest outcomes because residents infrequently performed thoracenteses procedures during their training.

This project demonstrates a practical model for outcomes‐based education, certification, and program accreditation. Given the need to move procedural training in internal medicine beyond such historical methods as see one, do one, teach one,29 extension of the mastery model to other invasive procedures deserves further study. At our institution we have been encouraged by the ability of simulation‐based education in ACLS to promote long‐term skill retention30 and improvement in the quality of actual patient care.31 In addition to studying these outcomes for thoracentesis, we plan to incorporate the use of ultrasound when training residents to perform procedures such as thoracentesis and central venous catheter insertion.

Given concerns about the quality of resident preparation to perform invasive procedures, programs such as this should be considered as part of the procedural certification process. As shown by our experience with several classes of residents (n = 158), use of simulation technology to reach high procedural skill levels is effective and feasible in internal medicine residency training. In addition, our residents have consistently enjoyed participating in the simulated training programs. Postcourse questionnaires show that residents agree that deliberate practice with simulation technology complements but does not replace patient care in graduate medical education.5, 10

An important question needing more research is whether performance in a simulated environment transfers to actual clinical settings. Several small studies have demonstrated such a relationship,8, 9, 31, 32 yet the transfer of simulated training to clinical practice requires further study. More work should also be done to assess long‐term retention of skills30 and to determine the utility and benefit of simulation‐based training in procedural certification and credentialing.

This study had several limitations. It was conducted in 1 training program at a single medical center. The sample size (n = 40) was relatively small. The thoracentesis model was used for both education and testing, potentially confounding the events. However, these limitations do not diminish the pronounced impact that the simulation‐based training had on the skills and knowledge of our residents.

In conclusion, this study has demonstrated the ability of deliberate practice using a thoracentesis model to produce high‐level performance of simulated thoracenteses. The project received high ratings from learners and provides reliable assessments of procedural competence. Although internists are performing fewer invasive procedures now than in years past, procedural training is still an important component of internal medicine training.29, 33 Attainment of high procedural skill levels may be especially important for residents who plan to practice hospital medicine. We believe that simulation‐based training using deliberate practice should be a key contributor to future internal medicine residency education, certification, and accreditation.

Acknowledgements

The authors thank Charles Watts, MD, and J. Larry Jameson, MD, PhD, for their support of this work. We recognize and appreciate the Northwestern University internal medicine residents for their dedication to patient care and education.

References
  1. Dressler DD,Pistoria MJ,Budnitz TL,McKean SC,Amin AN.Core competencies in hospital medicine: development and methodology.J Hosp Med.2006;1:4856.
  2. Huang GC,Smith CC,Gordon CE, et al.Beyond the comfort zone: residents assess their comfort performing inpatient medical procedures.Am J Med.2006;119:71.e17–71.e24.
  3. Sharp LK,Wang R,Lipsky MS.Perception of competency to perform procedures and future practice intent: a national survey of family practice residents.Acad Med.2003;78:926932.
  4. Bartter T,Mayo PD,Pratter MR,Santarelli RJ,Leeds WM,Akers SM.Lower risk and higher yield for thoracentesis when performed by experienced operators.Chest.1993;103:18731876.
  5. Issenberg SB,McGaghie WC,Hart IR, et al.Simulation technology for health care professional skills training and assessment.JAMA.1999;282:861866.
  6. Boulet JR,Murray D,Kras J, et al.Reliability and validity of a simulation‐based acute care skills assessment for medical students and residents.Anesthesiology.2003;99:12701280.
  7. Patel AD,Gallagher AG,Nicholson WJ,Cates CU.Learning curves and reliability measures for virtual reality simulation in the performance assessment of carotid angiography.J Am Coll Cardiol.2006;47:17961802.
  8. Mayo PH,Hackney JE,Mueck T,Ribaudo V,Schneider RF.Achieving house staff competence in emergency airway management: results of a teaching program using a computerized patient simulator.Crit Care Med.2004;32:24222427.
  9. Blum MG,Powers TW,Sundaresan S.Bronchoscopy simulator effectively prepares junior residents to competently perform basic clinical bronchoscopy.Ann Thorac Surg.2004;78:287291.
  10. Wayne DB,Butter J,Siddall VJ, et al.Simulation‐based training of internal medicine residents in advanced cardiac life support protocols: a randomized trial.Teach Learn Med.2005;17:210216.
  11. Wayne DB,Butter J,Siddall VJ, et al.Mastery learning of advanced cardiac life support skills by internal medicine residents using simulation technology and deliberate practice.J Gen Intern Med.2006;21:251256.
  12. Block JH, ed.Mastery Learning: Theory and Practice.New York:Holt, Rinehart and Winston;1971.
  13. McGaghie WC,Miller GE,Sajid A,Telder TV.Competency‐Based Curriculum Development in Medical Education. Public Health Paper No. 68.Geneva, Switzerland:World Health Organization;1978.
  14. Wayne DB,Fudala MJ,Butter J, et al.Comparison of two standard‐setting methods for advanced cardiac life support training.Acad Med.2005;80(10 Suppl):S63S66.
  15. Shadish WR,Cook TD,Campbell DT.Experimental and Quasi‐Experimental Designs for Generalized Causal Inference.Boston:Houghton Mifflin;2002.
  16. Ericsson KA.Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains.Acad Med.2004;79(10 Suppl):S70S81.
  17. Sokolowski JW,Burgher LW,Jones FL,Patterson JR,Selecky PA.Guidelines for thoracentesis and needle biopsy of the pleura. This position paper of the American Thoracic Society was adopted by the ATS Board of Directors June 1988.Am Rev Resp Dis.1989;140:257258.
  18. Light RW.Clinical practice. Pleural effusion.N Engl J Med2002;346:19711977.
  19. Stufflebeam DL. The Checklists Development Checklist. Western Michigan University Evaluation Center, July 2000. Available at: http://www.wmich.edu/evalctr/checklists/cdc.htm. Accessed December 15,2005.
  20. Downing SM,Tekian A,Yudkowsky R.Procedures for establishing defensible absolute passing scores on performance examinations in health professions education.Teach Learn Med2006;18:5057.
  21. Linn RL,Gronlund NE.Measurement and Assessment in Teaching.8th ed.Upper Saddle River, NJ:Prentice Hall;2000.
  22. Light RW.Pleural Diseases.4th ed.Philadelphia, PA:Lippincott Williams 2000:821829.
  23. Downing SM.Reliability: on the reproducibility of assessment data.Med Educ.2004;38:10061012.
  24. Fleiss JL,Levin B,Paik MC.Statistical Methods for Rates and Proportions.3rd ed.New York:John Wiley 2003.
  25. Brennan RL,Prediger DJ.Coefficient kappa: some uses, misuses, and alternatives.Educ Psychol Meas.1981;41:687699.
  26. Eisen LA,Berger JS,Hegde A,Schneider RF.Competency in chest radiography: a comparison of medical students, residents and fellows.J Gen Intern Med.2006;21:460465.
  27. Mangione S,Nieman LZ.Pulmonary auscultatory skills during training in internal medicine and family practice.Am J Resp Crit Care Med.1999;159:11191124.
  28. Duffy FD,Holmboe ES.What procedures should internists do?Ann Intern Med.2007;146:3923.
  29. Wayne DB,Siddall VJ,Butter J, et al.A longitudinal study of internal medicine residents' retention of advanced cardiac life support (ACLS) skills.Acad Med.2006;81(10 Suppl):S9S12.
  30. Wayne DB,Didwania A,Feinglass J,Barsuk J,Fudala M,McGaghie WC.Simulation‐based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: a case‐control study.Chest.2008;[Epub ahead of print].
  31. Seymour NE,Gallagher AG,Roman SA, et al.Virtual reality training improves operating room performance: results of a randomized, double‐blinded study.Ann Surg.2002;236:458464.
  32. Wigton RS,Alguire P.The declining number and variety of procedures done by general internists: a resurvey of members of the American College of Physicians.Ann Intern Med.2007;146:355360.
Article PDF
Issue
Journal of Hospital Medicine - 3(1)
Publications
Page Number
48-54
Legacy Keywords
thoracentesis, residency education, mastery learning, simulation‐based education
Sections
Article PDF
Article PDF

In a supplement to its inaugural issue, the Journal of Hospital Medicine published core competencies for hospitalists covering 3 areas: clinical conditions, systems in health care, and procedures.1 Completion of a traditional internal medicine residency may not provide hospitalists with the skills necessary to safely perform necessary procedures such as thoracentesis. A recent article reported that most internal medicine residents surveyed were uncomfortable performing common procedures, and their discomfort was higher for thoracentesis than for central line insertion, lumbar puncture, or paracentesis.2 This confirmed a previous report that family practice residents had low confidence in performing thoracenteses.3 Thoracentesis also carries the risk of the potentially life‐threatening complication of pneumothorax, which may be increased when performed by physicians‐in‐training.4

One method for improving training and assessment is the use of simulation technology. Simulation has been used to increase knowledge, provide opportunities for deliberate and safe practice, and shape the development of clinical skills.5, 6 Simulation has also been advocated for assessing competence in procedures including carotid angiography,7 emergency airway management,8 basic bronchoscopy,9 and advanced cardiac life support (ACLS).10, 11

Recently, we used simulation technology to help residents reach mastery learning standards for ACLS.11 Mastery learning,12 an extreme form of competency‐based education,13 implies that learners have acquired the clinical knowledge and skill measured against rigorous achievement standards. In mastery learning, educational results are equivalent, whereas educational practice time differs. To demonstrate mastery learning, we first documented a 38% improvement in skill after a simulation‐based educational intervention10 and used a multidisciplinary panel to determine mastery achievement standards for ACLS skills in 6 clinical scenarios.14 These standards were used in a study in which the amount of time needed to achieve skill mastery was allowed to vary while the skill outcomes of the residents were identical clinically.11

The present study had 4 aims. The first was to assess the baseline skill and knowledge of third‐year residents in thoracentesis. The second was to compare the thoracentesis‐related knowledge and skills of residents before and after an educational intervention. The third was to assess the correlation of medical knowledge and clinical experience with performance on a clinical skills examination after simulation training. The last was to document the feasibility of incorporating simulation‐based education into a training program.

METHODS

Objectives and Design

The study, which had a pretestposttest design without a control group,15 was of a simulation‐based, mastery learning educational intervention in thoracentesis. Primary measurements were obtained at baseline (pretest) and after the educational intervention (posttest).

Participants

Study participants were all 40 third‐year residents in the internal medicine residency program at Northwestern University's Chicago campus from January to May 2006. The Northwestern University Institutional Review Board approved the study. Participants provided informed consent before baseline assessment.

This residency program is based at Northwestern Memorial Hospital (NMH) and the Jesse Brown Veteran's Affairs Medical Center. Residents perform thoracenteses under the supervision of second‐ or third‐year residents or faculty members who are credentialed to perform the procedure. A didactic lecture on thoracentesis is part of the annual lecture series.

Procedure

The residents were kept as an intact group during the study period. The research procedure had 2 phases. First, the knowledge and clinical skills of participants at baseline were measured. Second, residents received two 2‐hour education sessions featuring didactic content and deliberate practice using a thoracentesis model. Between 4 and 6 weeks after the pretest, all residents were retested and were expected to meet or exceed a minimum passing score (MPS) on the clinical skills exam. Those who scored below the MPS engaged in more clinical skills practice until the mastery standard was reached. The amount of extra time needed to achieve the MPS was documented.

Educational Intervention

The intervention was designed to help residents acquire the knowledge and skills needed to perform a competent thoracentesis. The necessary components for mastery skill development were contained in the intervention. These included deliberate practice, rigorous skills assessment, and the provision of feedback in a supportive environment.16

The study was conducted in the Northwestern University Center for Advanced Surgical Education (N‐CASE) using the thoracentesis simulator developed by MediSim Inc. (Alton, Ontario) (http://www.medisim.ca/product.php?id=13). The model features realistic skin texture, ribs, and a fluid filled reservoir. Needles of various sizes can be inserted and fluid withdrawn. The model also accommodates the catheter/needle apparatus found in the thoracentesis kits (Tyco Healthcare, Pembroke, Bermuda) used at NMH.

Teaching and testing sessions were standardized. In teaching sessions, groups of 2‐4 residents had 4 hours to practice and ask questions, and to receive structured education and feedback from 1 of 2 hospitalist faculty instructors (J.H.B., K.J.O.). One of the 4 hours was devoted to the presentation of didactic material on indications, complications, and interpretation of results and a step‐by‐step demonstration of a thoracentesis. This presentation was videotaped to ensure standardization of content. The remaining 3 hours were devoted to clinical skills exam education, deliberate practice, and feedback.

One resident was present at each pretest and posttest session with 1 of the 2 faculty instructors who gave standardized instructions. The resident was expected to obtain a relevant history; perform a limited physical examination; review PA, lateral, and decubitus chest radiographs; perform a simulated thoracentesis; and order appropriate diagnostic tests. Written examinations were completed at the pretest and posttest sessions.

Measurements

A 25‐item checklist was developed for the thoracentesis procedure using relevant sources17, 18 and rigorous step‐by‐step procedures.19 Each skill or other action was listed in order and given equal weight. Each skill or action was scored dichotomouslyeither 0 = done correctly or 1 = done incorrectly. Checklists were reviewed for completeness and accuracy by 2 authors who frequently perform and supervise thoracenteses (J.H.B., K.J.O.), 2 authors with expertise in checklist design (D.B.W., W.C.M.), and the physician director of the medical intensive care unit at NMH. The checklist was used in a pilot clinical skills examination of 4 chief medical residents to estimate checklist reliability and face validity.

The MPS for the thoracentesis clinical skills examination was determined by 10 clinical experts using the Angoff and Hofstee standard setting methods. The panel was composed of clinical pulmonary critical care medicine faculty (n = 7) and senior fellows (n = 3). Each panel member was given instruction on standard setting and asked to use the Angoff and Hofstee methods to assign pass/fail standards. The Angoff method asks expert judges to estimate the percentage of borderline examinees who would answer each test item correctly. The Hofstee method requires judges to estimate 4 properties of an evaluation's passing scores and failure rates. The panel was asked to repeat their judgments 6 weeks later to assure stability of the MPS. Details about the use of a standard setting exercise to set an MPS for clinical skills examinations have been published previously.14, 20

Evaluation of each resident's skill was recorded on the checklist by 1 of the 2 faculty raters at the pretest and posttest sessions. A random sample of 50% of the pretest sessions was rescored by a third rater with expertise in scoring clinical skills examinations (D.B.W.) to assess interrater reliability. The rescorer was blinded to the results of the first evaluation.

A multiple choice written examination was prepared according to examination development guidelines21 using appropriate reference articles and texts.17, 18, 22 The examination was prepared by 1 author (J.H.B.) and reviewed for accuracy and clarity by 2 others (K.J.O., D.B.W.) and by the director of the medical intensive care unit at NMH. The examination had questions on knowledge and comprehension of the procedure as well as data interpretation and application. It was administered to 9 fourth‐year medical students and 5 pulmonary/critical care fellows to obtain pilot data. Results of the pilot allowed creation of a pretest and a posttest that were equivalent in content and difficulty.23 The Kuder Richardson Formula 20 (KR‐20) reliability coefficients for the 20‐item pretest and the 20‐item posttest were .72 and .74, respectively.

Demographic data were obtained from the participants including age, gender, ethnicity, medical school, and scores on the United States Medical Licensing Examination (USMLE) Steps 1 and 2. Each resident's experience performing the procedure was also collected at pretest.

Primary outcome measures were performance on the posttest written and clinical examinations. Secondary outcome measures were the total training time needed to reach the MPS (minimum = 240 minutes) and a course evaluation questionnaire.

Data Analysis

Checklist score reliability was estimated by calculating interrater reliability, the preferred method for assessments that depend on human judges,24 using the kappa () coefficient25 adjusted using the formula of Brennan and Prediger.26 Within‐group differences from pretest (baseline) to posttest (outcome) were analyzed using paired t tests. Multiple regression analysis was used to assess the correlation of posttest performance on thoracentesis skills with (1) performance on pretest thoracentesis skills, (2) medical knowledge measured by the thoracentesis pretest and posttest and USMLE Steps 1 and 2, (3) clinical experience in performing thoracentesis, (4) clinical self‐confidence about performing thoracentesis, and (5) whether additional training was needed to master the procedure.

RESULTS

All residents consented to participate and completed the entire training protocol. Table 1 presents demographic data about the residents. Most had limited experience performing and supervising thoracenteses.

Baseline Demographic Data from 40 Internal Medicine PGY3 Residents Participating in a Simulation‐Based Training Program on Thoracentesis
CharacteristicPGY‐3 Resident
Age (years), mean (SD)28.88 (1.57)
Male23 (57.5%)
Female17 (42.5%)
African American1 (2.5%)
White21 (52.5%)
Asian14 (35.0%)
Other4 (10.0%)
U.S. medical school graduate39 (97.5%)
Foreign medical school graduate1 (2.5%)
Number of thoracentesis procedures 
Performed as an intern 
0‐127.5%
2‐460.0%
512.5%
Performed as a PGY‐2 and PGY‐3 resident 
0‐125.0%
2‐455.0%
520.0%
Supervised others as a PGY‐2 and PGY‐3 resident 
0‐127.5%
2‐457.5%
515.0%

Interrater reliability for the thoracentesis checklist data was calculated at pretest. Across the 25 checklist items, the mean kappa coefficient was very high (n = .94). The MPS used as the mastery achievement standard was 80% (eg, 20 of 25 checklist items). This was the mean of the Angoff and Hofstee ratings obtained from the first judgment of the expert panel and is displayed in Figure 1.

Figure 1
Performance on thoracentesis written exam and clinical skills exam performance (MPS, minimum passing score).

No resident achieved mastery at pretest. However, 37 of the 40 medicine residents (93%) achieved mastery within the standard 4‐hour thoracentesis curriculum. The remaining 3 residents (7%) needed extra time ranging from 20 to 90 minutes to reach mastery.

Figure 1 is a graphic portrait with descriptive statistics of the residents' pretest and posttest performance on the thoracentesis written and clinical skills exams. For the written exam, the mean score rose from 57.63% to 89.75%, a statistically significant improvement of 56% from pretest to posttest (t[39] = 17.0, P < .0001). The clinical skills exam also showed a highly significant 71% pretest‐to‐posttest gain, as the mean score rose from 51.70% to 88.3% (t[39] = 15.6, P < .0001).

Results from the regression analysis indicate that neither pretest performance, medical knowledge measured by local or USMLE examinations, nor thoracentesis clinical experience was correlated with the posttest measure of thoracentesis clinical skills. However, the need for additional practice to reach the mastery standard on the posttest was a powerful negative predictor of posttest performance: b = .27 (95% CI = .46 to .09; P < .006; r2 = .28). For those residents who required extra practice time, the initial clinical skills posttest score was 20% lower than that of their peers. Although the need for extra deliberate practice was associated with relatively lower initial posttest scores, all residents ultimately met or exceeded the rigorous thoracentesis MPS.

The responses of the 40 residents on a course evaluation questionnaire were uniformly positive. Responses were recorded on a Likert scale where 1 = strongly disagree, 2 = disagree, 3 = uncertain, 4 = agree, and 5 = strongly agree (Table 2). The data show that residents strongly agreed that practice with the medical simulator boosts clinical skills and self‐confidence, that they received useful feedback from the training sessions, and that deliberate practice using the simulator is a valuable educational experience. Residents were uncertain whether practice with the medical simulator has more educational value than patient care.

Course Evaluations Provided by All Residents (n = 40) after Simulation‐Based Educational Program
 MeanSD
Practice with the thoracentesis model boosts my skills to perform this procedure.4.30.8
I receive useful educational feedback from the training sessions.4.00.6
Practice with the thoracentesis model boosts my clinical self‐confidence.4.10.9
Practice with the thoracentesis model has more educational value than patient care experience.2.31.0
The Skills Center staff are competent.4.30.6
Practice sessions in the Skills Center are a good use of my time.3.71.0
Practice sessions using procedural models should be a required component of residency education.3.80.8
Deliberate practice using models is a valuable educational experience.4.00.9
Practice sessions using models are hard work.2.10.7
Increasing the difficulty of simulated clinical problems helps me become a better doctor.3.90.7
The controlled environment in the Skills Center helps me focus on clinical education problems.3.90.8
Practice with the thoracentesis model has helped to prepare me to perform the procedure better than clinical experience alone.4.01.0

DISCUSSION

This study demonstrates the use of a mastery learning model to develop the thoracentesis skills of internal medicine residents to a high level. Use of a thoracentesis model in a structured educational program offering an opportunity for deliberate practice with feedback produced large and consistent improvements in residents' skills. An important finding of our study is that despite having completed most of their internal medicine training, residents displayed poor knowledge and clinical skill in thoracentesis procedures at baseline. This is similar to previous studies showing that the procedural skills and knowledge of physicians at all stages of training are often poor. Examples of areas in which significant gaps were found include basic skills such as chest radiography,27 emergency airway management,8 and pulmonary auscultation.28 In contrast, after the mastery learning program, all the residents met or exceeded the MPS for the thoracentesis clinical procedure and scored much higher on the posttest written examination.

Our data also demonstrate that medical knowledge measured by procedure‐specific pretests and posttests and USMLE Steps 1 and 2 scores were not correlated with thoracentesis skill acquisition. This reinforces findings from our previous studies of ACLS skill acquisition10, 11 and supports the difference between professional and academic achievement. Pretest skill performance and clinical experience also were not correlated with posttest outcomes. However, the amount of deliberate practice needed to reach the mastery standard was a powerful negative predictor of posttest thoracentesis skill scores, replicating our research on ACLS.11 We believe that clinical experience was not correlated with posttest outcomes because residents infrequently performed thoracenteses procedures during their training.

This project demonstrates a practical model for outcomes‐based education, certification, and program accreditation. Given the need to move procedural training in internal medicine beyond such historical methods as see one, do one, teach one,29 extension of the mastery model to other invasive procedures deserves further study. At our institution we have been encouraged by the ability of simulation‐based education in ACLS to promote long‐term skill retention30 and improvement in the quality of actual patient care.31 In addition to studying these outcomes for thoracentesis, we plan to incorporate the use of ultrasound when training residents to perform procedures such as thoracentesis and central venous catheter insertion.

Given concerns about the quality of resident preparation to perform invasive procedures, programs such as this should be considered as part of the procedural certification process. As shown by our experience with several classes of residents (n = 158), use of simulation technology to reach high procedural skill levels is effective and feasible in internal medicine residency training. In addition, our residents have consistently enjoyed participating in the simulated training programs. Postcourse questionnaires show that residents agree that deliberate practice with simulation technology complements but does not replace patient care in graduate medical education.5, 10

An important question needing more research is whether performance in a simulated environment transfers to actual clinical settings. Several small studies have demonstrated such a relationship,8, 9, 31, 32 yet the transfer of simulated training to clinical practice requires further study. More work should also be done to assess long‐term retention of skills30 and to determine the utility and benefit of simulation‐based training in procedural certification and credentialing.

This study had several limitations. It was conducted in 1 training program at a single medical center. The sample size (n = 40) was relatively small. The thoracentesis model was used for both education and testing, potentially confounding the events. However, these limitations do not diminish the pronounced impact that the simulation‐based training had on the skills and knowledge of our residents.

In conclusion, this study has demonstrated the ability of deliberate practice using a thoracentesis model to produce high‐level performance of simulated thoracenteses. The project received high ratings from learners and provides reliable assessments of procedural competence. Although internists are performing fewer invasive procedures now than in years past, procedural training is still an important component of internal medicine training.29, 33 Attainment of high procedural skill levels may be especially important for residents who plan to practice hospital medicine. We believe that simulation‐based training using deliberate practice should be a key contributor to future internal medicine residency education, certification, and accreditation.

Acknowledgements

The authors thank Charles Watts, MD, and J. Larry Jameson, MD, PhD, for their support of this work. We recognize and appreciate the Northwestern University internal medicine residents for their dedication to patient care and education.

In a supplement to its inaugural issue, the Journal of Hospital Medicine published core competencies for hospitalists covering 3 areas: clinical conditions, systems in health care, and procedures.1 Completion of a traditional internal medicine residency may not provide hospitalists with the skills necessary to safely perform necessary procedures such as thoracentesis. A recent article reported that most internal medicine residents surveyed were uncomfortable performing common procedures, and their discomfort was higher for thoracentesis than for central line insertion, lumbar puncture, or paracentesis.2 This confirmed a previous report that family practice residents had low confidence in performing thoracenteses.3 Thoracentesis also carries the risk of the potentially life‐threatening complication of pneumothorax, which may be increased when performed by physicians‐in‐training.4

One method for improving training and assessment is the use of simulation technology. Simulation has been used to increase knowledge, provide opportunities for deliberate and safe practice, and shape the development of clinical skills.5, 6 Simulation has also been advocated for assessing competence in procedures including carotid angiography,7 emergency airway management,8 basic bronchoscopy,9 and advanced cardiac life support (ACLS).10, 11

Recently, we used simulation technology to help residents reach mastery learning standards for ACLS.11 Mastery learning,12 an extreme form of competency‐based education,13 implies that learners have acquired the clinical knowledge and skill measured against rigorous achievement standards. In mastery learning, educational results are equivalent, whereas educational practice time differs. To demonstrate mastery learning, we first documented a 38% improvement in skill after a simulation‐based educational intervention10 and used a multidisciplinary panel to determine mastery achievement standards for ACLS skills in 6 clinical scenarios.14 These standards were used in a study in which the amount of time needed to achieve skill mastery was allowed to vary while the skill outcomes of the residents were identical clinically.11

The present study had 4 aims. The first was to assess the baseline skill and knowledge of third‐year residents in thoracentesis. The second was to compare the thoracentesis‐related knowledge and skills of residents before and after an educational intervention. The third was to assess the correlation of medical knowledge and clinical experience with performance on a clinical skills examination after simulation training. The last was to document the feasibility of incorporating simulation‐based education into a training program.

METHODS

Objectives and Design

The study, which had a pretestposttest design without a control group,15 was of a simulation‐based, mastery learning educational intervention in thoracentesis. Primary measurements were obtained at baseline (pretest) and after the educational intervention (posttest).

Participants

Study participants were all 40 third‐year residents in the internal medicine residency program at Northwestern University's Chicago campus from January to May 2006. The Northwestern University Institutional Review Board approved the study. Participants provided informed consent before baseline assessment.

This residency program is based at Northwestern Memorial Hospital (NMH) and the Jesse Brown Veteran's Affairs Medical Center. Residents perform thoracenteses under the supervision of second‐ or third‐year residents or faculty members who are credentialed to perform the procedure. A didactic lecture on thoracentesis is part of the annual lecture series.

Procedure

The residents were kept as an intact group during the study period. The research procedure had 2 phases. First, the knowledge and clinical skills of participants at baseline were measured. Second, residents received two 2‐hour education sessions featuring didactic content and deliberate practice using a thoracentesis model. Between 4 and 6 weeks after the pretest, all residents were retested and were expected to meet or exceed a minimum passing score (MPS) on the clinical skills exam. Those who scored below the MPS engaged in more clinical skills practice until the mastery standard was reached. The amount of extra time needed to achieve the MPS was documented.

Educational Intervention

The intervention was designed to help residents acquire the knowledge and skills needed to perform a competent thoracentesis. The necessary components for mastery skill development were contained in the intervention. These included deliberate practice, rigorous skills assessment, and the provision of feedback in a supportive environment.16

The study was conducted in the Northwestern University Center for Advanced Surgical Education (N‐CASE) using the thoracentesis simulator developed by MediSim Inc. (Alton, Ontario) (http://www.medisim.ca/product.php?id=13). The model features realistic skin texture, ribs, and a fluid filled reservoir. Needles of various sizes can be inserted and fluid withdrawn. The model also accommodates the catheter/needle apparatus found in the thoracentesis kits (Tyco Healthcare, Pembroke, Bermuda) used at NMH.

Teaching and testing sessions were standardized. In teaching sessions, groups of 2‐4 residents had 4 hours to practice and ask questions, and to receive structured education and feedback from 1 of 2 hospitalist faculty instructors (J.H.B., K.J.O.). One of the 4 hours was devoted to the presentation of didactic material on indications, complications, and interpretation of results and a step‐by‐step demonstration of a thoracentesis. This presentation was videotaped to ensure standardization of content. The remaining 3 hours were devoted to clinical skills exam education, deliberate practice, and feedback.

One resident was present at each pretest and posttest session with 1 of the 2 faculty instructors who gave standardized instructions. The resident was expected to obtain a relevant history; perform a limited physical examination; review PA, lateral, and decubitus chest radiographs; perform a simulated thoracentesis; and order appropriate diagnostic tests. Written examinations were completed at the pretest and posttest sessions.

Measurements

A 25‐item checklist was developed for the thoracentesis procedure using relevant sources17, 18 and rigorous step‐by‐step procedures.19 Each skill or other action was listed in order and given equal weight. Each skill or action was scored dichotomouslyeither 0 = done correctly or 1 = done incorrectly. Checklists were reviewed for completeness and accuracy by 2 authors who frequently perform and supervise thoracenteses (J.H.B., K.J.O.), 2 authors with expertise in checklist design (D.B.W., W.C.M.), and the physician director of the medical intensive care unit at NMH. The checklist was used in a pilot clinical skills examination of 4 chief medical residents to estimate checklist reliability and face validity.

The MPS for the thoracentesis clinical skills examination was determined by 10 clinical experts using the Angoff and Hofstee standard setting methods. The panel was composed of clinical pulmonary critical care medicine faculty (n = 7) and senior fellows (n = 3). Each panel member was given instruction on standard setting and asked to use the Angoff and Hofstee methods to assign pass/fail standards. The Angoff method asks expert judges to estimate the percentage of borderline examinees who would answer each test item correctly. The Hofstee method requires judges to estimate 4 properties of an evaluation's passing scores and failure rates. The panel was asked to repeat their judgments 6 weeks later to assure stability of the MPS. Details about the use of a standard setting exercise to set an MPS for clinical skills examinations have been published previously.14, 20

Evaluation of each resident's skill was recorded on the checklist by 1 of the 2 faculty raters at the pretest and posttest sessions. A random sample of 50% of the pretest sessions was rescored by a third rater with expertise in scoring clinical skills examinations (D.B.W.) to assess interrater reliability. The rescorer was blinded to the results of the first evaluation.

A multiple choice written examination was prepared according to examination development guidelines21 using appropriate reference articles and texts.17, 18, 22 The examination was prepared by 1 author (J.H.B.) and reviewed for accuracy and clarity by 2 others (K.J.O., D.B.W.) and by the director of the medical intensive care unit at NMH. The examination had questions on knowledge and comprehension of the procedure as well as data interpretation and application. It was administered to 9 fourth‐year medical students and 5 pulmonary/critical care fellows to obtain pilot data. Results of the pilot allowed creation of a pretest and a posttest that were equivalent in content and difficulty.23 The Kuder Richardson Formula 20 (KR‐20) reliability coefficients for the 20‐item pretest and the 20‐item posttest were .72 and .74, respectively.

Demographic data were obtained from the participants including age, gender, ethnicity, medical school, and scores on the United States Medical Licensing Examination (USMLE) Steps 1 and 2. Each resident's experience performing the procedure was also collected at pretest.

Primary outcome measures were performance on the posttest written and clinical examinations. Secondary outcome measures were the total training time needed to reach the MPS (minimum = 240 minutes) and a course evaluation questionnaire.

Data Analysis

Checklist score reliability was estimated by calculating interrater reliability, the preferred method for assessments that depend on human judges,24 using the kappa () coefficient25 adjusted using the formula of Brennan and Prediger.26 Within‐group differences from pretest (baseline) to posttest (outcome) were analyzed using paired t tests. Multiple regression analysis was used to assess the correlation of posttest performance on thoracentesis skills with (1) performance on pretest thoracentesis skills, (2) medical knowledge measured by the thoracentesis pretest and posttest and USMLE Steps 1 and 2, (3) clinical experience in performing thoracentesis, (4) clinical self‐confidence about performing thoracentesis, and (5) whether additional training was needed to master the procedure.

RESULTS

All residents consented to participate and completed the entire training protocol. Table 1 presents demographic data about the residents. Most had limited experience performing and supervising thoracenteses.

Baseline Demographic Data from 40 Internal Medicine PGY3 Residents Participating in a Simulation‐Based Training Program on Thoracentesis
CharacteristicPGY‐3 Resident
Age (years), mean (SD)28.88 (1.57)
Male23 (57.5%)
Female17 (42.5%)
African American1 (2.5%)
White21 (52.5%)
Asian14 (35.0%)
Other4 (10.0%)
U.S. medical school graduate39 (97.5%)
Foreign medical school graduate1 (2.5%)
Number of thoracentesis procedures 
Performed as an intern 
0‐127.5%
2‐460.0%
512.5%
Performed as a PGY‐2 and PGY‐3 resident 
0‐125.0%
2‐455.0%
520.0%
Supervised others as a PGY‐2 and PGY‐3 resident 
0‐127.5%
2‐457.5%
515.0%

Interrater reliability for the thoracentesis checklist data was calculated at pretest. Across the 25 checklist items, the mean kappa coefficient was very high (n = .94). The MPS used as the mastery achievement standard was 80% (eg, 20 of 25 checklist items). This was the mean of the Angoff and Hofstee ratings obtained from the first judgment of the expert panel and is displayed in Figure 1.

Figure 1
Performance on thoracentesis written exam and clinical skills exam performance (MPS, minimum passing score).

No resident achieved mastery at pretest. However, 37 of the 40 medicine residents (93%) achieved mastery within the standard 4‐hour thoracentesis curriculum. The remaining 3 residents (7%) needed extra time ranging from 20 to 90 minutes to reach mastery.

Figure 1 is a graphic portrait with descriptive statistics of the residents' pretest and posttest performance on the thoracentesis written and clinical skills exams. For the written exam, the mean score rose from 57.63% to 89.75%, a statistically significant improvement of 56% from pretest to posttest (t[39] = 17.0, P < .0001). The clinical skills exam also showed a highly significant 71% pretest‐to‐posttest gain, as the mean score rose from 51.70% to 88.3% (t[39] = 15.6, P < .0001).

Results from the regression analysis indicate that neither pretest performance, medical knowledge measured by local or USMLE examinations, nor thoracentesis clinical experience was correlated with the posttest measure of thoracentesis clinical skills. However, the need for additional practice to reach the mastery standard on the posttest was a powerful negative predictor of posttest performance: b = .27 (95% CI = .46 to .09; P < .006; r2 = .28). For those residents who required extra practice time, the initial clinical skills posttest score was 20% lower than that of their peers. Although the need for extra deliberate practice was associated with relatively lower initial posttest scores, all residents ultimately met or exceeded the rigorous thoracentesis MPS.

The responses of the 40 residents on a course evaluation questionnaire were uniformly positive. Responses were recorded on a Likert scale where 1 = strongly disagree, 2 = disagree, 3 = uncertain, 4 = agree, and 5 = strongly agree (Table 2). The data show that residents strongly agreed that practice with the medical simulator boosts clinical skills and self‐confidence, that they received useful feedback from the training sessions, and that deliberate practice using the simulator is a valuable educational experience. Residents were uncertain whether practice with the medical simulator has more educational value than patient care.

Course Evaluations Provided by All Residents (n = 40) after Simulation‐Based Educational Program
 MeanSD
Practice with the thoracentesis model boosts my skills to perform this procedure.4.30.8
I receive useful educational feedback from the training sessions.4.00.6
Practice with the thoracentesis model boosts my clinical self‐confidence.4.10.9
Practice with the thoracentesis model has more educational value than patient care experience.2.31.0
The Skills Center staff are competent.4.30.6
Practice sessions in the Skills Center are a good use of my time.3.71.0
Practice sessions using procedural models should be a required component of residency education.3.80.8
Deliberate practice using models is a valuable educational experience.4.00.9
Practice sessions using models are hard work.2.10.7
Increasing the difficulty of simulated clinical problems helps me become a better doctor.3.90.7
The controlled environment in the Skills Center helps me focus on clinical education problems.3.90.8
Practice with the thoracentesis model has helped to prepare me to perform the procedure better than clinical experience alone.4.01.0

DISCUSSION

This study demonstrates the use of a mastery learning model to develop the thoracentesis skills of internal medicine residents to a high level. Use of a thoracentesis model in a structured educational program offering an opportunity for deliberate practice with feedback produced large and consistent improvements in residents' skills. An important finding of our study is that despite having completed most of their internal medicine training, residents displayed poor knowledge and clinical skill in thoracentesis procedures at baseline. This is similar to previous studies showing that the procedural skills and knowledge of physicians at all stages of training are often poor. Examples of areas in which significant gaps were found include basic skills such as chest radiography,27 emergency airway management,8 and pulmonary auscultation.28 In contrast, after the mastery learning program, all the residents met or exceeded the MPS for the thoracentesis clinical procedure and scored much higher on the posttest written examination.

Our data also demonstrate that medical knowledge measured by procedure‐specific pretests and posttests and USMLE Steps 1 and 2 scores were not correlated with thoracentesis skill acquisition. This reinforces findings from our previous studies of ACLS skill acquisition10, 11 and supports the difference between professional and academic achievement. Pretest skill performance and clinical experience also were not correlated with posttest outcomes. However, the amount of deliberate practice needed to reach the mastery standard was a powerful negative predictor of posttest thoracentesis skill scores, replicating our research on ACLS.11 We believe that clinical experience was not correlated with posttest outcomes because residents infrequently performed thoracenteses procedures during their training.

This project demonstrates a practical model for outcomes‐based education, certification, and program accreditation. Given the need to move procedural training in internal medicine beyond such historical methods as see one, do one, teach one,29 extension of the mastery model to other invasive procedures deserves further study. At our institution we have been encouraged by the ability of simulation‐based education in ACLS to promote long‐term skill retention30 and improvement in the quality of actual patient care.31 In addition to studying these outcomes for thoracentesis, we plan to incorporate the use of ultrasound when training residents to perform procedures such as thoracentesis and central venous catheter insertion.

Given concerns about the quality of resident preparation to perform invasive procedures, programs such as this should be considered as part of the procedural certification process. As shown by our experience with several classes of residents (n = 158), use of simulation technology to reach high procedural skill levels is effective and feasible in internal medicine residency training. In addition, our residents have consistently enjoyed participating in the simulated training programs. Postcourse questionnaires show that residents agree that deliberate practice with simulation technology complements but does not replace patient care in graduate medical education.5, 10

An important question needing more research is whether performance in a simulated environment transfers to actual clinical settings. Several small studies have demonstrated such a relationship,8, 9, 31, 32 yet the transfer of simulated training to clinical practice requires further study. More work should also be done to assess long‐term retention of skills30 and to determine the utility and benefit of simulation‐based training in procedural certification and credentialing.

This study had several limitations. It was conducted in 1 training program at a single medical center. The sample size (n = 40) was relatively small. The thoracentesis model was used for both education and testing, potentially confounding the events. However, these limitations do not diminish the pronounced impact that the simulation‐based training had on the skills and knowledge of our residents.

In conclusion, this study has demonstrated the ability of deliberate practice using a thoracentesis model to produce high‐level performance of simulated thoracenteses. The project received high ratings from learners and provides reliable assessments of procedural competence. Although internists are performing fewer invasive procedures now than in years past, procedural training is still an important component of internal medicine training.29, 33 Attainment of high procedural skill levels may be especially important for residents who plan to practice hospital medicine. We believe that simulation‐based training using deliberate practice should be a key contributor to future internal medicine residency education, certification, and accreditation.

Acknowledgements

The authors thank Charles Watts, MD, and J. Larry Jameson, MD, PhD, for their support of this work. We recognize and appreciate the Northwestern University internal medicine residents for their dedication to patient care and education.

References
  1. Dressler DD,Pistoria MJ,Budnitz TL,McKean SC,Amin AN.Core competencies in hospital medicine: development and methodology.J Hosp Med.2006;1:4856.
  2. Huang GC,Smith CC,Gordon CE, et al.Beyond the comfort zone: residents assess their comfort performing inpatient medical procedures.Am J Med.2006;119:71.e17–71.e24.
  3. Sharp LK,Wang R,Lipsky MS.Perception of competency to perform procedures and future practice intent: a national survey of family practice residents.Acad Med.2003;78:926932.
  4. Bartter T,Mayo PD,Pratter MR,Santarelli RJ,Leeds WM,Akers SM.Lower risk and higher yield for thoracentesis when performed by experienced operators.Chest.1993;103:18731876.
  5. Issenberg SB,McGaghie WC,Hart IR, et al.Simulation technology for health care professional skills training and assessment.JAMA.1999;282:861866.
  6. Boulet JR,Murray D,Kras J, et al.Reliability and validity of a simulation‐based acute care skills assessment for medical students and residents.Anesthesiology.2003;99:12701280.
  7. Patel AD,Gallagher AG,Nicholson WJ,Cates CU.Learning curves and reliability measures for virtual reality simulation in the performance assessment of carotid angiography.J Am Coll Cardiol.2006;47:17961802.
  8. Mayo PH,Hackney JE,Mueck T,Ribaudo V,Schneider RF.Achieving house staff competence in emergency airway management: results of a teaching program using a computerized patient simulator.Crit Care Med.2004;32:24222427.
  9. Blum MG,Powers TW,Sundaresan S.Bronchoscopy simulator effectively prepares junior residents to competently perform basic clinical bronchoscopy.Ann Thorac Surg.2004;78:287291.
  10. Wayne DB,Butter J,Siddall VJ, et al.Simulation‐based training of internal medicine residents in advanced cardiac life support protocols: a randomized trial.Teach Learn Med.2005;17:210216.
  11. Wayne DB,Butter J,Siddall VJ, et al.Mastery learning of advanced cardiac life support skills by internal medicine residents using simulation technology and deliberate practice.J Gen Intern Med.2006;21:251256.
  12. Block JH, ed.Mastery Learning: Theory and Practice.New York:Holt, Rinehart and Winston;1971.
  13. McGaghie WC,Miller GE,Sajid A,Telder TV.Competency‐Based Curriculum Development in Medical Education. Public Health Paper No. 68.Geneva, Switzerland:World Health Organization;1978.
  14. Wayne DB,Fudala MJ,Butter J, et al.Comparison of two standard‐setting methods for advanced cardiac life support training.Acad Med.2005;80(10 Suppl):S63S66.
  15. Shadish WR,Cook TD,Campbell DT.Experimental and Quasi‐Experimental Designs for Generalized Causal Inference.Boston:Houghton Mifflin;2002.
  16. Ericsson KA.Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains.Acad Med.2004;79(10 Suppl):S70S81.
  17. Sokolowski JW,Burgher LW,Jones FL,Patterson JR,Selecky PA.Guidelines for thoracentesis and needle biopsy of the pleura. This position paper of the American Thoracic Society was adopted by the ATS Board of Directors June 1988.Am Rev Resp Dis.1989;140:257258.
  18. Light RW.Clinical practice. Pleural effusion.N Engl J Med2002;346:19711977.
  19. Stufflebeam DL. The Checklists Development Checklist. Western Michigan University Evaluation Center, July 2000. Available at: http://www.wmich.edu/evalctr/checklists/cdc.htm. Accessed December 15,2005.
  20. Downing SM,Tekian A,Yudkowsky R.Procedures for establishing defensible absolute passing scores on performance examinations in health professions education.Teach Learn Med2006;18:5057.
  21. Linn RL,Gronlund NE.Measurement and Assessment in Teaching.8th ed.Upper Saddle River, NJ:Prentice Hall;2000.
  22. Light RW.Pleural Diseases.4th ed.Philadelphia, PA:Lippincott Williams 2000:821829.
  23. Downing SM.Reliability: on the reproducibility of assessment data.Med Educ.2004;38:10061012.
  24. Fleiss JL,Levin B,Paik MC.Statistical Methods for Rates and Proportions.3rd ed.New York:John Wiley 2003.
  25. Brennan RL,Prediger DJ.Coefficient kappa: some uses, misuses, and alternatives.Educ Psychol Meas.1981;41:687699.
  26. Eisen LA,Berger JS,Hegde A,Schneider RF.Competency in chest radiography: a comparison of medical students, residents and fellows.J Gen Intern Med.2006;21:460465.
  27. Mangione S,Nieman LZ.Pulmonary auscultatory skills during training in internal medicine and family practice.Am J Resp Crit Care Med.1999;159:11191124.
  28. Duffy FD,Holmboe ES.What procedures should internists do?Ann Intern Med.2007;146:3923.
  29. Wayne DB,Siddall VJ,Butter J, et al.A longitudinal study of internal medicine residents' retention of advanced cardiac life support (ACLS) skills.Acad Med.2006;81(10 Suppl):S9S12.
  30. Wayne DB,Didwania A,Feinglass J,Barsuk J,Fudala M,McGaghie WC.Simulation‐based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: a case‐control study.Chest.2008;[Epub ahead of print].
  31. Seymour NE,Gallagher AG,Roman SA, et al.Virtual reality training improves operating room performance: results of a randomized, double‐blinded study.Ann Surg.2002;236:458464.
  32. Wigton RS,Alguire P.The declining number and variety of procedures done by general internists: a resurvey of members of the American College of Physicians.Ann Intern Med.2007;146:355360.
References
  1. Dressler DD,Pistoria MJ,Budnitz TL,McKean SC,Amin AN.Core competencies in hospital medicine: development and methodology.J Hosp Med.2006;1:4856.
  2. Huang GC,Smith CC,Gordon CE, et al.Beyond the comfort zone: residents assess their comfort performing inpatient medical procedures.Am J Med.2006;119:71.e17–71.e24.
  3. Sharp LK,Wang R,Lipsky MS.Perception of competency to perform procedures and future practice intent: a national survey of family practice residents.Acad Med.2003;78:926932.
  4. Bartter T,Mayo PD,Pratter MR,Santarelli RJ,Leeds WM,Akers SM.Lower risk and higher yield for thoracentesis when performed by experienced operators.Chest.1993;103:18731876.
  5. Issenberg SB,McGaghie WC,Hart IR, et al.Simulation technology for health care professional skills training and assessment.JAMA.1999;282:861866.
  6. Boulet JR,Murray D,Kras J, et al.Reliability and validity of a simulation‐based acute care skills assessment for medical students and residents.Anesthesiology.2003;99:12701280.
  7. Patel AD,Gallagher AG,Nicholson WJ,Cates CU.Learning curves and reliability measures for virtual reality simulation in the performance assessment of carotid angiography.J Am Coll Cardiol.2006;47:17961802.
  8. Mayo PH,Hackney JE,Mueck T,Ribaudo V,Schneider RF.Achieving house staff competence in emergency airway management: results of a teaching program using a computerized patient simulator.Crit Care Med.2004;32:24222427.
  9. Blum MG,Powers TW,Sundaresan S.Bronchoscopy simulator effectively prepares junior residents to competently perform basic clinical bronchoscopy.Ann Thorac Surg.2004;78:287291.
  10. Wayne DB,Butter J,Siddall VJ, et al.Simulation‐based training of internal medicine residents in advanced cardiac life support protocols: a randomized trial.Teach Learn Med.2005;17:210216.
  11. Wayne DB,Butter J,Siddall VJ, et al.Mastery learning of advanced cardiac life support skills by internal medicine residents using simulation technology and deliberate practice.J Gen Intern Med.2006;21:251256.
  12. Block JH, ed.Mastery Learning: Theory and Practice.New York:Holt, Rinehart and Winston;1971.
  13. McGaghie WC,Miller GE,Sajid A,Telder TV.Competency‐Based Curriculum Development in Medical Education. Public Health Paper No. 68.Geneva, Switzerland:World Health Organization;1978.
  14. Wayne DB,Fudala MJ,Butter J, et al.Comparison of two standard‐setting methods for advanced cardiac life support training.Acad Med.2005;80(10 Suppl):S63S66.
  15. Shadish WR,Cook TD,Campbell DT.Experimental and Quasi‐Experimental Designs for Generalized Causal Inference.Boston:Houghton Mifflin;2002.
  16. Ericsson KA.Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains.Acad Med.2004;79(10 Suppl):S70S81.
  17. Sokolowski JW,Burgher LW,Jones FL,Patterson JR,Selecky PA.Guidelines for thoracentesis and needle biopsy of the pleura. This position paper of the American Thoracic Society was adopted by the ATS Board of Directors June 1988.Am Rev Resp Dis.1989;140:257258.
  18. Light RW.Clinical practice. Pleural effusion.N Engl J Med2002;346:19711977.
  19. Stufflebeam DL. The Checklists Development Checklist. Western Michigan University Evaluation Center, July 2000. Available at: http://www.wmich.edu/evalctr/checklists/cdc.htm. Accessed December 15,2005.
  20. Downing SM,Tekian A,Yudkowsky R.Procedures for establishing defensible absolute passing scores on performance examinations in health professions education.Teach Learn Med2006;18:5057.
  21. Linn RL,Gronlund NE.Measurement and Assessment in Teaching.8th ed.Upper Saddle River, NJ:Prentice Hall;2000.
  22. Light RW.Pleural Diseases.4th ed.Philadelphia, PA:Lippincott Williams 2000:821829.
  23. Downing SM.Reliability: on the reproducibility of assessment data.Med Educ.2004;38:10061012.
  24. Fleiss JL,Levin B,Paik MC.Statistical Methods for Rates and Proportions.3rd ed.New York:John Wiley 2003.
  25. Brennan RL,Prediger DJ.Coefficient kappa: some uses, misuses, and alternatives.Educ Psychol Meas.1981;41:687699.
  26. Eisen LA,Berger JS,Hegde A,Schneider RF.Competency in chest radiography: a comparison of medical students, residents and fellows.J Gen Intern Med.2006;21:460465.
  27. Mangione S,Nieman LZ.Pulmonary auscultatory skills during training in internal medicine and family practice.Am J Resp Crit Care Med.1999;159:11191124.
  28. Duffy FD,Holmboe ES.What procedures should internists do?Ann Intern Med.2007;146:3923.
  29. Wayne DB,Siddall VJ,Butter J, et al.A longitudinal study of internal medicine residents' retention of advanced cardiac life support (ACLS) skills.Acad Med.2006;81(10 Suppl):S9S12.
  30. Wayne DB,Didwania A,Feinglass J,Barsuk J,Fudala M,McGaghie WC.Simulation‐based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: a case‐control study.Chest.2008;[Epub ahead of print].
  31. Seymour NE,Gallagher AG,Roman SA, et al.Virtual reality training improves operating room performance: results of a randomized, double‐blinded study.Ann Surg.2002;236:458464.
  32. Wigton RS,Alguire P.The declining number and variety of procedures done by general internists: a resurvey of members of the American College of Physicians.Ann Intern Med.2007;146:355360.
Issue
Journal of Hospital Medicine - 3(1)
Issue
Journal of Hospital Medicine - 3(1)
Page Number
48-54
Page Number
48-54
Publications
Publications
Article Type
Display Headline
Mastery learning of thoracentesis skills by internal medicine residents using simulation technology and deliberate practice
Display Headline
Mastery learning of thoracentesis skills by internal medicine residents using simulation technology and deliberate practice
Legacy Keywords
thoracentesis, residency education, mastery learning, simulation‐based education
Legacy Keywords
thoracentesis, residency education, mastery learning, simulation‐based education
Sections
Article Source

Copyright © 2008 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Department of Medicine, Northwestern University Feinberg School of Medicine, 251 E. Huron St., Galter 3‐150, Chicago, IL 60611; Fax: (312) 926‐6905
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Article PDF Media

Editorial

Article Type
Changed
Mon, 01/02/2017 - 19:34
Display Headline
Procedural training at a crossroads: Striking a balance between education, patient safety, and quality

See one, do one, teach one is a refrain familiar to all physicians. Historically, most procedural training has occurred at the bedside. In this model, senior residents, subspecialty fellows, or faculty members would demonstrate procedural skills to junior trainees, who would subsequently practice the procedures on patients, often with uneven, risky results. Acquisition of procedural skills by residents and fellows on inpatient wards is suboptimal for at least 2 reasons beyond the risks to patient safety: (1) clinical priorities are more important than educational priorities in this setting, and (2) the patient, not the medical learner, is the most important person in the room.

Recently, several new factors have challenged the traditional medical education model. For a variety of reasons, general internists currently perform far fewer invasive procedures than they used to.1 A heightened focus on patient safety and quality raises questions about the qualifications needed to perform invasive procedures. Assessment requirements have also become more stringent. The Accreditation Council for Graduate Medical Education (ACGME) now requires the use of measures that yield reliable and valid data to document the competence of trainees performing invasive procedures.2 In 2006 these factors, and the challenge to educate, assess, and certify residents, prompted the American Board of Internal Medicine to revise its certification requirements and remove the need for technical proficiency in several procedures including paracentesis, central venous catheter placement, and thoracentesis.3, 4

Two studies reported in this issue of the Journal of Hospital Medicine highlight important issues about preparing residents to perform invasive procedures. These include the educational limits of routine clinical care and the challenge to design rigorous educational interventions that improve residents' skills. Miranda and colleagues5 designed a clinical trial to evaluate an educational intervention in which residents practiced insertion of subclavian and internal jugular venous catheters under the supervision of a hospitalist faculty member. The goal was to reduce the frequency of femoral venous catheters placed at their institution. Although residents demonstrated increased knowledge and confidence after the educational intervention, the actual number of subclavian and internal jugular venous catheter insertions was lower in the intervention group, and was rare overall. The intervention did not achieve the stated goal of reducing the number of femoral venous catheters placed by residents. This research highlights that residents cannot be trained to perform invasive procedures through clinical experience alone. In addition, it demonstrates that brief educational interventions are also insufficient. Whether a longer and more robust educational intervention might have shown different results is uncertain, but many experts believe that opportunities for deliberate practice6 using standardized and sustained treatments7 can be a powerful tool to boost the procedural skills of physicians.

At the same institution, Lucas and colleagues studied the impact of a procedural service on the number of invasive procedures performed on a general medicine inpatient service.8 They found a 48% increase in procedure attempts when the procedure service staffed by an experienced faulty member was available. However, no improvement in success rate or reduction in complications was demonstrated. Thus, opportunities for trainees to perform procedures increased, but the presence of a faculty member to provide direct supervision did not improve the quality of the procedures accomplished.

Together these reports highlight challenges and opportunities in training residents to perform invasive procedures. Both studies involved the procedural skills of residents. One used an educational intervention, the other featured faculty supervision. Both studies produced outcomes that suggest improved procedural training, but neither improved the actual quality of delivered care. A brief educational intervention increased resident confidence and knowledge but did not increase the quality or number of procedures performed by residents. Opportunities to perform invasive procedures increased dramatically when an experienced attending physician was available to supervise residents. However, more education was not provided, and the quality of procedures performed did not improve.

Given these limitations, how should physicians learn to perform invasive procedures? We endorse a systematic approach to achieve high levels of procedural skills in resident physicians. First, procedures should be carefully selected. Only those essential to future practice should be required. If possible, opportunities should be available for selected trainees to develop skills in performing additional procedures relevant to their future careers. An example would be the opportunity for residents in a hospitalist track to develop proficiency in central venous catheter insertion through clinical experience, didactic education, and rigorous skill assessment. Second, dedicated programs are needed to train and assess residents in procedural skills. Reliance on clinical experience alone is inadequate because of the low frequency at which most procedures are performed and the inability to standardize assessments in routine clinical practice.

Simulation technology is a powerful adjunct to traditional clinical training and has been demonstrated to be highly effective in developing procedural skills in disciplines such as endoscopy9 and laparoscopic surgery.10 At our institution, a simulation‐based training program has been used to help residents achieve11 and maintain12 a high level of skill in performing advanced cardiac life support procedures. We use simulation to provide opportunities for deliberate practice in a controlled environment in which immediate feedback is emphasized and mastery levels are reached. The rigorous curriculum is standardized, but learner progress is individualized depending on the practice time needed to achieve competency standards.

Most important, when training physicians to perform invasive procedures, it is critical to use interventions and training programs that can be linked to improvements in actual clinical care. The studies by Miranda et al. and Lucas et al. highlight the utility of focused educational programs to complement clinical training as well as the positive impact of direct faculty supervision. These results are important starting points for programs to consider as they train and certify residents in required procedural skills. However, much work remains to be done. These studies have revealed that improvements in patient care outcomes are not likely to occur unless robust, learner‐centered educational programs are combined with adequate opportunities for residents to perform procedures under appropriate supervision.

References
  1. Wigton RS,Alguire P.The declining number and variety of procedures done by general internists: a resurvey of members of the American College of Physicians.Ann Intern Med.2007;146:355360.
  2. Accreditation Council for Graduate Medical Education. Outcome project: general competencies. Available at: http://www.acgme.org/outcome/comp/compFull.asp#1. Accessed January 28,2007.
  3. American Board of Internal Medicine. Requirements for certification in internal medicine. Available at: http://www.abim.org/cert/policiesim.shtm. Accessed January 28,2007.
  4. Duffy FD,Holmboe ES.What procedures should internists do?Ann Intern Med.2007;146:392393.
  5. Miranda JA,Trick WE,Evans AT,Charles‐Damte M,Reilly BM,Clarke P.Firm‐based trial to improve central venous catheter insertion practices.J Hosp Med.2007;2:135142.
  6. Ericsson KA.Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains.Acad Med.2004 Oct;79(10 Suppl):S70S81.
  7. Cordray DS,Pion GM.Treatment strength and integrity: models and methods. In:Bootzin RR,McKnight PE, eds.Strengthening Research Methodology: Psychological Measurement and Evaluation.Washington, DC:American Psychological Association;2006:103124.
  8. Lucas BP,Asbury JK,Wang Y, et al.Impact of a bedside procedure service on general medicine inpatients: a firm‐based trial.J Hosp Med.2007;2:143149.
  9. Cohen J,Cohen SA,Vora KC, et al.Multicenter, randomized, controlled trial of virtual‐reality simulator training in acquisition of competency in colonoscopy.Gastrointest Endosc.2006;64:361368.
  10. Andreatta PB,Woodrum DT,Birkmeyer JD, et al.Laparoscopic skills are improved with LapMentor training: results of a randomized, double‐blinded study.Ann Surg.2006;243:854860.
  11. Wayne DB,Butter J,Siddall VJ, et al.Mastery learning of advanced cardiac life support skills by internal medicine using simulation technology and deliberate practice.J Gen Intern Med.2006;21:251256.
  12. Wayne DB,Siddall VJ,Butter J, et al.A longitudinal study of internal medicine residents' retention of advanced cardiac support life skills.Acad Med.2006;81(10 Suppl):S9S12.
Article PDF
Issue
Journal of Hospital Medicine - 2(3)
Publications
Page Number
123-125
Sections
Article PDF
Article PDF

See one, do one, teach one is a refrain familiar to all physicians. Historically, most procedural training has occurred at the bedside. In this model, senior residents, subspecialty fellows, or faculty members would demonstrate procedural skills to junior trainees, who would subsequently practice the procedures on patients, often with uneven, risky results. Acquisition of procedural skills by residents and fellows on inpatient wards is suboptimal for at least 2 reasons beyond the risks to patient safety: (1) clinical priorities are more important than educational priorities in this setting, and (2) the patient, not the medical learner, is the most important person in the room.

Recently, several new factors have challenged the traditional medical education model. For a variety of reasons, general internists currently perform far fewer invasive procedures than they used to.1 A heightened focus on patient safety and quality raises questions about the qualifications needed to perform invasive procedures. Assessment requirements have also become more stringent. The Accreditation Council for Graduate Medical Education (ACGME) now requires the use of measures that yield reliable and valid data to document the competence of trainees performing invasive procedures.2 In 2006 these factors, and the challenge to educate, assess, and certify residents, prompted the American Board of Internal Medicine to revise its certification requirements and remove the need for technical proficiency in several procedures including paracentesis, central venous catheter placement, and thoracentesis.3, 4

Two studies reported in this issue of the Journal of Hospital Medicine highlight important issues about preparing residents to perform invasive procedures. These include the educational limits of routine clinical care and the challenge to design rigorous educational interventions that improve residents' skills. Miranda and colleagues5 designed a clinical trial to evaluate an educational intervention in which residents practiced insertion of subclavian and internal jugular venous catheters under the supervision of a hospitalist faculty member. The goal was to reduce the frequency of femoral venous catheters placed at their institution. Although residents demonstrated increased knowledge and confidence after the educational intervention, the actual number of subclavian and internal jugular venous catheter insertions was lower in the intervention group, and was rare overall. The intervention did not achieve the stated goal of reducing the number of femoral venous catheters placed by residents. This research highlights that residents cannot be trained to perform invasive procedures through clinical experience alone. In addition, it demonstrates that brief educational interventions are also insufficient. Whether a longer and more robust educational intervention might have shown different results is uncertain, but many experts believe that opportunities for deliberate practice6 using standardized and sustained treatments7 can be a powerful tool to boost the procedural skills of physicians.

At the same institution, Lucas and colleagues studied the impact of a procedural service on the number of invasive procedures performed on a general medicine inpatient service.8 They found a 48% increase in procedure attempts when the procedure service staffed by an experienced faulty member was available. However, no improvement in success rate or reduction in complications was demonstrated. Thus, opportunities for trainees to perform procedures increased, but the presence of a faculty member to provide direct supervision did not improve the quality of the procedures accomplished.

Together these reports highlight challenges and opportunities in training residents to perform invasive procedures. Both studies involved the procedural skills of residents. One used an educational intervention, the other featured faculty supervision. Both studies produced outcomes that suggest improved procedural training, but neither improved the actual quality of delivered care. A brief educational intervention increased resident confidence and knowledge but did not increase the quality or number of procedures performed by residents. Opportunities to perform invasive procedures increased dramatically when an experienced attending physician was available to supervise residents. However, more education was not provided, and the quality of procedures performed did not improve.

Given these limitations, how should physicians learn to perform invasive procedures? We endorse a systematic approach to achieve high levels of procedural skills in resident physicians. First, procedures should be carefully selected. Only those essential to future practice should be required. If possible, opportunities should be available for selected trainees to develop skills in performing additional procedures relevant to their future careers. An example would be the opportunity for residents in a hospitalist track to develop proficiency in central venous catheter insertion through clinical experience, didactic education, and rigorous skill assessment. Second, dedicated programs are needed to train and assess residents in procedural skills. Reliance on clinical experience alone is inadequate because of the low frequency at which most procedures are performed and the inability to standardize assessments in routine clinical practice.

Simulation technology is a powerful adjunct to traditional clinical training and has been demonstrated to be highly effective in developing procedural skills in disciplines such as endoscopy9 and laparoscopic surgery.10 At our institution, a simulation‐based training program has been used to help residents achieve11 and maintain12 a high level of skill in performing advanced cardiac life support procedures. We use simulation to provide opportunities for deliberate practice in a controlled environment in which immediate feedback is emphasized and mastery levels are reached. The rigorous curriculum is standardized, but learner progress is individualized depending on the practice time needed to achieve competency standards.

Most important, when training physicians to perform invasive procedures, it is critical to use interventions and training programs that can be linked to improvements in actual clinical care. The studies by Miranda et al. and Lucas et al. highlight the utility of focused educational programs to complement clinical training as well as the positive impact of direct faculty supervision. These results are important starting points for programs to consider as they train and certify residents in required procedural skills. However, much work remains to be done. These studies have revealed that improvements in patient care outcomes are not likely to occur unless robust, learner‐centered educational programs are combined with adequate opportunities for residents to perform procedures under appropriate supervision.

See one, do one, teach one is a refrain familiar to all physicians. Historically, most procedural training has occurred at the bedside. In this model, senior residents, subspecialty fellows, or faculty members would demonstrate procedural skills to junior trainees, who would subsequently practice the procedures on patients, often with uneven, risky results. Acquisition of procedural skills by residents and fellows on inpatient wards is suboptimal for at least 2 reasons beyond the risks to patient safety: (1) clinical priorities are more important than educational priorities in this setting, and (2) the patient, not the medical learner, is the most important person in the room.

Recently, several new factors have challenged the traditional medical education model. For a variety of reasons, general internists currently perform far fewer invasive procedures than they used to.1 A heightened focus on patient safety and quality raises questions about the qualifications needed to perform invasive procedures. Assessment requirements have also become more stringent. The Accreditation Council for Graduate Medical Education (ACGME) now requires the use of measures that yield reliable and valid data to document the competence of trainees performing invasive procedures.2 In 2006 these factors, and the challenge to educate, assess, and certify residents, prompted the American Board of Internal Medicine to revise its certification requirements and remove the need for technical proficiency in several procedures including paracentesis, central venous catheter placement, and thoracentesis.3, 4

Two studies reported in this issue of the Journal of Hospital Medicine highlight important issues about preparing residents to perform invasive procedures. These include the educational limits of routine clinical care and the challenge to design rigorous educational interventions that improve residents' skills. Miranda and colleagues5 designed a clinical trial to evaluate an educational intervention in which residents practiced insertion of subclavian and internal jugular venous catheters under the supervision of a hospitalist faculty member. The goal was to reduce the frequency of femoral venous catheters placed at their institution. Although residents demonstrated increased knowledge and confidence after the educational intervention, the actual number of subclavian and internal jugular venous catheter insertions was lower in the intervention group, and was rare overall. The intervention did not achieve the stated goal of reducing the number of femoral venous catheters placed by residents. This research highlights that residents cannot be trained to perform invasive procedures through clinical experience alone. In addition, it demonstrates that brief educational interventions are also insufficient. Whether a longer and more robust educational intervention might have shown different results is uncertain, but many experts believe that opportunities for deliberate practice6 using standardized and sustained treatments7 can be a powerful tool to boost the procedural skills of physicians.

At the same institution, Lucas and colleagues studied the impact of a procedural service on the number of invasive procedures performed on a general medicine inpatient service.8 They found a 48% increase in procedure attempts when the procedure service staffed by an experienced faulty member was available. However, no improvement in success rate or reduction in complications was demonstrated. Thus, opportunities for trainees to perform procedures increased, but the presence of a faculty member to provide direct supervision did not improve the quality of the procedures accomplished.

Together these reports highlight challenges and opportunities in training residents to perform invasive procedures. Both studies involved the procedural skills of residents. One used an educational intervention, the other featured faculty supervision. Both studies produced outcomes that suggest improved procedural training, but neither improved the actual quality of delivered care. A brief educational intervention increased resident confidence and knowledge but did not increase the quality or number of procedures performed by residents. Opportunities to perform invasive procedures increased dramatically when an experienced attending physician was available to supervise residents. However, more education was not provided, and the quality of procedures performed did not improve.

Given these limitations, how should physicians learn to perform invasive procedures? We endorse a systematic approach to achieve high levels of procedural skills in resident physicians. First, procedures should be carefully selected. Only those essential to future practice should be required. If possible, opportunities should be available for selected trainees to develop skills in performing additional procedures relevant to their future careers. An example would be the opportunity for residents in a hospitalist track to develop proficiency in central venous catheter insertion through clinical experience, didactic education, and rigorous skill assessment. Second, dedicated programs are needed to train and assess residents in procedural skills. Reliance on clinical experience alone is inadequate because of the low frequency at which most procedures are performed and the inability to standardize assessments in routine clinical practice.

Simulation technology is a powerful adjunct to traditional clinical training and has been demonstrated to be highly effective in developing procedural skills in disciplines such as endoscopy9 and laparoscopic surgery.10 At our institution, a simulation‐based training program has been used to help residents achieve11 and maintain12 a high level of skill in performing advanced cardiac life support procedures. We use simulation to provide opportunities for deliberate practice in a controlled environment in which immediate feedback is emphasized and mastery levels are reached. The rigorous curriculum is standardized, but learner progress is individualized depending on the practice time needed to achieve competency standards.

Most important, when training physicians to perform invasive procedures, it is critical to use interventions and training programs that can be linked to improvements in actual clinical care. The studies by Miranda et al. and Lucas et al. highlight the utility of focused educational programs to complement clinical training as well as the positive impact of direct faculty supervision. These results are important starting points for programs to consider as they train and certify residents in required procedural skills. However, much work remains to be done. These studies have revealed that improvements in patient care outcomes are not likely to occur unless robust, learner‐centered educational programs are combined with adequate opportunities for residents to perform procedures under appropriate supervision.

References
  1. Wigton RS,Alguire P.The declining number and variety of procedures done by general internists: a resurvey of members of the American College of Physicians.Ann Intern Med.2007;146:355360.
  2. Accreditation Council for Graduate Medical Education. Outcome project: general competencies. Available at: http://www.acgme.org/outcome/comp/compFull.asp#1. Accessed January 28,2007.
  3. American Board of Internal Medicine. Requirements for certification in internal medicine. Available at: http://www.abim.org/cert/policiesim.shtm. Accessed January 28,2007.
  4. Duffy FD,Holmboe ES.What procedures should internists do?Ann Intern Med.2007;146:392393.
  5. Miranda JA,Trick WE,Evans AT,Charles‐Damte M,Reilly BM,Clarke P.Firm‐based trial to improve central venous catheter insertion practices.J Hosp Med.2007;2:135142.
  6. Ericsson KA.Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains.Acad Med.2004 Oct;79(10 Suppl):S70S81.
  7. Cordray DS,Pion GM.Treatment strength and integrity: models and methods. In:Bootzin RR,McKnight PE, eds.Strengthening Research Methodology: Psychological Measurement and Evaluation.Washington, DC:American Psychological Association;2006:103124.
  8. Lucas BP,Asbury JK,Wang Y, et al.Impact of a bedside procedure service on general medicine inpatients: a firm‐based trial.J Hosp Med.2007;2:143149.
  9. Cohen J,Cohen SA,Vora KC, et al.Multicenter, randomized, controlled trial of virtual‐reality simulator training in acquisition of competency in colonoscopy.Gastrointest Endosc.2006;64:361368.
  10. Andreatta PB,Woodrum DT,Birkmeyer JD, et al.Laparoscopic skills are improved with LapMentor training: results of a randomized, double‐blinded study.Ann Surg.2006;243:854860.
  11. Wayne DB,Butter J,Siddall VJ, et al.Mastery learning of advanced cardiac life support skills by internal medicine using simulation technology and deliberate practice.J Gen Intern Med.2006;21:251256.
  12. Wayne DB,Siddall VJ,Butter J, et al.A longitudinal study of internal medicine residents' retention of advanced cardiac support life skills.Acad Med.2006;81(10 Suppl):S9S12.
References
  1. Wigton RS,Alguire P.The declining number and variety of procedures done by general internists: a resurvey of members of the American College of Physicians.Ann Intern Med.2007;146:355360.
  2. Accreditation Council for Graduate Medical Education. Outcome project: general competencies. Available at: http://www.acgme.org/outcome/comp/compFull.asp#1. Accessed January 28,2007.
  3. American Board of Internal Medicine. Requirements for certification in internal medicine. Available at: http://www.abim.org/cert/policiesim.shtm. Accessed January 28,2007.
  4. Duffy FD,Holmboe ES.What procedures should internists do?Ann Intern Med.2007;146:392393.
  5. Miranda JA,Trick WE,Evans AT,Charles‐Damte M,Reilly BM,Clarke P.Firm‐based trial to improve central venous catheter insertion practices.J Hosp Med.2007;2:135142.
  6. Ericsson KA.Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains.Acad Med.2004 Oct;79(10 Suppl):S70S81.
  7. Cordray DS,Pion GM.Treatment strength and integrity: models and methods. In:Bootzin RR,McKnight PE, eds.Strengthening Research Methodology: Psychological Measurement and Evaluation.Washington, DC:American Psychological Association;2006:103124.
  8. Lucas BP,Asbury JK,Wang Y, et al.Impact of a bedside procedure service on general medicine inpatients: a firm‐based trial.J Hosp Med.2007;2:143149.
  9. Cohen J,Cohen SA,Vora KC, et al.Multicenter, randomized, controlled trial of virtual‐reality simulator training in acquisition of competency in colonoscopy.Gastrointest Endosc.2006;64:361368.
  10. Andreatta PB,Woodrum DT,Birkmeyer JD, et al.Laparoscopic skills are improved with LapMentor training: results of a randomized, double‐blinded study.Ann Surg.2006;243:854860.
  11. Wayne DB,Butter J,Siddall VJ, et al.Mastery learning of advanced cardiac life support skills by internal medicine using simulation technology and deliberate practice.J Gen Intern Med.2006;21:251256.
  12. Wayne DB,Siddall VJ,Butter J, et al.A longitudinal study of internal medicine residents' retention of advanced cardiac support life skills.Acad Med.2006;81(10 Suppl):S9S12.
Issue
Journal of Hospital Medicine - 2(3)
Issue
Journal of Hospital Medicine - 2(3)
Page Number
123-125
Page Number
123-125
Publications
Publications
Article Type
Display Headline
Procedural training at a crossroads: Striking a balance between education, patient safety, and quality
Display Headline
Procedural training at a crossroads: Striking a balance between education, patient safety, and quality
Sections
Article Source
Copyright © 2007 Society of Hospital Medicine
Disallow All Ads
Correspondence Location
Department of Medicine, 251 E. Huron St., Galter 3‐150, Chicago, IL 60611; Fax: (312) 926‐6905
Content Gating
Gated (full article locked unless allowed per User)
Gating Strategy
First Peek Free
Article PDF Media