Affiliations
Pritzker School of Medicine, University of Chicago
Department of Medicine, University of Chicago
Email
varora@medicine.bsd.uchicago.edu
Given name(s)
Vineet M.
Family name
Arora
Degrees
MD, MAPP

Survey of Hospitalist Supervision

Article Type
Changed
Mon, 01/02/2017 - 19:34
Display Headline
Survey of overnight academic hospitalist supervision of trainees

In 2003, the Accreditation Council for Graduate Medical Education (ACGME) announced the first in a series of guidelines related to the regulation and oversight of residency training.1 The initial iteration specifically focused on the total and consecutive numbers of duty hours worked by trainees. These limitations began a new era of shift work in internal medicine residency training. With decreases in housestaff admitting capacity, clinical work has frequently been offloaded to non‐teaching or attending‐only services, increasing the demand for hospitalists to fill the void in physician‐staffed care in the hospital.2, 3 Since the implementation of the 2003 ACGME guidelines and a growing focus on patient safety, there has been increased study of, and call for, oversight of trainees in medicine; among these was the 2008 Institute of Medicine report,4 calling for 24/7 attending‐level supervision. The updated ACGME requirements,5 effective July 1, 2011, mandate enhanced on‐site supervision of trainee physicians. These new regulations not only define varying levels of supervision for trainees, including direct supervision with the physical presence of a supervisor and the degree of availability of said supervisor, they also describe ensuring the quality of supervision provided.5 While continuous attending‐level supervision is not yet mandated, many residency programs look to their academic hospitalists to fill the supervisory void, particularly at night. However, what specific roles hospitalists play in the nighttime supervision of trainees or the impact of this supervision remains unclear. To date, no study has examined a broad sample of hospitalist programs in teaching hospitals and the types of resident oversight they provide. We aimed to describe the current state of academic hospitalists in the clinical supervision of housestaff, specifically during the overnight period, and hospitalist perceptions of how the new ACGME requirements would impact traineehospitalist interactions.

METHODS

The Housestaff Oversight Subcommittee, a working group of the Society of General Internal Medicine (SGIM) Academic Hospitalist Task Force, surveyed a sample of academic hospitalist program leaders to assess the current status of trainee supervision performed by hospitalists. Programs were considered academic if they were located in the primary hospital of a residency that participates in the National Resident Matching Program for Internal Medicine. To obtain a broad geographic spectrum of academic hospitalist programs, all programs, both university and community‐based, in 4 states and 2 metropolitan regions were sampled: Washington, Oregon, Texas, Maryland, and the Philadelphia and Chicago metropolitan areas. Hospitalist program leaders were identified by members of the Taskforce using individual program websites and by querying departmental leadership at eligible teaching hospitals. Respondents were contacted by e‐mail for participation. None of the authors of the manuscript were participants in the survey.

The survey was developed by consensus of the working group after reviewing the salient literature and included additional questions queried to internal medicine program directors.6 The 19‐item SurveyMonkey instrument included questions about hospitalists' role in trainees' education and evaluation. A Likert‐type scale was used to assess perceptions regarding the impact of on‐site hospitalist supervision on trainee autonomy and hospitalist workload (1 = strongly disagree to 5 = strongly agree). Descriptive statistics were performed and, where appropriate, t test and Fisher's exact test were performed to identify associations between program characteristics and perceptions. Stata SE was used (STATA Corp, College Station, TX) for all statistical analysis.

RESULTS

The survey was sent to 47 individuals identified as likely hospitalist program leaders and completed by 41 individuals (87%). However, 7 respondents turned out not to be program leaders and were therefore excluded, resulting in a 72% (34/47) survey response rate.

The programs for which we did not obtain responses were similar to respondent programs, and did not include a larger proportion of community‐based programs or overrepresent a specific geographic region. Twenty‐five (73%) of the 34 hospitalist program leaders were male, with an average age of 44.3 years, and an average of 12 years post‐residency training (range, 530 years). They reported leading groups with an average of 18 full‐time equivalent (FTE) faculty (range, 350 persons).

Relationship of Hospitalist Programs With the Residency Program

The majority (32/34, 94%) of respondents describe their program as having traditional housestaffhospitalist interactions on an attending‐covered housestaff teaching service. Other hospitalists' clinical roles included: attending on uncovered (non‐housestaff services; 29/34, 85%); nighttime coverage (24/34, 70%); attending on consult services with housestaff (24/34, 70%). All respondents reported that hospitalist faculty are expected to participate in housestaff teaching or to fulfill other educational roles within the residency training program. These educational roles include participating in didactics or educational conferences, and serving as advisors. Additionally, the faculty of 30 (88%) programs have a formal evaluative role over the housestaff they supervise on teaching services (eg, members of formal housestaff evaluation committee). Finally, 28 (82%) programs have faculty who play administrative roles in the residency programs, such as involvement in program leadership or recruitment. Although 63% of the corresponding internal medicine residency programs have a formal housestaff supervision policy, only 43% of program leaders stated that their hospitalists receive formal faculty development on how to provide this supervision to resident trainees. Instead, the majority of hospitalist programs were described as having teaching expectations in the absence of a formal policy.

Twenty‐one programs (21/34, 61%) described having an attending hospitalist physician on‐site overnight to provide ongoing patient care or admit new patients. Of those with on‐site attending coverage, a minority of programs (8/21, 38%) reported having a formal defined supervisory role of housestaff trainees for hospitalists during the overnight period. In these 8 programs, this defined role included a requirement for housestaff to present newly admitted patients or contact hospitalists with questions regarding patient management. Twenty‐four percent (5/21) of the programs with nighttime coverage stated that the role of the nocturnal attending was only to cover the non‐teaching services, without housestaff interaction or supervision. The remainder of programs (8/21, 38%) describe only informal interactions between housestaff and hospitalist faculty, without clearly defined expectations for supervision.

Perceptions of New Regulations and Night Work

Hospitalist leaders viewed increased supervision of housestaff both positively and negatively. Leaders were asked their level of agreement with the potential impact of increased hospitalist nighttime supervision. Of respondents, 85% (27/32) agreed that formal overnight supervision by an attending hospitalist would improve patient safety, and 60% (20/33) agreed that formal overnight supervision would improve traineehospitalist relationships. In addition, 60% (20/33) of respondents felt that nighttime supervision of housestaff by faculty hospitalists would improve resident education. However, approximately 40% (13/33) expressed concern that increased on‐site hospitalist supervision would hamper resident decision‐making autonomy, and 75% (25/33) agreed that a formal housestaff supervisory role would increase hospitalist work load. The perception of increased workload was influenced by a hospitalist program's current supervisory role. Hospitalists programs providing formal nighttime supervision for housestaff, compared to those with informal or poorly defined faculty roles, were less likely to perceive these new regulations as resulting in an increase in hospitalist workload (3.72 vs 4.42; P = 0.02). In addition, hospitalist programs with a formal nighttime role were more likely to identify lack of specific parameters for attending‐level contact as a barrier to residents not contacting their supervisors during the overnight period (2.54 vs 3.54; P = 0.03). No differences in perception of the regulations were noted for those hospitalist programs which had existing faculty development on clinical supervision.

DISCUSSION

This study provides important information about how academic hospitalists currently contribute to the supervision of internal medicine residents. While academic hospitalist groups frequently have faculty providing clinical care on‐site at night, and often hospitalists provide overnight supervision of internal medicine trainees, formal supervision of trainees is not uniform, and few hospitalists groups have a mechanism to provide training or faculty development on how to effectively supervise resident trainees. Hospitalist leaders expressed concerns that creating additional formal overnight supervisory responsibilities may add to an already burdened overnight hospitalist. Formalizing this supervisory role, including explicit role definitions and faculty training for trainee supervision, is necessary.

Though our sample size is small, we captured a diverse geographic range of both university and community‐based academic hospitalist programs by surveying group leaders in several distinct regions. We are unable to comment on differences between responding and non‐responding hospitalist programs, but there does not appear to be a systematic difference between these groups.

Our findings are consistent with work describing a lack of structured conceptual frameworks in effectively supervising trainees,7, 8 and also, at times, nebulous expectations for hospitalist faculty. We found that the existence of a formal supervisory policy within the associated residency program, as well as defined roles for hospitalists, increases the likelihood of positive perceptions of the new ACGME supervisory recommendations. However, the existence of these requirements does not mean that all programs are capable of following them. While additional discussion is required to best delineate a formal overnight hospitalist role in trainee supervision, clearly defining expectations for both faculty and trainees, and their interactions, may alleviate the struggles that exist in programs with ill‐defined roles for hospitalist faculty supervision. While faculty duty hours standards do not exist, additional duties of nighttime coverage for hospitalists suggests that close attention should be paid to burn‐out.9 Faculty development on nighttime supervision and teaching may help maximize both learning and patient care efficiency, and provide a framework for this often unstructured educational time.

Acknowledgements

The research reported here was supported by the Department of Veterans Affairs, Veterans Health Administration, Health Services Research and Development Service (REA 05‐129, CDA 07‐022). The views expressed in this article are those of the authors and do not necessarily reflect the position or policy of the Department of Veterans Affairs.

Files
References
  1. Philibert I,Friedman P,Williams WT.New requirements for resident duty hours.JAMA.2002;288:11121114.
  2. Nuckol T,Bhattacharya J,Wolman DM,Ulmer C,Escarce J.Cost implications of reduced work hours and workloads for resident physicians.N Engl J Med.2009;360:22022215.
  3. Horwitz L.Why have working hour restrictions apparently not improved patient safety?BMJ.2011;342:d1200.
  4. Ulmer C, Wolman DM, Johns MME, eds.Resident Duty Hours: Enhancing Sleep, Supervision, and Safety.Washington, DC:National Academies Press;2008.
  5. Nasca TJ,Day SH,Amis ES;for the ACGME Duty Hour Task Force.The new recommendations on duty hours from the ACGME Task Force.N Engl J Med.2010;363.
  6. Association of Program Directors in Internal Medicine (APDIM) Survey 2009. Available at: http://www.im.org/toolbox/surveys/SurveyDataand Reports/APDIMSurveyData/Documents/2009_APDIM_summary_web. pdf. Accessed on July 30, 2012.
  7. Kennedy TJ,Lingard L,Baker GR,Kitchen L,Regehr G.Clinical oversight: conceptualizing the relationship between supervision and safety.J Gen Intern Med.2007;22(8):10801085.
  8. Farnan JM,Johnson JK,Meltzer DO, et al.Strategies for effective on‐call supervision for internal medicine residents: the SUPERB/SAFETY model.J Grad Med Educ.2010;2(1):4652.
  9. Glasheen J,Misky G,Reid M,Harrison R,Sharpe B,Auerbach A.Career satisfaction and burn‐out in academic hospital medicine.Arch Intern Med.2011;171(8):782785.
Article PDF
Issue
Journal of Hospital Medicine - 7(7)
Publications
Page Number
521-523
Sections
Files
Files
Article PDF
Article PDF

In 2003, the Accreditation Council for Graduate Medical Education (ACGME) announced the first in a series of guidelines related to the regulation and oversight of residency training.1 The initial iteration specifically focused on the total and consecutive numbers of duty hours worked by trainees. These limitations began a new era of shift work in internal medicine residency training. With decreases in housestaff admitting capacity, clinical work has frequently been offloaded to non‐teaching or attending‐only services, increasing the demand for hospitalists to fill the void in physician‐staffed care in the hospital.2, 3 Since the implementation of the 2003 ACGME guidelines and a growing focus on patient safety, there has been increased study of, and call for, oversight of trainees in medicine; among these was the 2008 Institute of Medicine report,4 calling for 24/7 attending‐level supervision. The updated ACGME requirements,5 effective July 1, 2011, mandate enhanced on‐site supervision of trainee physicians. These new regulations not only define varying levels of supervision for trainees, including direct supervision with the physical presence of a supervisor and the degree of availability of said supervisor, they also describe ensuring the quality of supervision provided.5 While continuous attending‐level supervision is not yet mandated, many residency programs look to their academic hospitalists to fill the supervisory void, particularly at night. However, what specific roles hospitalists play in the nighttime supervision of trainees or the impact of this supervision remains unclear. To date, no study has examined a broad sample of hospitalist programs in teaching hospitals and the types of resident oversight they provide. We aimed to describe the current state of academic hospitalists in the clinical supervision of housestaff, specifically during the overnight period, and hospitalist perceptions of how the new ACGME requirements would impact traineehospitalist interactions.

METHODS

The Housestaff Oversight Subcommittee, a working group of the Society of General Internal Medicine (SGIM) Academic Hospitalist Task Force, surveyed a sample of academic hospitalist program leaders to assess the current status of trainee supervision performed by hospitalists. Programs were considered academic if they were located in the primary hospital of a residency that participates in the National Resident Matching Program for Internal Medicine. To obtain a broad geographic spectrum of academic hospitalist programs, all programs, both university and community‐based, in 4 states and 2 metropolitan regions were sampled: Washington, Oregon, Texas, Maryland, and the Philadelphia and Chicago metropolitan areas. Hospitalist program leaders were identified by members of the Taskforce using individual program websites and by querying departmental leadership at eligible teaching hospitals. Respondents were contacted by e‐mail for participation. None of the authors of the manuscript were participants in the survey.

The survey was developed by consensus of the working group after reviewing the salient literature and included additional questions queried to internal medicine program directors.6 The 19‐item SurveyMonkey instrument included questions about hospitalists' role in trainees' education and evaluation. A Likert‐type scale was used to assess perceptions regarding the impact of on‐site hospitalist supervision on trainee autonomy and hospitalist workload (1 = strongly disagree to 5 = strongly agree). Descriptive statistics were performed and, where appropriate, t test and Fisher's exact test were performed to identify associations between program characteristics and perceptions. Stata SE was used (STATA Corp, College Station, TX) for all statistical analysis.

RESULTS

The survey was sent to 47 individuals identified as likely hospitalist program leaders and completed by 41 individuals (87%). However, 7 respondents turned out not to be program leaders and were therefore excluded, resulting in a 72% (34/47) survey response rate.

The programs for which we did not obtain responses were similar to respondent programs, and did not include a larger proportion of community‐based programs or overrepresent a specific geographic region. Twenty‐five (73%) of the 34 hospitalist program leaders were male, with an average age of 44.3 years, and an average of 12 years post‐residency training (range, 530 years). They reported leading groups with an average of 18 full‐time equivalent (FTE) faculty (range, 350 persons).

Relationship of Hospitalist Programs With the Residency Program

The majority (32/34, 94%) of respondents describe their program as having traditional housestaffhospitalist interactions on an attending‐covered housestaff teaching service. Other hospitalists' clinical roles included: attending on uncovered (non‐housestaff services; 29/34, 85%); nighttime coverage (24/34, 70%); attending on consult services with housestaff (24/34, 70%). All respondents reported that hospitalist faculty are expected to participate in housestaff teaching or to fulfill other educational roles within the residency training program. These educational roles include participating in didactics or educational conferences, and serving as advisors. Additionally, the faculty of 30 (88%) programs have a formal evaluative role over the housestaff they supervise on teaching services (eg, members of formal housestaff evaluation committee). Finally, 28 (82%) programs have faculty who play administrative roles in the residency programs, such as involvement in program leadership or recruitment. Although 63% of the corresponding internal medicine residency programs have a formal housestaff supervision policy, only 43% of program leaders stated that their hospitalists receive formal faculty development on how to provide this supervision to resident trainees. Instead, the majority of hospitalist programs were described as having teaching expectations in the absence of a formal policy.

Twenty‐one programs (21/34, 61%) described having an attending hospitalist physician on‐site overnight to provide ongoing patient care or admit new patients. Of those with on‐site attending coverage, a minority of programs (8/21, 38%) reported having a formal defined supervisory role of housestaff trainees for hospitalists during the overnight period. In these 8 programs, this defined role included a requirement for housestaff to present newly admitted patients or contact hospitalists with questions regarding patient management. Twenty‐four percent (5/21) of the programs with nighttime coverage stated that the role of the nocturnal attending was only to cover the non‐teaching services, without housestaff interaction or supervision. The remainder of programs (8/21, 38%) describe only informal interactions between housestaff and hospitalist faculty, without clearly defined expectations for supervision.

Perceptions of New Regulations and Night Work

Hospitalist leaders viewed increased supervision of housestaff both positively and negatively. Leaders were asked their level of agreement with the potential impact of increased hospitalist nighttime supervision. Of respondents, 85% (27/32) agreed that formal overnight supervision by an attending hospitalist would improve patient safety, and 60% (20/33) agreed that formal overnight supervision would improve traineehospitalist relationships. In addition, 60% (20/33) of respondents felt that nighttime supervision of housestaff by faculty hospitalists would improve resident education. However, approximately 40% (13/33) expressed concern that increased on‐site hospitalist supervision would hamper resident decision‐making autonomy, and 75% (25/33) agreed that a formal housestaff supervisory role would increase hospitalist work load. The perception of increased workload was influenced by a hospitalist program's current supervisory role. Hospitalists programs providing formal nighttime supervision for housestaff, compared to those with informal or poorly defined faculty roles, were less likely to perceive these new regulations as resulting in an increase in hospitalist workload (3.72 vs 4.42; P = 0.02). In addition, hospitalist programs with a formal nighttime role were more likely to identify lack of specific parameters for attending‐level contact as a barrier to residents not contacting their supervisors during the overnight period (2.54 vs 3.54; P = 0.03). No differences in perception of the regulations were noted for those hospitalist programs which had existing faculty development on clinical supervision.

DISCUSSION

This study provides important information about how academic hospitalists currently contribute to the supervision of internal medicine residents. While academic hospitalist groups frequently have faculty providing clinical care on‐site at night, and often hospitalists provide overnight supervision of internal medicine trainees, formal supervision of trainees is not uniform, and few hospitalists groups have a mechanism to provide training or faculty development on how to effectively supervise resident trainees. Hospitalist leaders expressed concerns that creating additional formal overnight supervisory responsibilities may add to an already burdened overnight hospitalist. Formalizing this supervisory role, including explicit role definitions and faculty training for trainee supervision, is necessary.

Though our sample size is small, we captured a diverse geographic range of both university and community‐based academic hospitalist programs by surveying group leaders in several distinct regions. We are unable to comment on differences between responding and non‐responding hospitalist programs, but there does not appear to be a systematic difference between these groups.

Our findings are consistent with work describing a lack of structured conceptual frameworks in effectively supervising trainees,7, 8 and also, at times, nebulous expectations for hospitalist faculty. We found that the existence of a formal supervisory policy within the associated residency program, as well as defined roles for hospitalists, increases the likelihood of positive perceptions of the new ACGME supervisory recommendations. However, the existence of these requirements does not mean that all programs are capable of following them. While additional discussion is required to best delineate a formal overnight hospitalist role in trainee supervision, clearly defining expectations for both faculty and trainees, and their interactions, may alleviate the struggles that exist in programs with ill‐defined roles for hospitalist faculty supervision. While faculty duty hours standards do not exist, additional duties of nighttime coverage for hospitalists suggests that close attention should be paid to burn‐out.9 Faculty development on nighttime supervision and teaching may help maximize both learning and patient care efficiency, and provide a framework for this often unstructured educational time.

Acknowledgements

The research reported here was supported by the Department of Veterans Affairs, Veterans Health Administration, Health Services Research and Development Service (REA 05‐129, CDA 07‐022). The views expressed in this article are those of the authors and do not necessarily reflect the position or policy of the Department of Veterans Affairs.

In 2003, the Accreditation Council for Graduate Medical Education (ACGME) announced the first in a series of guidelines related to the regulation and oversight of residency training.1 The initial iteration specifically focused on the total and consecutive numbers of duty hours worked by trainees. These limitations began a new era of shift work in internal medicine residency training. With decreases in housestaff admitting capacity, clinical work has frequently been offloaded to non‐teaching or attending‐only services, increasing the demand for hospitalists to fill the void in physician‐staffed care in the hospital.2, 3 Since the implementation of the 2003 ACGME guidelines and a growing focus on patient safety, there has been increased study of, and call for, oversight of trainees in medicine; among these was the 2008 Institute of Medicine report,4 calling for 24/7 attending‐level supervision. The updated ACGME requirements,5 effective July 1, 2011, mandate enhanced on‐site supervision of trainee physicians. These new regulations not only define varying levels of supervision for trainees, including direct supervision with the physical presence of a supervisor and the degree of availability of said supervisor, they also describe ensuring the quality of supervision provided.5 While continuous attending‐level supervision is not yet mandated, many residency programs look to their academic hospitalists to fill the supervisory void, particularly at night. However, what specific roles hospitalists play in the nighttime supervision of trainees or the impact of this supervision remains unclear. To date, no study has examined a broad sample of hospitalist programs in teaching hospitals and the types of resident oversight they provide. We aimed to describe the current state of academic hospitalists in the clinical supervision of housestaff, specifically during the overnight period, and hospitalist perceptions of how the new ACGME requirements would impact traineehospitalist interactions.

METHODS

The Housestaff Oversight Subcommittee, a working group of the Society of General Internal Medicine (SGIM) Academic Hospitalist Task Force, surveyed a sample of academic hospitalist program leaders to assess the current status of trainee supervision performed by hospitalists. Programs were considered academic if they were located in the primary hospital of a residency that participates in the National Resident Matching Program for Internal Medicine. To obtain a broad geographic spectrum of academic hospitalist programs, all programs, both university and community‐based, in 4 states and 2 metropolitan regions were sampled: Washington, Oregon, Texas, Maryland, and the Philadelphia and Chicago metropolitan areas. Hospitalist program leaders were identified by members of the Taskforce using individual program websites and by querying departmental leadership at eligible teaching hospitals. Respondents were contacted by e‐mail for participation. None of the authors of the manuscript were participants in the survey.

The survey was developed by consensus of the working group after reviewing the salient literature and included additional questions queried to internal medicine program directors.6 The 19‐item SurveyMonkey instrument included questions about hospitalists' role in trainees' education and evaluation. A Likert‐type scale was used to assess perceptions regarding the impact of on‐site hospitalist supervision on trainee autonomy and hospitalist workload (1 = strongly disagree to 5 = strongly agree). Descriptive statistics were performed and, where appropriate, t test and Fisher's exact test were performed to identify associations between program characteristics and perceptions. Stata SE was used (STATA Corp, College Station, TX) for all statistical analysis.

RESULTS

The survey was sent to 47 individuals identified as likely hospitalist program leaders and completed by 41 individuals (87%). However, 7 respondents turned out not to be program leaders and were therefore excluded, resulting in a 72% (34/47) survey response rate.

The programs for which we did not obtain responses were similar to respondent programs, and did not include a larger proportion of community‐based programs or overrepresent a specific geographic region. Twenty‐five (73%) of the 34 hospitalist program leaders were male, with an average age of 44.3 years, and an average of 12 years post‐residency training (range, 530 years). They reported leading groups with an average of 18 full‐time equivalent (FTE) faculty (range, 350 persons).

Relationship of Hospitalist Programs With the Residency Program

The majority (32/34, 94%) of respondents describe their program as having traditional housestaffhospitalist interactions on an attending‐covered housestaff teaching service. Other hospitalists' clinical roles included: attending on uncovered (non‐housestaff services; 29/34, 85%); nighttime coverage (24/34, 70%); attending on consult services with housestaff (24/34, 70%). All respondents reported that hospitalist faculty are expected to participate in housestaff teaching or to fulfill other educational roles within the residency training program. These educational roles include participating in didactics or educational conferences, and serving as advisors. Additionally, the faculty of 30 (88%) programs have a formal evaluative role over the housestaff they supervise on teaching services (eg, members of formal housestaff evaluation committee). Finally, 28 (82%) programs have faculty who play administrative roles in the residency programs, such as involvement in program leadership or recruitment. Although 63% of the corresponding internal medicine residency programs have a formal housestaff supervision policy, only 43% of program leaders stated that their hospitalists receive formal faculty development on how to provide this supervision to resident trainees. Instead, the majority of hospitalist programs were described as having teaching expectations in the absence of a formal policy.

Twenty‐one programs (21/34, 61%) described having an attending hospitalist physician on‐site overnight to provide ongoing patient care or admit new patients. Of those with on‐site attending coverage, a minority of programs (8/21, 38%) reported having a formal defined supervisory role of housestaff trainees for hospitalists during the overnight period. In these 8 programs, this defined role included a requirement for housestaff to present newly admitted patients or contact hospitalists with questions regarding patient management. Twenty‐four percent (5/21) of the programs with nighttime coverage stated that the role of the nocturnal attending was only to cover the non‐teaching services, without housestaff interaction or supervision. The remainder of programs (8/21, 38%) describe only informal interactions between housestaff and hospitalist faculty, without clearly defined expectations for supervision.

Perceptions of New Regulations and Night Work

Hospitalist leaders viewed increased supervision of housestaff both positively and negatively. Leaders were asked their level of agreement with the potential impact of increased hospitalist nighttime supervision. Of respondents, 85% (27/32) agreed that formal overnight supervision by an attending hospitalist would improve patient safety, and 60% (20/33) agreed that formal overnight supervision would improve traineehospitalist relationships. In addition, 60% (20/33) of respondents felt that nighttime supervision of housestaff by faculty hospitalists would improve resident education. However, approximately 40% (13/33) expressed concern that increased on‐site hospitalist supervision would hamper resident decision‐making autonomy, and 75% (25/33) agreed that a formal housestaff supervisory role would increase hospitalist work load. The perception of increased workload was influenced by a hospitalist program's current supervisory role. Hospitalists programs providing formal nighttime supervision for housestaff, compared to those with informal or poorly defined faculty roles, were less likely to perceive these new regulations as resulting in an increase in hospitalist workload (3.72 vs 4.42; P = 0.02). In addition, hospitalist programs with a formal nighttime role were more likely to identify lack of specific parameters for attending‐level contact as a barrier to residents not contacting their supervisors during the overnight period (2.54 vs 3.54; P = 0.03). No differences in perception of the regulations were noted for those hospitalist programs which had existing faculty development on clinical supervision.

DISCUSSION

This study provides important information about how academic hospitalists currently contribute to the supervision of internal medicine residents. While academic hospitalist groups frequently have faculty providing clinical care on‐site at night, and often hospitalists provide overnight supervision of internal medicine trainees, formal supervision of trainees is not uniform, and few hospitalists groups have a mechanism to provide training or faculty development on how to effectively supervise resident trainees. Hospitalist leaders expressed concerns that creating additional formal overnight supervisory responsibilities may add to an already burdened overnight hospitalist. Formalizing this supervisory role, including explicit role definitions and faculty training for trainee supervision, is necessary.

Though our sample size is small, we captured a diverse geographic range of both university and community‐based academic hospitalist programs by surveying group leaders in several distinct regions. We are unable to comment on differences between responding and non‐responding hospitalist programs, but there does not appear to be a systematic difference between these groups.

Our findings are consistent with work describing a lack of structured conceptual frameworks in effectively supervising trainees,7, 8 and also, at times, nebulous expectations for hospitalist faculty. We found that the existence of a formal supervisory policy within the associated residency program, as well as defined roles for hospitalists, increases the likelihood of positive perceptions of the new ACGME supervisory recommendations. However, the existence of these requirements does not mean that all programs are capable of following them. While additional discussion is required to best delineate a formal overnight hospitalist role in trainee supervision, clearly defining expectations for both faculty and trainees, and their interactions, may alleviate the struggles that exist in programs with ill‐defined roles for hospitalist faculty supervision. While faculty duty hours standards do not exist, additional duties of nighttime coverage for hospitalists suggests that close attention should be paid to burn‐out.9 Faculty development on nighttime supervision and teaching may help maximize both learning and patient care efficiency, and provide a framework for this often unstructured educational time.

Acknowledgements

The research reported here was supported by the Department of Veterans Affairs, Veterans Health Administration, Health Services Research and Development Service (REA 05‐129, CDA 07‐022). The views expressed in this article are those of the authors and do not necessarily reflect the position or policy of the Department of Veterans Affairs.

References
  1. Philibert I,Friedman P,Williams WT.New requirements for resident duty hours.JAMA.2002;288:11121114.
  2. Nuckol T,Bhattacharya J,Wolman DM,Ulmer C,Escarce J.Cost implications of reduced work hours and workloads for resident physicians.N Engl J Med.2009;360:22022215.
  3. Horwitz L.Why have working hour restrictions apparently not improved patient safety?BMJ.2011;342:d1200.
  4. Ulmer C, Wolman DM, Johns MME, eds.Resident Duty Hours: Enhancing Sleep, Supervision, and Safety.Washington, DC:National Academies Press;2008.
  5. Nasca TJ,Day SH,Amis ES;for the ACGME Duty Hour Task Force.The new recommendations on duty hours from the ACGME Task Force.N Engl J Med.2010;363.
  6. Association of Program Directors in Internal Medicine (APDIM) Survey 2009. Available at: http://www.im.org/toolbox/surveys/SurveyDataand Reports/APDIMSurveyData/Documents/2009_APDIM_summary_web. pdf. Accessed on July 30, 2012.
  7. Kennedy TJ,Lingard L,Baker GR,Kitchen L,Regehr G.Clinical oversight: conceptualizing the relationship between supervision and safety.J Gen Intern Med.2007;22(8):10801085.
  8. Farnan JM,Johnson JK,Meltzer DO, et al.Strategies for effective on‐call supervision for internal medicine residents: the SUPERB/SAFETY model.J Grad Med Educ.2010;2(1):4652.
  9. Glasheen J,Misky G,Reid M,Harrison R,Sharpe B,Auerbach A.Career satisfaction and burn‐out in academic hospital medicine.Arch Intern Med.2011;171(8):782785.
References
  1. Philibert I,Friedman P,Williams WT.New requirements for resident duty hours.JAMA.2002;288:11121114.
  2. Nuckol T,Bhattacharya J,Wolman DM,Ulmer C,Escarce J.Cost implications of reduced work hours and workloads for resident physicians.N Engl J Med.2009;360:22022215.
  3. Horwitz L.Why have working hour restrictions apparently not improved patient safety?BMJ.2011;342:d1200.
  4. Ulmer C, Wolman DM, Johns MME, eds.Resident Duty Hours: Enhancing Sleep, Supervision, and Safety.Washington, DC:National Academies Press;2008.
  5. Nasca TJ,Day SH,Amis ES;for the ACGME Duty Hour Task Force.The new recommendations on duty hours from the ACGME Task Force.N Engl J Med.2010;363.
  6. Association of Program Directors in Internal Medicine (APDIM) Survey 2009. Available at: http://www.im.org/toolbox/surveys/SurveyDataand Reports/APDIMSurveyData/Documents/2009_APDIM_summary_web. pdf. Accessed on July 30, 2012.
  7. Kennedy TJ,Lingard L,Baker GR,Kitchen L,Regehr G.Clinical oversight: conceptualizing the relationship between supervision and safety.J Gen Intern Med.2007;22(8):10801085.
  8. Farnan JM,Johnson JK,Meltzer DO, et al.Strategies for effective on‐call supervision for internal medicine residents: the SUPERB/SAFETY model.J Grad Med Educ.2010;2(1):4652.
  9. Glasheen J,Misky G,Reid M,Harrison R,Sharpe B,Auerbach A.Career satisfaction and burn‐out in academic hospital medicine.Arch Intern Med.2011;171(8):782785.
Issue
Journal of Hospital Medicine - 7(7)
Issue
Journal of Hospital Medicine - 7(7)
Page Number
521-523
Page Number
521-523
Publications
Publications
Article Type
Display Headline
Survey of overnight academic hospitalist supervision of trainees
Display Headline
Survey of overnight academic hospitalist supervision of trainees
Sections
Article Source
Copyright © 2012 Society of Hospital Medicine
Disallow All Ads
Correspondence Location
Department of Medicine and Pritzker School of Medicine, The University of Chicago, 5841 S Maryland Ave, MC 2007, AMB W216, Chicago, IL 60637
Content Gating
Gated (full article locked unless allowed per User)
Gating Strategy
First Peek Free
Article PDF Media
Media Files

Unprofessional Behavior and Hospitalists

Article Type
Changed
Mon, 05/22/2017 - 18:36
Display Headline
Participation in unprofessional behaviors among hospitalists: A multicenter study

The discrepancy between what is taught about professionalism in formal medical education and what is witnessed in the hospital has received increasing attention.17 This latter aspect of medical education contributes to the hidden curriculum and impacts medical trainees' views on professionalism.8 The hidden curriculum refers to the lessons trainees learn through informal interactions within the multilayered educational learning environment.9 A growing body of work examines how the hidden curriculum and disruptive physicians impact the learning environment.9, 10 In response, regulatory agencies, such as the Liaison Committee on Medical Education (LCME) and Accreditation Council for Graduate Medical Education (ACGME), require training programs and medical schools to maintain standards of professionalism, and to regularly evaluate the learning environment and its impact on professionalism.11, 12 The ACGME in 2011 expanded its standards regarding professionalism by making certain that the program director and institution ensure a culture of professionalism that supports patient safety and personal responsibility.11 Given this increasing focus on professionalism in medical school and residency training programs, it is critical to examine faculty perceptions and actions that may perpetuate the discrepancy between the formal and hidden curriculum.13 This early exposure is especially significant because unprofessional behavior in medical school is strongly associated with later disciplinary action by a medical board.14, 15 Certain unprofessional behaviors can also compromise patient care and safety, and can detract from the hospital working environment.1618

In our previous work, we demonstrated that internal medicine interns reported increased participation in unprofessional behaviors regarding on‐call etiquette during internship.19, 20 Examples of these behaviors include refusing an admission (ie, blocking) and misrepresenting a test as urgent. Interestingly, students and residents have highlighted the powerful role of supervising faculty physicians in condoning or inhibiting such behavior. Given the increasing role of hospitalists as resident supervisors, it is important to consider the perceptions and actions of hospitalists with respect to perpetuating or hindering some unprofessional behaviors. Although hospital medicine is a relatively new specialty, many hospitalists are in frequent contact with medical trainees, perhaps because many residency programs and medical schools have a strong inpatient focus.2123 It is thus possible that hospitalists have a major influence on residents' behaviors and views of professionalism. In fact, the Society of Hospital Medicine's Core Competencies for Hospital Medicine explicitly state that hospitalists are expected to serve as a role model for professional and ethical conduct to house staff, medical students and other members of the interdisciplinary team.24

Therefore, the current study had 2 aims: first, to measure internal medicine hospitalists' perceptions of, and participation in, unprofessional behaviors using a previously validated survey; and second, to examine associations between job characteristics and participation in unprofessional behaviors.

METHODS

Study Design

This was a multi‐institutional, observational study that took place at the University of Chicago Pritzker School of Medicine, Northwestern University Feinberg School of Medicine, and NorthShore University HealthSystem. Hospitalist physicians employed at these hospitals were recruited for this study between June 2010 and July 2010. The Institutional Review Boards of the University of Chicago, Northwestern University, and NorthShore University HealthSystem approved this study. All subjects provided informed consent before participating.

Survey Development and Administration

Based on a prior survey of interns and third‐year medical students, a 35‐item survey was used to measure perceptions of, and participation in, unprofessional behaviors.8, 19, 20 The original survey was developed in 2005 by medical students who observed behaviors by trainees and faculty that they considered to be unprofessional. The survey was subsequently modified by interns to ascertain unprofessional behavior among interns. For this iteration, hospitalists and study authors at each site reviewed the survey items and adapted each item to ensure relevance to hospitalist work and also generalizability to site. New items were also created to refer specifically to work routinely performed by hospitalist attendings (attesting to resident notes, transferring patients to other services to reduce workload, etc). Because of this, certain items utilized jargon to refer to the unprofessional behavior as hospitalists do (ie, blocking admissions and turfing), and resonate with literature describing these phenomena.25 Items were also written in such a fashion to elicit the unprofessional nature (ie, blocking an admission that could be appropriate for your service).

The final survey (see Supporting Information, Appendix, in the online version of this article) included domains such as interactions with others, interactions with trainees, and patient‐care scenarios. Demographic information and job characteristics were collected including year of residency completion, total amount of clinical work, amount of night work, and amount of administrative work. Hospitalists were not asked whether they completed residency at the institution where they currently work in order to maintain anonymity in the context of a small sample. Instead, they were asked to rate their familiarity with residents at their institution on a Likert‐type scale ranging from very unfamiliar (1) to familiar (3) to very familiar (5). To help standardize levels of familiarity across hospitalists, we developed anchors that corresponded to how well a hospitalist would know resident names with familiar defined as knowing over half of resident names.

Participants reported whether they participated in, or observed, a particular behavior and rated their perception of each behavior from 1 (unprofessional) to 5 (professional), with unprofessional and somewhat unprofessional defined as unprofessional. A site champion administered paper surveys during a routine faculty meeting at each site. An electronic version was administered using SurveyMonkey (SurveyMonkey, Palo Alto, CA) to hospitalists not present at the faculty meeting. Participants chose a unique, nonidentifiable code to facilitate truthful reporting while allowing data tracking in follow‐up studies.

Data Analysis

Clinical time was dichotomized using above and below 50% full‐time equivalents (FTE) to define those that did less clinical. Because teaching time was relatively low with the median percent FTE spent on teaching at 10%, we used a cutoff of greater than 10% as greater teaching. Because many hospitalists engaged in no night work, night work was reported as those who engaged in any night work and those who did not. Similarly, because many hospitalists had no administrative time, administrative time was split into those with any administrative work and those without any administrative work. Lastly, those born after 1970 were classified as younger hospitalists.

Chi‐square tests were used to compare site response rates, and descriptive statistics were used to examine demographic characteristics of hospitalist respondents, in addition to perception of, and participation in, unprofessional behaviors. Because items on the survey were highly correlated, we used factor analysis to identify the underlying constructs that related to unprofessional behavior.26 Factor analysis is a statistical procedure that is most often used to explore which variables in a data set are most related or correlated to each other. By examining the patterns of similar responses, the underlying factors can be identified and extracted. These factors, by definition, are not correlated with each other. To select the number of factors to retain, the most common convention is to use Kaiser criterion, or retain all factors with eigenvalues greater than, or equal to, one.27 An eigenvalue measures the amount of variation in all of the items on the survey which is accounted for by that factor. If a factor has a low eigenvalue (less than 1 is the convention), then it is contributing little and is ignored, as it is likely redundant with the higher value factors.

Because use of Kaiser criterion often overestimates the number of factors to retain, another method is to use a scree plot which tends to underestimate the factors. Both were used in this study to ensure a stable solution. To name the factors, we examined which items or group of items loaded or were most highly related to which factor. To ensure an optimal factor solution, items with minimal participation (less than 3%) were excluded from factor analysis.

Then, site‐adjusted multivariate regression analysis was used to examine associations between job and demographic characteristics, and the factors of unprofessional behavior identified. Models controlled for gender and familiarity with residents. Because sample medians were used to define greater teaching (>10% FTE), we also performed a sensitivity analysis using different cutoffs for teaching time (>20% FTE and teaching tertiles). Likewise, we also used varying definitions of less clinical time to ensure that any statistically significant associations were robust across varying definitions. All data were analyzed using STATA 11.0 (Stata Corp, College Station, TX) and statistical significance was defined as P < 0.05.

RESULTS

Seventy‐seven of the 101 hospitalists (76.2%) at 3 sites completed the survey. While response rates varied by site (site 1, 67%; site 2, 74%; site 3, 86%), the differences were not statistically significant (2 = 2.9, P = 0.24). Most hospitalists (79.2%) completed residency after 2000. Over half (57.1%) of participants were male, and over half (61%) reported having worked with their current hospitalist group from 1 to 4 years. Almost 60% (59.7%) reported being unfamiliar with residents in the program. Over 40% of hospitalists did not do any night work. Hospitalists were largely clinical, one‐quarter of hospitalists reported working over 50% FTE, and the median was 80% FTE. While 78% of hospitalists reported some teaching time, median time on teaching service was low at 10% (Table 1).

Demographics of Responders* (n = 77)
 Total n (%)
  • Abbreviations: IQR, interquartile range.

  • Site differences were observed for clinical practice characteristics, such as number of weeks of teaching service, weeks working nights, clinical time, research time, completed fellowship, and won teaching awards. Due to item nonresponse, number of respondents reporting is listed for each item.

  • Familiarity with residents asked in lieu of whether hospitalist trained at the institution. Familiarity defined as a rating of 4 or 5 on Likert scale ranging from Very Unfamiliar (1) to Very Familiar (5), with Familiar (4) defined further as knowing >50% of residents' names.

Male (%)44 (57.1)
Completed residency (%)
Between 1981 and 19902 (2.6)
Between 1991 and 200014 (18.2)
After 200061 (79.2)
Medical school matriculation (%) (n = 76) 
US medical school59 (77.6)
International medical school17 (22.3)
Years spent with current hospitalist group (%)
<1 yr14 (18.2)
14 yr47 (61.0)
59 yr15 (19.5)
>10 yr1 (1.3)
Familiarity with residents (%)
Familiar31 (40.2)
Unfamiliar46 (59.7)
No. of weeks per year spent on (median IQR)
Hospitalist practice (n = 72)26.0 [16.026.0]
Teaching services (n = 68)4.0 [1.08.0]
Weeks working nights* (n = 71)
>2 wk16 (22.5)
12 wk24 (33.8)
0 wk31 (43.7)
% Clinical time (median IQR)* (n = 73)80 (5099)
% Teaching time (median IQR)* (n = 74)10 (120)
Any research time (%)* (n = 71)22 (31.0)
Any administrative time (%) (n = 72)29 (40.3)
Completed fellowship (%)*12 (15.6)
Won teaching awards (%)* (n = 76)21 (27.6)
View a career in hospital medicine as (%)
Temporary11 (14.3)
Long term47 (61.0)
Unsure19 (24.7)

Hospitalists perceived almost all behaviors as unprofessional (unprofessional or somewhat unprofessional on a 5‐point Likert Scale). The only behavior rated as professional with a mean of 4.25 (95% CI 4.014.49) was staying past shift limit to complete a patient‐care task that could have been signed out. This behavior also had the highest level of participation by hospitalists (81.7%). Hospitalists were most ambivalent when rating professionalism of attending an industry‐sponsored dinner or social event (mean 3.20, 95% CI 2.983.41) (Table 2).

Perception of, and Observation and Participation in, Unprofessional Behaviors Among Hospitalists (n = 77)
BehaviorReported Perception (Mean Likert score)*Reported Participation (%)Reported Observation (%)
  • Abbreviations: ER, emergency room.

  • Perception rated on Likert scale from 1 (unprofessional) to 5 (professional).

Having nonmedical/personal conversations in patient corridors (eg, discussing evening plans)2.55 (2.342.76)67.180.3
Ordering a routine test as urgent to get it expedited2.82 (2.583.06)62.380.5
Making fun of other physicians to colleagues1.56 (1.391.70)40.367.5
Disparaging the ER team/outpatient doctor to others for findings later discovered on the floor (eg, after the patient is admitted)2.01 (1.842.19)39.567.1
Signing out patients over the phone at the end of shift when sign‐out could have been done in person2.95 (2.743.16)40.865.8
Texting or using smartphone during educational conferences (ie, noon lecture)2.16 (1.952.36)39.072.7
Discussing patient information in public spaces1.49 (1.341.63)37.766.2
Making fun of other attendings to colleagues1.62 (1.461.78)35.161.0
Deferring family members' concerns about a change in the patient's clinical course to the primary team in order to avoid engaging in such a discussion2.16 (1.912.40)30.355.3
Making disparaging comments about a patient on rounds1.42 (1.271.56)29.867.5
Attending an industry (eg, pharmaceutical or equipment/device manufacturer)‐sponsored dinner or social event3.20 (2.983.41)28.660.5
Ignoring family member's nonurgent questions about a cross‐cover patient when you had time to answer2.05 (1.852.25)26.348.7
Attesting to a resident's note when not fully confident of the content of their documentation1.65 (1.451.85)23.432.5
Making fun of support staff to colleagues1.45 (1.311.59)22.157.9
Not correcting someone who mistakes a student for a physician2.19 (2.012.38)20.835.1
Celebrating a blocked‐admission1.80 (1.612.00)21.160.5
Making fun of residents to colleagues1.53 (1.371.70)18.244.2
Coming to work when you have a significant illness (eg, influenza)1.99 (1.792.19)14.335.1
Celebrating a successful turf1.71 (1.511.92)11.739.0
Failing to notify the patient that a member of the team made, or is concerned that they made, an error1.53 (1.341.71)10.420.8
Transferring a patient, who could be cared for on one's own service, to another service in order to reduce one's census (eg, turfing)1.72 (1.521.91)9.358.7
Refusing an admission which could be considered appropriate for your service (eg, blocking)1.63 (1.441.82)7.968.4
Falsifying patient records (ie, back‐dating a note, copying forward unverified information, or documenting physical findings not personally obtained)1.22 (1.101.34)6.527.3
Making fun of students to colleagues1.35 (1.191.51)6.524.7
Failing to notify patient‐safety or risk management that a member of the team made, or is concerned that they made, an error1.64 (1.461.82)5.213.2
Introducing a student as a doctor to patients1.96 (1.762.16)3.920.8
Signing‐out a procedure or task, that could have been completed during a required shift or by the primary team, in order to go home as early in the day as possible1.48 (1.321.64)3.948.1
Performing medical or surgical procedures on a patient beyond self‐perceived level of skill1.27 (1.141.41)2.67.8
Asking a student to obtain written consent from a patient or their proxy without supervision (eg, for blood transfusion or minor procedures)1.60 (1.421.78)2.636.5
Encouraging a student to state that they are a doctor in order to expedite patient care1.31 (1.151.47)2.66.5
Discharging a patient before they are ready to go home in order to reduce one's census1.18 (1.071.29)2.619.5
Reporting patient information (eg, labs, test results, exam results) as normal when uncertain of the true results1.29 (1.161.41)2.615.6
Asking a student to perform medical or surgical procedures which are perceived to be beyond their level of skill1.26 (1.121.40)1.33.9
Asking a student to discuss, with patients, medical or surgical information which is perceived to be beyond their level of knowledge1.41 (1.261.56)0.015.8

Participation in egregious behaviors, such as falsifying patient records (6.49%) and performing medical or surgical procedures on a patient beyond self‐perceived level of skill (2.60%), was very low. The most common behaviors rated as unprofessional that hospitalists reported participating in were having nonmedical/personal conversations in patient corridors (67.1%), ordering a routine test as urgent to expedite care (62.3%), and making fun of other physicians to colleagues (40.3%). Forty percent of participants reported disparaging the emergency room (ER) team or primary care physician for findings later discovered, signing out over the phone when it could have been done in person, and texting or using smartphones during educational conferences. In particular, participation in unprofessional behaviors related to trainees was close to zero (eg, asking a student to discuss, with patients, medical or surgical information which is perceived to be beyond their level of knowledge). The least common behaviors that hospitalists reported participating in were discharging a patient before they are ready to go home in order to reduce one's census (2.56%) and reporting patient information as normal when uncertain of the true results (2.60%). Like previous studies of unprofessional behaviors, those that reported participation were less likely to report the behavior as unprofessional.8, 19

Observation of behaviors ranged from 4% to 80%. In all cases, observation of the behavior was reported at a higher level than participation. Correlation between observation and participation was also high, with the exception of a few behaviors that had zero or near zero participation rates (ie, reporting patient information as normal when unsure of true results.)

After performing factor analysis, 4 factors had eigenvalues greater than 1 and were therefore retained and extracted for further analysis. These 4 factors accounted for 76% of the variance in responses reported on the survey. By examining which items or groups of items most strongly loaded on each factor, the factors were named accordingly: factor 1 referred to behaviors related to making fun of others, factor 2 referred to workload management, factor 3 referred to behaviors related to the learning environment, and factor 4 referred to behaviors related to time pressure (Table 3).

Results of Factor Analysis Displaying Items by Primary Loading
  • NOTE: Items were categorized using factor analysis to the factor that they loaded most highly on. All items shown loaded at 0.4 or above onto each factor. Four items were omitted due to loadings less than 0.4. One item cross‐loaded on multiple factors (deferring family questions). Abbreviations: ER, emergency room.

Factor 1: Making fun of others
Making fun of other physicians (0.78)
Making fun of attendings (0.77)
Making fun of residents (0.70)
Making disparaging comments about a patient on rounds (0.51)
Factor 2: Workload management
Celebrating a successful turf (0.81)
Celebrating a blocked‐admission (0.65)
Coming to work sick (0.56)
Transferring a patient, who could be cared for on one's own service, to another service in order to reduce one's census (eg, turfing.) (0.51)
Disparaging the ER team/outpatient doctor to others for findings later discovered on the floor (0.48)
Discharging a patient before they are ready to go home in order to reduce one's census (0.43)
Factor 3: Learning environment
Not correcting someone who mistakes a student for a physician (0.72)
Texting or using smartphone during educational conferences (ie, noon lecture) (0.51)
Failing to notify patient‐safety or risk management that a member of the team made, or is concerned that they made, an error (0.45)
Having nonmedical/personal conversations in patient corridors (eg, discussing evening plans) (0.43)
Factor 4: Time pressure
Ignoring family member's nonurgent questions about a cross‐cover patient when you had the time to answer (0.50)
Signing out patients over the phone at the end of shift when sign‐out could have been done in person (0.46)
Attesting to a resident's note when not fully confident of the content of their documentation (0.44)

Using site‐adjusted multivariate regression, certain hospitalist job characteristics were associated with certain patterns of participating in unprofessional behavior (Table 4). Those with less clinical time (<50% FTE) were more likely to participate in unprofessional behaviors related to making fun of others (factor 1, value = 0.94, 95% CI 0.32 to 1.56, P value <0.05). Hospitalists who had any administrative time ( value = 0.61, 95% CI 0.111.10, P value <0.05) were more likely to report participation in behaviors related to workload management. Hospitalists engaged in any night work were more likely to report participation in unprofessional behaviors related to time pressure ( value = 0.67, 95% CI 0.171.17, P value <0.05). Time devoted to teaching or research was not associated with greater participation in any of the domains of unprofessional behavior surveyed.

Association Between Hospitalist Job and Demographic Characteristics and Factors of Unprofessional Behavior
ModelMaking Fun of OthersLearning EnvironmentWorkload ManagementTime Pressure
PredictorBeta [95% CI]Beta [95% CI]Beta [95% CI]Beta [95% CI]
  • NOTE: Table shows the results of 4 different multivariable linear regression models, which examine the association between various covariates (job characteristics, demographic characteristics, and site) and factors of participation in unprofessional behaviors (communication, patient safety, workload). Due to item nonresponse, n = 63 for all regression models. Abbreviations: CI, confidence interval.

  • P < 0.05.

  • Less clinical was defined as less than 50% full‐time equivalent (FTE) in a given year spent on clinical work.

  • Teaching was defined as greater than the median (10% FTE) spent on teaching. Results did not change when using tertiles of teaching effort, or a cutoff at teaching greater than 20% FTE.

  • Administrative time, research time, and nights were defined as reporting any administrative time, research time, or night work, respectively (greater than 0% per year).

  • Younger was defined as those born after 1970.

Job characteristics
Less clinical0.94 [0.32, 1.56]*0.01 [0.66, 0.64]0.17 [0.84, 0.49]0.39 [0.24, 1.01]
Administrative0.30 [0.16, 0.76]0.06 [0.43, 0.54]0.61 [0.11, 1.10]*0.26 [0.20, 0.72]
Teaching0.01 [0.49, 0.48]0.09 [0.60, 0.42]0.12 [0.64, 0.40]0.16 [0.33, 0.65]
Research0.30 [0.87, 0.27]0.38 [0.98, 0.22]0.37 [0.98, 0.24]0.13 [0.45, 0.71]
Any nights0.08 [0.58, 0.42]0.24 [0.28, 0.77]0.24 [0.29, 0.76]0.67 [0.17,1.17]*
Demographic characteristics
Male0.06 [0.42, 0.53]0.03 [0.47, 0.53]0.05 [0.56, 0.47]0.40 [0.89, 0.08]
Younger0.05 [0.79, 0.69]0.64 [1.42, 0.14]0.87 [0.07, 1.67]*0.62 [0.13, 1.37]
Unfamiliar with residents0.32 [0.85, 0.22]0.32 [0.89, 0.24]0.13 [0.45, 0.70]0.47 [0.08, 1.01]
Institution
Site 10.58 [0.22, 1.38]0.05 [0.89, 0.79]1.01 [0.15, 1.86]*0.77 [1.57, 0.04]
Site 30.11 [0.68, 0.47]0.70 [1.31, 0.09]*0.43 [0.20, 1.05]0.45 [0.13, 1.04]
Constant0.03 [0.99, 1.06]0.94 [0.14, 2.02]1.23[2.34, 0.13]*1.34[2.39, 0.31]*

The only demographic characteristic that was significantly associated with unprofessional behavior was age. Specifically, those who were born after 1970 were more likely to participate in unprofessional behaviors related to workload management ( value = 0.87, 95% CI 0.071.67, P value <0.05). Site differences were also present. Specifically, one site was more likely to report participation in unprofessional behaviors related to workload management ( value site 1 = 1.01, 95% CI 0.15 to 1.86, P value <0.05), while another site was less likely to report participation in behaviors related to the learning environment ( value site 3 = 0.70, 95% CI 1.31 to 0.09, P value <0.05). Gender and familiarity with residents were not significant predictors of participation in unprofessional behaviors. Results remained robust in sensitivity analyses using different cutoffs of clinical time and teaching time.

DISCUSSION

This multisite study adds to what is known about the perceptions of, and participation in, unprofessional behaviors among internal medicine hospitalists. Hospitalists perceived almost all surveyed behaviors as unprofessional. Participation in egregious and trainee‐related unprofessional behaviors was very low. Four categories appeared to explain the variability in how hospitalists reported participation in unprofessional behaviors: making fun of others, workload management, learning environment, and time pressure. Participation in behaviors within these factors was associated with certain job characteristics, such as clinical time, administrative time, and night work, as well as age and site.

It is reassuring that participation in, and trainee‐related, unprofessional behaviors is very low, and it is noteworthy that attending an industry‐sponsored dinner is not considered unprofessional. This was surprising in the setting of increased external pressures to report and ban such interactions.28 Perception that attending such dinners is acceptable may reflect a lag between current practice and national recommendations.

It is important to explore why certain job characteristics are associated with participation in unprofessional behaviors. For example, those with less clinical time were more likely to participate in making fun of others. It may be the case that hospitalists with more clinical time may make a larger effort to develop and maintain positive relationships. Another possible explanation is that hospitalists with less clinical time are more easily influenced by those in the learning environment who make fun of others, such as residents who they are supervising for only a brief period.

For unprofessional behaviors related to workload management, those who were younger, and those with any administrative time, were more likely to participate in behaviors such as celebrating a blocked‐admission. Our prior work shows that behaviors related to workload management are more widespread in residency, and therefore younger hospitalists, who are often recent residency graduates, may be more prone to participating in these behaviors. While unproven, it is possible that those with more administrative time may have competing priorities with their administrative roles, which motivate them to more actively manage their workload, leading them to participate in workload management behaviors.

Hospitalists who did any night work were more likely to participate in unprofessional behaviors related to time pressure. This could reflect the high workloads that night hospitalists may face and the pressure they feel to wrap up work, resulting in a hasty handoff (ie, over the phone) or to defer work (ie, family questions). Site differences were also observed for participation in behaviors related to the learning environment, speaking to the importance of institutional culture.

It is worth mentioning that hospitalists who were teachers were not any less likely to report participating in certain behaviors. While 78% of hospitalists reported some level of teaching, the median reported percentage of teaching was 10% FTE. This level of teaching likely reflects the diverse nature of work in which hospitalists engage. While hospitalists spend some time working with trainees, services that are not staffed with residents (eg, uncovered services) are becoming increasingly common due to stricter resident duty hour restrictions. This may explain why 60% of hospitalists reported being unfamiliar with residents. We also used a high bar for familiarity, which we defined as knowing half of residents by name, and served as a proxy for those who may have trained at the institution where they currently work. In spite of hospitalists reporting a low fraction of their total clinical time devoted to resident services, a significant fraction of resident services were staffed by hospitalists at all sites, making them a natural target for interventions.

These results have implications for future work to assess and improve professionalism in the hospital learning environment. First, interventions to address unprofessional behaviors should focus on behaviors with the highest participation rates. Like our earlier studies of residents, participation is high in certain behaviors, such as misrepresenting a test as urgent, or disparaging the ER or primary care physician (PCP) for a missed finding.19, 20 While blocking an admission was common in our studies of residents, reported participation among hospitalists was low. Similar to a prior study of clinical year medical students at one of our sites, 1 in 5 hospitalists reported not correcting someone who mistakes a student for a physician, highlighting the role that hospitalists may have in perpetuating this behavior.8 Additionally, addressing the behaviors identified in this study, through novel curricular tools, may help to teach residents many of the interpersonal and communication skills called for in the 2011 ACGME Common Program Requirements.11 The ACGME requirements also include the expectation that faculty model how to manage their time before, during, and after clinical assignments, and recognize that transferring a patient to a rested provider is best. Given that most hospitalists believe staying past shift limit is professional, these requirements will be difficult to adopt without widespread culture change.

Moreover, interventions could be tailored to hospitalists with certain job characteristics. Interventions may be educational or systems based. An example of the former is stressing the impact of the learning and working environment on trainees, and an example of the latter is streamlining the process in which ordered tests are executed to result in a more timely completion of tests. This may result in fewer physicians misrepresenting a test as urgent in order to have the test done in a timely manner. Additionally, hospitalists with less clinical time could receive education on their impact as a role model for trainees. Hospitalists who are younger or with administrative commitments could be trained on the importance of avoiding behaviors related to workload management, such as blocking or turfing patients. Lastly, given the site differences, critical examination of institutional culture and policies is also important. With funding from the American Board of Internal Medicine (ABIM) Foundation, we are currently creating an educational intervention, targeting those behaviors that were most frequent among hospitalists and residents at our institutions to promote dialogue and critical reflection, with the hope of reducing the most prevalent behaviors encountered.

There are several limitations to this study. Despite the anonymity of the survey, participants may have inaccurately reported their participation in unprofessional behaviors due to socially desirable response. In addition, because we used factor analysis and multivariate regression models with a small sample size, item nonresponse limited the sample for regression analyses and raises the concern for response bias. However, all significant associations remained so after performing backwards stepwise elimination of covariates that were P > 0.10 in models that were larger (ranging from 65 to 69). Because we used self‐report and not direct observation of participation in unprofessional behaviors, it is not possible to validate the responses given. Future work could rely on the use of 360 degree evaluations or other methods to validate responses given by self‐report. It is also important to consider assessing whether these behaviors are associated with actual patient outcomes, such as length of stay or readmission. Some items may not always be unprofessional. For example, texting during an educational conference might be to advance care, which would not necessarily be unprofessional. The order in which the questions were asked could have led to bias. We asked about participation before perception to try to limit bias reporting in participation. Changing the order of these questions would potentially have resulted in under‐reporting participation in behaviors that one perceived to be unprofessional. This study was conducted at 3 institutions located in Chicago, limiting generalizability to institutions outside of this area. Only internal medicine hospitalists were surveyed, which also limits generalizability to other disciplines and specialties within internal medicine. Lastly, it is important to highlight that hospitalists are not the sole teachers on inpatient services, since residents encounter a variety of faculty who serve as teaching attendings. Future work should expand to other centers and other specialties.

In conclusion, in this multi‐institutional study of hospitalists, participation in egregious behaviors was low. Four factors or patterns underlie hospitalists' reports of participation in unprofessional behavior: making fun of others, learning environment, workload management, and time pressure. Job characteristics (clinical time, administrative time, night work), age, and site were all associated with different patterns of unprofessional behavior. Specifically, hospitalists with less clinical time were more likely to make fun of others. Hospitalists who were younger in age, as well as those who had any administrative work, were more likely to participate in behaviors related to workload management. Hospitalists who work nights were more likely to report behaviors related to time pressure. Interventions to promote professionalism should take institutional culture into account and should focus on behaviors with the highest participation rates. Efforts should also be made to address underlying reasons for participation in these behaviors.

Acknowledgements

The authors thank Meryl Prochaska for her research assistance and manuscript preparation.

Disclosures: The authors acknowledge funding from the ABIM Foundation and the Pritzker Summer Research Program. The funders had no role in the design of the study; the collection, analysis, and interpretation of the data; or the decision to approve publication of the finished manuscript. Prior presentations of the data include the 2010 University of Chicago Pritzker School of Medicine Summer Research Forum, the 2010 University of Chicago Pritzker School of Medicine Medical Education Day, the 2010 Midwest Society of Hospital Medicine Meeting in Chicago, IL, and the 2011 Society of Hospital Medicine National Meeting in Dallas, TX. All authors disclose no relevant or financial conflicts of interest.

Files
References
  1. Stern DT.Practicing what we preach? An analysis of the curriculum of values in medical education.Am J Med.1998;104:569575.
  2. Borgstrom E,Cohn S,Barclay S.Medical professionalism: conflicting values for tomorrow's doctors.J Gen Intern Med.2010;25(12):13301336.
  3. Karnieli‐Miller O,Vu TR,Holtman MC,Clyman SG,Inui TS.Medical students' professionalism narratives: a window on the informal and hidden curriculum.Acad Med.2010;85(1):124133.
  4. Cohn FG,Shapiro J,Lie DA,Boker J,Stephens F,Leung LA.Interpreting values conflicts experienced by obstetrics‐gynecology clerkship students using reflective writing.Acad Med.2009;84(5):587596.
  5. Gaiser RR.The teaching of professionalism during residency: why it is failing and a suggestion to improve its success.Anesth Analg.2009;108(3):948954.
  6. Gofton W,Regehr G.What we don't know we are teaching: unveiling the hidden curriculum.Clin Orthop Relat Res.2006;449:2027.
  7. Hafferty FW.Definitions of professionalism: a search for meaning and identity.Clin Orthop Relat Res.2006;449:193204.
  8. Reddy ST,Farnan JM,Yoon JD, et al.Third‐year medical students' participation in and perceptions of unprofessional behaviors.Acad Med.2007;82:S35S39.
  9. Hafferty FW.Beyond curriculum reform: confronting medicine's hidden curriculum.Acad Med.1998;73:403407.
  10. Pfifferling JH.Physicians' “disruptive” behavior: consequences for medical quality and safety.Am J Med Qual.2008;23:165167.
  11. Accreditation Council for Graduate Medical Education. Common Program Requirements: General Competencies. Available at: http://www.acgme.org/acwebsite/home/common_program_requirements_07012011.pdf. Accessed December 19,2011.
  12. Liaison Committee on Medical Education. Functions and Structure of a Medical School. Available at: http://www.lcme.org/functions2010jun.pdf.. Accessed June 30,2010.
  13. Gillespie C,Paik S,Ark T,Zabar S,Kalet A.Residents' perceptions of their own professionalism and the professionalism of their learning environment.J Grad Med Educ.2009;1:208215.
  14. Papadakis MA,Hodgson CS,Teherani A,Kohatsu ND.Unprofessional behavior in medical school is associated with subsequent disciplinary action by a state medical board.Acad Med.2004;79:244249.
  15. Papadakis MA,Teherani A,Banach MA, et al.Disciplinary action by medical boards and prior behavior in medical school.N Engl J Med.2005;353:26732682.
  16. Rosenstein AH,O'Daniel M.A survey of the impact of disruptive behaviors and communication defects on patient safety.Jt Comm J Qual Patient Saf.2008;34:464471.
  17. Rosenstein AH,O'Daniel M.Managing disruptive physician behavior—impact on staff relationships and patient care.Neurology.2008;70:15641570.
  18. The Joint Commission.Behaviors that undermine a culture of safety. Sentinel Event Alert.2008. Available at: http://www.jointcommission.org/assets/1/18/SEA_40.PDF. Accessed April 28, 2012.
  19. Arora VM,Wayne DB,Anderson RA,Didwania A,Humphrey HJ.Participation in and perceptions of unprofessional behaviors among incoming internal medicine interns.JAMA.2008;300:11321134.
  20. Arora VM,Wayne DB,Anderson RA, et al.Changes in perception of and participation in unprofessional behaviors during internship.Acad Med.2010;85:S76S80.
  21. Wachter RM.Reflections: the hospitalist movement a decade later.J Hosp Med.2006;1:248252.
  22. Society of Hospital Medicine, 2007–2008 Bi‐Annual Survey.2008. Available at: http://www.medscape.org/viewarticle/578134. Accessed April 28, 2012.
  23. Holmboe ES,Bowen JL,Green M, et al.Reforming internal medicine residency training. A report from the Society of General Internal Medicine's Task Force for Residency Reform.J Gen Intern Med.2005;20:11651172.
  24. Society of Hospital Medicine.The Core Competencies in Hospital Medicine: a framework for curriculum development by the Society of Hospital Medicine.J Hosp Med.2006;1(suppl 1):25.
  25. Caldicott CV,Dunn KA,Frankel RM.Can patients tell when they are unwanted? “Turfing” in residency training.Patient Educ Couns.2005;56:104111.
  26. Costello AB,Osborn JW.Best practices in exploratory factor analysis: four recommendations for getting the most from your analysis.Pract Assess Res Eval.2005;10:19.
  27. Principal Components and Factor Analysis. StatSoft Electronic Statistics Textbook. Available at: http://www.statsoft.com/textbook/principal‐components‐factor‐analysis/. Accessed December 30,2011.
  28. Brennan TA,Rothman DJ,Blank L, et al.Health industry practices that create conflicts of interest: a policy proposal for academic medical centers.JAMA.2006;295(4):429433.
Article PDF
Issue
Journal of Hospital Medicine - 7(7)
Publications
Page Number
543-550
Sections
Files
Files
Article PDF
Article PDF

The discrepancy between what is taught about professionalism in formal medical education and what is witnessed in the hospital has received increasing attention.17 This latter aspect of medical education contributes to the hidden curriculum and impacts medical trainees' views on professionalism.8 The hidden curriculum refers to the lessons trainees learn through informal interactions within the multilayered educational learning environment.9 A growing body of work examines how the hidden curriculum and disruptive physicians impact the learning environment.9, 10 In response, regulatory agencies, such as the Liaison Committee on Medical Education (LCME) and Accreditation Council for Graduate Medical Education (ACGME), require training programs and medical schools to maintain standards of professionalism, and to regularly evaluate the learning environment and its impact on professionalism.11, 12 The ACGME in 2011 expanded its standards regarding professionalism by making certain that the program director and institution ensure a culture of professionalism that supports patient safety and personal responsibility.11 Given this increasing focus on professionalism in medical school and residency training programs, it is critical to examine faculty perceptions and actions that may perpetuate the discrepancy between the formal and hidden curriculum.13 This early exposure is especially significant because unprofessional behavior in medical school is strongly associated with later disciplinary action by a medical board.14, 15 Certain unprofessional behaviors can also compromise patient care and safety, and can detract from the hospital working environment.1618

In our previous work, we demonstrated that internal medicine interns reported increased participation in unprofessional behaviors regarding on‐call etiquette during internship.19, 20 Examples of these behaviors include refusing an admission (ie, blocking) and misrepresenting a test as urgent. Interestingly, students and residents have highlighted the powerful role of supervising faculty physicians in condoning or inhibiting such behavior. Given the increasing role of hospitalists as resident supervisors, it is important to consider the perceptions and actions of hospitalists with respect to perpetuating or hindering some unprofessional behaviors. Although hospital medicine is a relatively new specialty, many hospitalists are in frequent contact with medical trainees, perhaps because many residency programs and medical schools have a strong inpatient focus.2123 It is thus possible that hospitalists have a major influence on residents' behaviors and views of professionalism. In fact, the Society of Hospital Medicine's Core Competencies for Hospital Medicine explicitly state that hospitalists are expected to serve as a role model for professional and ethical conduct to house staff, medical students and other members of the interdisciplinary team.24

Therefore, the current study had 2 aims: first, to measure internal medicine hospitalists' perceptions of, and participation in, unprofessional behaviors using a previously validated survey; and second, to examine associations between job characteristics and participation in unprofessional behaviors.

METHODS

Study Design

This was a multi‐institutional, observational study that took place at the University of Chicago Pritzker School of Medicine, Northwestern University Feinberg School of Medicine, and NorthShore University HealthSystem. Hospitalist physicians employed at these hospitals were recruited for this study between June 2010 and July 2010. The Institutional Review Boards of the University of Chicago, Northwestern University, and NorthShore University HealthSystem approved this study. All subjects provided informed consent before participating.

Survey Development and Administration

Based on a prior survey of interns and third‐year medical students, a 35‐item survey was used to measure perceptions of, and participation in, unprofessional behaviors.8, 19, 20 The original survey was developed in 2005 by medical students who observed behaviors by trainees and faculty that they considered to be unprofessional. The survey was subsequently modified by interns to ascertain unprofessional behavior among interns. For this iteration, hospitalists and study authors at each site reviewed the survey items and adapted each item to ensure relevance to hospitalist work and also generalizability to site. New items were also created to refer specifically to work routinely performed by hospitalist attendings (attesting to resident notes, transferring patients to other services to reduce workload, etc). Because of this, certain items utilized jargon to refer to the unprofessional behavior as hospitalists do (ie, blocking admissions and turfing), and resonate with literature describing these phenomena.25 Items were also written in such a fashion to elicit the unprofessional nature (ie, blocking an admission that could be appropriate for your service).

The final survey (see Supporting Information, Appendix, in the online version of this article) included domains such as interactions with others, interactions with trainees, and patient‐care scenarios. Demographic information and job characteristics were collected including year of residency completion, total amount of clinical work, amount of night work, and amount of administrative work. Hospitalists were not asked whether they completed residency at the institution where they currently work in order to maintain anonymity in the context of a small sample. Instead, they were asked to rate their familiarity with residents at their institution on a Likert‐type scale ranging from very unfamiliar (1) to familiar (3) to very familiar (5). To help standardize levels of familiarity across hospitalists, we developed anchors that corresponded to how well a hospitalist would know resident names with familiar defined as knowing over half of resident names.

Participants reported whether they participated in, or observed, a particular behavior and rated their perception of each behavior from 1 (unprofessional) to 5 (professional), with unprofessional and somewhat unprofessional defined as unprofessional. A site champion administered paper surveys during a routine faculty meeting at each site. An electronic version was administered using SurveyMonkey (SurveyMonkey, Palo Alto, CA) to hospitalists not present at the faculty meeting. Participants chose a unique, nonidentifiable code to facilitate truthful reporting while allowing data tracking in follow‐up studies.

Data Analysis

Clinical time was dichotomized using above and below 50% full‐time equivalents (FTE) to define those that did less clinical. Because teaching time was relatively low with the median percent FTE spent on teaching at 10%, we used a cutoff of greater than 10% as greater teaching. Because many hospitalists engaged in no night work, night work was reported as those who engaged in any night work and those who did not. Similarly, because many hospitalists had no administrative time, administrative time was split into those with any administrative work and those without any administrative work. Lastly, those born after 1970 were classified as younger hospitalists.

Chi‐square tests were used to compare site response rates, and descriptive statistics were used to examine demographic characteristics of hospitalist respondents, in addition to perception of, and participation in, unprofessional behaviors. Because items on the survey were highly correlated, we used factor analysis to identify the underlying constructs that related to unprofessional behavior.26 Factor analysis is a statistical procedure that is most often used to explore which variables in a data set are most related or correlated to each other. By examining the patterns of similar responses, the underlying factors can be identified and extracted. These factors, by definition, are not correlated with each other. To select the number of factors to retain, the most common convention is to use Kaiser criterion, or retain all factors with eigenvalues greater than, or equal to, one.27 An eigenvalue measures the amount of variation in all of the items on the survey which is accounted for by that factor. If a factor has a low eigenvalue (less than 1 is the convention), then it is contributing little and is ignored, as it is likely redundant with the higher value factors.

Because use of Kaiser criterion often overestimates the number of factors to retain, another method is to use a scree plot which tends to underestimate the factors. Both were used in this study to ensure a stable solution. To name the factors, we examined which items or group of items loaded or were most highly related to which factor. To ensure an optimal factor solution, items with minimal participation (less than 3%) were excluded from factor analysis.

Then, site‐adjusted multivariate regression analysis was used to examine associations between job and demographic characteristics, and the factors of unprofessional behavior identified. Models controlled for gender and familiarity with residents. Because sample medians were used to define greater teaching (>10% FTE), we also performed a sensitivity analysis using different cutoffs for teaching time (>20% FTE and teaching tertiles). Likewise, we also used varying definitions of less clinical time to ensure that any statistically significant associations were robust across varying definitions. All data were analyzed using STATA 11.0 (Stata Corp, College Station, TX) and statistical significance was defined as P < 0.05.

RESULTS

Seventy‐seven of the 101 hospitalists (76.2%) at 3 sites completed the survey. While response rates varied by site (site 1, 67%; site 2, 74%; site 3, 86%), the differences were not statistically significant (2 = 2.9, P = 0.24). Most hospitalists (79.2%) completed residency after 2000. Over half (57.1%) of participants were male, and over half (61%) reported having worked with their current hospitalist group from 1 to 4 years. Almost 60% (59.7%) reported being unfamiliar with residents in the program. Over 40% of hospitalists did not do any night work. Hospitalists were largely clinical, one‐quarter of hospitalists reported working over 50% FTE, and the median was 80% FTE. While 78% of hospitalists reported some teaching time, median time on teaching service was low at 10% (Table 1).

Demographics of Responders* (n = 77)
 Total n (%)
  • Abbreviations: IQR, interquartile range.

  • Site differences were observed for clinical practice characteristics, such as number of weeks of teaching service, weeks working nights, clinical time, research time, completed fellowship, and won teaching awards. Due to item nonresponse, number of respondents reporting is listed for each item.

  • Familiarity with residents asked in lieu of whether hospitalist trained at the institution. Familiarity defined as a rating of 4 or 5 on Likert scale ranging from Very Unfamiliar (1) to Very Familiar (5), with Familiar (4) defined further as knowing >50% of residents' names.

Male (%)44 (57.1)
Completed residency (%)
Between 1981 and 19902 (2.6)
Between 1991 and 200014 (18.2)
After 200061 (79.2)
Medical school matriculation (%) (n = 76) 
US medical school59 (77.6)
International medical school17 (22.3)
Years spent with current hospitalist group (%)
<1 yr14 (18.2)
14 yr47 (61.0)
59 yr15 (19.5)
>10 yr1 (1.3)
Familiarity with residents (%)
Familiar31 (40.2)
Unfamiliar46 (59.7)
No. of weeks per year spent on (median IQR)
Hospitalist practice (n = 72)26.0 [16.026.0]
Teaching services (n = 68)4.0 [1.08.0]
Weeks working nights* (n = 71)
>2 wk16 (22.5)
12 wk24 (33.8)
0 wk31 (43.7)
% Clinical time (median IQR)* (n = 73)80 (5099)
% Teaching time (median IQR)* (n = 74)10 (120)
Any research time (%)* (n = 71)22 (31.0)
Any administrative time (%) (n = 72)29 (40.3)
Completed fellowship (%)*12 (15.6)
Won teaching awards (%)* (n = 76)21 (27.6)
View a career in hospital medicine as (%)
Temporary11 (14.3)
Long term47 (61.0)
Unsure19 (24.7)

Hospitalists perceived almost all behaviors as unprofessional (unprofessional or somewhat unprofessional on a 5‐point Likert Scale). The only behavior rated as professional with a mean of 4.25 (95% CI 4.014.49) was staying past shift limit to complete a patient‐care task that could have been signed out. This behavior also had the highest level of participation by hospitalists (81.7%). Hospitalists were most ambivalent when rating professionalism of attending an industry‐sponsored dinner or social event (mean 3.20, 95% CI 2.983.41) (Table 2).

Perception of, and Observation and Participation in, Unprofessional Behaviors Among Hospitalists (n = 77)
BehaviorReported Perception (Mean Likert score)*Reported Participation (%)Reported Observation (%)
  • Abbreviations: ER, emergency room.

  • Perception rated on Likert scale from 1 (unprofessional) to 5 (professional).

Having nonmedical/personal conversations in patient corridors (eg, discussing evening plans)2.55 (2.342.76)67.180.3
Ordering a routine test as urgent to get it expedited2.82 (2.583.06)62.380.5
Making fun of other physicians to colleagues1.56 (1.391.70)40.367.5
Disparaging the ER team/outpatient doctor to others for findings later discovered on the floor (eg, after the patient is admitted)2.01 (1.842.19)39.567.1
Signing out patients over the phone at the end of shift when sign‐out could have been done in person2.95 (2.743.16)40.865.8
Texting or using smartphone during educational conferences (ie, noon lecture)2.16 (1.952.36)39.072.7
Discussing patient information in public spaces1.49 (1.341.63)37.766.2
Making fun of other attendings to colleagues1.62 (1.461.78)35.161.0
Deferring family members' concerns about a change in the patient's clinical course to the primary team in order to avoid engaging in such a discussion2.16 (1.912.40)30.355.3
Making disparaging comments about a patient on rounds1.42 (1.271.56)29.867.5
Attending an industry (eg, pharmaceutical or equipment/device manufacturer)‐sponsored dinner or social event3.20 (2.983.41)28.660.5
Ignoring family member's nonurgent questions about a cross‐cover patient when you had time to answer2.05 (1.852.25)26.348.7
Attesting to a resident's note when not fully confident of the content of their documentation1.65 (1.451.85)23.432.5
Making fun of support staff to colleagues1.45 (1.311.59)22.157.9
Not correcting someone who mistakes a student for a physician2.19 (2.012.38)20.835.1
Celebrating a blocked‐admission1.80 (1.612.00)21.160.5
Making fun of residents to colleagues1.53 (1.371.70)18.244.2
Coming to work when you have a significant illness (eg, influenza)1.99 (1.792.19)14.335.1
Celebrating a successful turf1.71 (1.511.92)11.739.0
Failing to notify the patient that a member of the team made, or is concerned that they made, an error1.53 (1.341.71)10.420.8
Transferring a patient, who could be cared for on one's own service, to another service in order to reduce one's census (eg, turfing)1.72 (1.521.91)9.358.7
Refusing an admission which could be considered appropriate for your service (eg, blocking)1.63 (1.441.82)7.968.4
Falsifying patient records (ie, back‐dating a note, copying forward unverified information, or documenting physical findings not personally obtained)1.22 (1.101.34)6.527.3
Making fun of students to colleagues1.35 (1.191.51)6.524.7
Failing to notify patient‐safety or risk management that a member of the team made, or is concerned that they made, an error1.64 (1.461.82)5.213.2
Introducing a student as a doctor to patients1.96 (1.762.16)3.920.8
Signing‐out a procedure or task, that could have been completed during a required shift or by the primary team, in order to go home as early in the day as possible1.48 (1.321.64)3.948.1
Performing medical or surgical procedures on a patient beyond self‐perceived level of skill1.27 (1.141.41)2.67.8
Asking a student to obtain written consent from a patient or their proxy without supervision (eg, for blood transfusion or minor procedures)1.60 (1.421.78)2.636.5
Encouraging a student to state that they are a doctor in order to expedite patient care1.31 (1.151.47)2.66.5
Discharging a patient before they are ready to go home in order to reduce one's census1.18 (1.071.29)2.619.5
Reporting patient information (eg, labs, test results, exam results) as normal when uncertain of the true results1.29 (1.161.41)2.615.6
Asking a student to perform medical or surgical procedures which are perceived to be beyond their level of skill1.26 (1.121.40)1.33.9
Asking a student to discuss, with patients, medical or surgical information which is perceived to be beyond their level of knowledge1.41 (1.261.56)0.015.8

Participation in egregious behaviors, such as falsifying patient records (6.49%) and performing medical or surgical procedures on a patient beyond self‐perceived level of skill (2.60%), was very low. The most common behaviors rated as unprofessional that hospitalists reported participating in were having nonmedical/personal conversations in patient corridors (67.1%), ordering a routine test as urgent to expedite care (62.3%), and making fun of other physicians to colleagues (40.3%). Forty percent of participants reported disparaging the emergency room (ER) team or primary care physician for findings later discovered, signing out over the phone when it could have been done in person, and texting or using smartphones during educational conferences. In particular, participation in unprofessional behaviors related to trainees was close to zero (eg, asking a student to discuss, with patients, medical or surgical information which is perceived to be beyond their level of knowledge). The least common behaviors that hospitalists reported participating in were discharging a patient before they are ready to go home in order to reduce one's census (2.56%) and reporting patient information as normal when uncertain of the true results (2.60%). Like previous studies of unprofessional behaviors, those that reported participation were less likely to report the behavior as unprofessional.8, 19

Observation of behaviors ranged from 4% to 80%. In all cases, observation of the behavior was reported at a higher level than participation. Correlation between observation and participation was also high, with the exception of a few behaviors that had zero or near zero participation rates (ie, reporting patient information as normal when unsure of true results.)

After performing factor analysis, 4 factors had eigenvalues greater than 1 and were therefore retained and extracted for further analysis. These 4 factors accounted for 76% of the variance in responses reported on the survey. By examining which items or groups of items most strongly loaded on each factor, the factors were named accordingly: factor 1 referred to behaviors related to making fun of others, factor 2 referred to workload management, factor 3 referred to behaviors related to the learning environment, and factor 4 referred to behaviors related to time pressure (Table 3).

Results of Factor Analysis Displaying Items by Primary Loading
  • NOTE: Items were categorized using factor analysis to the factor that they loaded most highly on. All items shown loaded at 0.4 or above onto each factor. Four items were omitted due to loadings less than 0.4. One item cross‐loaded on multiple factors (deferring family questions). Abbreviations: ER, emergency room.

Factor 1: Making fun of others
Making fun of other physicians (0.78)
Making fun of attendings (0.77)
Making fun of residents (0.70)
Making disparaging comments about a patient on rounds (0.51)
Factor 2: Workload management
Celebrating a successful turf (0.81)
Celebrating a blocked‐admission (0.65)
Coming to work sick (0.56)
Transferring a patient, who could be cared for on one's own service, to another service in order to reduce one's census (eg, turfing.) (0.51)
Disparaging the ER team/outpatient doctor to others for findings later discovered on the floor (0.48)
Discharging a patient before they are ready to go home in order to reduce one's census (0.43)
Factor 3: Learning environment
Not correcting someone who mistakes a student for a physician (0.72)
Texting or using smartphone during educational conferences (ie, noon lecture) (0.51)
Failing to notify patient‐safety or risk management that a member of the team made, or is concerned that they made, an error (0.45)
Having nonmedical/personal conversations in patient corridors (eg, discussing evening plans) (0.43)
Factor 4: Time pressure
Ignoring family member's nonurgent questions about a cross‐cover patient when you had the time to answer (0.50)
Signing out patients over the phone at the end of shift when sign‐out could have been done in person (0.46)
Attesting to a resident's note when not fully confident of the content of their documentation (0.44)

Using site‐adjusted multivariate regression, certain hospitalist job characteristics were associated with certain patterns of participating in unprofessional behavior (Table 4). Those with less clinical time (<50% FTE) were more likely to participate in unprofessional behaviors related to making fun of others (factor 1, value = 0.94, 95% CI 0.32 to 1.56, P value <0.05). Hospitalists who had any administrative time ( value = 0.61, 95% CI 0.111.10, P value <0.05) were more likely to report participation in behaviors related to workload management. Hospitalists engaged in any night work were more likely to report participation in unprofessional behaviors related to time pressure ( value = 0.67, 95% CI 0.171.17, P value <0.05). Time devoted to teaching or research was not associated with greater participation in any of the domains of unprofessional behavior surveyed.

Association Between Hospitalist Job and Demographic Characteristics and Factors of Unprofessional Behavior
ModelMaking Fun of OthersLearning EnvironmentWorkload ManagementTime Pressure
PredictorBeta [95% CI]Beta [95% CI]Beta [95% CI]Beta [95% CI]
  • NOTE: Table shows the results of 4 different multivariable linear regression models, which examine the association between various covariates (job characteristics, demographic characteristics, and site) and factors of participation in unprofessional behaviors (communication, patient safety, workload). Due to item nonresponse, n = 63 for all regression models. Abbreviations: CI, confidence interval.

  • P < 0.05.

  • Less clinical was defined as less than 50% full‐time equivalent (FTE) in a given year spent on clinical work.

  • Teaching was defined as greater than the median (10% FTE) spent on teaching. Results did not change when using tertiles of teaching effort, or a cutoff at teaching greater than 20% FTE.

  • Administrative time, research time, and nights were defined as reporting any administrative time, research time, or night work, respectively (greater than 0% per year).

  • Younger was defined as those born after 1970.

Job characteristics
Less clinical0.94 [0.32, 1.56]*0.01 [0.66, 0.64]0.17 [0.84, 0.49]0.39 [0.24, 1.01]
Administrative0.30 [0.16, 0.76]0.06 [0.43, 0.54]0.61 [0.11, 1.10]*0.26 [0.20, 0.72]
Teaching0.01 [0.49, 0.48]0.09 [0.60, 0.42]0.12 [0.64, 0.40]0.16 [0.33, 0.65]
Research0.30 [0.87, 0.27]0.38 [0.98, 0.22]0.37 [0.98, 0.24]0.13 [0.45, 0.71]
Any nights0.08 [0.58, 0.42]0.24 [0.28, 0.77]0.24 [0.29, 0.76]0.67 [0.17,1.17]*
Demographic characteristics
Male0.06 [0.42, 0.53]0.03 [0.47, 0.53]0.05 [0.56, 0.47]0.40 [0.89, 0.08]
Younger0.05 [0.79, 0.69]0.64 [1.42, 0.14]0.87 [0.07, 1.67]*0.62 [0.13, 1.37]
Unfamiliar with residents0.32 [0.85, 0.22]0.32 [0.89, 0.24]0.13 [0.45, 0.70]0.47 [0.08, 1.01]
Institution
Site 10.58 [0.22, 1.38]0.05 [0.89, 0.79]1.01 [0.15, 1.86]*0.77 [1.57, 0.04]
Site 30.11 [0.68, 0.47]0.70 [1.31, 0.09]*0.43 [0.20, 1.05]0.45 [0.13, 1.04]
Constant0.03 [0.99, 1.06]0.94 [0.14, 2.02]1.23[2.34, 0.13]*1.34[2.39, 0.31]*

The only demographic characteristic that was significantly associated with unprofessional behavior was age. Specifically, those who were born after 1970 were more likely to participate in unprofessional behaviors related to workload management ( value = 0.87, 95% CI 0.071.67, P value <0.05). Site differences were also present. Specifically, one site was more likely to report participation in unprofessional behaviors related to workload management ( value site 1 = 1.01, 95% CI 0.15 to 1.86, P value <0.05), while another site was less likely to report participation in behaviors related to the learning environment ( value site 3 = 0.70, 95% CI 1.31 to 0.09, P value <0.05). Gender and familiarity with residents were not significant predictors of participation in unprofessional behaviors. Results remained robust in sensitivity analyses using different cutoffs of clinical time and teaching time.

DISCUSSION

This multisite study adds to what is known about the perceptions of, and participation in, unprofessional behaviors among internal medicine hospitalists. Hospitalists perceived almost all surveyed behaviors as unprofessional. Participation in egregious and trainee‐related unprofessional behaviors was very low. Four categories appeared to explain the variability in how hospitalists reported participation in unprofessional behaviors: making fun of others, workload management, learning environment, and time pressure. Participation in behaviors within these factors was associated with certain job characteristics, such as clinical time, administrative time, and night work, as well as age and site.

It is reassuring that participation in, and trainee‐related, unprofessional behaviors is very low, and it is noteworthy that attending an industry‐sponsored dinner is not considered unprofessional. This was surprising in the setting of increased external pressures to report and ban such interactions.28 Perception that attending such dinners is acceptable may reflect a lag between current practice and national recommendations.

It is important to explore why certain job characteristics are associated with participation in unprofessional behaviors. For example, those with less clinical time were more likely to participate in making fun of others. It may be the case that hospitalists with more clinical time may make a larger effort to develop and maintain positive relationships. Another possible explanation is that hospitalists with less clinical time are more easily influenced by those in the learning environment who make fun of others, such as residents who they are supervising for only a brief period.

For unprofessional behaviors related to workload management, those who were younger, and those with any administrative time, were more likely to participate in behaviors such as celebrating a blocked‐admission. Our prior work shows that behaviors related to workload management are more widespread in residency, and therefore younger hospitalists, who are often recent residency graduates, may be more prone to participating in these behaviors. While unproven, it is possible that those with more administrative time may have competing priorities with their administrative roles, which motivate them to more actively manage their workload, leading them to participate in workload management behaviors.

Hospitalists who did any night work were more likely to participate in unprofessional behaviors related to time pressure. This could reflect the high workloads that night hospitalists may face and the pressure they feel to wrap up work, resulting in a hasty handoff (ie, over the phone) or to defer work (ie, family questions). Site differences were also observed for participation in behaviors related to the learning environment, speaking to the importance of institutional culture.

It is worth mentioning that hospitalists who were teachers were not any less likely to report participating in certain behaviors. While 78% of hospitalists reported some level of teaching, the median reported percentage of teaching was 10% FTE. This level of teaching likely reflects the diverse nature of work in which hospitalists engage. While hospitalists spend some time working with trainees, services that are not staffed with residents (eg, uncovered services) are becoming increasingly common due to stricter resident duty hour restrictions. This may explain why 60% of hospitalists reported being unfamiliar with residents. We also used a high bar for familiarity, which we defined as knowing half of residents by name, and served as a proxy for those who may have trained at the institution where they currently work. In spite of hospitalists reporting a low fraction of their total clinical time devoted to resident services, a significant fraction of resident services were staffed by hospitalists at all sites, making them a natural target for interventions.

These results have implications for future work to assess and improve professionalism in the hospital learning environment. First, interventions to address unprofessional behaviors should focus on behaviors with the highest participation rates. Like our earlier studies of residents, participation is high in certain behaviors, such as misrepresenting a test as urgent, or disparaging the ER or primary care physician (PCP) for a missed finding.19, 20 While blocking an admission was common in our studies of residents, reported participation among hospitalists was low. Similar to a prior study of clinical year medical students at one of our sites, 1 in 5 hospitalists reported not correcting someone who mistakes a student for a physician, highlighting the role that hospitalists may have in perpetuating this behavior.8 Additionally, addressing the behaviors identified in this study, through novel curricular tools, may help to teach residents many of the interpersonal and communication skills called for in the 2011 ACGME Common Program Requirements.11 The ACGME requirements also include the expectation that faculty model how to manage their time before, during, and after clinical assignments, and recognize that transferring a patient to a rested provider is best. Given that most hospitalists believe staying past shift limit is professional, these requirements will be difficult to adopt without widespread culture change.

Moreover, interventions could be tailored to hospitalists with certain job characteristics. Interventions may be educational or systems based. An example of the former is stressing the impact of the learning and working environment on trainees, and an example of the latter is streamlining the process in which ordered tests are executed to result in a more timely completion of tests. This may result in fewer physicians misrepresenting a test as urgent in order to have the test done in a timely manner. Additionally, hospitalists with less clinical time could receive education on their impact as a role model for trainees. Hospitalists who are younger or with administrative commitments could be trained on the importance of avoiding behaviors related to workload management, such as blocking or turfing patients. Lastly, given the site differences, critical examination of institutional culture and policies is also important. With funding from the American Board of Internal Medicine (ABIM) Foundation, we are currently creating an educational intervention, targeting those behaviors that were most frequent among hospitalists and residents at our institutions to promote dialogue and critical reflection, with the hope of reducing the most prevalent behaviors encountered.

There are several limitations to this study. Despite the anonymity of the survey, participants may have inaccurately reported their participation in unprofessional behaviors due to socially desirable response. In addition, because we used factor analysis and multivariate regression models with a small sample size, item nonresponse limited the sample for regression analyses and raises the concern for response bias. However, all significant associations remained so after performing backwards stepwise elimination of covariates that were P > 0.10 in models that were larger (ranging from 65 to 69). Because we used self‐report and not direct observation of participation in unprofessional behaviors, it is not possible to validate the responses given. Future work could rely on the use of 360 degree evaluations or other methods to validate responses given by self‐report. It is also important to consider assessing whether these behaviors are associated with actual patient outcomes, such as length of stay or readmission. Some items may not always be unprofessional. For example, texting during an educational conference might be to advance care, which would not necessarily be unprofessional. The order in which the questions were asked could have led to bias. We asked about participation before perception to try to limit bias reporting in participation. Changing the order of these questions would potentially have resulted in under‐reporting participation in behaviors that one perceived to be unprofessional. This study was conducted at 3 institutions located in Chicago, limiting generalizability to institutions outside of this area. Only internal medicine hospitalists were surveyed, which also limits generalizability to other disciplines and specialties within internal medicine. Lastly, it is important to highlight that hospitalists are not the sole teachers on inpatient services, since residents encounter a variety of faculty who serve as teaching attendings. Future work should expand to other centers and other specialties.

In conclusion, in this multi‐institutional study of hospitalists, participation in egregious behaviors was low. Four factors or patterns underlie hospitalists' reports of participation in unprofessional behavior: making fun of others, learning environment, workload management, and time pressure. Job characteristics (clinical time, administrative time, night work), age, and site were all associated with different patterns of unprofessional behavior. Specifically, hospitalists with less clinical time were more likely to make fun of others. Hospitalists who were younger in age, as well as those who had any administrative work, were more likely to participate in behaviors related to workload management. Hospitalists who work nights were more likely to report behaviors related to time pressure. Interventions to promote professionalism should take institutional culture into account and should focus on behaviors with the highest participation rates. Efforts should also be made to address underlying reasons for participation in these behaviors.

Acknowledgements

The authors thank Meryl Prochaska for her research assistance and manuscript preparation.

Disclosures: The authors acknowledge funding from the ABIM Foundation and the Pritzker Summer Research Program. The funders had no role in the design of the study; the collection, analysis, and interpretation of the data; or the decision to approve publication of the finished manuscript. Prior presentations of the data include the 2010 University of Chicago Pritzker School of Medicine Summer Research Forum, the 2010 University of Chicago Pritzker School of Medicine Medical Education Day, the 2010 Midwest Society of Hospital Medicine Meeting in Chicago, IL, and the 2011 Society of Hospital Medicine National Meeting in Dallas, TX. All authors disclose no relevant or financial conflicts of interest.

The discrepancy between what is taught about professionalism in formal medical education and what is witnessed in the hospital has received increasing attention.17 This latter aspect of medical education contributes to the hidden curriculum and impacts medical trainees' views on professionalism.8 The hidden curriculum refers to the lessons trainees learn through informal interactions within the multilayered educational learning environment.9 A growing body of work examines how the hidden curriculum and disruptive physicians impact the learning environment.9, 10 In response, regulatory agencies, such as the Liaison Committee on Medical Education (LCME) and Accreditation Council for Graduate Medical Education (ACGME), require training programs and medical schools to maintain standards of professionalism, and to regularly evaluate the learning environment and its impact on professionalism.11, 12 The ACGME in 2011 expanded its standards regarding professionalism by making certain that the program director and institution ensure a culture of professionalism that supports patient safety and personal responsibility.11 Given this increasing focus on professionalism in medical school and residency training programs, it is critical to examine faculty perceptions and actions that may perpetuate the discrepancy between the formal and hidden curriculum.13 This early exposure is especially significant because unprofessional behavior in medical school is strongly associated with later disciplinary action by a medical board.14, 15 Certain unprofessional behaviors can also compromise patient care and safety, and can detract from the hospital working environment.1618

In our previous work, we demonstrated that internal medicine interns reported increased participation in unprofessional behaviors regarding on‐call etiquette during internship.19, 20 Examples of these behaviors include refusing an admission (ie, blocking) and misrepresenting a test as urgent. Interestingly, students and residents have highlighted the powerful role of supervising faculty physicians in condoning or inhibiting such behavior. Given the increasing role of hospitalists as resident supervisors, it is important to consider the perceptions and actions of hospitalists with respect to perpetuating or hindering some unprofessional behaviors. Although hospital medicine is a relatively new specialty, many hospitalists are in frequent contact with medical trainees, perhaps because many residency programs and medical schools have a strong inpatient focus.2123 It is thus possible that hospitalists have a major influence on residents' behaviors and views of professionalism. In fact, the Society of Hospital Medicine's Core Competencies for Hospital Medicine explicitly state that hospitalists are expected to serve as a role model for professional and ethical conduct to house staff, medical students and other members of the interdisciplinary team.24

Therefore, the current study had 2 aims: first, to measure internal medicine hospitalists' perceptions of, and participation in, unprofessional behaviors using a previously validated survey; and second, to examine associations between job characteristics and participation in unprofessional behaviors.

METHODS

Study Design

This was a multi‐institutional, observational study that took place at the University of Chicago Pritzker School of Medicine, Northwestern University Feinberg School of Medicine, and NorthShore University HealthSystem. Hospitalist physicians employed at these hospitals were recruited for this study between June 2010 and July 2010. The Institutional Review Boards of the University of Chicago, Northwestern University, and NorthShore University HealthSystem approved this study. All subjects provided informed consent before participating.

Survey Development and Administration

Based on a prior survey of interns and third‐year medical students, a 35‐item survey was used to measure perceptions of, and participation in, unprofessional behaviors.8, 19, 20 The original survey was developed in 2005 by medical students who observed behaviors by trainees and faculty that they considered to be unprofessional. The survey was subsequently modified by interns to ascertain unprofessional behavior among interns. For this iteration, hospitalists and study authors at each site reviewed the survey items and adapted each item to ensure relevance to hospitalist work and also generalizability to site. New items were also created to refer specifically to work routinely performed by hospitalist attendings (attesting to resident notes, transferring patients to other services to reduce workload, etc). Because of this, certain items utilized jargon to refer to the unprofessional behavior as hospitalists do (ie, blocking admissions and turfing), and resonate with literature describing these phenomena.25 Items were also written in such a fashion to elicit the unprofessional nature (ie, blocking an admission that could be appropriate for your service).

The final survey (see Supporting Information, Appendix, in the online version of this article) included domains such as interactions with others, interactions with trainees, and patient‐care scenarios. Demographic information and job characteristics were collected including year of residency completion, total amount of clinical work, amount of night work, and amount of administrative work. Hospitalists were not asked whether they completed residency at the institution where they currently work in order to maintain anonymity in the context of a small sample. Instead, they were asked to rate their familiarity with residents at their institution on a Likert‐type scale ranging from very unfamiliar (1) to familiar (3) to very familiar (5). To help standardize levels of familiarity across hospitalists, we developed anchors that corresponded to how well a hospitalist would know resident names with familiar defined as knowing over half of resident names.

Participants reported whether they participated in, or observed, a particular behavior and rated their perception of each behavior from 1 (unprofessional) to 5 (professional), with unprofessional and somewhat unprofessional defined as unprofessional. A site champion administered paper surveys during a routine faculty meeting at each site. An electronic version was administered using SurveyMonkey (SurveyMonkey, Palo Alto, CA) to hospitalists not present at the faculty meeting. Participants chose a unique, nonidentifiable code to facilitate truthful reporting while allowing data tracking in follow‐up studies.

Data Analysis

Clinical time was dichotomized using above and below 50% full‐time equivalents (FTE) to define those that did less clinical. Because teaching time was relatively low with the median percent FTE spent on teaching at 10%, we used a cutoff of greater than 10% as greater teaching. Because many hospitalists engaged in no night work, night work was reported as those who engaged in any night work and those who did not. Similarly, because many hospitalists had no administrative time, administrative time was split into those with any administrative work and those without any administrative work. Lastly, those born after 1970 were classified as younger hospitalists.

Chi‐square tests were used to compare site response rates, and descriptive statistics were used to examine demographic characteristics of hospitalist respondents, in addition to perception of, and participation in, unprofessional behaviors. Because items on the survey were highly correlated, we used factor analysis to identify the underlying constructs that related to unprofessional behavior.26 Factor analysis is a statistical procedure that is most often used to explore which variables in a data set are most related or correlated to each other. By examining the patterns of similar responses, the underlying factors can be identified and extracted. These factors, by definition, are not correlated with each other. To select the number of factors to retain, the most common convention is to use Kaiser criterion, or retain all factors with eigenvalues greater than, or equal to, one.27 An eigenvalue measures the amount of variation in all of the items on the survey which is accounted for by that factor. If a factor has a low eigenvalue (less than 1 is the convention), then it is contributing little and is ignored, as it is likely redundant with the higher value factors.

Because use of Kaiser criterion often overestimates the number of factors to retain, another method is to use a scree plot which tends to underestimate the factors. Both were used in this study to ensure a stable solution. To name the factors, we examined which items or group of items loaded or were most highly related to which factor. To ensure an optimal factor solution, items with minimal participation (less than 3%) were excluded from factor analysis.

Then, site‐adjusted multivariate regression analysis was used to examine associations between job and demographic characteristics, and the factors of unprofessional behavior identified. Models controlled for gender and familiarity with residents. Because sample medians were used to define greater teaching (>10% FTE), we also performed a sensitivity analysis using different cutoffs for teaching time (>20% FTE and teaching tertiles). Likewise, we also used varying definitions of less clinical time to ensure that any statistically significant associations were robust across varying definitions. All data were analyzed using STATA 11.0 (Stata Corp, College Station, TX) and statistical significance was defined as P < 0.05.

RESULTS

Seventy‐seven of the 101 hospitalists (76.2%) at 3 sites completed the survey. While response rates varied by site (site 1, 67%; site 2, 74%; site 3, 86%), the differences were not statistically significant (2 = 2.9, P = 0.24). Most hospitalists (79.2%) completed residency after 2000. Over half (57.1%) of participants were male, and over half (61%) reported having worked with their current hospitalist group from 1 to 4 years. Almost 60% (59.7%) reported being unfamiliar with residents in the program. Over 40% of hospitalists did not do any night work. Hospitalists were largely clinical, one‐quarter of hospitalists reported working over 50% FTE, and the median was 80% FTE. While 78% of hospitalists reported some teaching time, median time on teaching service was low at 10% (Table 1).

Demographics of Responders* (n = 77)
 Total n (%)
  • Abbreviations: IQR, interquartile range.

  • Site differences were observed for clinical practice characteristics, such as number of weeks of teaching service, weeks working nights, clinical time, research time, completed fellowship, and won teaching awards. Due to item nonresponse, number of respondents reporting is listed for each item.

  • Familiarity with residents asked in lieu of whether hospitalist trained at the institution. Familiarity defined as a rating of 4 or 5 on Likert scale ranging from Very Unfamiliar (1) to Very Familiar (5), with Familiar (4) defined further as knowing >50% of residents' names.

Male (%)44 (57.1)
Completed residency (%)
Between 1981 and 19902 (2.6)
Between 1991 and 200014 (18.2)
After 200061 (79.2)
Medical school matriculation (%) (n = 76) 
US medical school59 (77.6)
International medical school17 (22.3)
Years spent with current hospitalist group (%)
<1 yr14 (18.2)
14 yr47 (61.0)
59 yr15 (19.5)
>10 yr1 (1.3)
Familiarity with residents (%)
Familiar31 (40.2)
Unfamiliar46 (59.7)
No. of weeks per year spent on (median IQR)
Hospitalist practice (n = 72)26.0 [16.026.0]
Teaching services (n = 68)4.0 [1.08.0]
Weeks working nights* (n = 71)
>2 wk16 (22.5)
12 wk24 (33.8)
0 wk31 (43.7)
% Clinical time (median IQR)* (n = 73)80 (5099)
% Teaching time (median IQR)* (n = 74)10 (120)
Any research time (%)* (n = 71)22 (31.0)
Any administrative time (%) (n = 72)29 (40.3)
Completed fellowship (%)*12 (15.6)
Won teaching awards (%)* (n = 76)21 (27.6)
View a career in hospital medicine as (%)
Temporary11 (14.3)
Long term47 (61.0)
Unsure19 (24.7)

Hospitalists perceived almost all behaviors as unprofessional (unprofessional or somewhat unprofessional on a 5‐point Likert Scale). The only behavior rated as professional with a mean of 4.25 (95% CI 4.014.49) was staying past shift limit to complete a patient‐care task that could have been signed out. This behavior also had the highest level of participation by hospitalists (81.7%). Hospitalists were most ambivalent when rating professionalism of attending an industry‐sponsored dinner or social event (mean 3.20, 95% CI 2.983.41) (Table 2).

Perception of, and Observation and Participation in, Unprofessional Behaviors Among Hospitalists (n = 77)
BehaviorReported Perception (Mean Likert score)*Reported Participation (%)Reported Observation (%)
  • Abbreviations: ER, emergency room.

  • Perception rated on Likert scale from 1 (unprofessional) to 5 (professional).

Having nonmedical/personal conversations in patient corridors (eg, discussing evening plans)2.55 (2.342.76)67.180.3
Ordering a routine test as urgent to get it expedited2.82 (2.583.06)62.380.5
Making fun of other physicians to colleagues1.56 (1.391.70)40.367.5
Disparaging the ER team/outpatient doctor to others for findings later discovered on the floor (eg, after the patient is admitted)2.01 (1.842.19)39.567.1
Signing out patients over the phone at the end of shift when sign‐out could have been done in person2.95 (2.743.16)40.865.8
Texting or using smartphone during educational conferences (ie, noon lecture)2.16 (1.952.36)39.072.7
Discussing patient information in public spaces1.49 (1.341.63)37.766.2
Making fun of other attendings to colleagues1.62 (1.461.78)35.161.0
Deferring family members' concerns about a change in the patient's clinical course to the primary team in order to avoid engaging in such a discussion2.16 (1.912.40)30.355.3
Making disparaging comments about a patient on rounds1.42 (1.271.56)29.867.5
Attending an industry (eg, pharmaceutical or equipment/device manufacturer)‐sponsored dinner or social event3.20 (2.983.41)28.660.5
Ignoring family member's nonurgent questions about a cross‐cover patient when you had time to answer2.05 (1.852.25)26.348.7
Attesting to a resident's note when not fully confident of the content of their documentation1.65 (1.451.85)23.432.5
Making fun of support staff to colleagues1.45 (1.311.59)22.157.9
Not correcting someone who mistakes a student for a physician2.19 (2.012.38)20.835.1
Celebrating a blocked‐admission1.80 (1.612.00)21.160.5
Making fun of residents to colleagues1.53 (1.371.70)18.244.2
Coming to work when you have a significant illness (eg, influenza)1.99 (1.792.19)14.335.1
Celebrating a successful turf1.71 (1.511.92)11.739.0
Failing to notify the patient that a member of the team made, or is concerned that they made, an error1.53 (1.341.71)10.420.8
Transferring a patient, who could be cared for on one's own service, to another service in order to reduce one's census (eg, turfing)1.72 (1.521.91)9.358.7
Refusing an admission which could be considered appropriate for your service (eg, blocking)1.63 (1.441.82)7.968.4
Falsifying patient records (ie, back‐dating a note, copying forward unverified information, or documenting physical findings not personally obtained)1.22 (1.101.34)6.527.3
Making fun of students to colleagues1.35 (1.191.51)6.524.7
Failing to notify patient‐safety or risk management that a member of the team made, or is concerned that they made, an error1.64 (1.461.82)5.213.2
Introducing a student as a doctor to patients1.96 (1.762.16)3.920.8
Signing‐out a procedure or task, that could have been completed during a required shift or by the primary team, in order to go home as early in the day as possible1.48 (1.321.64)3.948.1
Performing medical or surgical procedures on a patient beyond self‐perceived level of skill1.27 (1.141.41)2.67.8
Asking a student to obtain written consent from a patient or their proxy without supervision (eg, for blood transfusion or minor procedures)1.60 (1.421.78)2.636.5
Encouraging a student to state that they are a doctor in order to expedite patient care1.31 (1.151.47)2.66.5
Discharging a patient before they are ready to go home in order to reduce one's census1.18 (1.071.29)2.619.5
Reporting patient information (eg, labs, test results, exam results) as normal when uncertain of the true results1.29 (1.161.41)2.615.6
Asking a student to perform medical or surgical procedures which are perceived to be beyond their level of skill1.26 (1.121.40)1.33.9
Asking a student to discuss, with patients, medical or surgical information which is perceived to be beyond their level of knowledge1.41 (1.261.56)0.015.8

Participation in egregious behaviors, such as falsifying patient records (6.49%) and performing medical or surgical procedures on a patient beyond self‐perceived level of skill (2.60%), was very low. The most common behaviors rated as unprofessional that hospitalists reported participating in were having nonmedical/personal conversations in patient corridors (67.1%), ordering a routine test as urgent to expedite care (62.3%), and making fun of other physicians to colleagues (40.3%). Forty percent of participants reported disparaging the emergency room (ER) team or primary care physician for findings later discovered, signing out over the phone when it could have been done in person, and texting or using smartphones during educational conferences. In particular, participation in unprofessional behaviors related to trainees was close to zero (eg, asking a student to discuss, with patients, medical or surgical information which is perceived to be beyond their level of knowledge). The least common behaviors that hospitalists reported participating in were discharging a patient before they are ready to go home in order to reduce one's census (2.56%) and reporting patient information as normal when uncertain of the true results (2.60%). Like previous studies of unprofessional behaviors, those that reported participation were less likely to report the behavior as unprofessional.8, 19

Observation of behaviors ranged from 4% to 80%. In all cases, observation of the behavior was reported at a higher level than participation. Correlation between observation and participation was also high, with the exception of a few behaviors that had zero or near zero participation rates (ie, reporting patient information as normal when unsure of true results.)

After performing factor analysis, 4 factors had eigenvalues greater than 1 and were therefore retained and extracted for further analysis. These 4 factors accounted for 76% of the variance in responses reported on the survey. By examining which items or groups of items most strongly loaded on each factor, the factors were named accordingly: factor 1 referred to behaviors related to making fun of others, factor 2 referred to workload management, factor 3 referred to behaviors related to the learning environment, and factor 4 referred to behaviors related to time pressure (Table 3).

Results of Factor Analysis Displaying Items by Primary Loading
  • NOTE: Items were categorized using factor analysis to the factor that they loaded most highly on. All items shown loaded at 0.4 or above onto each factor. Four items were omitted due to loadings less than 0.4. One item cross‐loaded on multiple factors (deferring family questions). Abbreviations: ER, emergency room.

Factor 1: Making fun of others
Making fun of other physicians (0.78)
Making fun of attendings (0.77)
Making fun of residents (0.70)
Making disparaging comments about a patient on rounds (0.51)
Factor 2: Workload management
Celebrating a successful turf (0.81)
Celebrating a blocked‐admission (0.65)
Coming to work sick (0.56)
Transferring a patient, who could be cared for on one's own service, to another service in order to reduce one's census (eg, turfing.) (0.51)
Disparaging the ER team/outpatient doctor to others for findings later discovered on the floor (0.48)
Discharging a patient before they are ready to go home in order to reduce one's census (0.43)
Factor 3: Learning environment
Not correcting someone who mistakes a student for a physician (0.72)
Texting or using smartphone during educational conferences (ie, noon lecture) (0.51)
Failing to notify patient‐safety or risk management that a member of the team made, or is concerned that they made, an error (0.45)
Having nonmedical/personal conversations in patient corridors (eg, discussing evening plans) (0.43)
Factor 4: Time pressure
Ignoring family member's nonurgent questions about a cross‐cover patient when you had the time to answer (0.50)
Signing out patients over the phone at the end of shift when sign‐out could have been done in person (0.46)
Attesting to a resident's note when not fully confident of the content of their documentation (0.44)

Using site‐adjusted multivariate regression, certain hospitalist job characteristics were associated with certain patterns of participating in unprofessional behavior (Table 4). Those with less clinical time (<50% FTE) were more likely to participate in unprofessional behaviors related to making fun of others (factor 1, value = 0.94, 95% CI 0.32 to 1.56, P value <0.05). Hospitalists who had any administrative time ( value = 0.61, 95% CI 0.111.10, P value <0.05) were more likely to report participation in behaviors related to workload management. Hospitalists engaged in any night work were more likely to report participation in unprofessional behaviors related to time pressure ( value = 0.67, 95% CI 0.171.17, P value <0.05). Time devoted to teaching or research was not associated with greater participation in any of the domains of unprofessional behavior surveyed.

Association Between Hospitalist Job and Demographic Characteristics and Factors of Unprofessional Behavior
ModelMaking Fun of OthersLearning EnvironmentWorkload ManagementTime Pressure
PredictorBeta [95% CI]Beta [95% CI]Beta [95% CI]Beta [95% CI]
  • NOTE: Table shows the results of 4 different multivariable linear regression models, which examine the association between various covariates (job characteristics, demographic characteristics, and site) and factors of participation in unprofessional behaviors (communication, patient safety, workload). Due to item nonresponse, n = 63 for all regression models. Abbreviations: CI, confidence interval.

  • P < 0.05.

  • Less clinical was defined as less than 50% full‐time equivalent (FTE) in a given year spent on clinical work.

  • Teaching was defined as greater than the median (10% FTE) spent on teaching. Results did not change when using tertiles of teaching effort, or a cutoff at teaching greater than 20% FTE.

  • Administrative time, research time, and nights were defined as reporting any administrative time, research time, or night work, respectively (greater than 0% per year).

  • Younger was defined as those born after 1970.

Job characteristics
Less clinical0.94 [0.32, 1.56]*0.01 [0.66, 0.64]0.17 [0.84, 0.49]0.39 [0.24, 1.01]
Administrative0.30 [0.16, 0.76]0.06 [0.43, 0.54]0.61 [0.11, 1.10]*0.26 [0.20, 0.72]
Teaching0.01 [0.49, 0.48]0.09 [0.60, 0.42]0.12 [0.64, 0.40]0.16 [0.33, 0.65]
Research0.30 [0.87, 0.27]0.38 [0.98, 0.22]0.37 [0.98, 0.24]0.13 [0.45, 0.71]
Any nights0.08 [0.58, 0.42]0.24 [0.28, 0.77]0.24 [0.29, 0.76]0.67 [0.17,1.17]*
Demographic characteristics
Male0.06 [0.42, 0.53]0.03 [0.47, 0.53]0.05 [0.56, 0.47]0.40 [0.89, 0.08]
Younger0.05 [0.79, 0.69]0.64 [1.42, 0.14]0.87 [0.07, 1.67]*0.62 [0.13, 1.37]
Unfamiliar with residents0.32 [0.85, 0.22]0.32 [0.89, 0.24]0.13 [0.45, 0.70]0.47 [0.08, 1.01]
Institution
Site 10.58 [0.22, 1.38]0.05 [0.89, 0.79]1.01 [0.15, 1.86]*0.77 [1.57, 0.04]
Site 30.11 [0.68, 0.47]0.70 [1.31, 0.09]*0.43 [0.20, 1.05]0.45 [0.13, 1.04]
Constant0.03 [0.99, 1.06]0.94 [0.14, 2.02]1.23[2.34, 0.13]*1.34[2.39, 0.31]*

The only demographic characteristic that was significantly associated with unprofessional behavior was age. Specifically, those who were born after 1970 were more likely to participate in unprofessional behaviors related to workload management ( value = 0.87, 95% CI 0.071.67, P value <0.05). Site differences were also present. Specifically, one site was more likely to report participation in unprofessional behaviors related to workload management ( value site 1 = 1.01, 95% CI 0.15 to 1.86, P value <0.05), while another site was less likely to report participation in behaviors related to the learning environment ( value site 3 = 0.70, 95% CI 1.31 to 0.09, P value <0.05). Gender and familiarity with residents were not significant predictors of participation in unprofessional behaviors. Results remained robust in sensitivity analyses using different cutoffs of clinical time and teaching time.

DISCUSSION

This multisite study adds to what is known about the perceptions of, and participation in, unprofessional behaviors among internal medicine hospitalists. Hospitalists perceived almost all surveyed behaviors as unprofessional. Participation in egregious and trainee‐related unprofessional behaviors was very low. Four categories appeared to explain the variability in how hospitalists reported participation in unprofessional behaviors: making fun of others, workload management, learning environment, and time pressure. Participation in behaviors within these factors was associated with certain job characteristics, such as clinical time, administrative time, and night work, as well as age and site.

It is reassuring that participation in, and trainee‐related, unprofessional behaviors is very low, and it is noteworthy that attending an industry‐sponsored dinner is not considered unprofessional. This was surprising in the setting of increased external pressures to report and ban such interactions.28 Perception that attending such dinners is acceptable may reflect a lag between current practice and national recommendations.

It is important to explore why certain job characteristics are associated with participation in unprofessional behaviors. For example, those with less clinical time were more likely to participate in making fun of others. It may be the case that hospitalists with more clinical time may make a larger effort to develop and maintain positive relationships. Another possible explanation is that hospitalists with less clinical time are more easily influenced by those in the learning environment who make fun of others, such as residents who they are supervising for only a brief period.

For unprofessional behaviors related to workload management, those who were younger, and those with any administrative time, were more likely to participate in behaviors such as celebrating a blocked‐admission. Our prior work shows that behaviors related to workload management are more widespread in residency, and therefore younger hospitalists, who are often recent residency graduates, may be more prone to participating in these behaviors. While unproven, it is possible that those with more administrative time may have competing priorities with their administrative roles, which motivate them to more actively manage their workload, leading them to participate in workload management behaviors.

Hospitalists who did any night work were more likely to participate in unprofessional behaviors related to time pressure. This could reflect the high workloads that night hospitalists may face and the pressure they feel to wrap up work, resulting in a hasty handoff (ie, over the phone) or to defer work (ie, family questions). Site differences were also observed for participation in behaviors related to the learning environment, speaking to the importance of institutional culture.

It is worth mentioning that hospitalists who were teachers were not any less likely to report participating in certain behaviors. While 78% of hospitalists reported some level of teaching, the median reported percentage of teaching was 10% FTE. This level of teaching likely reflects the diverse nature of work in which hospitalists engage. While hospitalists spend some time working with trainees, services that are not staffed with residents (eg, uncovered services) are becoming increasingly common due to stricter resident duty hour restrictions. This may explain why 60% of hospitalists reported being unfamiliar with residents. We also used a high bar for familiarity, which we defined as knowing half of residents by name, and served as a proxy for those who may have trained at the institution where they currently work. In spite of hospitalists reporting a low fraction of their total clinical time devoted to resident services, a significant fraction of resident services were staffed by hospitalists at all sites, making them a natural target for interventions.

These results have implications for future work to assess and improve professionalism in the hospital learning environment. First, interventions to address unprofessional behaviors should focus on behaviors with the highest participation rates. Like our earlier studies of residents, participation is high in certain behaviors, such as misrepresenting a test as urgent, or disparaging the ER or primary care physician (PCP) for a missed finding.19, 20 While blocking an admission was common in our studies of residents, reported participation among hospitalists was low. Similar to a prior study of clinical year medical students at one of our sites, 1 in 5 hospitalists reported not correcting someone who mistakes a student for a physician, highlighting the role that hospitalists may have in perpetuating this behavior.8 Additionally, addressing the behaviors identified in this study, through novel curricular tools, may help to teach residents many of the interpersonal and communication skills called for in the 2011 ACGME Common Program Requirements.11 The ACGME requirements also include the expectation that faculty model how to manage their time before, during, and after clinical assignments, and recognize that transferring a patient to a rested provider is best. Given that most hospitalists believe staying past shift limit is professional, these requirements will be difficult to adopt without widespread culture change.

Moreover, interventions could be tailored to hospitalists with certain job characteristics. Interventions may be educational or systems based. An example of the former is stressing the impact of the learning and working environment on trainees, and an example of the latter is streamlining the process in which ordered tests are executed to result in a more timely completion of tests. This may result in fewer physicians misrepresenting a test as urgent in order to have the test done in a timely manner. Additionally, hospitalists with less clinical time could receive education on their impact as a role model for trainees. Hospitalists who are younger or with administrative commitments could be trained on the importance of avoiding behaviors related to workload management, such as blocking or turfing patients. Lastly, given the site differences, critical examination of institutional culture and policies is also important. With funding from the American Board of Internal Medicine (ABIM) Foundation, we are currently creating an educational intervention, targeting those behaviors that were most frequent among hospitalists and residents at our institutions to promote dialogue and critical reflection, with the hope of reducing the most prevalent behaviors encountered.

There are several limitations to this study. Despite the anonymity of the survey, participants may have inaccurately reported their participation in unprofessional behaviors due to socially desirable response. In addition, because we used factor analysis and multivariate regression models with a small sample size, item nonresponse limited the sample for regression analyses and raises the concern for response bias. However, all significant associations remained so after performing backwards stepwise elimination of covariates that were P > 0.10 in models that were larger (ranging from 65 to 69). Because we used self‐report and not direct observation of participation in unprofessional behaviors, it is not possible to validate the responses given. Future work could rely on the use of 360 degree evaluations or other methods to validate responses given by self‐report. It is also important to consider assessing whether these behaviors are associated with actual patient outcomes, such as length of stay or readmission. Some items may not always be unprofessional. For example, texting during an educational conference might be to advance care, which would not necessarily be unprofessional. The order in which the questions were asked could have led to bias. We asked about participation before perception to try to limit bias reporting in participation. Changing the order of these questions would potentially have resulted in under‐reporting participation in behaviors that one perceived to be unprofessional. This study was conducted at 3 institutions located in Chicago, limiting generalizability to institutions outside of this area. Only internal medicine hospitalists were surveyed, which also limits generalizability to other disciplines and specialties within internal medicine. Lastly, it is important to highlight that hospitalists are not the sole teachers on inpatient services, since residents encounter a variety of faculty who serve as teaching attendings. Future work should expand to other centers and other specialties.

In conclusion, in this multi‐institutional study of hospitalists, participation in egregious behaviors was low. Four factors or patterns underlie hospitalists' reports of participation in unprofessional behavior: making fun of others, learning environment, workload management, and time pressure. Job characteristics (clinical time, administrative time, night work), age, and site were all associated with different patterns of unprofessional behavior. Specifically, hospitalists with less clinical time were more likely to make fun of others. Hospitalists who were younger in age, as well as those who had any administrative work, were more likely to participate in behaviors related to workload management. Hospitalists who work nights were more likely to report behaviors related to time pressure. Interventions to promote professionalism should take institutional culture into account and should focus on behaviors with the highest participation rates. Efforts should also be made to address underlying reasons for participation in these behaviors.

Acknowledgements

The authors thank Meryl Prochaska for her research assistance and manuscript preparation.

Disclosures: The authors acknowledge funding from the ABIM Foundation and the Pritzker Summer Research Program. The funders had no role in the design of the study; the collection, analysis, and interpretation of the data; or the decision to approve publication of the finished manuscript. Prior presentations of the data include the 2010 University of Chicago Pritzker School of Medicine Summer Research Forum, the 2010 University of Chicago Pritzker School of Medicine Medical Education Day, the 2010 Midwest Society of Hospital Medicine Meeting in Chicago, IL, and the 2011 Society of Hospital Medicine National Meeting in Dallas, TX. All authors disclose no relevant or financial conflicts of interest.

References
  1. Stern DT.Practicing what we preach? An analysis of the curriculum of values in medical education.Am J Med.1998;104:569575.
  2. Borgstrom E,Cohn S,Barclay S.Medical professionalism: conflicting values for tomorrow's doctors.J Gen Intern Med.2010;25(12):13301336.
  3. Karnieli‐Miller O,Vu TR,Holtman MC,Clyman SG,Inui TS.Medical students' professionalism narratives: a window on the informal and hidden curriculum.Acad Med.2010;85(1):124133.
  4. Cohn FG,Shapiro J,Lie DA,Boker J,Stephens F,Leung LA.Interpreting values conflicts experienced by obstetrics‐gynecology clerkship students using reflective writing.Acad Med.2009;84(5):587596.
  5. Gaiser RR.The teaching of professionalism during residency: why it is failing and a suggestion to improve its success.Anesth Analg.2009;108(3):948954.
  6. Gofton W,Regehr G.What we don't know we are teaching: unveiling the hidden curriculum.Clin Orthop Relat Res.2006;449:2027.
  7. Hafferty FW.Definitions of professionalism: a search for meaning and identity.Clin Orthop Relat Res.2006;449:193204.
  8. Reddy ST,Farnan JM,Yoon JD, et al.Third‐year medical students' participation in and perceptions of unprofessional behaviors.Acad Med.2007;82:S35S39.
  9. Hafferty FW.Beyond curriculum reform: confronting medicine's hidden curriculum.Acad Med.1998;73:403407.
  10. Pfifferling JH.Physicians' “disruptive” behavior: consequences for medical quality and safety.Am J Med Qual.2008;23:165167.
  11. Accreditation Council for Graduate Medical Education. Common Program Requirements: General Competencies. Available at: http://www.acgme.org/acwebsite/home/common_program_requirements_07012011.pdf. Accessed December 19,2011.
  12. Liaison Committee on Medical Education. Functions and Structure of a Medical School. Available at: http://www.lcme.org/functions2010jun.pdf.. Accessed June 30,2010.
  13. Gillespie C,Paik S,Ark T,Zabar S,Kalet A.Residents' perceptions of their own professionalism and the professionalism of their learning environment.J Grad Med Educ.2009;1:208215.
  14. Papadakis MA,Hodgson CS,Teherani A,Kohatsu ND.Unprofessional behavior in medical school is associated with subsequent disciplinary action by a state medical board.Acad Med.2004;79:244249.
  15. Papadakis MA,Teherani A,Banach MA, et al.Disciplinary action by medical boards and prior behavior in medical school.N Engl J Med.2005;353:26732682.
  16. Rosenstein AH,O'Daniel M.A survey of the impact of disruptive behaviors and communication defects on patient safety.Jt Comm J Qual Patient Saf.2008;34:464471.
  17. Rosenstein AH,O'Daniel M.Managing disruptive physician behavior—impact on staff relationships and patient care.Neurology.2008;70:15641570.
  18. The Joint Commission.Behaviors that undermine a culture of safety. Sentinel Event Alert.2008. Available at: http://www.jointcommission.org/assets/1/18/SEA_40.PDF. Accessed April 28, 2012.
  19. Arora VM,Wayne DB,Anderson RA,Didwania A,Humphrey HJ.Participation in and perceptions of unprofessional behaviors among incoming internal medicine interns.JAMA.2008;300:11321134.
  20. Arora VM,Wayne DB,Anderson RA, et al.Changes in perception of and participation in unprofessional behaviors during internship.Acad Med.2010;85:S76S80.
  21. Wachter RM.Reflections: the hospitalist movement a decade later.J Hosp Med.2006;1:248252.
  22. Society of Hospital Medicine, 2007–2008 Bi‐Annual Survey.2008. Available at: http://www.medscape.org/viewarticle/578134. Accessed April 28, 2012.
  23. Holmboe ES,Bowen JL,Green M, et al.Reforming internal medicine residency training. A report from the Society of General Internal Medicine's Task Force for Residency Reform.J Gen Intern Med.2005;20:11651172.
  24. Society of Hospital Medicine.The Core Competencies in Hospital Medicine: a framework for curriculum development by the Society of Hospital Medicine.J Hosp Med.2006;1(suppl 1):25.
  25. Caldicott CV,Dunn KA,Frankel RM.Can patients tell when they are unwanted? “Turfing” in residency training.Patient Educ Couns.2005;56:104111.
  26. Costello AB,Osborn JW.Best practices in exploratory factor analysis: four recommendations for getting the most from your analysis.Pract Assess Res Eval.2005;10:19.
  27. Principal Components and Factor Analysis. StatSoft Electronic Statistics Textbook. Available at: http://www.statsoft.com/textbook/principal‐components‐factor‐analysis/. Accessed December 30,2011.
  28. Brennan TA,Rothman DJ,Blank L, et al.Health industry practices that create conflicts of interest: a policy proposal for academic medical centers.JAMA.2006;295(4):429433.
References
  1. Stern DT.Practicing what we preach? An analysis of the curriculum of values in medical education.Am J Med.1998;104:569575.
  2. Borgstrom E,Cohn S,Barclay S.Medical professionalism: conflicting values for tomorrow's doctors.J Gen Intern Med.2010;25(12):13301336.
  3. Karnieli‐Miller O,Vu TR,Holtman MC,Clyman SG,Inui TS.Medical students' professionalism narratives: a window on the informal and hidden curriculum.Acad Med.2010;85(1):124133.
  4. Cohn FG,Shapiro J,Lie DA,Boker J,Stephens F,Leung LA.Interpreting values conflicts experienced by obstetrics‐gynecology clerkship students using reflective writing.Acad Med.2009;84(5):587596.
  5. Gaiser RR.The teaching of professionalism during residency: why it is failing and a suggestion to improve its success.Anesth Analg.2009;108(3):948954.
  6. Gofton W,Regehr G.What we don't know we are teaching: unveiling the hidden curriculum.Clin Orthop Relat Res.2006;449:2027.
  7. Hafferty FW.Definitions of professionalism: a search for meaning and identity.Clin Orthop Relat Res.2006;449:193204.
  8. Reddy ST,Farnan JM,Yoon JD, et al.Third‐year medical students' participation in and perceptions of unprofessional behaviors.Acad Med.2007;82:S35S39.
  9. Hafferty FW.Beyond curriculum reform: confronting medicine's hidden curriculum.Acad Med.1998;73:403407.
  10. Pfifferling JH.Physicians' “disruptive” behavior: consequences for medical quality and safety.Am J Med Qual.2008;23:165167.
  11. Accreditation Council for Graduate Medical Education. Common Program Requirements: General Competencies. Available at: http://www.acgme.org/acwebsite/home/common_program_requirements_07012011.pdf. Accessed December 19,2011.
  12. Liaison Committee on Medical Education. Functions and Structure of a Medical School. Available at: http://www.lcme.org/functions2010jun.pdf.. Accessed June 30,2010.
  13. Gillespie C,Paik S,Ark T,Zabar S,Kalet A.Residents' perceptions of their own professionalism and the professionalism of their learning environment.J Grad Med Educ.2009;1:208215.
  14. Papadakis MA,Hodgson CS,Teherani A,Kohatsu ND.Unprofessional behavior in medical school is associated with subsequent disciplinary action by a state medical board.Acad Med.2004;79:244249.
  15. Papadakis MA,Teherani A,Banach MA, et al.Disciplinary action by medical boards and prior behavior in medical school.N Engl J Med.2005;353:26732682.
  16. Rosenstein AH,O'Daniel M.A survey of the impact of disruptive behaviors and communication defects on patient safety.Jt Comm J Qual Patient Saf.2008;34:464471.
  17. Rosenstein AH,O'Daniel M.Managing disruptive physician behavior—impact on staff relationships and patient care.Neurology.2008;70:15641570.
  18. The Joint Commission.Behaviors that undermine a culture of safety. Sentinel Event Alert.2008. Available at: http://www.jointcommission.org/assets/1/18/SEA_40.PDF. Accessed April 28, 2012.
  19. Arora VM,Wayne DB,Anderson RA,Didwania A,Humphrey HJ.Participation in and perceptions of unprofessional behaviors among incoming internal medicine interns.JAMA.2008;300:11321134.
  20. Arora VM,Wayne DB,Anderson RA, et al.Changes in perception of and participation in unprofessional behaviors during internship.Acad Med.2010;85:S76S80.
  21. Wachter RM.Reflections: the hospitalist movement a decade later.J Hosp Med.2006;1:248252.
  22. Society of Hospital Medicine, 2007–2008 Bi‐Annual Survey.2008. Available at: http://www.medscape.org/viewarticle/578134. Accessed April 28, 2012.
  23. Holmboe ES,Bowen JL,Green M, et al.Reforming internal medicine residency training. A report from the Society of General Internal Medicine's Task Force for Residency Reform.J Gen Intern Med.2005;20:11651172.
  24. Society of Hospital Medicine.The Core Competencies in Hospital Medicine: a framework for curriculum development by the Society of Hospital Medicine.J Hosp Med.2006;1(suppl 1):25.
  25. Caldicott CV,Dunn KA,Frankel RM.Can patients tell when they are unwanted? “Turfing” in residency training.Patient Educ Couns.2005;56:104111.
  26. Costello AB,Osborn JW.Best practices in exploratory factor analysis: four recommendations for getting the most from your analysis.Pract Assess Res Eval.2005;10:19.
  27. Principal Components and Factor Analysis. StatSoft Electronic Statistics Textbook. Available at: http://www.statsoft.com/textbook/principal‐components‐factor‐analysis/. Accessed December 30,2011.
  28. Brennan TA,Rothman DJ,Blank L, et al.Health industry practices that create conflicts of interest: a policy proposal for academic medical centers.JAMA.2006;295(4):429433.
Issue
Journal of Hospital Medicine - 7(7)
Issue
Journal of Hospital Medicine - 7(7)
Page Number
543-550
Page Number
543-550
Publications
Publications
Article Type
Display Headline
Participation in unprofessional behaviors among hospitalists: A multicenter study
Display Headline
Participation in unprofessional behaviors among hospitalists: A multicenter study
Sections
Article Source

Copyright © 2012 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Department of Medicine, The University of Chicago, 5841 S Maryland Ave, MC 2007, AMB B200, Chicago, IL 60637
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Article PDF Media
Media Files