Affiliations
Department of Medicine, Northwestern University, Chicago, Illinois
Given name(s)
Aashish K.
Family name
Didwania
Degrees
MD

RRTs in Teaching Hospitals

Article Type
Changed
Mon, 01/02/2017 - 19:34
Display Headline
Rapid response teams in teaching hospitals: Aligning efforts to improve medical education and quality

In this issue of the Journal of Hospital Medicine, Butcher and colleagues report on residents' perceptions of a rapid response team's (RRT) impact on their training.[1] RRTs mobilize key clinicians in an attempt to rescue acutely decompensating hospitalized patients. Early recognition is essential, and most systems allow any concerned health professional to activate the RRT. Although the evidence for benefit is somewhat controversial,[2, 3] an overwhelming majority of hospitals have implemented RRTs.[4, 5]

The use of RRTs in teaching hospitals raises important concerns. The ability of nurses and other professionals to activate the RRT without need for prior approval from a physician could potentially undermine resident physician autonomy. Residents may feel that their clinical judgment has been usurped or second guessed. Whether nurse led or physician led, RRTs always introduce new members to the care team.[6] These new team members share in decision making, which may theoretically reduce residents' opportunities to hone their decision‐making skills when caring for potentially critically ill patients.

Despite these potential disadvantages, Butcher and colleagues report that the vast majority of residents found working with the RRT to be a valuable educational experience and disagreed that the RRT decreased their clinical autonomy. Interestingly, surgical residents were less likely to agree that working with the RRT was a valuable educational experience and much more likely to feel that nurses should contact them before activating the RRT.

The results of the study by Butcher et al. highlight several evolving paradigms in medical education and quality improvement. Over the past 10 to 15 years, and fostered in large part by Accreditation Council for Graduate Medical Education (ACGME) duty‐hour revisions,[7] teaching hospitals have moved away from the traditional practice of using residents primarily to fill their clinical service needs to an approach that treats residents more as learners. Resident training requires clinical care, but the provision of clinical care in teaching hospitals does not necessarily require residents. At the same time, healthcare organizations have moved away from the traditional culture characterized by reliance on individual skill, physician autonomy, and steep hierarchies, to an enlightened culture emphasizing teamwork with flattened hierarchies and systems redesigned to provide safe and effective care.[8]

For the most part, the paradigm shifts in medical education and quality improvement have been aligned. In fact, the primary goal of duty‐hour policy revisions was to improve patient safety.[9] Yet, Butcher and colleagues' study highlights the need to continuously and deliberately integrate our efforts to enhance medical education and quality of care, and more rigorously study the effects. Rather than be pleasantly surprised that residents understand the intrinsic value of an RRT to patient care and their education, we should ensure that residents understand the rationale for an RRT and consider using the RRT to complement other efforts to educate resident physicians in managing unstable patients. RRTs introduce a wonderful opportunity to develop novel interprofessional curricula. Learning objectives should include the management of common clinical syndromes represented in RRT calls, but should also focus on communication, leadership, and other essential teamwork skills. Simulation‐based training is an ideal teaching strategy for these objectives, and prior studies support the effectiveness of this approach.[10, 11]

The ACGME has now implemented the Next Accreditation System (NAS) across all specialties. Of the 22 reporting milestones within internal medicine, 12 relate directly to quality improvement and patient safety objectives, whereas 6 relate directly to pathophysiology and disease management.[12] Educating residents on systems of care is further highlighted by the Clinical Learning Environment Review (CLER), a key component of the NAS. The CLER program uses site visits to identify teaching hospitals' efforts to engage residents in 6 focus areas: patient safety; healthcare quality; transitions of care; supervision; duty hours, fatigue management, and mitigation; and professionalism.[13] CLER site visits include discussions and observations with hospital executive leadership, residents, graduate medical education leadership, nursing, and other hospital staff. The CLER program raises the bar for integrating medical education and quality improvement efforts even further. Quality improvement activities that previously supported an informal curriculum must now be made explicit to, and deliberately engage, our residents. Teaching hospitals are being tasked with including residents in safety initiatives and on all quality committees, especially those with cross‐departmental boundaries such as the Emergency Response Team/RRT Committee. Residents should meaningfully participate, and whenever possible, lead quality improvement projects, the focus of which may ideally be identified by residents themselves. An important resource for medical educators is the Quality and Safety Educators Academy, a program developed by the Society of Hospital Medicine and the Alliance for Academic Internal Medicine, which provides educators with the knowledge and tools to integrate quality improvement and patient safety objectives into their training programs.[14]

In conclusion, we are reassured that residents understand the intrinsic value of an RRT to patient care and their education. We encourage medical educators to use RRTs as an opportunity to develop interprofessional curricula, including those that aim to enhance teamwork skills. Beyond curricular innovation, quality‐improvement activities in teaching hospitals must deliberately engage our residents at every level of the organization.

Disclosure

Disclosure: Nothing to report.

References
  1. Butcher BW, Quist CE, Harrison JD, Ranji SR. The effect of a rapid response team on resident perceptions of education and autonomy. J Hosp Med. 2015;10(1):812.
  2. Chan PS, Jain R, Nallmothu BK, Berg RA, Sasson C. Rapid response teams: a systematic review and meta‐analysis. Arch Intern Med. 2010;170(1):1826.
  3. Winters BD, Weaver SJ, Pfoh ER, Yang T, Pham JC, Dy SM. Rapid‐response systems as a patient safety strategy: a systematic review. Ann Intern Med. 2013;158(5 pt 2):417425.
  4. Edelson DP, Yuen TC, Mancini ME, et al. Hospital cardiac arrest resuscitation practice in the United States: a nationally representative survey. J Hosp Med. 2014;9(6):353357.
  5. Reason J. Achieving a safe culture: theory and practice. Work Stress. 1998;12(3):293306.
  6. Wood KA, Ranji SR, Ide B, Dracup K. Rapid response systems in adult academic medical centers. Jt Comm J Qual Patient Saf. 2009;35(9):475482, 437.
  7. Nasca TJ, Day SH, Amis ES. The new recommendations on duty hours from the ACGME Task Force. N Engl J Med. 2010;363(2):e3.
  8. Jones KJ, Skinner A, Xu L, et al. The AHRQ hospital survey on patient safety culture: a tool to plan and evaluate patient safety programs. In: Henriksen K, Battles JB, Keyes MA, et al., eds. Advances in Patient Safety: New Directions and Alternative Approaches (Vol. 2: Culture and Redesign). Rockville, MD: Agency for Healthcare Research and Quality; 2008. Available at: http://www.ncbi.nlm.nih.gov/books/NBK43699. Accessed November 4, 2014.
  9. The ACGME 2011 Duty Hour Standards: Enhancing Quality of Care, Supervision, and Resident Professional Development. Chicago, IL: Accreditation Council for Graduate Medical Education; 2011.
  10. DeVita MA, Schaefer J, Lutz J, Wang H, Dongilli T. Improving medical emergency team (MET) performance using a novel curriculum and a computerized human patient simulator. Qual Saf Health Care. 2005;14(5):326331.
  11. Wehbe‐Janek H, Pliego J, Sheather S, Villamaria F. System‐based interprofessional simulation‐based training program increases awareness and use of rapid response teams. Jt Comm J Qual Patient Saf. 2014;40(6):279287.
  12. Internal Medicine Milestone Group. The Internal Medicine Milestone Project. A Joint Initiative of the Accreditation Council for Graduate Medical Education and The American Board of Internal Medicine. Available at: https://www.acgme.org/acgmeweb/Portals/0/PDFs/Milestones/InternalMedicineMilestones.pdf. Accessed November 4, 2014.
  13. Weiss KB, Bagian JP, Nasca TJ. The clinical learning environment: the foundation of graduate medical education. JAMA. 2013;309(16):16871688.
  14. Myers JS, Tess A, Glasheen JJ, et al. The Quality and Safety Educators Academy: fulfilling an unmet need for faculty development. Am J Med Qual. 2014;29(1):512.
Article PDF
Issue
Journal of Hospital Medicine - 10(1)
Publications
Page Number
62-63
Sections
Article PDF
Article PDF

In this issue of the Journal of Hospital Medicine, Butcher and colleagues report on residents' perceptions of a rapid response team's (RRT) impact on their training.[1] RRTs mobilize key clinicians in an attempt to rescue acutely decompensating hospitalized patients. Early recognition is essential, and most systems allow any concerned health professional to activate the RRT. Although the evidence for benefit is somewhat controversial,[2, 3] an overwhelming majority of hospitals have implemented RRTs.[4, 5]

The use of RRTs in teaching hospitals raises important concerns. The ability of nurses and other professionals to activate the RRT without need for prior approval from a physician could potentially undermine resident physician autonomy. Residents may feel that their clinical judgment has been usurped or second guessed. Whether nurse led or physician led, RRTs always introduce new members to the care team.[6] These new team members share in decision making, which may theoretically reduce residents' opportunities to hone their decision‐making skills when caring for potentially critically ill patients.

Despite these potential disadvantages, Butcher and colleagues report that the vast majority of residents found working with the RRT to be a valuable educational experience and disagreed that the RRT decreased their clinical autonomy. Interestingly, surgical residents were less likely to agree that working with the RRT was a valuable educational experience and much more likely to feel that nurses should contact them before activating the RRT.

The results of the study by Butcher et al. highlight several evolving paradigms in medical education and quality improvement. Over the past 10 to 15 years, and fostered in large part by Accreditation Council for Graduate Medical Education (ACGME) duty‐hour revisions,[7] teaching hospitals have moved away from the traditional practice of using residents primarily to fill their clinical service needs to an approach that treats residents more as learners. Resident training requires clinical care, but the provision of clinical care in teaching hospitals does not necessarily require residents. At the same time, healthcare organizations have moved away from the traditional culture characterized by reliance on individual skill, physician autonomy, and steep hierarchies, to an enlightened culture emphasizing teamwork with flattened hierarchies and systems redesigned to provide safe and effective care.[8]

For the most part, the paradigm shifts in medical education and quality improvement have been aligned. In fact, the primary goal of duty‐hour policy revisions was to improve patient safety.[9] Yet, Butcher and colleagues' study highlights the need to continuously and deliberately integrate our efforts to enhance medical education and quality of care, and more rigorously study the effects. Rather than be pleasantly surprised that residents understand the intrinsic value of an RRT to patient care and their education, we should ensure that residents understand the rationale for an RRT and consider using the RRT to complement other efforts to educate resident physicians in managing unstable patients. RRTs introduce a wonderful opportunity to develop novel interprofessional curricula. Learning objectives should include the management of common clinical syndromes represented in RRT calls, but should also focus on communication, leadership, and other essential teamwork skills. Simulation‐based training is an ideal teaching strategy for these objectives, and prior studies support the effectiveness of this approach.[10, 11]

The ACGME has now implemented the Next Accreditation System (NAS) across all specialties. Of the 22 reporting milestones within internal medicine, 12 relate directly to quality improvement and patient safety objectives, whereas 6 relate directly to pathophysiology and disease management.[12] Educating residents on systems of care is further highlighted by the Clinical Learning Environment Review (CLER), a key component of the NAS. The CLER program uses site visits to identify teaching hospitals' efforts to engage residents in 6 focus areas: patient safety; healthcare quality; transitions of care; supervision; duty hours, fatigue management, and mitigation; and professionalism.[13] CLER site visits include discussions and observations with hospital executive leadership, residents, graduate medical education leadership, nursing, and other hospital staff. The CLER program raises the bar for integrating medical education and quality improvement efforts even further. Quality improvement activities that previously supported an informal curriculum must now be made explicit to, and deliberately engage, our residents. Teaching hospitals are being tasked with including residents in safety initiatives and on all quality committees, especially those with cross‐departmental boundaries such as the Emergency Response Team/RRT Committee. Residents should meaningfully participate, and whenever possible, lead quality improvement projects, the focus of which may ideally be identified by residents themselves. An important resource for medical educators is the Quality and Safety Educators Academy, a program developed by the Society of Hospital Medicine and the Alliance for Academic Internal Medicine, which provides educators with the knowledge and tools to integrate quality improvement and patient safety objectives into their training programs.[14]

In conclusion, we are reassured that residents understand the intrinsic value of an RRT to patient care and their education. We encourage medical educators to use RRTs as an opportunity to develop interprofessional curricula, including those that aim to enhance teamwork skills. Beyond curricular innovation, quality‐improvement activities in teaching hospitals must deliberately engage our residents at every level of the organization.

Disclosure

Disclosure: Nothing to report.

In this issue of the Journal of Hospital Medicine, Butcher and colleagues report on residents' perceptions of a rapid response team's (RRT) impact on their training.[1] RRTs mobilize key clinicians in an attempt to rescue acutely decompensating hospitalized patients. Early recognition is essential, and most systems allow any concerned health professional to activate the RRT. Although the evidence for benefit is somewhat controversial,[2, 3] an overwhelming majority of hospitals have implemented RRTs.[4, 5]

The use of RRTs in teaching hospitals raises important concerns. The ability of nurses and other professionals to activate the RRT without need for prior approval from a physician could potentially undermine resident physician autonomy. Residents may feel that their clinical judgment has been usurped or second guessed. Whether nurse led or physician led, RRTs always introduce new members to the care team.[6] These new team members share in decision making, which may theoretically reduce residents' opportunities to hone their decision‐making skills when caring for potentially critically ill patients.

Despite these potential disadvantages, Butcher and colleagues report that the vast majority of residents found working with the RRT to be a valuable educational experience and disagreed that the RRT decreased their clinical autonomy. Interestingly, surgical residents were less likely to agree that working with the RRT was a valuable educational experience and much more likely to feel that nurses should contact them before activating the RRT.

The results of the study by Butcher et al. highlight several evolving paradigms in medical education and quality improvement. Over the past 10 to 15 years, and fostered in large part by Accreditation Council for Graduate Medical Education (ACGME) duty‐hour revisions,[7] teaching hospitals have moved away from the traditional practice of using residents primarily to fill their clinical service needs to an approach that treats residents more as learners. Resident training requires clinical care, but the provision of clinical care in teaching hospitals does not necessarily require residents. At the same time, healthcare organizations have moved away from the traditional culture characterized by reliance on individual skill, physician autonomy, and steep hierarchies, to an enlightened culture emphasizing teamwork with flattened hierarchies and systems redesigned to provide safe and effective care.[8]

For the most part, the paradigm shifts in medical education and quality improvement have been aligned. In fact, the primary goal of duty‐hour policy revisions was to improve patient safety.[9] Yet, Butcher and colleagues' study highlights the need to continuously and deliberately integrate our efforts to enhance medical education and quality of care, and more rigorously study the effects. Rather than be pleasantly surprised that residents understand the intrinsic value of an RRT to patient care and their education, we should ensure that residents understand the rationale for an RRT and consider using the RRT to complement other efforts to educate resident physicians in managing unstable patients. RRTs introduce a wonderful opportunity to develop novel interprofessional curricula. Learning objectives should include the management of common clinical syndromes represented in RRT calls, but should also focus on communication, leadership, and other essential teamwork skills. Simulation‐based training is an ideal teaching strategy for these objectives, and prior studies support the effectiveness of this approach.[10, 11]

The ACGME has now implemented the Next Accreditation System (NAS) across all specialties. Of the 22 reporting milestones within internal medicine, 12 relate directly to quality improvement and patient safety objectives, whereas 6 relate directly to pathophysiology and disease management.[12] Educating residents on systems of care is further highlighted by the Clinical Learning Environment Review (CLER), a key component of the NAS. The CLER program uses site visits to identify teaching hospitals' efforts to engage residents in 6 focus areas: patient safety; healthcare quality; transitions of care; supervision; duty hours, fatigue management, and mitigation; and professionalism.[13] CLER site visits include discussions and observations with hospital executive leadership, residents, graduate medical education leadership, nursing, and other hospital staff. The CLER program raises the bar for integrating medical education and quality improvement efforts even further. Quality improvement activities that previously supported an informal curriculum must now be made explicit to, and deliberately engage, our residents. Teaching hospitals are being tasked with including residents in safety initiatives and on all quality committees, especially those with cross‐departmental boundaries such as the Emergency Response Team/RRT Committee. Residents should meaningfully participate, and whenever possible, lead quality improvement projects, the focus of which may ideally be identified by residents themselves. An important resource for medical educators is the Quality and Safety Educators Academy, a program developed by the Society of Hospital Medicine and the Alliance for Academic Internal Medicine, which provides educators with the knowledge and tools to integrate quality improvement and patient safety objectives into their training programs.[14]

In conclusion, we are reassured that residents understand the intrinsic value of an RRT to patient care and their education. We encourage medical educators to use RRTs as an opportunity to develop interprofessional curricula, including those that aim to enhance teamwork skills. Beyond curricular innovation, quality‐improvement activities in teaching hospitals must deliberately engage our residents at every level of the organization.

Disclosure

Disclosure: Nothing to report.

References
  1. Butcher BW, Quist CE, Harrison JD, Ranji SR. The effect of a rapid response team on resident perceptions of education and autonomy. J Hosp Med. 2015;10(1):812.
  2. Chan PS, Jain R, Nallmothu BK, Berg RA, Sasson C. Rapid response teams: a systematic review and meta‐analysis. Arch Intern Med. 2010;170(1):1826.
  3. Winters BD, Weaver SJ, Pfoh ER, Yang T, Pham JC, Dy SM. Rapid‐response systems as a patient safety strategy: a systematic review. Ann Intern Med. 2013;158(5 pt 2):417425.
  4. Edelson DP, Yuen TC, Mancini ME, et al. Hospital cardiac arrest resuscitation practice in the United States: a nationally representative survey. J Hosp Med. 2014;9(6):353357.
  5. Reason J. Achieving a safe culture: theory and practice. Work Stress. 1998;12(3):293306.
  6. Wood KA, Ranji SR, Ide B, Dracup K. Rapid response systems in adult academic medical centers. Jt Comm J Qual Patient Saf. 2009;35(9):475482, 437.
  7. Nasca TJ, Day SH, Amis ES. The new recommendations on duty hours from the ACGME Task Force. N Engl J Med. 2010;363(2):e3.
  8. Jones KJ, Skinner A, Xu L, et al. The AHRQ hospital survey on patient safety culture: a tool to plan and evaluate patient safety programs. In: Henriksen K, Battles JB, Keyes MA, et al., eds. Advances in Patient Safety: New Directions and Alternative Approaches (Vol. 2: Culture and Redesign). Rockville, MD: Agency for Healthcare Research and Quality; 2008. Available at: http://www.ncbi.nlm.nih.gov/books/NBK43699. Accessed November 4, 2014.
  9. The ACGME 2011 Duty Hour Standards: Enhancing Quality of Care, Supervision, and Resident Professional Development. Chicago, IL: Accreditation Council for Graduate Medical Education; 2011.
  10. DeVita MA, Schaefer J, Lutz J, Wang H, Dongilli T. Improving medical emergency team (MET) performance using a novel curriculum and a computerized human patient simulator. Qual Saf Health Care. 2005;14(5):326331.
  11. Wehbe‐Janek H, Pliego J, Sheather S, Villamaria F. System‐based interprofessional simulation‐based training program increases awareness and use of rapid response teams. Jt Comm J Qual Patient Saf. 2014;40(6):279287.
  12. Internal Medicine Milestone Group. The Internal Medicine Milestone Project. A Joint Initiative of the Accreditation Council for Graduate Medical Education and The American Board of Internal Medicine. Available at: https://www.acgme.org/acgmeweb/Portals/0/PDFs/Milestones/InternalMedicineMilestones.pdf. Accessed November 4, 2014.
  13. Weiss KB, Bagian JP, Nasca TJ. The clinical learning environment: the foundation of graduate medical education. JAMA. 2013;309(16):16871688.
  14. Myers JS, Tess A, Glasheen JJ, et al. The Quality and Safety Educators Academy: fulfilling an unmet need for faculty development. Am J Med Qual. 2014;29(1):512.
References
  1. Butcher BW, Quist CE, Harrison JD, Ranji SR. The effect of a rapid response team on resident perceptions of education and autonomy. J Hosp Med. 2015;10(1):812.
  2. Chan PS, Jain R, Nallmothu BK, Berg RA, Sasson C. Rapid response teams: a systematic review and meta‐analysis. Arch Intern Med. 2010;170(1):1826.
  3. Winters BD, Weaver SJ, Pfoh ER, Yang T, Pham JC, Dy SM. Rapid‐response systems as a patient safety strategy: a systematic review. Ann Intern Med. 2013;158(5 pt 2):417425.
  4. Edelson DP, Yuen TC, Mancini ME, et al. Hospital cardiac arrest resuscitation practice in the United States: a nationally representative survey. J Hosp Med. 2014;9(6):353357.
  5. Reason J. Achieving a safe culture: theory and practice. Work Stress. 1998;12(3):293306.
  6. Wood KA, Ranji SR, Ide B, Dracup K. Rapid response systems in adult academic medical centers. Jt Comm J Qual Patient Saf. 2009;35(9):475482, 437.
  7. Nasca TJ, Day SH, Amis ES. The new recommendations on duty hours from the ACGME Task Force. N Engl J Med. 2010;363(2):e3.
  8. Jones KJ, Skinner A, Xu L, et al. The AHRQ hospital survey on patient safety culture: a tool to plan and evaluate patient safety programs. In: Henriksen K, Battles JB, Keyes MA, et al., eds. Advances in Patient Safety: New Directions and Alternative Approaches (Vol. 2: Culture and Redesign). Rockville, MD: Agency for Healthcare Research and Quality; 2008. Available at: http://www.ncbi.nlm.nih.gov/books/NBK43699. Accessed November 4, 2014.
  9. The ACGME 2011 Duty Hour Standards: Enhancing Quality of Care, Supervision, and Resident Professional Development. Chicago, IL: Accreditation Council for Graduate Medical Education; 2011.
  10. DeVita MA, Schaefer J, Lutz J, Wang H, Dongilli T. Improving medical emergency team (MET) performance using a novel curriculum and a computerized human patient simulator. Qual Saf Health Care. 2005;14(5):326331.
  11. Wehbe‐Janek H, Pliego J, Sheather S, Villamaria F. System‐based interprofessional simulation‐based training program increases awareness and use of rapid response teams. Jt Comm J Qual Patient Saf. 2014;40(6):279287.
  12. Internal Medicine Milestone Group. The Internal Medicine Milestone Project. A Joint Initiative of the Accreditation Council for Graduate Medical Education and The American Board of Internal Medicine. Available at: https://www.acgme.org/acgmeweb/Portals/0/PDFs/Milestones/InternalMedicineMilestones.pdf. Accessed November 4, 2014.
  13. Weiss KB, Bagian JP, Nasca TJ. The clinical learning environment: the foundation of graduate medical education. JAMA. 2013;309(16):16871688.
  14. Myers JS, Tess A, Glasheen JJ, et al. The Quality and Safety Educators Academy: fulfilling an unmet need for faculty development. Am J Med Qual. 2014;29(1):512.
Issue
Journal of Hospital Medicine - 10(1)
Issue
Journal of Hospital Medicine - 10(1)
Page Number
62-63
Page Number
62-63
Publications
Publications
Article Type
Display Headline
Rapid response teams in teaching hospitals: Aligning efforts to improve medical education and quality
Display Headline
Rapid response teams in teaching hospitals: Aligning efforts to improve medical education and quality
Sections
Article Source
© 2015 Society of Hospital Medicine
Disallow All Ads
Correspondence Location
Address for correspondence and reprint requests: Kevin J. O'Leary, MD, Division of Hospital Medicine, Northwestern University Feinberg School of Medicine, 211 E. Ontario St., Suite 700, Chicago, IL 60611; E‐mail: keoleary@nmh.org
Content Gating
Gated (full article locked unless allowed per User)
Gating Strategy
First Peek Free
Article PDF Media

Unprofessional Behavior and Hospitalists

Article Type
Changed
Mon, 05/22/2017 - 18:36
Display Headline
Participation in unprofessional behaviors among hospitalists: A multicenter study

The discrepancy between what is taught about professionalism in formal medical education and what is witnessed in the hospital has received increasing attention.17 This latter aspect of medical education contributes to the hidden curriculum and impacts medical trainees' views on professionalism.8 The hidden curriculum refers to the lessons trainees learn through informal interactions within the multilayered educational learning environment.9 A growing body of work examines how the hidden curriculum and disruptive physicians impact the learning environment.9, 10 In response, regulatory agencies, such as the Liaison Committee on Medical Education (LCME) and Accreditation Council for Graduate Medical Education (ACGME), require training programs and medical schools to maintain standards of professionalism, and to regularly evaluate the learning environment and its impact on professionalism.11, 12 The ACGME in 2011 expanded its standards regarding professionalism by making certain that the program director and institution ensure a culture of professionalism that supports patient safety and personal responsibility.11 Given this increasing focus on professionalism in medical school and residency training programs, it is critical to examine faculty perceptions and actions that may perpetuate the discrepancy between the formal and hidden curriculum.13 This early exposure is especially significant because unprofessional behavior in medical school is strongly associated with later disciplinary action by a medical board.14, 15 Certain unprofessional behaviors can also compromise patient care and safety, and can detract from the hospital working environment.1618

In our previous work, we demonstrated that internal medicine interns reported increased participation in unprofessional behaviors regarding on‐call etiquette during internship.19, 20 Examples of these behaviors include refusing an admission (ie, blocking) and misrepresenting a test as urgent. Interestingly, students and residents have highlighted the powerful role of supervising faculty physicians in condoning or inhibiting such behavior. Given the increasing role of hospitalists as resident supervisors, it is important to consider the perceptions and actions of hospitalists with respect to perpetuating or hindering some unprofessional behaviors. Although hospital medicine is a relatively new specialty, many hospitalists are in frequent contact with medical trainees, perhaps because many residency programs and medical schools have a strong inpatient focus.2123 It is thus possible that hospitalists have a major influence on residents' behaviors and views of professionalism. In fact, the Society of Hospital Medicine's Core Competencies for Hospital Medicine explicitly state that hospitalists are expected to serve as a role model for professional and ethical conduct to house staff, medical students and other members of the interdisciplinary team.24

Therefore, the current study had 2 aims: first, to measure internal medicine hospitalists' perceptions of, and participation in, unprofessional behaviors using a previously validated survey; and second, to examine associations between job characteristics and participation in unprofessional behaviors.

METHODS

Study Design

This was a multi‐institutional, observational study that took place at the University of Chicago Pritzker School of Medicine, Northwestern University Feinberg School of Medicine, and NorthShore University HealthSystem. Hospitalist physicians employed at these hospitals were recruited for this study between June 2010 and July 2010. The Institutional Review Boards of the University of Chicago, Northwestern University, and NorthShore University HealthSystem approved this study. All subjects provided informed consent before participating.

Survey Development and Administration

Based on a prior survey of interns and third‐year medical students, a 35‐item survey was used to measure perceptions of, and participation in, unprofessional behaviors.8, 19, 20 The original survey was developed in 2005 by medical students who observed behaviors by trainees and faculty that they considered to be unprofessional. The survey was subsequently modified by interns to ascertain unprofessional behavior among interns. For this iteration, hospitalists and study authors at each site reviewed the survey items and adapted each item to ensure relevance to hospitalist work and also generalizability to site. New items were also created to refer specifically to work routinely performed by hospitalist attendings (attesting to resident notes, transferring patients to other services to reduce workload, etc). Because of this, certain items utilized jargon to refer to the unprofessional behavior as hospitalists do (ie, blocking admissions and turfing), and resonate with literature describing these phenomena.25 Items were also written in such a fashion to elicit the unprofessional nature (ie, blocking an admission that could be appropriate for your service).

The final survey (see Supporting Information, Appendix, in the online version of this article) included domains such as interactions with others, interactions with trainees, and patient‐care scenarios. Demographic information and job characteristics were collected including year of residency completion, total amount of clinical work, amount of night work, and amount of administrative work. Hospitalists were not asked whether they completed residency at the institution where they currently work in order to maintain anonymity in the context of a small sample. Instead, they were asked to rate their familiarity with residents at their institution on a Likert‐type scale ranging from very unfamiliar (1) to familiar (3) to very familiar (5). To help standardize levels of familiarity across hospitalists, we developed anchors that corresponded to how well a hospitalist would know resident names with familiar defined as knowing over half of resident names.

Participants reported whether they participated in, or observed, a particular behavior and rated their perception of each behavior from 1 (unprofessional) to 5 (professional), with unprofessional and somewhat unprofessional defined as unprofessional. A site champion administered paper surveys during a routine faculty meeting at each site. An electronic version was administered using SurveyMonkey (SurveyMonkey, Palo Alto, CA) to hospitalists not present at the faculty meeting. Participants chose a unique, nonidentifiable code to facilitate truthful reporting while allowing data tracking in follow‐up studies.

Data Analysis

Clinical time was dichotomized using above and below 50% full‐time equivalents (FTE) to define those that did less clinical. Because teaching time was relatively low with the median percent FTE spent on teaching at 10%, we used a cutoff of greater than 10% as greater teaching. Because many hospitalists engaged in no night work, night work was reported as those who engaged in any night work and those who did not. Similarly, because many hospitalists had no administrative time, administrative time was split into those with any administrative work and those without any administrative work. Lastly, those born after 1970 were classified as younger hospitalists.

Chi‐square tests were used to compare site response rates, and descriptive statistics were used to examine demographic characteristics of hospitalist respondents, in addition to perception of, and participation in, unprofessional behaviors. Because items on the survey were highly correlated, we used factor analysis to identify the underlying constructs that related to unprofessional behavior.26 Factor analysis is a statistical procedure that is most often used to explore which variables in a data set are most related or correlated to each other. By examining the patterns of similar responses, the underlying factors can be identified and extracted. These factors, by definition, are not correlated with each other. To select the number of factors to retain, the most common convention is to use Kaiser criterion, or retain all factors with eigenvalues greater than, or equal to, one.27 An eigenvalue measures the amount of variation in all of the items on the survey which is accounted for by that factor. If a factor has a low eigenvalue (less than 1 is the convention), then it is contributing little and is ignored, as it is likely redundant with the higher value factors.

Because use of Kaiser criterion often overestimates the number of factors to retain, another method is to use a scree plot which tends to underestimate the factors. Both were used in this study to ensure a stable solution. To name the factors, we examined which items or group of items loaded or were most highly related to which factor. To ensure an optimal factor solution, items with minimal participation (less than 3%) were excluded from factor analysis.

Then, site‐adjusted multivariate regression analysis was used to examine associations between job and demographic characteristics, and the factors of unprofessional behavior identified. Models controlled for gender and familiarity with residents. Because sample medians were used to define greater teaching (>10% FTE), we also performed a sensitivity analysis using different cutoffs for teaching time (>20% FTE and teaching tertiles). Likewise, we also used varying definitions of less clinical time to ensure that any statistically significant associations were robust across varying definitions. All data were analyzed using STATA 11.0 (Stata Corp, College Station, TX) and statistical significance was defined as P < 0.05.

RESULTS

Seventy‐seven of the 101 hospitalists (76.2%) at 3 sites completed the survey. While response rates varied by site (site 1, 67%; site 2, 74%; site 3, 86%), the differences were not statistically significant (2 = 2.9, P = 0.24). Most hospitalists (79.2%) completed residency after 2000. Over half (57.1%) of participants were male, and over half (61%) reported having worked with their current hospitalist group from 1 to 4 years. Almost 60% (59.7%) reported being unfamiliar with residents in the program. Over 40% of hospitalists did not do any night work. Hospitalists were largely clinical, one‐quarter of hospitalists reported working over 50% FTE, and the median was 80% FTE. While 78% of hospitalists reported some teaching time, median time on teaching service was low at 10% (Table 1).

Demographics of Responders* (n = 77)
 Total n (%)
  • Abbreviations: IQR, interquartile range.

  • Site differences were observed for clinical practice characteristics, such as number of weeks of teaching service, weeks working nights, clinical time, research time, completed fellowship, and won teaching awards. Due to item nonresponse, number of respondents reporting is listed for each item.

  • Familiarity with residents asked in lieu of whether hospitalist trained at the institution. Familiarity defined as a rating of 4 or 5 on Likert scale ranging from Very Unfamiliar (1) to Very Familiar (5), with Familiar (4) defined further as knowing >50% of residents' names.

Male (%)44 (57.1)
Completed residency (%)
Between 1981 and 19902 (2.6)
Between 1991 and 200014 (18.2)
After 200061 (79.2)
Medical school matriculation (%) (n = 76) 
US medical school59 (77.6)
International medical school17 (22.3)
Years spent with current hospitalist group (%)
<1 yr14 (18.2)
14 yr47 (61.0)
59 yr15 (19.5)
>10 yr1 (1.3)
Familiarity with residents (%)
Familiar31 (40.2)
Unfamiliar46 (59.7)
No. of weeks per year spent on (median IQR)
Hospitalist practice (n = 72)26.0 [16.026.0]
Teaching services (n = 68)4.0 [1.08.0]
Weeks working nights* (n = 71)
>2 wk16 (22.5)
12 wk24 (33.8)
0 wk31 (43.7)
% Clinical time (median IQR)* (n = 73)80 (5099)
% Teaching time (median IQR)* (n = 74)10 (120)
Any research time (%)* (n = 71)22 (31.0)
Any administrative time (%) (n = 72)29 (40.3)
Completed fellowship (%)*12 (15.6)
Won teaching awards (%)* (n = 76)21 (27.6)
View a career in hospital medicine as (%)
Temporary11 (14.3)
Long term47 (61.0)
Unsure19 (24.7)

Hospitalists perceived almost all behaviors as unprofessional (unprofessional or somewhat unprofessional on a 5‐point Likert Scale). The only behavior rated as professional with a mean of 4.25 (95% CI 4.014.49) was staying past shift limit to complete a patient‐care task that could have been signed out. This behavior also had the highest level of participation by hospitalists (81.7%). Hospitalists were most ambivalent when rating professionalism of attending an industry‐sponsored dinner or social event (mean 3.20, 95% CI 2.983.41) (Table 2).

Perception of, and Observation and Participation in, Unprofessional Behaviors Among Hospitalists (n = 77)
BehaviorReported Perception (Mean Likert score)*Reported Participation (%)Reported Observation (%)
  • Abbreviations: ER, emergency room.

  • Perception rated on Likert scale from 1 (unprofessional) to 5 (professional).

Having nonmedical/personal conversations in patient corridors (eg, discussing evening plans)2.55 (2.342.76)67.180.3
Ordering a routine test as urgent to get it expedited2.82 (2.583.06)62.380.5
Making fun of other physicians to colleagues1.56 (1.391.70)40.367.5
Disparaging the ER team/outpatient doctor to others for findings later discovered on the floor (eg, after the patient is admitted)2.01 (1.842.19)39.567.1
Signing out patients over the phone at the end of shift when sign‐out could have been done in person2.95 (2.743.16)40.865.8
Texting or using smartphone during educational conferences (ie, noon lecture)2.16 (1.952.36)39.072.7
Discussing patient information in public spaces1.49 (1.341.63)37.766.2
Making fun of other attendings to colleagues1.62 (1.461.78)35.161.0
Deferring family members' concerns about a change in the patient's clinical course to the primary team in order to avoid engaging in such a discussion2.16 (1.912.40)30.355.3
Making disparaging comments about a patient on rounds1.42 (1.271.56)29.867.5
Attending an industry (eg, pharmaceutical or equipment/device manufacturer)‐sponsored dinner or social event3.20 (2.983.41)28.660.5
Ignoring family member's nonurgent questions about a cross‐cover patient when you had time to answer2.05 (1.852.25)26.348.7
Attesting to a resident's note when not fully confident of the content of their documentation1.65 (1.451.85)23.432.5
Making fun of support staff to colleagues1.45 (1.311.59)22.157.9
Not correcting someone who mistakes a student for a physician2.19 (2.012.38)20.835.1
Celebrating a blocked‐admission1.80 (1.612.00)21.160.5
Making fun of residents to colleagues1.53 (1.371.70)18.244.2
Coming to work when you have a significant illness (eg, influenza)1.99 (1.792.19)14.335.1
Celebrating a successful turf1.71 (1.511.92)11.739.0
Failing to notify the patient that a member of the team made, or is concerned that they made, an error1.53 (1.341.71)10.420.8
Transferring a patient, who could be cared for on one's own service, to another service in order to reduce one's census (eg, turfing)1.72 (1.521.91)9.358.7
Refusing an admission which could be considered appropriate for your service (eg, blocking)1.63 (1.441.82)7.968.4
Falsifying patient records (ie, back‐dating a note, copying forward unverified information, or documenting physical findings not personally obtained)1.22 (1.101.34)6.527.3
Making fun of students to colleagues1.35 (1.191.51)6.524.7
Failing to notify patient‐safety or risk management that a member of the team made, or is concerned that they made, an error1.64 (1.461.82)5.213.2
Introducing a student as a doctor to patients1.96 (1.762.16)3.920.8
Signing‐out a procedure or task, that could have been completed during a required shift or by the primary team, in order to go home as early in the day as possible1.48 (1.321.64)3.948.1
Performing medical or surgical procedures on a patient beyond self‐perceived level of skill1.27 (1.141.41)2.67.8
Asking a student to obtain written consent from a patient or their proxy without supervision (eg, for blood transfusion or minor procedures)1.60 (1.421.78)2.636.5
Encouraging a student to state that they are a doctor in order to expedite patient care1.31 (1.151.47)2.66.5
Discharging a patient before they are ready to go home in order to reduce one's census1.18 (1.071.29)2.619.5
Reporting patient information (eg, labs, test results, exam results) as normal when uncertain of the true results1.29 (1.161.41)2.615.6
Asking a student to perform medical or surgical procedures which are perceived to be beyond their level of skill1.26 (1.121.40)1.33.9
Asking a student to discuss, with patients, medical or surgical information which is perceived to be beyond their level of knowledge1.41 (1.261.56)0.015.8

Participation in egregious behaviors, such as falsifying patient records (6.49%) and performing medical or surgical procedures on a patient beyond self‐perceived level of skill (2.60%), was very low. The most common behaviors rated as unprofessional that hospitalists reported participating in were having nonmedical/personal conversations in patient corridors (67.1%), ordering a routine test as urgent to expedite care (62.3%), and making fun of other physicians to colleagues (40.3%). Forty percent of participants reported disparaging the emergency room (ER) team or primary care physician for findings later discovered, signing out over the phone when it could have been done in person, and texting or using smartphones during educational conferences. In particular, participation in unprofessional behaviors related to trainees was close to zero (eg, asking a student to discuss, with patients, medical or surgical information which is perceived to be beyond their level of knowledge). The least common behaviors that hospitalists reported participating in were discharging a patient before they are ready to go home in order to reduce one's census (2.56%) and reporting patient information as normal when uncertain of the true results (2.60%). Like previous studies of unprofessional behaviors, those that reported participation were less likely to report the behavior as unprofessional.8, 19

Observation of behaviors ranged from 4% to 80%. In all cases, observation of the behavior was reported at a higher level than participation. Correlation between observation and participation was also high, with the exception of a few behaviors that had zero or near zero participation rates (ie, reporting patient information as normal when unsure of true results.)

After performing factor analysis, 4 factors had eigenvalues greater than 1 and were therefore retained and extracted for further analysis. These 4 factors accounted for 76% of the variance in responses reported on the survey. By examining which items or groups of items most strongly loaded on each factor, the factors were named accordingly: factor 1 referred to behaviors related to making fun of others, factor 2 referred to workload management, factor 3 referred to behaviors related to the learning environment, and factor 4 referred to behaviors related to time pressure (Table 3).

Results of Factor Analysis Displaying Items by Primary Loading
  • NOTE: Items were categorized using factor analysis to the factor that they loaded most highly on. All items shown loaded at 0.4 or above onto each factor. Four items were omitted due to loadings less than 0.4. One item cross‐loaded on multiple factors (deferring family questions). Abbreviations: ER, emergency room.

Factor 1: Making fun of others
Making fun of other physicians (0.78)
Making fun of attendings (0.77)
Making fun of residents (0.70)
Making disparaging comments about a patient on rounds (0.51)
Factor 2: Workload management
Celebrating a successful turf (0.81)
Celebrating a blocked‐admission (0.65)
Coming to work sick (0.56)
Transferring a patient, who could be cared for on one's own service, to another service in order to reduce one's census (eg, turfing.) (0.51)
Disparaging the ER team/outpatient doctor to others for findings later discovered on the floor (0.48)
Discharging a patient before they are ready to go home in order to reduce one's census (0.43)
Factor 3: Learning environment
Not correcting someone who mistakes a student for a physician (0.72)
Texting or using smartphone during educational conferences (ie, noon lecture) (0.51)
Failing to notify patient‐safety or risk management that a member of the team made, or is concerned that they made, an error (0.45)
Having nonmedical/personal conversations in patient corridors (eg, discussing evening plans) (0.43)
Factor 4: Time pressure
Ignoring family member's nonurgent questions about a cross‐cover patient when you had the time to answer (0.50)
Signing out patients over the phone at the end of shift when sign‐out could have been done in person (0.46)
Attesting to a resident's note when not fully confident of the content of their documentation (0.44)

Using site‐adjusted multivariate regression, certain hospitalist job characteristics were associated with certain patterns of participating in unprofessional behavior (Table 4). Those with less clinical time (<50% FTE) were more likely to participate in unprofessional behaviors related to making fun of others (factor 1, value = 0.94, 95% CI 0.32 to 1.56, P value <0.05). Hospitalists who had any administrative time ( value = 0.61, 95% CI 0.111.10, P value <0.05) were more likely to report participation in behaviors related to workload management. Hospitalists engaged in any night work were more likely to report participation in unprofessional behaviors related to time pressure ( value = 0.67, 95% CI 0.171.17, P value <0.05). Time devoted to teaching or research was not associated with greater participation in any of the domains of unprofessional behavior surveyed.

Association Between Hospitalist Job and Demographic Characteristics and Factors of Unprofessional Behavior
ModelMaking Fun of OthersLearning EnvironmentWorkload ManagementTime Pressure
PredictorBeta [95% CI]Beta [95% CI]Beta [95% CI]Beta [95% CI]
  • NOTE: Table shows the results of 4 different multivariable linear regression models, which examine the association between various covariates (job characteristics, demographic characteristics, and site) and factors of participation in unprofessional behaviors (communication, patient safety, workload). Due to item nonresponse, n = 63 for all regression models. Abbreviations: CI, confidence interval.

  • P < 0.05.

  • Less clinical was defined as less than 50% full‐time equivalent (FTE) in a given year spent on clinical work.

  • Teaching was defined as greater than the median (10% FTE) spent on teaching. Results did not change when using tertiles of teaching effort, or a cutoff at teaching greater than 20% FTE.

  • Administrative time, research time, and nights were defined as reporting any administrative time, research time, or night work, respectively (greater than 0% per year).

  • Younger was defined as those born after 1970.

Job characteristics
Less clinical0.94 [0.32, 1.56]*0.01 [0.66, 0.64]0.17 [0.84, 0.49]0.39 [0.24, 1.01]
Administrative0.30 [0.16, 0.76]0.06 [0.43, 0.54]0.61 [0.11, 1.10]*0.26 [0.20, 0.72]
Teaching0.01 [0.49, 0.48]0.09 [0.60, 0.42]0.12 [0.64, 0.40]0.16 [0.33, 0.65]
Research0.30 [0.87, 0.27]0.38 [0.98, 0.22]0.37 [0.98, 0.24]0.13 [0.45, 0.71]
Any nights0.08 [0.58, 0.42]0.24 [0.28, 0.77]0.24 [0.29, 0.76]0.67 [0.17,1.17]*
Demographic characteristics
Male0.06 [0.42, 0.53]0.03 [0.47, 0.53]0.05 [0.56, 0.47]0.40 [0.89, 0.08]
Younger0.05 [0.79, 0.69]0.64 [1.42, 0.14]0.87 [0.07, 1.67]*0.62 [0.13, 1.37]
Unfamiliar with residents0.32 [0.85, 0.22]0.32 [0.89, 0.24]0.13 [0.45, 0.70]0.47 [0.08, 1.01]
Institution
Site 10.58 [0.22, 1.38]0.05 [0.89, 0.79]1.01 [0.15, 1.86]*0.77 [1.57, 0.04]
Site 30.11 [0.68, 0.47]0.70 [1.31, 0.09]*0.43 [0.20, 1.05]0.45 [0.13, 1.04]
Constant0.03 [0.99, 1.06]0.94 [0.14, 2.02]1.23[2.34, 0.13]*1.34[2.39, 0.31]*

The only demographic characteristic that was significantly associated with unprofessional behavior was age. Specifically, those who were born after 1970 were more likely to participate in unprofessional behaviors related to workload management ( value = 0.87, 95% CI 0.071.67, P value <0.05). Site differences were also present. Specifically, one site was more likely to report participation in unprofessional behaviors related to workload management ( value site 1 = 1.01, 95% CI 0.15 to 1.86, P value <0.05), while another site was less likely to report participation in behaviors related to the learning environment ( value site 3 = 0.70, 95% CI 1.31 to 0.09, P value <0.05). Gender and familiarity with residents were not significant predictors of participation in unprofessional behaviors. Results remained robust in sensitivity analyses using different cutoffs of clinical time and teaching time.

DISCUSSION

This multisite study adds to what is known about the perceptions of, and participation in, unprofessional behaviors among internal medicine hospitalists. Hospitalists perceived almost all surveyed behaviors as unprofessional. Participation in egregious and trainee‐related unprofessional behaviors was very low. Four categories appeared to explain the variability in how hospitalists reported participation in unprofessional behaviors: making fun of others, workload management, learning environment, and time pressure. Participation in behaviors within these factors was associated with certain job characteristics, such as clinical time, administrative time, and night work, as well as age and site.

It is reassuring that participation in, and trainee‐related, unprofessional behaviors is very low, and it is noteworthy that attending an industry‐sponsored dinner is not considered unprofessional. This was surprising in the setting of increased external pressures to report and ban such interactions.28 Perception that attending such dinners is acceptable may reflect a lag between current practice and national recommendations.

It is important to explore why certain job characteristics are associated with participation in unprofessional behaviors. For example, those with less clinical time were more likely to participate in making fun of others. It may be the case that hospitalists with more clinical time may make a larger effort to develop and maintain positive relationships. Another possible explanation is that hospitalists with less clinical time are more easily influenced by those in the learning environment who make fun of others, such as residents who they are supervising for only a brief period.

For unprofessional behaviors related to workload management, those who were younger, and those with any administrative time, were more likely to participate in behaviors such as celebrating a blocked‐admission. Our prior work shows that behaviors related to workload management are more widespread in residency, and therefore younger hospitalists, who are often recent residency graduates, may be more prone to participating in these behaviors. While unproven, it is possible that those with more administrative time may have competing priorities with their administrative roles, which motivate them to more actively manage their workload, leading them to participate in workload management behaviors.

Hospitalists who did any night work were more likely to participate in unprofessional behaviors related to time pressure. This could reflect the high workloads that night hospitalists may face and the pressure they feel to wrap up work, resulting in a hasty handoff (ie, over the phone) or to defer work (ie, family questions). Site differences were also observed for participation in behaviors related to the learning environment, speaking to the importance of institutional culture.

It is worth mentioning that hospitalists who were teachers were not any less likely to report participating in certain behaviors. While 78% of hospitalists reported some level of teaching, the median reported percentage of teaching was 10% FTE. This level of teaching likely reflects the diverse nature of work in which hospitalists engage. While hospitalists spend some time working with trainees, services that are not staffed with residents (eg, uncovered services) are becoming increasingly common due to stricter resident duty hour restrictions. This may explain why 60% of hospitalists reported being unfamiliar with residents. We also used a high bar for familiarity, which we defined as knowing half of residents by name, and served as a proxy for those who may have trained at the institution where they currently work. In spite of hospitalists reporting a low fraction of their total clinical time devoted to resident services, a significant fraction of resident services were staffed by hospitalists at all sites, making them a natural target for interventions.

These results have implications for future work to assess and improve professionalism in the hospital learning environment. First, interventions to address unprofessional behaviors should focus on behaviors with the highest participation rates. Like our earlier studies of residents, participation is high in certain behaviors, such as misrepresenting a test as urgent, or disparaging the ER or primary care physician (PCP) for a missed finding.19, 20 While blocking an admission was common in our studies of residents, reported participation among hospitalists was low. Similar to a prior study of clinical year medical students at one of our sites, 1 in 5 hospitalists reported not correcting someone who mistakes a student for a physician, highlighting the role that hospitalists may have in perpetuating this behavior.8 Additionally, addressing the behaviors identified in this study, through novel curricular tools, may help to teach residents many of the interpersonal and communication skills called for in the 2011 ACGME Common Program Requirements.11 The ACGME requirements also include the expectation that faculty model how to manage their time before, during, and after clinical assignments, and recognize that transferring a patient to a rested provider is best. Given that most hospitalists believe staying past shift limit is professional, these requirements will be difficult to adopt without widespread culture change.

Moreover, interventions could be tailored to hospitalists with certain job characteristics. Interventions may be educational or systems based. An example of the former is stressing the impact of the learning and working environment on trainees, and an example of the latter is streamlining the process in which ordered tests are executed to result in a more timely completion of tests. This may result in fewer physicians misrepresenting a test as urgent in order to have the test done in a timely manner. Additionally, hospitalists with less clinical time could receive education on their impact as a role model for trainees. Hospitalists who are younger or with administrative commitments could be trained on the importance of avoiding behaviors related to workload management, such as blocking or turfing patients. Lastly, given the site differences, critical examination of institutional culture and policies is also important. With funding from the American Board of Internal Medicine (ABIM) Foundation, we are currently creating an educational intervention, targeting those behaviors that were most frequent among hospitalists and residents at our institutions to promote dialogue and critical reflection, with the hope of reducing the most prevalent behaviors encountered.

There are several limitations to this study. Despite the anonymity of the survey, participants may have inaccurately reported their participation in unprofessional behaviors due to socially desirable response. In addition, because we used factor analysis and multivariate regression models with a small sample size, item nonresponse limited the sample for regression analyses and raises the concern for response bias. However, all significant associations remained so after performing backwards stepwise elimination of covariates that were P > 0.10 in models that were larger (ranging from 65 to 69). Because we used self‐report and not direct observation of participation in unprofessional behaviors, it is not possible to validate the responses given. Future work could rely on the use of 360 degree evaluations or other methods to validate responses given by self‐report. It is also important to consider assessing whether these behaviors are associated with actual patient outcomes, such as length of stay or readmission. Some items may not always be unprofessional. For example, texting during an educational conference might be to advance care, which would not necessarily be unprofessional. The order in which the questions were asked could have led to bias. We asked about participation before perception to try to limit bias reporting in participation. Changing the order of these questions would potentially have resulted in under‐reporting participation in behaviors that one perceived to be unprofessional. This study was conducted at 3 institutions located in Chicago, limiting generalizability to institutions outside of this area. Only internal medicine hospitalists were surveyed, which also limits generalizability to other disciplines and specialties within internal medicine. Lastly, it is important to highlight that hospitalists are not the sole teachers on inpatient services, since residents encounter a variety of faculty who serve as teaching attendings. Future work should expand to other centers and other specialties.

In conclusion, in this multi‐institutional study of hospitalists, participation in egregious behaviors was low. Four factors or patterns underlie hospitalists' reports of participation in unprofessional behavior: making fun of others, learning environment, workload management, and time pressure. Job characteristics (clinical time, administrative time, night work), age, and site were all associated with different patterns of unprofessional behavior. Specifically, hospitalists with less clinical time were more likely to make fun of others. Hospitalists who were younger in age, as well as those who had any administrative work, were more likely to participate in behaviors related to workload management. Hospitalists who work nights were more likely to report behaviors related to time pressure. Interventions to promote professionalism should take institutional culture into account and should focus on behaviors with the highest participation rates. Efforts should also be made to address underlying reasons for participation in these behaviors.

Acknowledgements

The authors thank Meryl Prochaska for her research assistance and manuscript preparation.

Disclosures: The authors acknowledge funding from the ABIM Foundation and the Pritzker Summer Research Program. The funders had no role in the design of the study; the collection, analysis, and interpretation of the data; or the decision to approve publication of the finished manuscript. Prior presentations of the data include the 2010 University of Chicago Pritzker School of Medicine Summer Research Forum, the 2010 University of Chicago Pritzker School of Medicine Medical Education Day, the 2010 Midwest Society of Hospital Medicine Meeting in Chicago, IL, and the 2011 Society of Hospital Medicine National Meeting in Dallas, TX. All authors disclose no relevant or financial conflicts of interest.

Files
References
  1. Stern DT.Practicing what we preach? An analysis of the curriculum of values in medical education.Am J Med.1998;104:569575.
  2. Borgstrom E,Cohn S,Barclay S.Medical professionalism: conflicting values for tomorrow's doctors.J Gen Intern Med.2010;25(12):13301336.
  3. Karnieli‐Miller O,Vu TR,Holtman MC,Clyman SG,Inui TS.Medical students' professionalism narratives: a window on the informal and hidden curriculum.Acad Med.2010;85(1):124133.
  4. Cohn FG,Shapiro J,Lie DA,Boker J,Stephens F,Leung LA.Interpreting values conflicts experienced by obstetrics‐gynecology clerkship students using reflective writing.Acad Med.2009;84(5):587596.
  5. Gaiser RR.The teaching of professionalism during residency: why it is failing and a suggestion to improve its success.Anesth Analg.2009;108(3):948954.
  6. Gofton W,Regehr G.What we don't know we are teaching: unveiling the hidden curriculum.Clin Orthop Relat Res.2006;449:2027.
  7. Hafferty FW.Definitions of professionalism: a search for meaning and identity.Clin Orthop Relat Res.2006;449:193204.
  8. Reddy ST,Farnan JM,Yoon JD, et al.Third‐year medical students' participation in and perceptions of unprofessional behaviors.Acad Med.2007;82:S35S39.
  9. Hafferty FW.Beyond curriculum reform: confronting medicine's hidden curriculum.Acad Med.1998;73:403407.
  10. Pfifferling JH.Physicians' “disruptive” behavior: consequences for medical quality and safety.Am J Med Qual.2008;23:165167.
  11. Accreditation Council for Graduate Medical Education. Common Program Requirements: General Competencies. Available at: http://www.acgme.org/acwebsite/home/common_program_requirements_07012011.pdf. Accessed December 19,2011.
  12. Liaison Committee on Medical Education. Functions and Structure of a Medical School. Available at: http://www.lcme.org/functions2010jun.pdf.. Accessed June 30,2010.
  13. Gillespie C,Paik S,Ark T,Zabar S,Kalet A.Residents' perceptions of their own professionalism and the professionalism of their learning environment.J Grad Med Educ.2009;1:208215.
  14. Papadakis MA,Hodgson CS,Teherani A,Kohatsu ND.Unprofessional behavior in medical school is associated with subsequent disciplinary action by a state medical board.Acad Med.2004;79:244249.
  15. Papadakis MA,Teherani A,Banach MA, et al.Disciplinary action by medical boards and prior behavior in medical school.N Engl J Med.2005;353:26732682.
  16. Rosenstein AH,O'Daniel M.A survey of the impact of disruptive behaviors and communication defects on patient safety.Jt Comm J Qual Patient Saf.2008;34:464471.
  17. Rosenstein AH,O'Daniel M.Managing disruptive physician behavior—impact on staff relationships and patient care.Neurology.2008;70:15641570.
  18. The Joint Commission.Behaviors that undermine a culture of safety. Sentinel Event Alert.2008. Available at: http://www.jointcommission.org/assets/1/18/SEA_40.PDF. Accessed April 28, 2012.
  19. Arora VM,Wayne DB,Anderson RA,Didwania A,Humphrey HJ.Participation in and perceptions of unprofessional behaviors among incoming internal medicine interns.JAMA.2008;300:11321134.
  20. Arora VM,Wayne DB,Anderson RA, et al.Changes in perception of and participation in unprofessional behaviors during internship.Acad Med.2010;85:S76S80.
  21. Wachter RM.Reflections: the hospitalist movement a decade later.J Hosp Med.2006;1:248252.
  22. Society of Hospital Medicine, 2007–2008 Bi‐Annual Survey.2008. Available at: http://www.medscape.org/viewarticle/578134. Accessed April 28, 2012.
  23. Holmboe ES,Bowen JL,Green M, et al.Reforming internal medicine residency training. A report from the Society of General Internal Medicine's Task Force for Residency Reform.J Gen Intern Med.2005;20:11651172.
  24. Society of Hospital Medicine.The Core Competencies in Hospital Medicine: a framework for curriculum development by the Society of Hospital Medicine.J Hosp Med.2006;1(suppl 1):25.
  25. Caldicott CV,Dunn KA,Frankel RM.Can patients tell when they are unwanted? “Turfing” in residency training.Patient Educ Couns.2005;56:104111.
  26. Costello AB,Osborn JW.Best practices in exploratory factor analysis: four recommendations for getting the most from your analysis.Pract Assess Res Eval.2005;10:19.
  27. Principal Components and Factor Analysis. StatSoft Electronic Statistics Textbook. Available at: http://www.statsoft.com/textbook/principal‐components‐factor‐analysis/. Accessed December 30,2011.
  28. Brennan TA,Rothman DJ,Blank L, et al.Health industry practices that create conflicts of interest: a policy proposal for academic medical centers.JAMA.2006;295(4):429433.
Article PDF
Issue
Journal of Hospital Medicine - 7(7)
Publications
Page Number
543-550
Sections
Files
Files
Article PDF
Article PDF

The discrepancy between what is taught about professionalism in formal medical education and what is witnessed in the hospital has received increasing attention.17 This latter aspect of medical education contributes to the hidden curriculum and impacts medical trainees' views on professionalism.8 The hidden curriculum refers to the lessons trainees learn through informal interactions within the multilayered educational learning environment.9 A growing body of work examines how the hidden curriculum and disruptive physicians impact the learning environment.9, 10 In response, regulatory agencies, such as the Liaison Committee on Medical Education (LCME) and Accreditation Council for Graduate Medical Education (ACGME), require training programs and medical schools to maintain standards of professionalism, and to regularly evaluate the learning environment and its impact on professionalism.11, 12 The ACGME in 2011 expanded its standards regarding professionalism by making certain that the program director and institution ensure a culture of professionalism that supports patient safety and personal responsibility.11 Given this increasing focus on professionalism in medical school and residency training programs, it is critical to examine faculty perceptions and actions that may perpetuate the discrepancy between the formal and hidden curriculum.13 This early exposure is especially significant because unprofessional behavior in medical school is strongly associated with later disciplinary action by a medical board.14, 15 Certain unprofessional behaviors can also compromise patient care and safety, and can detract from the hospital working environment.1618

In our previous work, we demonstrated that internal medicine interns reported increased participation in unprofessional behaviors regarding on‐call etiquette during internship.19, 20 Examples of these behaviors include refusing an admission (ie, blocking) and misrepresenting a test as urgent. Interestingly, students and residents have highlighted the powerful role of supervising faculty physicians in condoning or inhibiting such behavior. Given the increasing role of hospitalists as resident supervisors, it is important to consider the perceptions and actions of hospitalists with respect to perpetuating or hindering some unprofessional behaviors. Although hospital medicine is a relatively new specialty, many hospitalists are in frequent contact with medical trainees, perhaps because many residency programs and medical schools have a strong inpatient focus.2123 It is thus possible that hospitalists have a major influence on residents' behaviors and views of professionalism. In fact, the Society of Hospital Medicine's Core Competencies for Hospital Medicine explicitly state that hospitalists are expected to serve as a role model for professional and ethical conduct to house staff, medical students and other members of the interdisciplinary team.24

Therefore, the current study had 2 aims: first, to measure internal medicine hospitalists' perceptions of, and participation in, unprofessional behaviors using a previously validated survey; and second, to examine associations between job characteristics and participation in unprofessional behaviors.

METHODS

Study Design

This was a multi‐institutional, observational study that took place at the University of Chicago Pritzker School of Medicine, Northwestern University Feinberg School of Medicine, and NorthShore University HealthSystem. Hospitalist physicians employed at these hospitals were recruited for this study between June 2010 and July 2010. The Institutional Review Boards of the University of Chicago, Northwestern University, and NorthShore University HealthSystem approved this study. All subjects provided informed consent before participating.

Survey Development and Administration

Based on a prior survey of interns and third‐year medical students, a 35‐item survey was used to measure perceptions of, and participation in, unprofessional behaviors.8, 19, 20 The original survey was developed in 2005 by medical students who observed behaviors by trainees and faculty that they considered to be unprofessional. The survey was subsequently modified by interns to ascertain unprofessional behavior among interns. For this iteration, hospitalists and study authors at each site reviewed the survey items and adapted each item to ensure relevance to hospitalist work and also generalizability to site. New items were also created to refer specifically to work routinely performed by hospitalist attendings (attesting to resident notes, transferring patients to other services to reduce workload, etc). Because of this, certain items utilized jargon to refer to the unprofessional behavior as hospitalists do (ie, blocking admissions and turfing), and resonate with literature describing these phenomena.25 Items were also written in such a fashion to elicit the unprofessional nature (ie, blocking an admission that could be appropriate for your service).

The final survey (see Supporting Information, Appendix, in the online version of this article) included domains such as interactions with others, interactions with trainees, and patient‐care scenarios. Demographic information and job characteristics were collected including year of residency completion, total amount of clinical work, amount of night work, and amount of administrative work. Hospitalists were not asked whether they completed residency at the institution where they currently work in order to maintain anonymity in the context of a small sample. Instead, they were asked to rate their familiarity with residents at their institution on a Likert‐type scale ranging from very unfamiliar (1) to familiar (3) to very familiar (5). To help standardize levels of familiarity across hospitalists, we developed anchors that corresponded to how well a hospitalist would know resident names with familiar defined as knowing over half of resident names.

Participants reported whether they participated in, or observed, a particular behavior and rated their perception of each behavior from 1 (unprofessional) to 5 (professional), with unprofessional and somewhat unprofessional defined as unprofessional. A site champion administered paper surveys during a routine faculty meeting at each site. An electronic version was administered using SurveyMonkey (SurveyMonkey, Palo Alto, CA) to hospitalists not present at the faculty meeting. Participants chose a unique, nonidentifiable code to facilitate truthful reporting while allowing data tracking in follow‐up studies.

Data Analysis

Clinical time was dichotomized using above and below 50% full‐time equivalents (FTE) to define those that did less clinical. Because teaching time was relatively low with the median percent FTE spent on teaching at 10%, we used a cutoff of greater than 10% as greater teaching. Because many hospitalists engaged in no night work, night work was reported as those who engaged in any night work and those who did not. Similarly, because many hospitalists had no administrative time, administrative time was split into those with any administrative work and those without any administrative work. Lastly, those born after 1970 were classified as younger hospitalists.

Chi‐square tests were used to compare site response rates, and descriptive statistics were used to examine demographic characteristics of hospitalist respondents, in addition to perception of, and participation in, unprofessional behaviors. Because items on the survey were highly correlated, we used factor analysis to identify the underlying constructs that related to unprofessional behavior.26 Factor analysis is a statistical procedure that is most often used to explore which variables in a data set are most related or correlated to each other. By examining the patterns of similar responses, the underlying factors can be identified and extracted. These factors, by definition, are not correlated with each other. To select the number of factors to retain, the most common convention is to use Kaiser criterion, or retain all factors with eigenvalues greater than, or equal to, one.27 An eigenvalue measures the amount of variation in all of the items on the survey which is accounted for by that factor. If a factor has a low eigenvalue (less than 1 is the convention), then it is contributing little and is ignored, as it is likely redundant with the higher value factors.

Because use of Kaiser criterion often overestimates the number of factors to retain, another method is to use a scree plot which tends to underestimate the factors. Both were used in this study to ensure a stable solution. To name the factors, we examined which items or group of items loaded or were most highly related to which factor. To ensure an optimal factor solution, items with minimal participation (less than 3%) were excluded from factor analysis.

Then, site‐adjusted multivariate regression analysis was used to examine associations between job and demographic characteristics, and the factors of unprofessional behavior identified. Models controlled for gender and familiarity with residents. Because sample medians were used to define greater teaching (>10% FTE), we also performed a sensitivity analysis using different cutoffs for teaching time (>20% FTE and teaching tertiles). Likewise, we also used varying definitions of less clinical time to ensure that any statistically significant associations were robust across varying definitions. All data were analyzed using STATA 11.0 (Stata Corp, College Station, TX) and statistical significance was defined as P < 0.05.

RESULTS

Seventy‐seven of the 101 hospitalists (76.2%) at 3 sites completed the survey. While response rates varied by site (site 1, 67%; site 2, 74%; site 3, 86%), the differences were not statistically significant (2 = 2.9, P = 0.24). Most hospitalists (79.2%) completed residency after 2000. Over half (57.1%) of participants were male, and over half (61%) reported having worked with their current hospitalist group from 1 to 4 years. Almost 60% (59.7%) reported being unfamiliar with residents in the program. Over 40% of hospitalists did not do any night work. Hospitalists were largely clinical, one‐quarter of hospitalists reported working over 50% FTE, and the median was 80% FTE. While 78% of hospitalists reported some teaching time, median time on teaching service was low at 10% (Table 1).

Demographics of Responders* (n = 77)
 Total n (%)
  • Abbreviations: IQR, interquartile range.

  • Site differences were observed for clinical practice characteristics, such as number of weeks of teaching service, weeks working nights, clinical time, research time, completed fellowship, and won teaching awards. Due to item nonresponse, number of respondents reporting is listed for each item.

  • Familiarity with residents asked in lieu of whether hospitalist trained at the institution. Familiarity defined as a rating of 4 or 5 on Likert scale ranging from Very Unfamiliar (1) to Very Familiar (5), with Familiar (4) defined further as knowing >50% of residents' names.

Male (%)44 (57.1)
Completed residency (%)
Between 1981 and 19902 (2.6)
Between 1991 and 200014 (18.2)
After 200061 (79.2)
Medical school matriculation (%) (n = 76) 
US medical school59 (77.6)
International medical school17 (22.3)
Years spent with current hospitalist group (%)
<1 yr14 (18.2)
14 yr47 (61.0)
59 yr15 (19.5)
>10 yr1 (1.3)
Familiarity with residents (%)
Familiar31 (40.2)
Unfamiliar46 (59.7)
No. of weeks per year spent on (median IQR)
Hospitalist practice (n = 72)26.0 [16.026.0]
Teaching services (n = 68)4.0 [1.08.0]
Weeks working nights* (n = 71)
>2 wk16 (22.5)
12 wk24 (33.8)
0 wk31 (43.7)
% Clinical time (median IQR)* (n = 73)80 (5099)
% Teaching time (median IQR)* (n = 74)10 (120)
Any research time (%)* (n = 71)22 (31.0)
Any administrative time (%) (n = 72)29 (40.3)
Completed fellowship (%)*12 (15.6)
Won teaching awards (%)* (n = 76)21 (27.6)
View a career in hospital medicine as (%)
Temporary11 (14.3)
Long term47 (61.0)
Unsure19 (24.7)

Hospitalists perceived almost all behaviors as unprofessional (unprofessional or somewhat unprofessional on a 5‐point Likert Scale). The only behavior rated as professional with a mean of 4.25 (95% CI 4.014.49) was staying past shift limit to complete a patient‐care task that could have been signed out. This behavior also had the highest level of participation by hospitalists (81.7%). Hospitalists were most ambivalent when rating professionalism of attending an industry‐sponsored dinner or social event (mean 3.20, 95% CI 2.983.41) (Table 2).

Perception of, and Observation and Participation in, Unprofessional Behaviors Among Hospitalists (n = 77)
BehaviorReported Perception (Mean Likert score)*Reported Participation (%)Reported Observation (%)
  • Abbreviations: ER, emergency room.

  • Perception rated on Likert scale from 1 (unprofessional) to 5 (professional).

Having nonmedical/personal conversations in patient corridors (eg, discussing evening plans)2.55 (2.342.76)67.180.3
Ordering a routine test as urgent to get it expedited2.82 (2.583.06)62.380.5
Making fun of other physicians to colleagues1.56 (1.391.70)40.367.5
Disparaging the ER team/outpatient doctor to others for findings later discovered on the floor (eg, after the patient is admitted)2.01 (1.842.19)39.567.1
Signing out patients over the phone at the end of shift when sign‐out could have been done in person2.95 (2.743.16)40.865.8
Texting or using smartphone during educational conferences (ie, noon lecture)2.16 (1.952.36)39.072.7
Discussing patient information in public spaces1.49 (1.341.63)37.766.2
Making fun of other attendings to colleagues1.62 (1.461.78)35.161.0
Deferring family members' concerns about a change in the patient's clinical course to the primary team in order to avoid engaging in such a discussion2.16 (1.912.40)30.355.3
Making disparaging comments about a patient on rounds1.42 (1.271.56)29.867.5
Attending an industry (eg, pharmaceutical or equipment/device manufacturer)‐sponsored dinner or social event3.20 (2.983.41)28.660.5
Ignoring family member's nonurgent questions about a cross‐cover patient when you had time to answer2.05 (1.852.25)26.348.7
Attesting to a resident's note when not fully confident of the content of their documentation1.65 (1.451.85)23.432.5
Making fun of support staff to colleagues1.45 (1.311.59)22.157.9
Not correcting someone who mistakes a student for a physician2.19 (2.012.38)20.835.1
Celebrating a blocked‐admission1.80 (1.612.00)21.160.5
Making fun of residents to colleagues1.53 (1.371.70)18.244.2
Coming to work when you have a significant illness (eg, influenza)1.99 (1.792.19)14.335.1
Celebrating a successful turf1.71 (1.511.92)11.739.0
Failing to notify the patient that a member of the team made, or is concerned that they made, an error1.53 (1.341.71)10.420.8
Transferring a patient, who could be cared for on one's own service, to another service in order to reduce one's census (eg, turfing)1.72 (1.521.91)9.358.7
Refusing an admission which could be considered appropriate for your service (eg, blocking)1.63 (1.441.82)7.968.4
Falsifying patient records (ie, back‐dating a note, copying forward unverified information, or documenting physical findings not personally obtained)1.22 (1.101.34)6.527.3
Making fun of students to colleagues1.35 (1.191.51)6.524.7
Failing to notify patient‐safety or risk management that a member of the team made, or is concerned that they made, an error1.64 (1.461.82)5.213.2
Introducing a student as a doctor to patients1.96 (1.762.16)3.920.8
Signing‐out a procedure or task, that could have been completed during a required shift or by the primary team, in order to go home as early in the day as possible1.48 (1.321.64)3.948.1
Performing medical or surgical procedures on a patient beyond self‐perceived level of skill1.27 (1.141.41)2.67.8
Asking a student to obtain written consent from a patient or their proxy without supervision (eg, for blood transfusion or minor procedures)1.60 (1.421.78)2.636.5
Encouraging a student to state that they are a doctor in order to expedite patient care1.31 (1.151.47)2.66.5
Discharging a patient before they are ready to go home in order to reduce one's census1.18 (1.071.29)2.619.5
Reporting patient information (eg, labs, test results, exam results) as normal when uncertain of the true results1.29 (1.161.41)2.615.6
Asking a student to perform medical or surgical procedures which are perceived to be beyond their level of skill1.26 (1.121.40)1.33.9
Asking a student to discuss, with patients, medical or surgical information which is perceived to be beyond their level of knowledge1.41 (1.261.56)0.015.8

Participation in egregious behaviors, such as falsifying patient records (6.49%) and performing medical or surgical procedures on a patient beyond self‐perceived level of skill (2.60%), was very low. The most common behaviors rated as unprofessional that hospitalists reported participating in were having nonmedical/personal conversations in patient corridors (67.1%), ordering a routine test as urgent to expedite care (62.3%), and making fun of other physicians to colleagues (40.3%). Forty percent of participants reported disparaging the emergency room (ER) team or primary care physician for findings later discovered, signing out over the phone when it could have been done in person, and texting or using smartphones during educational conferences. In particular, participation in unprofessional behaviors related to trainees was close to zero (eg, asking a student to discuss, with patients, medical or surgical information which is perceived to be beyond their level of knowledge). The least common behaviors that hospitalists reported participating in were discharging a patient before they are ready to go home in order to reduce one's census (2.56%) and reporting patient information as normal when uncertain of the true results (2.60%). Like previous studies of unprofessional behaviors, those that reported participation were less likely to report the behavior as unprofessional.8, 19

Observation of behaviors ranged from 4% to 80%. In all cases, observation of the behavior was reported at a higher level than participation. Correlation between observation and participation was also high, with the exception of a few behaviors that had zero or near zero participation rates (ie, reporting patient information as normal when unsure of true results.)

After performing factor analysis, 4 factors had eigenvalues greater than 1 and were therefore retained and extracted for further analysis. These 4 factors accounted for 76% of the variance in responses reported on the survey. By examining which items or groups of items most strongly loaded on each factor, the factors were named accordingly: factor 1 referred to behaviors related to making fun of others, factor 2 referred to workload management, factor 3 referred to behaviors related to the learning environment, and factor 4 referred to behaviors related to time pressure (Table 3).

Results of Factor Analysis Displaying Items by Primary Loading
  • NOTE: Items were categorized using factor analysis to the factor that they loaded most highly on. All items shown loaded at 0.4 or above onto each factor. Four items were omitted due to loadings less than 0.4. One item cross‐loaded on multiple factors (deferring family questions). Abbreviations: ER, emergency room.

Factor 1: Making fun of others
Making fun of other physicians (0.78)
Making fun of attendings (0.77)
Making fun of residents (0.70)
Making disparaging comments about a patient on rounds (0.51)
Factor 2: Workload management
Celebrating a successful turf (0.81)
Celebrating a blocked‐admission (0.65)
Coming to work sick (0.56)
Transferring a patient, who could be cared for on one's own service, to another service in order to reduce one's census (eg, turfing.) (0.51)
Disparaging the ER team/outpatient doctor to others for findings later discovered on the floor (0.48)
Discharging a patient before they are ready to go home in order to reduce one's census (0.43)
Factor 3: Learning environment
Not correcting someone who mistakes a student for a physician (0.72)
Texting or using smartphone during educational conferences (ie, noon lecture) (0.51)
Failing to notify patient‐safety or risk management that a member of the team made, or is concerned that they made, an error (0.45)
Having nonmedical/personal conversations in patient corridors (eg, discussing evening plans) (0.43)
Factor 4: Time pressure
Ignoring family member's nonurgent questions about a cross‐cover patient when you had the time to answer (0.50)
Signing out patients over the phone at the end of shift when sign‐out could have been done in person (0.46)
Attesting to a resident's note when not fully confident of the content of their documentation (0.44)

Using site‐adjusted multivariate regression, certain hospitalist job characteristics were associated with certain patterns of participating in unprofessional behavior (Table 4). Those with less clinical time (<50% FTE) were more likely to participate in unprofessional behaviors related to making fun of others (factor 1, value = 0.94, 95% CI 0.32 to 1.56, P value <0.05). Hospitalists who had any administrative time ( value = 0.61, 95% CI 0.111.10, P value <0.05) were more likely to report participation in behaviors related to workload management. Hospitalists engaged in any night work were more likely to report participation in unprofessional behaviors related to time pressure ( value = 0.67, 95% CI 0.171.17, P value <0.05). Time devoted to teaching or research was not associated with greater participation in any of the domains of unprofessional behavior surveyed.

Association Between Hospitalist Job and Demographic Characteristics and Factors of Unprofessional Behavior
ModelMaking Fun of OthersLearning EnvironmentWorkload ManagementTime Pressure
PredictorBeta [95% CI]Beta [95% CI]Beta [95% CI]Beta [95% CI]
  • NOTE: Table shows the results of 4 different multivariable linear regression models, which examine the association between various covariates (job characteristics, demographic characteristics, and site) and factors of participation in unprofessional behaviors (communication, patient safety, workload). Due to item nonresponse, n = 63 for all regression models. Abbreviations: CI, confidence interval.

  • P < 0.05.

  • Less clinical was defined as less than 50% full‐time equivalent (FTE) in a given year spent on clinical work.

  • Teaching was defined as greater than the median (10% FTE) spent on teaching. Results did not change when using tertiles of teaching effort, or a cutoff at teaching greater than 20% FTE.

  • Administrative time, research time, and nights were defined as reporting any administrative time, research time, or night work, respectively (greater than 0% per year).

  • Younger was defined as those born after 1970.

Job characteristics
Less clinical0.94 [0.32, 1.56]*0.01 [0.66, 0.64]0.17 [0.84, 0.49]0.39 [0.24, 1.01]
Administrative0.30 [0.16, 0.76]0.06 [0.43, 0.54]0.61 [0.11, 1.10]*0.26 [0.20, 0.72]
Teaching0.01 [0.49, 0.48]0.09 [0.60, 0.42]0.12 [0.64, 0.40]0.16 [0.33, 0.65]
Research0.30 [0.87, 0.27]0.38 [0.98, 0.22]0.37 [0.98, 0.24]0.13 [0.45, 0.71]
Any nights0.08 [0.58, 0.42]0.24 [0.28, 0.77]0.24 [0.29, 0.76]0.67 [0.17,1.17]*
Demographic characteristics
Male0.06 [0.42, 0.53]0.03 [0.47, 0.53]0.05 [0.56, 0.47]0.40 [0.89, 0.08]
Younger0.05 [0.79, 0.69]0.64 [1.42, 0.14]0.87 [0.07, 1.67]*0.62 [0.13, 1.37]
Unfamiliar with residents0.32 [0.85, 0.22]0.32 [0.89, 0.24]0.13 [0.45, 0.70]0.47 [0.08, 1.01]
Institution
Site 10.58 [0.22, 1.38]0.05 [0.89, 0.79]1.01 [0.15, 1.86]*0.77 [1.57, 0.04]
Site 30.11 [0.68, 0.47]0.70 [1.31, 0.09]*0.43 [0.20, 1.05]0.45 [0.13, 1.04]
Constant0.03 [0.99, 1.06]0.94 [0.14, 2.02]1.23[2.34, 0.13]*1.34[2.39, 0.31]*

The only demographic characteristic that was significantly associated with unprofessional behavior was age. Specifically, those who were born after 1970 were more likely to participate in unprofessional behaviors related to workload management ( value = 0.87, 95% CI 0.071.67, P value <0.05). Site differences were also present. Specifically, one site was more likely to report participation in unprofessional behaviors related to workload management ( value site 1 = 1.01, 95% CI 0.15 to 1.86, P value <0.05), while another site was less likely to report participation in behaviors related to the learning environment ( value site 3 = 0.70, 95% CI 1.31 to 0.09, P value <0.05). Gender and familiarity with residents were not significant predictors of participation in unprofessional behaviors. Results remained robust in sensitivity analyses using different cutoffs of clinical time and teaching time.

DISCUSSION

This multisite study adds to what is known about the perceptions of, and participation in, unprofessional behaviors among internal medicine hospitalists. Hospitalists perceived almost all surveyed behaviors as unprofessional. Participation in egregious and trainee‐related unprofessional behaviors was very low. Four categories appeared to explain the variability in how hospitalists reported participation in unprofessional behaviors: making fun of others, workload management, learning environment, and time pressure. Participation in behaviors within these factors was associated with certain job characteristics, such as clinical time, administrative time, and night work, as well as age and site.

It is reassuring that participation in, and trainee‐related, unprofessional behaviors is very low, and it is noteworthy that attending an industry‐sponsored dinner is not considered unprofessional. This was surprising in the setting of increased external pressures to report and ban such interactions.28 Perception that attending such dinners is acceptable may reflect a lag between current practice and national recommendations.

It is important to explore why certain job characteristics are associated with participation in unprofessional behaviors. For example, those with less clinical time were more likely to participate in making fun of others. It may be the case that hospitalists with more clinical time may make a larger effort to develop and maintain positive relationships. Another possible explanation is that hospitalists with less clinical time are more easily influenced by those in the learning environment who make fun of others, such as residents who they are supervising for only a brief period.

For unprofessional behaviors related to workload management, those who were younger, and those with any administrative time, were more likely to participate in behaviors such as celebrating a blocked‐admission. Our prior work shows that behaviors related to workload management are more widespread in residency, and therefore younger hospitalists, who are often recent residency graduates, may be more prone to participating in these behaviors. While unproven, it is possible that those with more administrative time may have competing priorities with their administrative roles, which motivate them to more actively manage their workload, leading them to participate in workload management behaviors.

Hospitalists who did any night work were more likely to participate in unprofessional behaviors related to time pressure. This could reflect the high workloads that night hospitalists may face and the pressure they feel to wrap up work, resulting in a hasty handoff (ie, over the phone) or to defer work (ie, family questions). Site differences were also observed for participation in behaviors related to the learning environment, speaking to the importance of institutional culture.

It is worth mentioning that hospitalists who were teachers were not any less likely to report participating in certain behaviors. While 78% of hospitalists reported some level of teaching, the median reported percentage of teaching was 10% FTE. This level of teaching likely reflects the diverse nature of work in which hospitalists engage. While hospitalists spend some time working with trainees, services that are not staffed with residents (eg, uncovered services) are becoming increasingly common due to stricter resident duty hour restrictions. This may explain why 60% of hospitalists reported being unfamiliar with residents. We also used a high bar for familiarity, which we defined as knowing half of residents by name, and served as a proxy for those who may have trained at the institution where they currently work. In spite of hospitalists reporting a low fraction of their total clinical time devoted to resident services, a significant fraction of resident services were staffed by hospitalists at all sites, making them a natural target for interventions.

These results have implications for future work to assess and improve professionalism in the hospital learning environment. First, interventions to address unprofessional behaviors should focus on behaviors with the highest participation rates. Like our earlier studies of residents, participation is high in certain behaviors, such as misrepresenting a test as urgent, or disparaging the ER or primary care physician (PCP) for a missed finding.19, 20 While blocking an admission was common in our studies of residents, reported participation among hospitalists was low. Similar to a prior study of clinical year medical students at one of our sites, 1 in 5 hospitalists reported not correcting someone who mistakes a student for a physician, highlighting the role that hospitalists may have in perpetuating this behavior.8 Additionally, addressing the behaviors identified in this study, through novel curricular tools, may help to teach residents many of the interpersonal and communication skills called for in the 2011 ACGME Common Program Requirements.11 The ACGME requirements also include the expectation that faculty model how to manage their time before, during, and after clinical assignments, and recognize that transferring a patient to a rested provider is best. Given that most hospitalists believe staying past shift limit is professional, these requirements will be difficult to adopt without widespread culture change.

Moreover, interventions could be tailored to hospitalists with certain job characteristics. Interventions may be educational or systems based. An example of the former is stressing the impact of the learning and working environment on trainees, and an example of the latter is streamlining the process in which ordered tests are executed to result in a more timely completion of tests. This may result in fewer physicians misrepresenting a test as urgent in order to have the test done in a timely manner. Additionally, hospitalists with less clinical time could receive education on their impact as a role model for trainees. Hospitalists who are younger or with administrative commitments could be trained on the importance of avoiding behaviors related to workload management, such as blocking or turfing patients. Lastly, given the site differences, critical examination of institutional culture and policies is also important. With funding from the American Board of Internal Medicine (ABIM) Foundation, we are currently creating an educational intervention, targeting those behaviors that were most frequent among hospitalists and residents at our institutions to promote dialogue and critical reflection, with the hope of reducing the most prevalent behaviors encountered.

There are several limitations to this study. Despite the anonymity of the survey, participants may have inaccurately reported their participation in unprofessional behaviors due to socially desirable response. In addition, because we used factor analysis and multivariate regression models with a small sample size, item nonresponse limited the sample for regression analyses and raises the concern for response bias. However, all significant associations remained so after performing backwards stepwise elimination of covariates that were P > 0.10 in models that were larger (ranging from 65 to 69). Because we used self‐report and not direct observation of participation in unprofessional behaviors, it is not possible to validate the responses given. Future work could rely on the use of 360 degree evaluations or other methods to validate responses given by self‐report. It is also important to consider assessing whether these behaviors are associated with actual patient outcomes, such as length of stay or readmission. Some items may not always be unprofessional. For example, texting during an educational conference might be to advance care, which would not necessarily be unprofessional. The order in which the questions were asked could have led to bias. We asked about participation before perception to try to limit bias reporting in participation. Changing the order of these questions would potentially have resulted in under‐reporting participation in behaviors that one perceived to be unprofessional. This study was conducted at 3 institutions located in Chicago, limiting generalizability to institutions outside of this area. Only internal medicine hospitalists were surveyed, which also limits generalizability to other disciplines and specialties within internal medicine. Lastly, it is important to highlight that hospitalists are not the sole teachers on inpatient services, since residents encounter a variety of faculty who serve as teaching attendings. Future work should expand to other centers and other specialties.

In conclusion, in this multi‐institutional study of hospitalists, participation in egregious behaviors was low. Four factors or patterns underlie hospitalists' reports of participation in unprofessional behavior: making fun of others, learning environment, workload management, and time pressure. Job characteristics (clinical time, administrative time, night work), age, and site were all associated with different patterns of unprofessional behavior. Specifically, hospitalists with less clinical time were more likely to make fun of others. Hospitalists who were younger in age, as well as those who had any administrative work, were more likely to participate in behaviors related to workload management. Hospitalists who work nights were more likely to report behaviors related to time pressure. Interventions to promote professionalism should take institutional culture into account and should focus on behaviors with the highest participation rates. Efforts should also be made to address underlying reasons for participation in these behaviors.

Acknowledgements

The authors thank Meryl Prochaska for her research assistance and manuscript preparation.

Disclosures: The authors acknowledge funding from the ABIM Foundation and the Pritzker Summer Research Program. The funders had no role in the design of the study; the collection, analysis, and interpretation of the data; or the decision to approve publication of the finished manuscript. Prior presentations of the data include the 2010 University of Chicago Pritzker School of Medicine Summer Research Forum, the 2010 University of Chicago Pritzker School of Medicine Medical Education Day, the 2010 Midwest Society of Hospital Medicine Meeting in Chicago, IL, and the 2011 Society of Hospital Medicine National Meeting in Dallas, TX. All authors disclose no relevant or financial conflicts of interest.

The discrepancy between what is taught about professionalism in formal medical education and what is witnessed in the hospital has received increasing attention.17 This latter aspect of medical education contributes to the hidden curriculum and impacts medical trainees' views on professionalism.8 The hidden curriculum refers to the lessons trainees learn through informal interactions within the multilayered educational learning environment.9 A growing body of work examines how the hidden curriculum and disruptive physicians impact the learning environment.9, 10 In response, regulatory agencies, such as the Liaison Committee on Medical Education (LCME) and Accreditation Council for Graduate Medical Education (ACGME), require training programs and medical schools to maintain standards of professionalism, and to regularly evaluate the learning environment and its impact on professionalism.11, 12 The ACGME in 2011 expanded its standards regarding professionalism by making certain that the program director and institution ensure a culture of professionalism that supports patient safety and personal responsibility.11 Given this increasing focus on professionalism in medical school and residency training programs, it is critical to examine faculty perceptions and actions that may perpetuate the discrepancy between the formal and hidden curriculum.13 This early exposure is especially significant because unprofessional behavior in medical school is strongly associated with later disciplinary action by a medical board.14, 15 Certain unprofessional behaviors can also compromise patient care and safety, and can detract from the hospital working environment.1618

In our previous work, we demonstrated that internal medicine interns reported increased participation in unprofessional behaviors regarding on‐call etiquette during internship.19, 20 Examples of these behaviors include refusing an admission (ie, blocking) and misrepresenting a test as urgent. Interestingly, students and residents have highlighted the powerful role of supervising faculty physicians in condoning or inhibiting such behavior. Given the increasing role of hospitalists as resident supervisors, it is important to consider the perceptions and actions of hospitalists with respect to perpetuating or hindering some unprofessional behaviors. Although hospital medicine is a relatively new specialty, many hospitalists are in frequent contact with medical trainees, perhaps because many residency programs and medical schools have a strong inpatient focus.2123 It is thus possible that hospitalists have a major influence on residents' behaviors and views of professionalism. In fact, the Society of Hospital Medicine's Core Competencies for Hospital Medicine explicitly state that hospitalists are expected to serve as a role model for professional and ethical conduct to house staff, medical students and other members of the interdisciplinary team.24

Therefore, the current study had 2 aims: first, to measure internal medicine hospitalists' perceptions of, and participation in, unprofessional behaviors using a previously validated survey; and second, to examine associations between job characteristics and participation in unprofessional behaviors.

METHODS

Study Design

This was a multi‐institutional, observational study that took place at the University of Chicago Pritzker School of Medicine, Northwestern University Feinberg School of Medicine, and NorthShore University HealthSystem. Hospitalist physicians employed at these hospitals were recruited for this study between June 2010 and July 2010. The Institutional Review Boards of the University of Chicago, Northwestern University, and NorthShore University HealthSystem approved this study. All subjects provided informed consent before participating.

Survey Development and Administration

Based on a prior survey of interns and third‐year medical students, a 35‐item survey was used to measure perceptions of, and participation in, unprofessional behaviors.8, 19, 20 The original survey was developed in 2005 by medical students who observed behaviors by trainees and faculty that they considered to be unprofessional. The survey was subsequently modified by interns to ascertain unprofessional behavior among interns. For this iteration, hospitalists and study authors at each site reviewed the survey items and adapted each item to ensure relevance to hospitalist work and also generalizability to site. New items were also created to refer specifically to work routinely performed by hospitalist attendings (attesting to resident notes, transferring patients to other services to reduce workload, etc). Because of this, certain items utilized jargon to refer to the unprofessional behavior as hospitalists do (ie, blocking admissions and turfing), and resonate with literature describing these phenomena.25 Items were also written in such a fashion to elicit the unprofessional nature (ie, blocking an admission that could be appropriate for your service).

The final survey (see Supporting Information, Appendix, in the online version of this article) included domains such as interactions with others, interactions with trainees, and patient‐care scenarios. Demographic information and job characteristics were collected including year of residency completion, total amount of clinical work, amount of night work, and amount of administrative work. Hospitalists were not asked whether they completed residency at the institution where they currently work in order to maintain anonymity in the context of a small sample. Instead, they were asked to rate their familiarity with residents at their institution on a Likert‐type scale ranging from very unfamiliar (1) to familiar (3) to very familiar (5). To help standardize levels of familiarity across hospitalists, we developed anchors that corresponded to how well a hospitalist would know resident names with familiar defined as knowing over half of resident names.

Participants reported whether they participated in, or observed, a particular behavior and rated their perception of each behavior from 1 (unprofessional) to 5 (professional), with unprofessional and somewhat unprofessional defined as unprofessional. A site champion administered paper surveys during a routine faculty meeting at each site. An electronic version was administered using SurveyMonkey (SurveyMonkey, Palo Alto, CA) to hospitalists not present at the faculty meeting. Participants chose a unique, nonidentifiable code to facilitate truthful reporting while allowing data tracking in follow‐up studies.

Data Analysis

Clinical time was dichotomized using above and below 50% full‐time equivalents (FTE) to define those that did less clinical. Because teaching time was relatively low with the median percent FTE spent on teaching at 10%, we used a cutoff of greater than 10% as greater teaching. Because many hospitalists engaged in no night work, night work was reported as those who engaged in any night work and those who did not. Similarly, because many hospitalists had no administrative time, administrative time was split into those with any administrative work and those without any administrative work. Lastly, those born after 1970 were classified as younger hospitalists.

Chi‐square tests were used to compare site response rates, and descriptive statistics were used to examine demographic characteristics of hospitalist respondents, in addition to perception of, and participation in, unprofessional behaviors. Because items on the survey were highly correlated, we used factor analysis to identify the underlying constructs that related to unprofessional behavior.26 Factor analysis is a statistical procedure that is most often used to explore which variables in a data set are most related or correlated to each other. By examining the patterns of similar responses, the underlying factors can be identified and extracted. These factors, by definition, are not correlated with each other. To select the number of factors to retain, the most common convention is to use Kaiser criterion, or retain all factors with eigenvalues greater than, or equal to, one.27 An eigenvalue measures the amount of variation in all of the items on the survey which is accounted for by that factor. If a factor has a low eigenvalue (less than 1 is the convention), then it is contributing little and is ignored, as it is likely redundant with the higher value factors.

Because use of Kaiser criterion often overestimates the number of factors to retain, another method is to use a scree plot which tends to underestimate the factors. Both were used in this study to ensure a stable solution. To name the factors, we examined which items or group of items loaded or were most highly related to which factor. To ensure an optimal factor solution, items with minimal participation (less than 3%) were excluded from factor analysis.

Then, site‐adjusted multivariate regression analysis was used to examine associations between job and demographic characteristics, and the factors of unprofessional behavior identified. Models controlled for gender and familiarity with residents. Because sample medians were used to define greater teaching (>10% FTE), we also performed a sensitivity analysis using different cutoffs for teaching time (>20% FTE and teaching tertiles). Likewise, we also used varying definitions of less clinical time to ensure that any statistically significant associations were robust across varying definitions. All data were analyzed using STATA 11.0 (Stata Corp, College Station, TX) and statistical significance was defined as P < 0.05.

RESULTS

Seventy‐seven of the 101 hospitalists (76.2%) at 3 sites completed the survey. While response rates varied by site (site 1, 67%; site 2, 74%; site 3, 86%), the differences were not statistically significant (2 = 2.9, P = 0.24). Most hospitalists (79.2%) completed residency after 2000. Over half (57.1%) of participants were male, and over half (61%) reported having worked with their current hospitalist group from 1 to 4 years. Almost 60% (59.7%) reported being unfamiliar with residents in the program. Over 40% of hospitalists did not do any night work. Hospitalists were largely clinical, one‐quarter of hospitalists reported working over 50% FTE, and the median was 80% FTE. While 78% of hospitalists reported some teaching time, median time on teaching service was low at 10% (Table 1).

Demographics of Responders* (n = 77)
 Total n (%)
  • Abbreviations: IQR, interquartile range.

  • Site differences were observed for clinical practice characteristics, such as number of weeks of teaching service, weeks working nights, clinical time, research time, completed fellowship, and won teaching awards. Due to item nonresponse, number of respondents reporting is listed for each item.

  • Familiarity with residents asked in lieu of whether hospitalist trained at the institution. Familiarity defined as a rating of 4 or 5 on Likert scale ranging from Very Unfamiliar (1) to Very Familiar (5), with Familiar (4) defined further as knowing >50% of residents' names.

Male (%)44 (57.1)
Completed residency (%)
Between 1981 and 19902 (2.6)
Between 1991 and 200014 (18.2)
After 200061 (79.2)
Medical school matriculation (%) (n = 76) 
US medical school59 (77.6)
International medical school17 (22.3)
Years spent with current hospitalist group (%)
<1 yr14 (18.2)
14 yr47 (61.0)
59 yr15 (19.5)
>10 yr1 (1.3)
Familiarity with residents (%)
Familiar31 (40.2)
Unfamiliar46 (59.7)
No. of weeks per year spent on (median IQR)
Hospitalist practice (n = 72)26.0 [16.026.0]
Teaching services (n = 68)4.0 [1.08.0]
Weeks working nights* (n = 71)
>2 wk16 (22.5)
12 wk24 (33.8)
0 wk31 (43.7)
% Clinical time (median IQR)* (n = 73)80 (5099)
% Teaching time (median IQR)* (n = 74)10 (120)
Any research time (%)* (n = 71)22 (31.0)
Any administrative time (%) (n = 72)29 (40.3)
Completed fellowship (%)*12 (15.6)
Won teaching awards (%)* (n = 76)21 (27.6)
View a career in hospital medicine as (%)
Temporary11 (14.3)
Long term47 (61.0)
Unsure19 (24.7)

Hospitalists perceived almost all behaviors as unprofessional (unprofessional or somewhat unprofessional on a 5‐point Likert Scale). The only behavior rated as professional with a mean of 4.25 (95% CI 4.014.49) was staying past shift limit to complete a patient‐care task that could have been signed out. This behavior also had the highest level of participation by hospitalists (81.7%). Hospitalists were most ambivalent when rating professionalism of attending an industry‐sponsored dinner or social event (mean 3.20, 95% CI 2.983.41) (Table 2).

Perception of, and Observation and Participation in, Unprofessional Behaviors Among Hospitalists (n = 77)
BehaviorReported Perception (Mean Likert score)*Reported Participation (%)Reported Observation (%)
  • Abbreviations: ER, emergency room.

  • Perception rated on Likert scale from 1 (unprofessional) to 5 (professional).

Having nonmedical/personal conversations in patient corridors (eg, discussing evening plans)2.55 (2.342.76)67.180.3
Ordering a routine test as urgent to get it expedited2.82 (2.583.06)62.380.5
Making fun of other physicians to colleagues1.56 (1.391.70)40.367.5
Disparaging the ER team/outpatient doctor to others for findings later discovered on the floor (eg, after the patient is admitted)2.01 (1.842.19)39.567.1
Signing out patients over the phone at the end of shift when sign‐out could have been done in person2.95 (2.743.16)40.865.8
Texting or using smartphone during educational conferences (ie, noon lecture)2.16 (1.952.36)39.072.7
Discussing patient information in public spaces1.49 (1.341.63)37.766.2
Making fun of other attendings to colleagues1.62 (1.461.78)35.161.0
Deferring family members' concerns about a change in the patient's clinical course to the primary team in order to avoid engaging in such a discussion2.16 (1.912.40)30.355.3
Making disparaging comments about a patient on rounds1.42 (1.271.56)29.867.5
Attending an industry (eg, pharmaceutical or equipment/device manufacturer)‐sponsored dinner or social event3.20 (2.983.41)28.660.5
Ignoring family member's nonurgent questions about a cross‐cover patient when you had time to answer2.05 (1.852.25)26.348.7
Attesting to a resident's note when not fully confident of the content of their documentation1.65 (1.451.85)23.432.5
Making fun of support staff to colleagues1.45 (1.311.59)22.157.9
Not correcting someone who mistakes a student for a physician2.19 (2.012.38)20.835.1
Celebrating a blocked‐admission1.80 (1.612.00)21.160.5
Making fun of residents to colleagues1.53 (1.371.70)18.244.2
Coming to work when you have a significant illness (eg, influenza)1.99 (1.792.19)14.335.1
Celebrating a successful turf1.71 (1.511.92)11.739.0
Failing to notify the patient that a member of the team made, or is concerned that they made, an error1.53 (1.341.71)10.420.8
Transferring a patient, who could be cared for on one's own service, to another service in order to reduce one's census (eg, turfing)1.72 (1.521.91)9.358.7
Refusing an admission which could be considered appropriate for your service (eg, blocking)1.63 (1.441.82)7.968.4
Falsifying patient records (ie, back‐dating a note, copying forward unverified information, or documenting physical findings not personally obtained)1.22 (1.101.34)6.527.3
Making fun of students to colleagues1.35 (1.191.51)6.524.7
Failing to notify patient‐safety or risk management that a member of the team made, or is concerned that they made, an error1.64 (1.461.82)5.213.2
Introducing a student as a doctor to patients1.96 (1.762.16)3.920.8
Signing‐out a procedure or task, that could have been completed during a required shift or by the primary team, in order to go home as early in the day as possible1.48 (1.321.64)3.948.1
Performing medical or surgical procedures on a patient beyond self‐perceived level of skill1.27 (1.141.41)2.67.8
Asking a student to obtain written consent from a patient or their proxy without supervision (eg, for blood transfusion or minor procedures)1.60 (1.421.78)2.636.5
Encouraging a student to state that they are a doctor in order to expedite patient care1.31 (1.151.47)2.66.5
Discharging a patient before they are ready to go home in order to reduce one's census1.18 (1.071.29)2.619.5
Reporting patient information (eg, labs, test results, exam results) as normal when uncertain of the true results1.29 (1.161.41)2.615.6
Asking a student to perform medical or surgical procedures which are perceived to be beyond their level of skill1.26 (1.121.40)1.33.9
Asking a student to discuss, with patients, medical or surgical information which is perceived to be beyond their level of knowledge1.41 (1.261.56)0.015.8

Participation in egregious behaviors, such as falsifying patient records (6.49%) and performing medical or surgical procedures on a patient beyond self‐perceived level of skill (2.60%), was very low. The most common behaviors rated as unprofessional that hospitalists reported participating in were having nonmedical/personal conversations in patient corridors (67.1%), ordering a routine test as urgent to expedite care (62.3%), and making fun of other physicians to colleagues (40.3%). Forty percent of participants reported disparaging the emergency room (ER) team or primary care physician for findings later discovered, signing out over the phone when it could have been done in person, and texting or using smartphones during educational conferences. In particular, participation in unprofessional behaviors related to trainees was close to zero (eg, asking a student to discuss, with patients, medical or surgical information which is perceived to be beyond their level of knowledge). The least common behaviors that hospitalists reported participating in were discharging a patient before they are ready to go home in order to reduce one's census (2.56%) and reporting patient information as normal when uncertain of the true results (2.60%). Like previous studies of unprofessional behaviors, those that reported participation were less likely to report the behavior as unprofessional.8, 19

Observation of behaviors ranged from 4% to 80%. In all cases, observation of the behavior was reported at a higher level than participation. Correlation between observation and participation was also high, with the exception of a few behaviors that had zero or near zero participation rates (ie, reporting patient information as normal when unsure of true results.)

After performing factor analysis, 4 factors had eigenvalues greater than 1 and were therefore retained and extracted for further analysis. These 4 factors accounted for 76% of the variance in responses reported on the survey. By examining which items or groups of items most strongly loaded on each factor, the factors were named accordingly: factor 1 referred to behaviors related to making fun of others, factor 2 referred to workload management, factor 3 referred to behaviors related to the learning environment, and factor 4 referred to behaviors related to time pressure (Table 3).

Results of Factor Analysis Displaying Items by Primary Loading
  • NOTE: Items were categorized using factor analysis to the factor that they loaded most highly on. All items shown loaded at 0.4 or above onto each factor. Four items were omitted due to loadings less than 0.4. One item cross‐loaded on multiple factors (deferring family questions). Abbreviations: ER, emergency room.

Factor 1: Making fun of others
Making fun of other physicians (0.78)
Making fun of attendings (0.77)
Making fun of residents (0.70)
Making disparaging comments about a patient on rounds (0.51)
Factor 2: Workload management
Celebrating a successful turf (0.81)
Celebrating a blocked‐admission (0.65)
Coming to work sick (0.56)
Transferring a patient, who could be cared for on one's own service, to another service in order to reduce one's census (eg, turfing.) (0.51)
Disparaging the ER team/outpatient doctor to others for findings later discovered on the floor (0.48)
Discharging a patient before they are ready to go home in order to reduce one's census (0.43)
Factor 3: Learning environment
Not correcting someone who mistakes a student for a physician (0.72)
Texting or using smartphone during educational conferences (ie, noon lecture) (0.51)
Failing to notify patient‐safety or risk management that a member of the team made, or is concerned that they made, an error (0.45)
Having nonmedical/personal conversations in patient corridors (eg, discussing evening plans) (0.43)
Factor 4: Time pressure
Ignoring family member's nonurgent questions about a cross‐cover patient when you had the time to answer (0.50)
Signing out patients over the phone at the end of shift when sign‐out could have been done in person (0.46)
Attesting to a resident's note when not fully confident of the content of their documentation (0.44)

Using site‐adjusted multivariate regression, certain hospitalist job characteristics were associated with certain patterns of participating in unprofessional behavior (Table 4). Those with less clinical time (<50% FTE) were more likely to participate in unprofessional behaviors related to making fun of others (factor 1, value = 0.94, 95% CI 0.32 to 1.56, P value <0.05). Hospitalists who had any administrative time ( value = 0.61, 95% CI 0.111.10, P value <0.05) were more likely to report participation in behaviors related to workload management. Hospitalists engaged in any night work were more likely to report participation in unprofessional behaviors related to time pressure ( value = 0.67, 95% CI 0.171.17, P value <0.05). Time devoted to teaching or research was not associated with greater participation in any of the domains of unprofessional behavior surveyed.

Association Between Hospitalist Job and Demographic Characteristics and Factors of Unprofessional Behavior
ModelMaking Fun of OthersLearning EnvironmentWorkload ManagementTime Pressure
PredictorBeta [95% CI]Beta [95% CI]Beta [95% CI]Beta [95% CI]
  • NOTE: Table shows the results of 4 different multivariable linear regression models, which examine the association between various covariates (job characteristics, demographic characteristics, and site) and factors of participation in unprofessional behaviors (communication, patient safety, workload). Due to item nonresponse, n = 63 for all regression models. Abbreviations: CI, confidence interval.

  • P < 0.05.

  • Less clinical was defined as less than 50% full‐time equivalent (FTE) in a given year spent on clinical work.

  • Teaching was defined as greater than the median (10% FTE) spent on teaching. Results did not change when using tertiles of teaching effort, or a cutoff at teaching greater than 20% FTE.

  • Administrative time, research time, and nights were defined as reporting any administrative time, research time, or night work, respectively (greater than 0% per year).

  • Younger was defined as those born after 1970.

Job characteristics
Less clinical0.94 [0.32, 1.56]*0.01 [0.66, 0.64]0.17 [0.84, 0.49]0.39 [0.24, 1.01]
Administrative0.30 [0.16, 0.76]0.06 [0.43, 0.54]0.61 [0.11, 1.10]*0.26 [0.20, 0.72]
Teaching0.01 [0.49, 0.48]0.09 [0.60, 0.42]0.12 [0.64, 0.40]0.16 [0.33, 0.65]
Research0.30 [0.87, 0.27]0.38 [0.98, 0.22]0.37 [0.98, 0.24]0.13 [0.45, 0.71]
Any nights0.08 [0.58, 0.42]0.24 [0.28, 0.77]0.24 [0.29, 0.76]0.67 [0.17,1.17]*
Demographic characteristics
Male0.06 [0.42, 0.53]0.03 [0.47, 0.53]0.05 [0.56, 0.47]0.40 [0.89, 0.08]
Younger0.05 [0.79, 0.69]0.64 [1.42, 0.14]0.87 [0.07, 1.67]*0.62 [0.13, 1.37]
Unfamiliar with residents0.32 [0.85, 0.22]0.32 [0.89, 0.24]0.13 [0.45, 0.70]0.47 [0.08, 1.01]
Institution
Site 10.58 [0.22, 1.38]0.05 [0.89, 0.79]1.01 [0.15, 1.86]*0.77 [1.57, 0.04]
Site 30.11 [0.68, 0.47]0.70 [1.31, 0.09]*0.43 [0.20, 1.05]0.45 [0.13, 1.04]
Constant0.03 [0.99, 1.06]0.94 [0.14, 2.02]1.23[2.34, 0.13]*1.34[2.39, 0.31]*

The only demographic characteristic that was significantly associated with unprofessional behavior was age. Specifically, those who were born after 1970 were more likely to participate in unprofessional behaviors related to workload management ( value = 0.87, 95% CI 0.071.67, P value <0.05). Site differences were also present. Specifically, one site was more likely to report participation in unprofessional behaviors related to workload management ( value site 1 = 1.01, 95% CI 0.15 to 1.86, P value <0.05), while another site was less likely to report participation in behaviors related to the learning environment ( value site 3 = 0.70, 95% CI 1.31 to 0.09, P value <0.05). Gender and familiarity with residents were not significant predictors of participation in unprofessional behaviors. Results remained robust in sensitivity analyses using different cutoffs of clinical time and teaching time.

DISCUSSION

This multisite study adds to what is known about the perceptions of, and participation in, unprofessional behaviors among internal medicine hospitalists. Hospitalists perceived almost all surveyed behaviors as unprofessional. Participation in egregious and trainee‐related unprofessional behaviors was very low. Four categories appeared to explain the variability in how hospitalists reported participation in unprofessional behaviors: making fun of others, workload management, learning environment, and time pressure. Participation in behaviors within these factors was associated with certain job characteristics, such as clinical time, administrative time, and night work, as well as age and site.

It is reassuring that participation in, and trainee‐related, unprofessional behaviors is very low, and it is noteworthy that attending an industry‐sponsored dinner is not considered unprofessional. This was surprising in the setting of increased external pressures to report and ban such interactions.28 Perception that attending such dinners is acceptable may reflect a lag between current practice and national recommendations.

It is important to explore why certain job characteristics are associated with participation in unprofessional behaviors. For example, those with less clinical time were more likely to participate in making fun of others. It may be the case that hospitalists with more clinical time may make a larger effort to develop and maintain positive relationships. Another possible explanation is that hospitalists with less clinical time are more easily influenced by those in the learning environment who make fun of others, such as residents who they are supervising for only a brief period.

For unprofessional behaviors related to workload management, those who were younger, and those with any administrative time, were more likely to participate in behaviors such as celebrating a blocked‐admission. Our prior work shows that behaviors related to workload management are more widespread in residency, and therefore younger hospitalists, who are often recent residency graduates, may be more prone to participating in these behaviors. While unproven, it is possible that those with more administrative time may have competing priorities with their administrative roles, which motivate them to more actively manage their workload, leading them to participate in workload management behaviors.

Hospitalists who did any night work were more likely to participate in unprofessional behaviors related to time pressure. This could reflect the high workloads that night hospitalists may face and the pressure they feel to wrap up work, resulting in a hasty handoff (ie, over the phone) or to defer work (ie, family questions). Site differences were also observed for participation in behaviors related to the learning environment, speaking to the importance of institutional culture.

It is worth mentioning that hospitalists who were teachers were not any less likely to report participating in certain behaviors. While 78% of hospitalists reported some level of teaching, the median reported percentage of teaching was 10% FTE. This level of teaching likely reflects the diverse nature of work in which hospitalists engage. While hospitalists spend some time working with trainees, services that are not staffed with residents (eg, uncovered services) are becoming increasingly common due to stricter resident duty hour restrictions. This may explain why 60% of hospitalists reported being unfamiliar with residents. We also used a high bar for familiarity, which we defined as knowing half of residents by name, and served as a proxy for those who may have trained at the institution where they currently work. In spite of hospitalists reporting a low fraction of their total clinical time devoted to resident services, a significant fraction of resident services were staffed by hospitalists at all sites, making them a natural target for interventions.

These results have implications for future work to assess and improve professionalism in the hospital learning environment. First, interventions to address unprofessional behaviors should focus on behaviors with the highest participation rates. Like our earlier studies of residents, participation is high in certain behaviors, such as misrepresenting a test as urgent, or disparaging the ER or primary care physician (PCP) for a missed finding.19, 20 While blocking an admission was common in our studies of residents, reported participation among hospitalists was low. Similar to a prior study of clinical year medical students at one of our sites, 1 in 5 hospitalists reported not correcting someone who mistakes a student for a physician, highlighting the role that hospitalists may have in perpetuating this behavior.8 Additionally, addressing the behaviors identified in this study, through novel curricular tools, may help to teach residents many of the interpersonal and communication skills called for in the 2011 ACGME Common Program Requirements.11 The ACGME requirements also include the expectation that faculty model how to manage their time before, during, and after clinical assignments, and recognize that transferring a patient to a rested provider is best. Given that most hospitalists believe staying past shift limit is professional, these requirements will be difficult to adopt without widespread culture change.

Moreover, interventions could be tailored to hospitalists with certain job characteristics. Interventions may be educational or systems based. An example of the former is stressing the impact of the learning and working environment on trainees, and an example of the latter is streamlining the process in which ordered tests are executed to result in a more timely completion of tests. This may result in fewer physicians misrepresenting a test as urgent in order to have the test done in a timely manner. Additionally, hospitalists with less clinical time could receive education on their impact as a role model for trainees. Hospitalists who are younger or with administrative commitments could be trained on the importance of avoiding behaviors related to workload management, such as blocking or turfing patients. Lastly, given the site differences, critical examination of institutional culture and policies is also important. With funding from the American Board of Internal Medicine (ABIM) Foundation, we are currently creating an educational intervention, targeting those behaviors that were most frequent among hospitalists and residents at our institutions to promote dialogue and critical reflection, with the hope of reducing the most prevalent behaviors encountered.

There are several limitations to this study. Despite the anonymity of the survey, participants may have inaccurately reported their participation in unprofessional behaviors due to socially desirable response. In addition, because we used factor analysis and multivariate regression models with a small sample size, item nonresponse limited the sample for regression analyses and raises the concern for response bias. However, all significant associations remained so after performing backwards stepwise elimination of covariates that were P > 0.10 in models that were larger (ranging from 65 to 69). Because we used self‐report and not direct observation of participation in unprofessional behaviors, it is not possible to validate the responses given. Future work could rely on the use of 360 degree evaluations or other methods to validate responses given by self‐report. It is also important to consider assessing whether these behaviors are associated with actual patient outcomes, such as length of stay or readmission. Some items may not always be unprofessional. For example, texting during an educational conference might be to advance care, which would not necessarily be unprofessional. The order in which the questions were asked could have led to bias. We asked about participation before perception to try to limit bias reporting in participation. Changing the order of these questions would potentially have resulted in under‐reporting participation in behaviors that one perceived to be unprofessional. This study was conducted at 3 institutions located in Chicago, limiting generalizability to institutions outside of this area. Only internal medicine hospitalists were surveyed, which also limits generalizability to other disciplines and specialties within internal medicine. Lastly, it is important to highlight that hospitalists are not the sole teachers on inpatient services, since residents encounter a variety of faculty who serve as teaching attendings. Future work should expand to other centers and other specialties.

In conclusion, in this multi‐institutional study of hospitalists, participation in egregious behaviors was low. Four factors or patterns underlie hospitalists' reports of participation in unprofessional behavior: making fun of others, learning environment, workload management, and time pressure. Job characteristics (clinical time, administrative time, night work), age, and site were all associated with different patterns of unprofessional behavior. Specifically, hospitalists with less clinical time were more likely to make fun of others. Hospitalists who were younger in age, as well as those who had any administrative work, were more likely to participate in behaviors related to workload management. Hospitalists who work nights were more likely to report behaviors related to time pressure. Interventions to promote professionalism should take institutional culture into account and should focus on behaviors with the highest participation rates. Efforts should also be made to address underlying reasons for participation in these behaviors.

Acknowledgements

The authors thank Meryl Prochaska for her research assistance and manuscript preparation.

Disclosures: The authors acknowledge funding from the ABIM Foundation and the Pritzker Summer Research Program. The funders had no role in the design of the study; the collection, analysis, and interpretation of the data; or the decision to approve publication of the finished manuscript. Prior presentations of the data include the 2010 University of Chicago Pritzker School of Medicine Summer Research Forum, the 2010 University of Chicago Pritzker School of Medicine Medical Education Day, the 2010 Midwest Society of Hospital Medicine Meeting in Chicago, IL, and the 2011 Society of Hospital Medicine National Meeting in Dallas, TX. All authors disclose no relevant or financial conflicts of interest.

References
  1. Stern DT.Practicing what we preach? An analysis of the curriculum of values in medical education.Am J Med.1998;104:569575.
  2. Borgstrom E,Cohn S,Barclay S.Medical professionalism: conflicting values for tomorrow's doctors.J Gen Intern Med.2010;25(12):13301336.
  3. Karnieli‐Miller O,Vu TR,Holtman MC,Clyman SG,Inui TS.Medical students' professionalism narratives: a window on the informal and hidden curriculum.Acad Med.2010;85(1):124133.
  4. Cohn FG,Shapiro J,Lie DA,Boker J,Stephens F,Leung LA.Interpreting values conflicts experienced by obstetrics‐gynecology clerkship students using reflective writing.Acad Med.2009;84(5):587596.
  5. Gaiser RR.The teaching of professionalism during residency: why it is failing and a suggestion to improve its success.Anesth Analg.2009;108(3):948954.
  6. Gofton W,Regehr G.What we don't know we are teaching: unveiling the hidden curriculum.Clin Orthop Relat Res.2006;449:2027.
  7. Hafferty FW.Definitions of professionalism: a search for meaning and identity.Clin Orthop Relat Res.2006;449:193204.
  8. Reddy ST,Farnan JM,Yoon JD, et al.Third‐year medical students' participation in and perceptions of unprofessional behaviors.Acad Med.2007;82:S35S39.
  9. Hafferty FW.Beyond curriculum reform: confronting medicine's hidden curriculum.Acad Med.1998;73:403407.
  10. Pfifferling JH.Physicians' “disruptive” behavior: consequences for medical quality and safety.Am J Med Qual.2008;23:165167.
  11. Accreditation Council for Graduate Medical Education. Common Program Requirements: General Competencies. Available at: http://www.acgme.org/acwebsite/home/common_program_requirements_07012011.pdf. Accessed December 19,2011.
  12. Liaison Committee on Medical Education. Functions and Structure of a Medical School. Available at: http://www.lcme.org/functions2010jun.pdf.. Accessed June 30,2010.
  13. Gillespie C,Paik S,Ark T,Zabar S,Kalet A.Residents' perceptions of their own professionalism and the professionalism of their learning environment.J Grad Med Educ.2009;1:208215.
  14. Papadakis MA,Hodgson CS,Teherani A,Kohatsu ND.Unprofessional behavior in medical school is associated with subsequent disciplinary action by a state medical board.Acad Med.2004;79:244249.
  15. Papadakis MA,Teherani A,Banach MA, et al.Disciplinary action by medical boards and prior behavior in medical school.N Engl J Med.2005;353:26732682.
  16. Rosenstein AH,O'Daniel M.A survey of the impact of disruptive behaviors and communication defects on patient safety.Jt Comm J Qual Patient Saf.2008;34:464471.
  17. Rosenstein AH,O'Daniel M.Managing disruptive physician behavior—impact on staff relationships and patient care.Neurology.2008;70:15641570.
  18. The Joint Commission.Behaviors that undermine a culture of safety. Sentinel Event Alert.2008. Available at: http://www.jointcommission.org/assets/1/18/SEA_40.PDF. Accessed April 28, 2012.
  19. Arora VM,Wayne DB,Anderson RA,Didwania A,Humphrey HJ.Participation in and perceptions of unprofessional behaviors among incoming internal medicine interns.JAMA.2008;300:11321134.
  20. Arora VM,Wayne DB,Anderson RA, et al.Changes in perception of and participation in unprofessional behaviors during internship.Acad Med.2010;85:S76S80.
  21. Wachter RM.Reflections: the hospitalist movement a decade later.J Hosp Med.2006;1:248252.
  22. Society of Hospital Medicine, 2007–2008 Bi‐Annual Survey.2008. Available at: http://www.medscape.org/viewarticle/578134. Accessed April 28, 2012.
  23. Holmboe ES,Bowen JL,Green M, et al.Reforming internal medicine residency training. A report from the Society of General Internal Medicine's Task Force for Residency Reform.J Gen Intern Med.2005;20:11651172.
  24. Society of Hospital Medicine.The Core Competencies in Hospital Medicine: a framework for curriculum development by the Society of Hospital Medicine.J Hosp Med.2006;1(suppl 1):25.
  25. Caldicott CV,Dunn KA,Frankel RM.Can patients tell when they are unwanted? “Turfing” in residency training.Patient Educ Couns.2005;56:104111.
  26. Costello AB,Osborn JW.Best practices in exploratory factor analysis: four recommendations for getting the most from your analysis.Pract Assess Res Eval.2005;10:19.
  27. Principal Components and Factor Analysis. StatSoft Electronic Statistics Textbook. Available at: http://www.statsoft.com/textbook/principal‐components‐factor‐analysis/. Accessed December 30,2011.
  28. Brennan TA,Rothman DJ,Blank L, et al.Health industry practices that create conflicts of interest: a policy proposal for academic medical centers.JAMA.2006;295(4):429433.
References
  1. Stern DT.Practicing what we preach? An analysis of the curriculum of values in medical education.Am J Med.1998;104:569575.
  2. Borgstrom E,Cohn S,Barclay S.Medical professionalism: conflicting values for tomorrow's doctors.J Gen Intern Med.2010;25(12):13301336.
  3. Karnieli‐Miller O,Vu TR,Holtman MC,Clyman SG,Inui TS.Medical students' professionalism narratives: a window on the informal and hidden curriculum.Acad Med.2010;85(1):124133.
  4. Cohn FG,Shapiro J,Lie DA,Boker J,Stephens F,Leung LA.Interpreting values conflicts experienced by obstetrics‐gynecology clerkship students using reflective writing.Acad Med.2009;84(5):587596.
  5. Gaiser RR.The teaching of professionalism during residency: why it is failing and a suggestion to improve its success.Anesth Analg.2009;108(3):948954.
  6. Gofton W,Regehr G.What we don't know we are teaching: unveiling the hidden curriculum.Clin Orthop Relat Res.2006;449:2027.
  7. Hafferty FW.Definitions of professionalism: a search for meaning and identity.Clin Orthop Relat Res.2006;449:193204.
  8. Reddy ST,Farnan JM,Yoon JD, et al.Third‐year medical students' participation in and perceptions of unprofessional behaviors.Acad Med.2007;82:S35S39.
  9. Hafferty FW.Beyond curriculum reform: confronting medicine's hidden curriculum.Acad Med.1998;73:403407.
  10. Pfifferling JH.Physicians' “disruptive” behavior: consequences for medical quality and safety.Am J Med Qual.2008;23:165167.
  11. Accreditation Council for Graduate Medical Education. Common Program Requirements: General Competencies. Available at: http://www.acgme.org/acwebsite/home/common_program_requirements_07012011.pdf. Accessed December 19,2011.
  12. Liaison Committee on Medical Education. Functions and Structure of a Medical School. Available at: http://www.lcme.org/functions2010jun.pdf.. Accessed June 30,2010.
  13. Gillespie C,Paik S,Ark T,Zabar S,Kalet A.Residents' perceptions of their own professionalism and the professionalism of their learning environment.J Grad Med Educ.2009;1:208215.
  14. Papadakis MA,Hodgson CS,Teherani A,Kohatsu ND.Unprofessional behavior in medical school is associated with subsequent disciplinary action by a state medical board.Acad Med.2004;79:244249.
  15. Papadakis MA,Teherani A,Banach MA, et al.Disciplinary action by medical boards and prior behavior in medical school.N Engl J Med.2005;353:26732682.
  16. Rosenstein AH,O'Daniel M.A survey of the impact of disruptive behaviors and communication defects on patient safety.Jt Comm J Qual Patient Saf.2008;34:464471.
  17. Rosenstein AH,O'Daniel M.Managing disruptive physician behavior—impact on staff relationships and patient care.Neurology.2008;70:15641570.
  18. The Joint Commission.Behaviors that undermine a culture of safety. Sentinel Event Alert.2008. Available at: http://www.jointcommission.org/assets/1/18/SEA_40.PDF. Accessed April 28, 2012.
  19. Arora VM,Wayne DB,Anderson RA,Didwania A,Humphrey HJ.Participation in and perceptions of unprofessional behaviors among incoming internal medicine interns.JAMA.2008;300:11321134.
  20. Arora VM,Wayne DB,Anderson RA, et al.Changes in perception of and participation in unprofessional behaviors during internship.Acad Med.2010;85:S76S80.
  21. Wachter RM.Reflections: the hospitalist movement a decade later.J Hosp Med.2006;1:248252.
  22. Society of Hospital Medicine, 2007–2008 Bi‐Annual Survey.2008. Available at: http://www.medscape.org/viewarticle/578134. Accessed April 28, 2012.
  23. Holmboe ES,Bowen JL,Green M, et al.Reforming internal medicine residency training. A report from the Society of General Internal Medicine's Task Force for Residency Reform.J Gen Intern Med.2005;20:11651172.
  24. Society of Hospital Medicine.The Core Competencies in Hospital Medicine: a framework for curriculum development by the Society of Hospital Medicine.J Hosp Med.2006;1(suppl 1):25.
  25. Caldicott CV,Dunn KA,Frankel RM.Can patients tell when they are unwanted? “Turfing” in residency training.Patient Educ Couns.2005;56:104111.
  26. Costello AB,Osborn JW.Best practices in exploratory factor analysis: four recommendations for getting the most from your analysis.Pract Assess Res Eval.2005;10:19.
  27. Principal Components and Factor Analysis. StatSoft Electronic Statistics Textbook. Available at: http://www.statsoft.com/textbook/principal‐components‐factor‐analysis/. Accessed December 30,2011.
  28. Brennan TA,Rothman DJ,Blank L, et al.Health industry practices that create conflicts of interest: a policy proposal for academic medical centers.JAMA.2006;295(4):429433.
Issue
Journal of Hospital Medicine - 7(7)
Issue
Journal of Hospital Medicine - 7(7)
Page Number
543-550
Page Number
543-550
Publications
Publications
Article Type
Display Headline
Participation in unprofessional behaviors among hospitalists: A multicenter study
Display Headline
Participation in unprofessional behaviors among hospitalists: A multicenter study
Sections
Article Source

Copyright © 2012 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Department of Medicine, The University of Chicago, 5841 S Maryland Ave, MC 2007, AMB B200, Chicago, IL 60637
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Article PDF Media
Media Files