Affiliations
Department of Medicine, Brigham and Women's Hospital, Boston, Massachusetts
Harvard Medical School, Boston, Massachusetts
Email
gmcmahon@partners.org
Given name(s)
Graham T.
Family name
McMahon
Degrees
MD, MMSc

Knowledge of Selected Medical Procedures

Article Type
Changed
Sun, 05/28/2017 - 21:26
Display Headline
Development of a test to evaluate residents' knowledge of medical procedures

Medical procedures, an essential and highly valued part of medical education, are often undertaught and inconsistently evaluated. Hospitalists play an increasingly important role in developing the skills of resident‐learners. Alumni rate procedure skills as some of the most important skills learned during residency training,1, 2 but frequently identify training in procedural skills as having been insufficient.3, 4 For certification in internal medicine, the American Board of Internal Medicine (ABIM) has identified a limited set of procedures in which it expects all candidates to be cognitively competent with regard to their knowledge of these procedures. Although active participation in procedures is recommended for certification in internal medicine, the demonstration of procedural proficiency is not required.5

Resident competence in performing procedures remains highly variable and procedural complications can be a source of morbidity and mortality.2, 6, 7 A validated tool for the assessment of procedure related knowledge is currently lacking. In existing standardized tests, including the in‐training examination (ITE) and ABIM certification examination, only a fraction of questions pertain to medical procedures. The necessity for a specifically designed, standardized instrument that can objectively measure procedure related knowledge has been highlighted by studies that have demonstrated that there is little correlation between the rate of procedure‐related complications and ABIM/ITE scores.8 A validated tool to assess the knowledge of residents in selected medical procedures could serve to assess the readiness of residents to begin supervised practice and form part of a proficiency assessment.

In this study we aimed to develop a valid and reliable test of procedural knowledge in 3 procedures associated with potentially serious complications.

Methods

Placement of an arterial line, central venous catheter and thoracentesis were selected as the focus for test development. Using the National Board of Medical Examiners question development guidelines, multiple‐choice questions were developed to test residents on specific points of a prepared curriculum. Questions were designed to test the essential cognitive aspects of medical procedures, including indications, contraindications, and the management of complications, with an emphasis on the elements that were considered by a panel of experts to be frequently misunderstood. Questions were written by faculty trained in question writing (G.M.) and assessed for clarity by other members of faculty. Content evidence of the 36‐item examination (12 questions per procedure) was established by a panel of 4 critical care specialists with expertise in medical education. The study was approved by the Institutional Review Board at all sites.

Item performance characteristics were evaluated by administering the test online to a series of 30 trainees and specialty clinicians. Postadministration interviews with the critical care experts were performed to determine whether test questions were clear and appropriate for residents. Following initial testing, 4 test items with the lowest discrimination according to a point‐biserial correlation (Integrity; Castle Rock Research, Canada) were deleted from the test. The resulting 32‐item test contained items of varying difficulty to allow for effective discrimination between examinees (Appendix 1).

The test was then administered to residents beginning rotations in either the medical intensive care unit or in the coronary care unit at 4 medical centers in Massachusetts (Brigham and Women's Hospital; Massachusetts General Hospital; Faulkner Hospital; and North Shore Medical Center). In addition to completing the on‐line, self‐administered examination, participants provided baseline data including year of residency training, anticipated career path, and the number of prior procedures performed. On a 5‐point Likert scale participants estimated their self‐perceived confidence at performing the procedure (with and without supervision) and supervising each of the procedures. Residents were invited to complete a second test before the end of their rotation (2‐4 weeks after the initial test) in order to assess test‐retest reliability. Answers were made available only after the conclusion of the study.

Reliability of the 32‐item instrument was measured by Cronbach's analysis; a value of 0.6 is considered adequate and values of 0.7 or higher indicate good reliability. Pearson's correlation (Pearson's r) was used to compute test‐retest reliability. Univariate analyses were used to assess the association of the demographic variables with the test scores. Comparison of test scores between groups was made using a t test/Wilcoxon rank sum (2 groups) and analysis of variance (ANOVA)/Kruskal‐Wallis (3 or more groups). The associations of number of prior procedures attempted and self‐reported confidence with test scores was explored using Spearman's correlation. Inferences were made at the 0.05 level of significance, using 2‐tailed tests. Statistical analyses were performed using SPSS 15.0 (SPSS, Inc., Chicago, IL).

Results

Of the 192 internal medicine residents who consented to participate in the study between February and June 2006, 188 completed the initial and repeat test. Subject characteristics are detailed in Table 1.

Subject Characteristics
 Number (%)
Total residents192
Males113 (59)
Year of residency training
First101 (52)
Second64 (33)
Third/fourth27 (14)
Anticipated career path
General medicine/primary care26 (14)
Critical care47 (24)
Medical subspecialties54 (28)
Undecided/other65 (34)

Reliability of the 32‐item instrument measured by Cronbach's was 0.79 and its test‐retest reliability was 0.82. The items difficulty mean was 0.52 with a corrected point biserial correlation mean of 0.26. The test was of high difficulty, with a mean overall score of 50% (median 53%, interquartile range 44‐59%). Baseline scores differed significantly by residency program (P = 0.03). Residents with anticipated careers in critical care had significantly higher scores than those with anticipated careers in primary care (median scores critical care 56%, primary care and other nonprocedural medical subspecialties 50%, P = 0.01).

Residents in their final year reported performing a median of 13 arterial lines, 14 central venous lines, and 3 thoracenteses over the course of their residency training (Table 2). Increase in the number of performed procedures (central lines, arterial lines, and thoracenteses) was associated with an increase in test score (Spearman's correlation coefficient 0.35, P < 0.001). Residents in the highest and lowest decile of procedures performed had median scores of 56% and 43%, respectively (P < 0.001). Increasing seniority in residency was associated with an increase in overall test scores (median score by program year 49%, 54%, 50%, and 64%, P = 0.02).

Number of Procedures Performed by Year of Internal Medicine Residency Training
Year of Residency TrainingMedian Number of Procedures (Interquartile Range)
Arterial Line InsertionCentral Venous Line InsertionThoracentesis
First1 (03)1 (04)0 (01)
Second8.5 (618)10 (518)2 (04)
Third/fourth13 (820)14 (1027)3 (26)

Increase in self‐reported confidence was significantly associated with an increase in the number of performed procedures (Spearman's correlation coefficients for central line 0.83, arterial lines 0.76, and thoracentesis 0.78, all P < 0.001) and increasing seniority (0.66, 0.59, and 0.52, respectively, all P < 0.001).

Discussion

The determination of procedural competence has long been a challenge for trainers and internal medicine programs; methods for measuring procedural skills have not been rigorously studied. Procedural competence requires a combination of theoretical knowledge and practical skill. However, given the declining number of procedures performed by internists,4 the new ABIM guidelines mandate cognitive competence in contrast to the demonstration of hands‐on procedural proficiency.

We therefore sought to develop and validate the results of an examination of the theoretical knowledge necessary to perform 3 procedures associated with potentially serious complications. Following establishment of content evidence, item performance characteristics and postadministration interviews were used to develop a 32‐item test. We confirmed the test's internal structure by assessment of reliability and assessed the association of test scores with other variables for which correlation would be expected.

We found that residents performed poorly on test content considered to be important by procedure specialists. These findings highlight the limitations in current procedure training that is frequently sporadic and often variable. The numbers of procedures reported over the duration of residency by residents at these centers were low. It is unclear if the low number of procedures performed was due to limitations in resident content knowledge or if it reflects the increasing use of interventional services with fewer opportunities for experiential learning. Nevertheless, an increasing number of prior procedures was associated with higher self‐reported confidence for all procedures and translated to higher test scores.

This study was limited to 4 teaching hospitals and further studies may be needed to investigate the wider generalizability of the study instrument. However, participants were from 3 distinct internal medicine residency programs that included both community and university hospitals. We relied on resident self‐reports and did not independently verify the number of prior procedures performed. However, similar assumptions have been made in prior studies that physicians who rarely perform procedures are able to provide reasonable estimates of the total number performed.3

The reliability of the 32‐item test (Cronbach's = 0.79) is in the expected range for this length of test and indicates good reliability.9, 10 Given the potential complications associated with advanced medical procedures, there is increasing need to establish criteria for competence. Although we have not established a score threshold, the development of this validated tool to assess procedural knowledge is an important step toward establishing such a goal.

This test may facilitate efforts by hospitalists and others to evaluate the efficacy and refine existing methods of procedure training. Feedback to educators using this assessment tool may assist in the improvement of teaching strategies. In addition, the assessment of cognitive competence in procedure‐related knowledge using a rigorous and reliable means of assessment such as outlined in this study may help identify residents who need further training. Recognition for the necessity for additional training and oversight are likely to be especially important if residents are expected to perform procedures safely yet have fewer opportunities for practice.

Acknowledgements

The authors thank Dr. Stephen Wright, Haley Hamlin, and Matt Johnston for their contributions to the data collection and analysis.

References
  1. Nelson RL,McCaffrey LA,Nobrega FT, et al.Altering residency curriculum in response to a changing practice environment: use of the Mayo internal medicine residency alumni survey.Mayo Clin Proc.1990;65(6):809817.
  2. Mandel JH,Rich EC,Luxenberg MG,Spilane MT,Kern DC,Parrino TA.Preparation for practice in internal medicine. A study of ten years of residency graduates.Arch Intern Med.1988;148(4):853856.
  3. Hicks CM,Gonzalez R,Morton MT,Gibbon RV,Wigton RS,Anderson RJ.Procedural experience and comfort level in internal medicine trainees.J Gen Intern Med.2000;15(10):716722.
  4. Wigton RS.Training internists in procedural skills.Ann Intern Med.1992;116(12 Pt 2):10911093.
  5. ABIM. Policies and Procedures for Certification in Internal Medicine2008. Available at: http://www.abim.org/certification/policies/imss/im.aspx. Accessed August 2009.
  6. Wickstrom GC,Kolar MM,Keyserling TC, et al.Confidence of graduating internal medicine residents to perform ambulatory procedures.J Gen Intern Med.2000;15(6):361365.
  7. Kern DC,Parrino TA,Korst DR.The lasting value of clinical skills.JAMA.1985;254(1):7076.
  8. Durning SJ,Cation LJ,Jackson JL.Are commonly used resident measurements associated with procedural skills in internal medicine residency training?J Gen Intern Med.2007;22(3):357361.
  9. Nunnally JC.Psychometric Theory.New York:McGraw Hill;1978.
  10. Cronbach LJ.Coefficient alpha and the internal structure of tests.Psychometrika.1951;16:297334.
Article PDF
Issue
Journal of Hospital Medicine - 4(7)
Publications
Page Number
430-432
Legacy Keywords
medical procedures, residency training, test development
Sections
Article PDF
Article PDF

Medical procedures, an essential and highly valued part of medical education, are often undertaught and inconsistently evaluated. Hospitalists play an increasingly important role in developing the skills of resident‐learners. Alumni rate procedure skills as some of the most important skills learned during residency training,1, 2 but frequently identify training in procedural skills as having been insufficient.3, 4 For certification in internal medicine, the American Board of Internal Medicine (ABIM) has identified a limited set of procedures in which it expects all candidates to be cognitively competent with regard to their knowledge of these procedures. Although active participation in procedures is recommended for certification in internal medicine, the demonstration of procedural proficiency is not required.5

Resident competence in performing procedures remains highly variable and procedural complications can be a source of morbidity and mortality.2, 6, 7 A validated tool for the assessment of procedure related knowledge is currently lacking. In existing standardized tests, including the in‐training examination (ITE) and ABIM certification examination, only a fraction of questions pertain to medical procedures. The necessity for a specifically designed, standardized instrument that can objectively measure procedure related knowledge has been highlighted by studies that have demonstrated that there is little correlation between the rate of procedure‐related complications and ABIM/ITE scores.8 A validated tool to assess the knowledge of residents in selected medical procedures could serve to assess the readiness of residents to begin supervised practice and form part of a proficiency assessment.

In this study we aimed to develop a valid and reliable test of procedural knowledge in 3 procedures associated with potentially serious complications.

Methods

Placement of an arterial line, central venous catheter and thoracentesis were selected as the focus for test development. Using the National Board of Medical Examiners question development guidelines, multiple‐choice questions were developed to test residents on specific points of a prepared curriculum. Questions were designed to test the essential cognitive aspects of medical procedures, including indications, contraindications, and the management of complications, with an emphasis on the elements that were considered by a panel of experts to be frequently misunderstood. Questions were written by faculty trained in question writing (G.M.) and assessed for clarity by other members of faculty. Content evidence of the 36‐item examination (12 questions per procedure) was established by a panel of 4 critical care specialists with expertise in medical education. The study was approved by the Institutional Review Board at all sites.

Item performance characteristics were evaluated by administering the test online to a series of 30 trainees and specialty clinicians. Postadministration interviews with the critical care experts were performed to determine whether test questions were clear and appropriate for residents. Following initial testing, 4 test items with the lowest discrimination according to a point‐biserial correlation (Integrity; Castle Rock Research, Canada) were deleted from the test. The resulting 32‐item test contained items of varying difficulty to allow for effective discrimination between examinees (Appendix 1).

The test was then administered to residents beginning rotations in either the medical intensive care unit or in the coronary care unit at 4 medical centers in Massachusetts (Brigham and Women's Hospital; Massachusetts General Hospital; Faulkner Hospital; and North Shore Medical Center). In addition to completing the on‐line, self‐administered examination, participants provided baseline data including year of residency training, anticipated career path, and the number of prior procedures performed. On a 5‐point Likert scale participants estimated their self‐perceived confidence at performing the procedure (with and without supervision) and supervising each of the procedures. Residents were invited to complete a second test before the end of their rotation (2‐4 weeks after the initial test) in order to assess test‐retest reliability. Answers were made available only after the conclusion of the study.

Reliability of the 32‐item instrument was measured by Cronbach's analysis; a value of 0.6 is considered adequate and values of 0.7 or higher indicate good reliability. Pearson's correlation (Pearson's r) was used to compute test‐retest reliability. Univariate analyses were used to assess the association of the demographic variables with the test scores. Comparison of test scores between groups was made using a t test/Wilcoxon rank sum (2 groups) and analysis of variance (ANOVA)/Kruskal‐Wallis (3 or more groups). The associations of number of prior procedures attempted and self‐reported confidence with test scores was explored using Spearman's correlation. Inferences were made at the 0.05 level of significance, using 2‐tailed tests. Statistical analyses were performed using SPSS 15.0 (SPSS, Inc., Chicago, IL).

Results

Of the 192 internal medicine residents who consented to participate in the study between February and June 2006, 188 completed the initial and repeat test. Subject characteristics are detailed in Table 1.

Subject Characteristics
 Number (%)
Total residents192
Males113 (59)
Year of residency training
First101 (52)
Second64 (33)
Third/fourth27 (14)
Anticipated career path
General medicine/primary care26 (14)
Critical care47 (24)
Medical subspecialties54 (28)
Undecided/other65 (34)

Reliability of the 32‐item instrument measured by Cronbach's was 0.79 and its test‐retest reliability was 0.82. The items difficulty mean was 0.52 with a corrected point biserial correlation mean of 0.26. The test was of high difficulty, with a mean overall score of 50% (median 53%, interquartile range 44‐59%). Baseline scores differed significantly by residency program (P = 0.03). Residents with anticipated careers in critical care had significantly higher scores than those with anticipated careers in primary care (median scores critical care 56%, primary care and other nonprocedural medical subspecialties 50%, P = 0.01).

Residents in their final year reported performing a median of 13 arterial lines, 14 central venous lines, and 3 thoracenteses over the course of their residency training (Table 2). Increase in the number of performed procedures (central lines, arterial lines, and thoracenteses) was associated with an increase in test score (Spearman's correlation coefficient 0.35, P < 0.001). Residents in the highest and lowest decile of procedures performed had median scores of 56% and 43%, respectively (P < 0.001). Increasing seniority in residency was associated with an increase in overall test scores (median score by program year 49%, 54%, 50%, and 64%, P = 0.02).

Number of Procedures Performed by Year of Internal Medicine Residency Training
Year of Residency TrainingMedian Number of Procedures (Interquartile Range)
Arterial Line InsertionCentral Venous Line InsertionThoracentesis
First1 (03)1 (04)0 (01)
Second8.5 (618)10 (518)2 (04)
Third/fourth13 (820)14 (1027)3 (26)

Increase in self‐reported confidence was significantly associated with an increase in the number of performed procedures (Spearman's correlation coefficients for central line 0.83, arterial lines 0.76, and thoracentesis 0.78, all P < 0.001) and increasing seniority (0.66, 0.59, and 0.52, respectively, all P < 0.001).

Discussion

The determination of procedural competence has long been a challenge for trainers and internal medicine programs; methods for measuring procedural skills have not been rigorously studied. Procedural competence requires a combination of theoretical knowledge and practical skill. However, given the declining number of procedures performed by internists,4 the new ABIM guidelines mandate cognitive competence in contrast to the demonstration of hands‐on procedural proficiency.

We therefore sought to develop and validate the results of an examination of the theoretical knowledge necessary to perform 3 procedures associated with potentially serious complications. Following establishment of content evidence, item performance characteristics and postadministration interviews were used to develop a 32‐item test. We confirmed the test's internal structure by assessment of reliability and assessed the association of test scores with other variables for which correlation would be expected.

We found that residents performed poorly on test content considered to be important by procedure specialists. These findings highlight the limitations in current procedure training that is frequently sporadic and often variable. The numbers of procedures reported over the duration of residency by residents at these centers were low. It is unclear if the low number of procedures performed was due to limitations in resident content knowledge or if it reflects the increasing use of interventional services with fewer opportunities for experiential learning. Nevertheless, an increasing number of prior procedures was associated with higher self‐reported confidence for all procedures and translated to higher test scores.

This study was limited to 4 teaching hospitals and further studies may be needed to investigate the wider generalizability of the study instrument. However, participants were from 3 distinct internal medicine residency programs that included both community and university hospitals. We relied on resident self‐reports and did not independently verify the number of prior procedures performed. However, similar assumptions have been made in prior studies that physicians who rarely perform procedures are able to provide reasonable estimates of the total number performed.3

The reliability of the 32‐item test (Cronbach's = 0.79) is in the expected range for this length of test and indicates good reliability.9, 10 Given the potential complications associated with advanced medical procedures, there is increasing need to establish criteria for competence. Although we have not established a score threshold, the development of this validated tool to assess procedural knowledge is an important step toward establishing such a goal.

This test may facilitate efforts by hospitalists and others to evaluate the efficacy and refine existing methods of procedure training. Feedback to educators using this assessment tool may assist in the improvement of teaching strategies. In addition, the assessment of cognitive competence in procedure‐related knowledge using a rigorous and reliable means of assessment such as outlined in this study may help identify residents who need further training. Recognition for the necessity for additional training and oversight are likely to be especially important if residents are expected to perform procedures safely yet have fewer opportunities for practice.

Acknowledgements

The authors thank Dr. Stephen Wright, Haley Hamlin, and Matt Johnston for their contributions to the data collection and analysis.

Medical procedures, an essential and highly valued part of medical education, are often undertaught and inconsistently evaluated. Hospitalists play an increasingly important role in developing the skills of resident‐learners. Alumni rate procedure skills as some of the most important skills learned during residency training,1, 2 but frequently identify training in procedural skills as having been insufficient.3, 4 For certification in internal medicine, the American Board of Internal Medicine (ABIM) has identified a limited set of procedures in which it expects all candidates to be cognitively competent with regard to their knowledge of these procedures. Although active participation in procedures is recommended for certification in internal medicine, the demonstration of procedural proficiency is not required.5

Resident competence in performing procedures remains highly variable and procedural complications can be a source of morbidity and mortality.2, 6, 7 A validated tool for the assessment of procedure related knowledge is currently lacking. In existing standardized tests, including the in‐training examination (ITE) and ABIM certification examination, only a fraction of questions pertain to medical procedures. The necessity for a specifically designed, standardized instrument that can objectively measure procedure related knowledge has been highlighted by studies that have demonstrated that there is little correlation between the rate of procedure‐related complications and ABIM/ITE scores.8 A validated tool to assess the knowledge of residents in selected medical procedures could serve to assess the readiness of residents to begin supervised practice and form part of a proficiency assessment.

In this study we aimed to develop a valid and reliable test of procedural knowledge in 3 procedures associated with potentially serious complications.

Methods

Placement of an arterial line, central venous catheter and thoracentesis were selected as the focus for test development. Using the National Board of Medical Examiners question development guidelines, multiple‐choice questions were developed to test residents on specific points of a prepared curriculum. Questions were designed to test the essential cognitive aspects of medical procedures, including indications, contraindications, and the management of complications, with an emphasis on the elements that were considered by a panel of experts to be frequently misunderstood. Questions were written by faculty trained in question writing (G.M.) and assessed for clarity by other members of faculty. Content evidence of the 36‐item examination (12 questions per procedure) was established by a panel of 4 critical care specialists with expertise in medical education. The study was approved by the Institutional Review Board at all sites.

Item performance characteristics were evaluated by administering the test online to a series of 30 trainees and specialty clinicians. Postadministration interviews with the critical care experts were performed to determine whether test questions were clear and appropriate for residents. Following initial testing, 4 test items with the lowest discrimination according to a point‐biserial correlation (Integrity; Castle Rock Research, Canada) were deleted from the test. The resulting 32‐item test contained items of varying difficulty to allow for effective discrimination between examinees (Appendix 1).

The test was then administered to residents beginning rotations in either the medical intensive care unit or in the coronary care unit at 4 medical centers in Massachusetts (Brigham and Women's Hospital; Massachusetts General Hospital; Faulkner Hospital; and North Shore Medical Center). In addition to completing the on‐line, self‐administered examination, participants provided baseline data including year of residency training, anticipated career path, and the number of prior procedures performed. On a 5‐point Likert scale participants estimated their self‐perceived confidence at performing the procedure (with and without supervision) and supervising each of the procedures. Residents were invited to complete a second test before the end of their rotation (2‐4 weeks after the initial test) in order to assess test‐retest reliability. Answers were made available only after the conclusion of the study.

Reliability of the 32‐item instrument was measured by Cronbach's analysis; a value of 0.6 is considered adequate and values of 0.7 or higher indicate good reliability. Pearson's correlation (Pearson's r) was used to compute test‐retest reliability. Univariate analyses were used to assess the association of the demographic variables with the test scores. Comparison of test scores between groups was made using a t test/Wilcoxon rank sum (2 groups) and analysis of variance (ANOVA)/Kruskal‐Wallis (3 or more groups). The associations of number of prior procedures attempted and self‐reported confidence with test scores was explored using Spearman's correlation. Inferences were made at the 0.05 level of significance, using 2‐tailed tests. Statistical analyses were performed using SPSS 15.0 (SPSS, Inc., Chicago, IL).

Results

Of the 192 internal medicine residents who consented to participate in the study between February and June 2006, 188 completed the initial and repeat test. Subject characteristics are detailed in Table 1.

Subject Characteristics
 Number (%)
Total residents192
Males113 (59)
Year of residency training
First101 (52)
Second64 (33)
Third/fourth27 (14)
Anticipated career path
General medicine/primary care26 (14)
Critical care47 (24)
Medical subspecialties54 (28)
Undecided/other65 (34)

Reliability of the 32‐item instrument measured by Cronbach's was 0.79 and its test‐retest reliability was 0.82. The items difficulty mean was 0.52 with a corrected point biserial correlation mean of 0.26. The test was of high difficulty, with a mean overall score of 50% (median 53%, interquartile range 44‐59%). Baseline scores differed significantly by residency program (P = 0.03). Residents with anticipated careers in critical care had significantly higher scores than those with anticipated careers in primary care (median scores critical care 56%, primary care and other nonprocedural medical subspecialties 50%, P = 0.01).

Residents in their final year reported performing a median of 13 arterial lines, 14 central venous lines, and 3 thoracenteses over the course of their residency training (Table 2). Increase in the number of performed procedures (central lines, arterial lines, and thoracenteses) was associated with an increase in test score (Spearman's correlation coefficient 0.35, P < 0.001). Residents in the highest and lowest decile of procedures performed had median scores of 56% and 43%, respectively (P < 0.001). Increasing seniority in residency was associated with an increase in overall test scores (median score by program year 49%, 54%, 50%, and 64%, P = 0.02).

Number of Procedures Performed by Year of Internal Medicine Residency Training
Year of Residency TrainingMedian Number of Procedures (Interquartile Range)
Arterial Line InsertionCentral Venous Line InsertionThoracentesis
First1 (03)1 (04)0 (01)
Second8.5 (618)10 (518)2 (04)
Third/fourth13 (820)14 (1027)3 (26)

Increase in self‐reported confidence was significantly associated with an increase in the number of performed procedures (Spearman's correlation coefficients for central line 0.83, arterial lines 0.76, and thoracentesis 0.78, all P < 0.001) and increasing seniority (0.66, 0.59, and 0.52, respectively, all P < 0.001).

Discussion

The determination of procedural competence has long been a challenge for trainers and internal medicine programs; methods for measuring procedural skills have not been rigorously studied. Procedural competence requires a combination of theoretical knowledge and practical skill. However, given the declining number of procedures performed by internists,4 the new ABIM guidelines mandate cognitive competence in contrast to the demonstration of hands‐on procedural proficiency.

We therefore sought to develop and validate the results of an examination of the theoretical knowledge necessary to perform 3 procedures associated with potentially serious complications. Following establishment of content evidence, item performance characteristics and postadministration interviews were used to develop a 32‐item test. We confirmed the test's internal structure by assessment of reliability and assessed the association of test scores with other variables for which correlation would be expected.

We found that residents performed poorly on test content considered to be important by procedure specialists. These findings highlight the limitations in current procedure training that is frequently sporadic and often variable. The numbers of procedures reported over the duration of residency by residents at these centers were low. It is unclear if the low number of procedures performed was due to limitations in resident content knowledge or if it reflects the increasing use of interventional services with fewer opportunities for experiential learning. Nevertheless, an increasing number of prior procedures was associated with higher self‐reported confidence for all procedures and translated to higher test scores.

This study was limited to 4 teaching hospitals and further studies may be needed to investigate the wider generalizability of the study instrument. However, participants were from 3 distinct internal medicine residency programs that included both community and university hospitals. We relied on resident self‐reports and did not independently verify the number of prior procedures performed. However, similar assumptions have been made in prior studies that physicians who rarely perform procedures are able to provide reasonable estimates of the total number performed.3

The reliability of the 32‐item test (Cronbach's = 0.79) is in the expected range for this length of test and indicates good reliability.9, 10 Given the potential complications associated with advanced medical procedures, there is increasing need to establish criteria for competence. Although we have not established a score threshold, the development of this validated tool to assess procedural knowledge is an important step toward establishing such a goal.

This test may facilitate efforts by hospitalists and others to evaluate the efficacy and refine existing methods of procedure training. Feedback to educators using this assessment tool may assist in the improvement of teaching strategies. In addition, the assessment of cognitive competence in procedure‐related knowledge using a rigorous and reliable means of assessment such as outlined in this study may help identify residents who need further training. Recognition for the necessity for additional training and oversight are likely to be especially important if residents are expected to perform procedures safely yet have fewer opportunities for practice.

Acknowledgements

The authors thank Dr. Stephen Wright, Haley Hamlin, and Matt Johnston for their contributions to the data collection and analysis.

References
  1. Nelson RL,McCaffrey LA,Nobrega FT, et al.Altering residency curriculum in response to a changing practice environment: use of the Mayo internal medicine residency alumni survey.Mayo Clin Proc.1990;65(6):809817.
  2. Mandel JH,Rich EC,Luxenberg MG,Spilane MT,Kern DC,Parrino TA.Preparation for practice in internal medicine. A study of ten years of residency graduates.Arch Intern Med.1988;148(4):853856.
  3. Hicks CM,Gonzalez R,Morton MT,Gibbon RV,Wigton RS,Anderson RJ.Procedural experience and comfort level in internal medicine trainees.J Gen Intern Med.2000;15(10):716722.
  4. Wigton RS.Training internists in procedural skills.Ann Intern Med.1992;116(12 Pt 2):10911093.
  5. ABIM. Policies and Procedures for Certification in Internal Medicine2008. Available at: http://www.abim.org/certification/policies/imss/im.aspx. Accessed August 2009.
  6. Wickstrom GC,Kolar MM,Keyserling TC, et al.Confidence of graduating internal medicine residents to perform ambulatory procedures.J Gen Intern Med.2000;15(6):361365.
  7. Kern DC,Parrino TA,Korst DR.The lasting value of clinical skills.JAMA.1985;254(1):7076.
  8. Durning SJ,Cation LJ,Jackson JL.Are commonly used resident measurements associated with procedural skills in internal medicine residency training?J Gen Intern Med.2007;22(3):357361.
  9. Nunnally JC.Psychometric Theory.New York:McGraw Hill;1978.
  10. Cronbach LJ.Coefficient alpha and the internal structure of tests.Psychometrika.1951;16:297334.
References
  1. Nelson RL,McCaffrey LA,Nobrega FT, et al.Altering residency curriculum in response to a changing practice environment: use of the Mayo internal medicine residency alumni survey.Mayo Clin Proc.1990;65(6):809817.
  2. Mandel JH,Rich EC,Luxenberg MG,Spilane MT,Kern DC,Parrino TA.Preparation for practice in internal medicine. A study of ten years of residency graduates.Arch Intern Med.1988;148(4):853856.
  3. Hicks CM,Gonzalez R,Morton MT,Gibbon RV,Wigton RS,Anderson RJ.Procedural experience and comfort level in internal medicine trainees.J Gen Intern Med.2000;15(10):716722.
  4. Wigton RS.Training internists in procedural skills.Ann Intern Med.1992;116(12 Pt 2):10911093.
  5. ABIM. Policies and Procedures for Certification in Internal Medicine2008. Available at: http://www.abim.org/certification/policies/imss/im.aspx. Accessed August 2009.
  6. Wickstrom GC,Kolar MM,Keyserling TC, et al.Confidence of graduating internal medicine residents to perform ambulatory procedures.J Gen Intern Med.2000;15(6):361365.
  7. Kern DC,Parrino TA,Korst DR.The lasting value of clinical skills.JAMA.1985;254(1):7076.
  8. Durning SJ,Cation LJ,Jackson JL.Are commonly used resident measurements associated with procedural skills in internal medicine residency training?J Gen Intern Med.2007;22(3):357361.
  9. Nunnally JC.Psychometric Theory.New York:McGraw Hill;1978.
  10. Cronbach LJ.Coefficient alpha and the internal structure of tests.Psychometrika.1951;16:297334.
Issue
Journal of Hospital Medicine - 4(7)
Issue
Journal of Hospital Medicine - 4(7)
Page Number
430-432
Page Number
430-432
Publications
Publications
Article Type
Display Headline
Development of a test to evaluate residents' knowledge of medical procedures
Display Headline
Development of a test to evaluate residents' knowledge of medical procedures
Legacy Keywords
medical procedures, residency training, test development
Legacy Keywords
medical procedures, residency training, test development
Sections
Article Source

Copyright © 2009 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Division of Endocrinology, Diabetes and Hypertension, Brigham and Women's Hospital, 221 Longwood Avenue, Boston, MA 02114
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Article PDF Media

Quantification of Bedside Teaching

Article Type
Changed
Sun, 05/28/2017 - 21:41
Display Headline
Quantification of bedside teaching by an academic hospitalist group

Bedside teaching, defined as teaching in the presence of a patient, has been an integral, respected part of medical education throughout the history of modern medicine. There is widespread concern among medical educators that bedside teaching is declining, and in particular, physical examination teaching.1‐5 Learning at the bedside accounted for 75% of clinical teaching in the 1960s and only 16% by 1978.2 Current estimates range from 8% to 19%.1

The bedside is the ideal venue for demonstrating, observing, and evaluating medical interviewing skills, physical examination techniques, and interpersonal and communication skills. Role modeling is the primary method by which clinical teachers demonstrate and teach professionalism and humanistic behavior.6 The bedside is also a place to develop clinical reasoning skills, stimulate problem‐based learning,7 and demonstrate teamwork.4 Thus, the decline in bedside teaching is of major concern for more than just the dying of a time‐honored tradition, but for the threat to the development of skills and attitudes essential for the practice of medicine.

With the rapid growth in the number of hospitalists and their presence at most major U.S. teaching hospitals, internal medicine residents and medical students in their medicine clerkships receive much of their inpatient training from attending physicians who are hospitalists.8 Little is known about the teaching practices of hospitalist attending physicians. We investigated the fraction of time hospitalist attending physicians spend at the bedside during attending teaching rounds and the frequency of the demonstration of physical examination skills at 1 academic teaching hospital.

Patients and Methods

The Brigham & Women's Hospitalist Service, a 28‐member academic hospitalist group who serve as both the teaching attendings and patient care attendings on 4 general medicine teams, was studied in a prospective, observational fashion. Internal medicine residents at Brigham & Women's Hospital rotating on the hospitalist service were identified by examining the schedule of inpatient rotations during the 2007‐2008 academic year and were asked to participate in the study via an e‐mail invitation. The Institutional Review Board of Brigham & Women's Hospital approved the study.

Teams were made up of 1 senior resident and 2 interns. Call frequency was every fourth day. Over a period of 23 sequential weekdays, medical residents and interns from each of the 4 hospitalist teams observed and reported the behavior of their attendings on rounds. Their reports captured the fraction of time spent at the bedside during rounds and the frequency of physical examination teaching. Residents and interns were asked to respond to 3 questions in a daily e‐mail. Respondents reported (1) total time spent with their hospitalist attending during attending rounds, (2) time spent inside patient rooms during attending rounds, and (3) whether or not a physical examination finding or skill was demonstrated by their hospitalist attending. When more than 1 team member responded, time reported among team members was averaged and if there was a discrepancy between whether or not a physical examination finding or skill was demonstrated, it was defaulted to the positive response. Hospitalist attendings remained unaware of the daily observations.

Hospitalist attendings were independently invited to complete a baseline needs assessment survey on bedside teaching. Surveys addressed attitudes toward bedside teaching, confidence in ability to lead bedside teaching rounds and teach the physical examination, and adequacy of their own training in these skills. Respondents were asked to comment on obstacles to bedside teaching. Residents were surveyed at the completion of a rotation with a hospitalist attending regarding the value of the time spent at the bedside and their self‐perceived improvement in physical examination skills and bedside teaching skills. The survey solicited the residents' opinion of the most valuable aspect of bedside teaching. The survey questions used a 4‐point Likert scale with response options ranging from 1 = strongly disagree to 4 = strongly agree.

The fraction of time spent at the bedside during attending hospitalist rounds was calculated from the average time spent in patient rooms and the average time of attending rounds. The frequency of physical examination teaching was expressed as a percent of all teaching encounters. Interrater reliability was calculated using the intraclass correlation coefficient with the Spearman‐Brown adjustment. Differences between groups were calculated using the Fisher's exact test for counts and the Wilcoxon rank‐sum test for continuous data. Significance was accepted for P < 0.05.

Results

Thirty‐five residents provided observations on 61 of 92 potentially observed attending rounds (66% response rate) over 23 weekdays, including observations of the rounding behavior of 12 different hospitalists. The interrater reliability was 0.91. The average patient census on each team during this time period was 12 (range 6‐19).

Residents reported that their attendings went to the bedside at least once during 37 of these 61 rounds (61%), and provided physical examination teaching during 23 of these 61 (38%) encounters. Hospitalists spent an average of 101 minutes on rounds and an average of 17 minutes (17%) of their time inside patient rooms.

Rounds that included time spent at the bedside were significantly longer on average than rounds that did not include time spent at the bedside (122 vs. 69 minutes, P < 0.001). During rounds that included bedside teaching, teams spent an average of 29 minutes (24% of the total time) in patient rooms, and rounds were significantly more likely to include teaching on physical diagnosis (23/37 rounds vs. 0/24 rounds, P < 0.001). Physical examination teaching did not significantly prolong those rounds that included bedside teaching (124 vs. 119 minutes, P = 0.56), but did significantly increase the amount of time spent at the bedside (32 vs. 22 minutes, P = 0.046).

Eighteen hospitalists (64% response) with a mean of 5.9 years of experience as attending physicians completed a needs‐assessment survey (Table 1). Fourteen of the 18 hospitalists (78%) reported that they prioritize bedside teaching and 16 (89%) requested more emphasis on bedside teaching in the residency curriculum. Twelve hospitalists (67%) indicated that they were confident in their ability to lead bedside teaching rounds; 9 (50%) were confident in their ability to teach physical examination. Eleven (61%) of the respondents felt poorly prepared to do bedside teaching after completing their residency, and 12 (67%) felt that they had received inadequate training in how to teach the physical examination. Of the obstacles to bedside teaching, time and inadequate training and skills were the most frequently noted, present in 11 and 6 of the reports, respectively. Lack of confidence and lack of role models were also cited in 4 and 2 of the reports, respectively.

Hospitalist Survey
 Strongly Disagree (%)Disagree (%)Agree (%)Strongly Agree (%)
  • NOTE: n = 18.

I make bedside teaching a priority0225622
More emphasis on bedside teaching in the residency curriculum is needed0113950
I feel confident in my ability to lead bedside teaching rounds11225017
I was well‐prepared to do bedside teaching after residency training22392811
I feel confident in my ability to teach the physical exam11393317
I have received adequate training in how to teach the physical exam17502211

Seventeen medical residents (49% response) completed a survey regarding their general medical service rotation with a hospitalist upon its completion (Table 2). Sixteen of the respondents (94%) agreed that time spent at the bedside during hospitalist attending teaching rounds that specific rotation was valuable, and 15 (82%) of the residents sought more emphasis on bedside teaching in the residency curriculum. Four of the respondents (24%) reported that their physical examination skills improved over the rotation, 5 (29%) felt better prepared to teach the physical examination, and 9 (53%) felt better prepared to lead bedside teaching rounds. Only 3 (18%) of the respondents reported that they had received helpful feedback on their physical examination skills from their attending. Responding residents noted physical examination teaching, communication and interpersonal skills, focus on patient‐centered care, and integrating the clinical examination with diagnostic and management decisions as the most valuable aspects of bedside teaching.

Resident End of Hospitalist Rotation Survey
 Strongly Disagree (%)Disagree (%)Agree (%)Strongly Agree (%)
  • NOTE: n = 17.

Time spent at the bedside during teaching rounds was valuable066529
More emphasis on bedside teaching in the residency curriculum is needed0185329
I feel better prepared to lead bedside teaching rounds641530
My physical exam skills improved over the rotation671240
I feel better prepared to teach the physical exam665290
I received helpful feedback on my physical exam skills1865180

Discussion

Bedside teaching is highly valued by clinicians and trainees, though there is little evidence supporting its efficacy. Patients also enjoy and are accepting of bedside presentations7, 9, 10 if certain rules are adhered to (eg, avoid medical jargon) and benefit by having a better understanding of their illness.9 This study supports previous views of medical residents, students,1, 5, 7 and faculty11 of the value and need for greater emphasis on bedside teaching in medical education.

This study of rounding behavior found that hospitalists in this academic center go to the bedside most days, but 39% of attending teaching rounds did not include a bedside encounter. Physical examination teaching is infrequent. Though time spent at the bedside was only a small fraction of total teaching time (17%) in this practice, this fraction is at the high end of previous reports. Teaching rounds that did not include bedside teaching most likely occurred in the confines of a conference room.

Many factors appear to contribute to the paucity of time spent at the bedside: time constraints, shorter hospital stays, greater work demands,11 residency duty‐hour regulations,12 declining bedside teaching skills, unrealistic expectations of the encounter, and erosion of the teaching ethic.3 A decline in clinical examination skills among trainees and attending physicians leads to a growing reliance on data and technology, thereby perpetuating the cycle of declining bedside skills.4

The hospitalists in this study identify time as the most dominant obstacle to bedside teaching. On days when hospitalist attending physicians went to the bedside, rounds were on average 53 minutes longer than on those days when they did not go to the bedside. This time increase varied little whether or not physical examination teaching occurred. The difference in rounding time may be partially explained by the admitting cycle and patient census. Teaching attendings are likely to go to the bedside to see new patients on postcall days when the patient census is also the highest.

Many members of this hospitalist group indicated that they felt inadequately prepared to lead bedside teaching rounds. Of those who responded to the survey, 67% did not feel that they received adequate training in how to teach the physical examination. Consequently, only one‐half of responding hospitalists expressed confidence in their ability to teach the physical examination. Not surprisingly, physical examination skills were a component of a minority of teaching sessions and only one‐quarter of the medical residents perceived that their physical examination skills improved during the rotation with a hospitalist attending. The paucity of feedback to the house‐staff likely contributed to this stagnancy. Residents who become hospitalists ill‐prepared to lead bedside teaching and teach the physical examination will perpetuate the decline in bedside teaching.

Though a substantial portion of the hospitalists in this study lacked confidence, an overwhelming majority of medical residents found their time spent at the bedside with a hospitalist to be valuable. More than one‐half of residents reported that they were better prepared to lead bedside teaching after the rotation. Residents recognize that bedside teaching can include communication and clinical reasoning skills. Hospitalists should be made aware that a broad range of skills and content can be taught at the bedside.

Hospitalists have an increasing influence on the education of medical residents and students and are appropriate targets for faculty development programs aimed at improving bedside teaching. As a newer, growing specialty, hospitalists tend to be younger physicians, and are therefore more reliant on the education attained during residency to support their bedside activities. Many residencies have developed resident as educator programs in an attempt to create a future generation of attendings better able to teach.13

Several limitations should be acknowledged when interpreting the results of this study. The study was limited to a hospitalist group at a single academic medical center and relied on resident recall. Though the response rate to the daily e‐mails was relatively low, the interrater reliability was high, and a broad range of residents and attendings were represented. Residents with greater patient censuses may have been too busy to respond, but it is unclear in which direction this would bias the results.

Conclusions

This study provides additional evidence that bedside and physical examination teaching are in decline. Time is an increasingly precious commodity for hospitalists; though many commentators echo the sentiments of the respondents in this study that more time at the bedside is needed, the amount of time that should be optimally spent at the bedside remains unclear. Research to improve the quality of bedside learning and its influence on patient care outcomes is needed.

References
  1. Williams KN,Ramani S,Fraser B,Orlander JD.Improving bedside teaching: findings from a focus group study of learners.Acad Med.2008;83(3):257264.
  2. LaCombe MA.On bedside teaching.Ann Intern Med.1997;126(3):217220.
  3. Ramani S,Orlander JD,Strunin L,Barber TW.Whither bedside teaching? A focus‐group study of clinical teachers.Acad Med.2003;78(4):384390.
  4. Thibault GE.Bedside rounds revisited.N Engl J Med.1997;336(16):11741175.
  5. McMahon GT,Marina O,Kritek PA,Katz JT.Effect of a physical examination teaching program on the behavior of medical residents.J Gen Intern Med.2005;20(8):710714.
  6. Weissmann PF,Branch WT,Gracey CF,Haidet P,Frankel RM.Role modeling humanistic behavior: learning bedside manner from the experts.Acad Med.2006;81(7):661667.
  7. Nair BR,Coughlan JL,Hensley MJ.Student and patient perspectives on bedside teaching.Med Educ.1997;31(5):341346.
  8. Wachter RM.Hospitalists in the United States—mission accomplished or work in progress?N Engl J Med.2004;350(19):19351936.
  9. Lehmann LS,Brancati FL,Chen MC,Roter D,Dobs AS.The effect of bedside case presentations on patients' perceptions of their medical care.N Engl J Med.1997;336(16):11501155.
  10. Landry MA,Lafrenaye S,Roy MC,Cyr C.A randomized, controlled trial of bedside versus conference‐room case presentation in a pediatric intensive care unit.Pediatrics.2007;120(2):275280.
  11. Nair BR,Coughlan JL,Hensley MJ.Impediments to bed‐side teaching.Med Educ.1998;32(2):159162.
  12. Myers JS,Bellini LM,Morris JB, et al.Internal medicine and general surgery residents' attitudes about the ACGME duty hours regulations: a multicenter study.Acad Med.2006;81(12):10521058.
  13. Weissman MA,Bensinger L,Koestler JL.Resident as teacher: educating the educators.Mt Sinai J Med.2006;73(8):11651169.
Article PDF
Issue
Journal of Hospital Medicine - 4(5)
Publications
Page Number
304-307
Legacy Keywords
bedside teaching, graduate medical education, physical examination
Sections
Article PDF
Article PDF

Bedside teaching, defined as teaching in the presence of a patient, has been an integral, respected part of medical education throughout the history of modern medicine. There is widespread concern among medical educators that bedside teaching is declining, and in particular, physical examination teaching.1‐5 Learning at the bedside accounted for 75% of clinical teaching in the 1960s and only 16% by 1978.2 Current estimates range from 8% to 19%.1

The bedside is the ideal venue for demonstrating, observing, and evaluating medical interviewing skills, physical examination techniques, and interpersonal and communication skills. Role modeling is the primary method by which clinical teachers demonstrate and teach professionalism and humanistic behavior.6 The bedside is also a place to develop clinical reasoning skills, stimulate problem‐based learning,7 and demonstrate teamwork.4 Thus, the decline in bedside teaching is of major concern for more than just the dying of a time‐honored tradition, but for the threat to the development of skills and attitudes essential for the practice of medicine.

With the rapid growth in the number of hospitalists and their presence at most major U.S. teaching hospitals, internal medicine residents and medical students in their medicine clerkships receive much of their inpatient training from attending physicians who are hospitalists.8 Little is known about the teaching practices of hospitalist attending physicians. We investigated the fraction of time hospitalist attending physicians spend at the bedside during attending teaching rounds and the frequency of the demonstration of physical examination skills at 1 academic teaching hospital.

Patients and Methods

The Brigham & Women's Hospitalist Service, a 28‐member academic hospitalist group who serve as both the teaching attendings and patient care attendings on 4 general medicine teams, was studied in a prospective, observational fashion. Internal medicine residents at Brigham & Women's Hospital rotating on the hospitalist service were identified by examining the schedule of inpatient rotations during the 2007‐2008 academic year and were asked to participate in the study via an e‐mail invitation. The Institutional Review Board of Brigham & Women's Hospital approved the study.

Teams were made up of 1 senior resident and 2 interns. Call frequency was every fourth day. Over a period of 23 sequential weekdays, medical residents and interns from each of the 4 hospitalist teams observed and reported the behavior of their attendings on rounds. Their reports captured the fraction of time spent at the bedside during rounds and the frequency of physical examination teaching. Residents and interns were asked to respond to 3 questions in a daily e‐mail. Respondents reported (1) total time spent with their hospitalist attending during attending rounds, (2) time spent inside patient rooms during attending rounds, and (3) whether or not a physical examination finding or skill was demonstrated by their hospitalist attending. When more than 1 team member responded, time reported among team members was averaged and if there was a discrepancy between whether or not a physical examination finding or skill was demonstrated, it was defaulted to the positive response. Hospitalist attendings remained unaware of the daily observations.

Hospitalist attendings were independently invited to complete a baseline needs assessment survey on bedside teaching. Surveys addressed attitudes toward bedside teaching, confidence in ability to lead bedside teaching rounds and teach the physical examination, and adequacy of their own training in these skills. Respondents were asked to comment on obstacles to bedside teaching. Residents were surveyed at the completion of a rotation with a hospitalist attending regarding the value of the time spent at the bedside and their self‐perceived improvement in physical examination skills and bedside teaching skills. The survey solicited the residents' opinion of the most valuable aspect of bedside teaching. The survey questions used a 4‐point Likert scale with response options ranging from 1 = strongly disagree to 4 = strongly agree.

The fraction of time spent at the bedside during attending hospitalist rounds was calculated from the average time spent in patient rooms and the average time of attending rounds. The frequency of physical examination teaching was expressed as a percent of all teaching encounters. Interrater reliability was calculated using the intraclass correlation coefficient with the Spearman‐Brown adjustment. Differences between groups were calculated using the Fisher's exact test for counts and the Wilcoxon rank‐sum test for continuous data. Significance was accepted for P < 0.05.

Results

Thirty‐five residents provided observations on 61 of 92 potentially observed attending rounds (66% response rate) over 23 weekdays, including observations of the rounding behavior of 12 different hospitalists. The interrater reliability was 0.91. The average patient census on each team during this time period was 12 (range 6‐19).

Residents reported that their attendings went to the bedside at least once during 37 of these 61 rounds (61%), and provided physical examination teaching during 23 of these 61 (38%) encounters. Hospitalists spent an average of 101 minutes on rounds and an average of 17 minutes (17%) of their time inside patient rooms.

Rounds that included time spent at the bedside were significantly longer on average than rounds that did not include time spent at the bedside (122 vs. 69 minutes, P < 0.001). During rounds that included bedside teaching, teams spent an average of 29 minutes (24% of the total time) in patient rooms, and rounds were significantly more likely to include teaching on physical diagnosis (23/37 rounds vs. 0/24 rounds, P < 0.001). Physical examination teaching did not significantly prolong those rounds that included bedside teaching (124 vs. 119 minutes, P = 0.56), but did significantly increase the amount of time spent at the bedside (32 vs. 22 minutes, P = 0.046).

Eighteen hospitalists (64% response) with a mean of 5.9 years of experience as attending physicians completed a needs‐assessment survey (Table 1). Fourteen of the 18 hospitalists (78%) reported that they prioritize bedside teaching and 16 (89%) requested more emphasis on bedside teaching in the residency curriculum. Twelve hospitalists (67%) indicated that they were confident in their ability to lead bedside teaching rounds; 9 (50%) were confident in their ability to teach physical examination. Eleven (61%) of the respondents felt poorly prepared to do bedside teaching after completing their residency, and 12 (67%) felt that they had received inadequate training in how to teach the physical examination. Of the obstacles to bedside teaching, time and inadequate training and skills were the most frequently noted, present in 11 and 6 of the reports, respectively. Lack of confidence and lack of role models were also cited in 4 and 2 of the reports, respectively.

Hospitalist Survey
 Strongly Disagree (%)Disagree (%)Agree (%)Strongly Agree (%)
  • NOTE: n = 18.

I make bedside teaching a priority0225622
More emphasis on bedside teaching in the residency curriculum is needed0113950
I feel confident in my ability to lead bedside teaching rounds11225017
I was well‐prepared to do bedside teaching after residency training22392811
I feel confident in my ability to teach the physical exam11393317
I have received adequate training in how to teach the physical exam17502211

Seventeen medical residents (49% response) completed a survey regarding their general medical service rotation with a hospitalist upon its completion (Table 2). Sixteen of the respondents (94%) agreed that time spent at the bedside during hospitalist attending teaching rounds that specific rotation was valuable, and 15 (82%) of the residents sought more emphasis on bedside teaching in the residency curriculum. Four of the respondents (24%) reported that their physical examination skills improved over the rotation, 5 (29%) felt better prepared to teach the physical examination, and 9 (53%) felt better prepared to lead bedside teaching rounds. Only 3 (18%) of the respondents reported that they had received helpful feedback on their physical examination skills from their attending. Responding residents noted physical examination teaching, communication and interpersonal skills, focus on patient‐centered care, and integrating the clinical examination with diagnostic and management decisions as the most valuable aspects of bedside teaching.

Resident End of Hospitalist Rotation Survey
 Strongly Disagree (%)Disagree (%)Agree (%)Strongly Agree (%)
  • NOTE: n = 17.

Time spent at the bedside during teaching rounds was valuable066529
More emphasis on bedside teaching in the residency curriculum is needed0185329
I feel better prepared to lead bedside teaching rounds641530
My physical exam skills improved over the rotation671240
I feel better prepared to teach the physical exam665290
I received helpful feedback on my physical exam skills1865180

Discussion

Bedside teaching is highly valued by clinicians and trainees, though there is little evidence supporting its efficacy. Patients also enjoy and are accepting of bedside presentations7, 9, 10 if certain rules are adhered to (eg, avoid medical jargon) and benefit by having a better understanding of their illness.9 This study supports previous views of medical residents, students,1, 5, 7 and faculty11 of the value and need for greater emphasis on bedside teaching in medical education.

This study of rounding behavior found that hospitalists in this academic center go to the bedside most days, but 39% of attending teaching rounds did not include a bedside encounter. Physical examination teaching is infrequent. Though time spent at the bedside was only a small fraction of total teaching time (17%) in this practice, this fraction is at the high end of previous reports. Teaching rounds that did not include bedside teaching most likely occurred in the confines of a conference room.

Many factors appear to contribute to the paucity of time spent at the bedside: time constraints, shorter hospital stays, greater work demands,11 residency duty‐hour regulations,12 declining bedside teaching skills, unrealistic expectations of the encounter, and erosion of the teaching ethic.3 A decline in clinical examination skills among trainees and attending physicians leads to a growing reliance on data and technology, thereby perpetuating the cycle of declining bedside skills.4

The hospitalists in this study identify time as the most dominant obstacle to bedside teaching. On days when hospitalist attending physicians went to the bedside, rounds were on average 53 minutes longer than on those days when they did not go to the bedside. This time increase varied little whether or not physical examination teaching occurred. The difference in rounding time may be partially explained by the admitting cycle and patient census. Teaching attendings are likely to go to the bedside to see new patients on postcall days when the patient census is also the highest.

Many members of this hospitalist group indicated that they felt inadequately prepared to lead bedside teaching rounds. Of those who responded to the survey, 67% did not feel that they received adequate training in how to teach the physical examination. Consequently, only one‐half of responding hospitalists expressed confidence in their ability to teach the physical examination. Not surprisingly, physical examination skills were a component of a minority of teaching sessions and only one‐quarter of the medical residents perceived that their physical examination skills improved during the rotation with a hospitalist attending. The paucity of feedback to the house‐staff likely contributed to this stagnancy. Residents who become hospitalists ill‐prepared to lead bedside teaching and teach the physical examination will perpetuate the decline in bedside teaching.

Though a substantial portion of the hospitalists in this study lacked confidence, an overwhelming majority of medical residents found their time spent at the bedside with a hospitalist to be valuable. More than one‐half of residents reported that they were better prepared to lead bedside teaching after the rotation. Residents recognize that bedside teaching can include communication and clinical reasoning skills. Hospitalists should be made aware that a broad range of skills and content can be taught at the bedside.

Hospitalists have an increasing influence on the education of medical residents and students and are appropriate targets for faculty development programs aimed at improving bedside teaching. As a newer, growing specialty, hospitalists tend to be younger physicians, and are therefore more reliant on the education attained during residency to support their bedside activities. Many residencies have developed resident as educator programs in an attempt to create a future generation of attendings better able to teach.13

Several limitations should be acknowledged when interpreting the results of this study. The study was limited to a hospitalist group at a single academic medical center and relied on resident recall. Though the response rate to the daily e‐mails was relatively low, the interrater reliability was high, and a broad range of residents and attendings were represented. Residents with greater patient censuses may have been too busy to respond, but it is unclear in which direction this would bias the results.

Conclusions

This study provides additional evidence that bedside and physical examination teaching are in decline. Time is an increasingly precious commodity for hospitalists; though many commentators echo the sentiments of the respondents in this study that more time at the bedside is needed, the amount of time that should be optimally spent at the bedside remains unclear. Research to improve the quality of bedside learning and its influence on patient care outcomes is needed.

Bedside teaching, defined as teaching in the presence of a patient, has been an integral, respected part of medical education throughout the history of modern medicine. There is widespread concern among medical educators that bedside teaching is declining, and in particular, physical examination teaching.1‐5 Learning at the bedside accounted for 75% of clinical teaching in the 1960s and only 16% by 1978.2 Current estimates range from 8% to 19%.1

The bedside is the ideal venue for demonstrating, observing, and evaluating medical interviewing skills, physical examination techniques, and interpersonal and communication skills. Role modeling is the primary method by which clinical teachers demonstrate and teach professionalism and humanistic behavior.6 The bedside is also a place to develop clinical reasoning skills, stimulate problem‐based learning,7 and demonstrate teamwork.4 Thus, the decline in bedside teaching is of major concern for more than just the dying of a time‐honored tradition, but for the threat to the development of skills and attitudes essential for the practice of medicine.

With the rapid growth in the number of hospitalists and their presence at most major U.S. teaching hospitals, internal medicine residents and medical students in their medicine clerkships receive much of their inpatient training from attending physicians who are hospitalists.8 Little is known about the teaching practices of hospitalist attending physicians. We investigated the fraction of time hospitalist attending physicians spend at the bedside during attending teaching rounds and the frequency of the demonstration of physical examination skills at 1 academic teaching hospital.

Patients and Methods

The Brigham & Women's Hospitalist Service, a 28‐member academic hospitalist group who serve as both the teaching attendings and patient care attendings on 4 general medicine teams, was studied in a prospective, observational fashion. Internal medicine residents at Brigham & Women's Hospital rotating on the hospitalist service were identified by examining the schedule of inpatient rotations during the 2007‐2008 academic year and were asked to participate in the study via an e‐mail invitation. The Institutional Review Board of Brigham & Women's Hospital approved the study.

Teams were made up of 1 senior resident and 2 interns. Call frequency was every fourth day. Over a period of 23 sequential weekdays, medical residents and interns from each of the 4 hospitalist teams observed and reported the behavior of their attendings on rounds. Their reports captured the fraction of time spent at the bedside during rounds and the frequency of physical examination teaching. Residents and interns were asked to respond to 3 questions in a daily e‐mail. Respondents reported (1) total time spent with their hospitalist attending during attending rounds, (2) time spent inside patient rooms during attending rounds, and (3) whether or not a physical examination finding or skill was demonstrated by their hospitalist attending. When more than 1 team member responded, time reported among team members was averaged and if there was a discrepancy between whether or not a physical examination finding or skill was demonstrated, it was defaulted to the positive response. Hospitalist attendings remained unaware of the daily observations.

Hospitalist attendings were independently invited to complete a baseline needs assessment survey on bedside teaching. Surveys addressed attitudes toward bedside teaching, confidence in ability to lead bedside teaching rounds and teach the physical examination, and adequacy of their own training in these skills. Respondents were asked to comment on obstacles to bedside teaching. Residents were surveyed at the completion of a rotation with a hospitalist attending regarding the value of the time spent at the bedside and their self‐perceived improvement in physical examination skills and bedside teaching skills. The survey solicited the residents' opinion of the most valuable aspect of bedside teaching. The survey questions used a 4‐point Likert scale with response options ranging from 1 = strongly disagree to 4 = strongly agree.

The fraction of time spent at the bedside during attending hospitalist rounds was calculated from the average time spent in patient rooms and the average time of attending rounds. The frequency of physical examination teaching was expressed as a percent of all teaching encounters. Interrater reliability was calculated using the intraclass correlation coefficient with the Spearman‐Brown adjustment. Differences between groups were calculated using the Fisher's exact test for counts and the Wilcoxon rank‐sum test for continuous data. Significance was accepted for P < 0.05.

Results

Thirty‐five residents provided observations on 61 of 92 potentially observed attending rounds (66% response rate) over 23 weekdays, including observations of the rounding behavior of 12 different hospitalists. The interrater reliability was 0.91. The average patient census on each team during this time period was 12 (range 6‐19).

Residents reported that their attendings went to the bedside at least once during 37 of these 61 rounds (61%), and provided physical examination teaching during 23 of these 61 (38%) encounters. Hospitalists spent an average of 101 minutes on rounds and an average of 17 minutes (17%) of their time inside patient rooms.

Rounds that included time spent at the bedside were significantly longer on average than rounds that did not include time spent at the bedside (122 vs. 69 minutes, P < 0.001). During rounds that included bedside teaching, teams spent an average of 29 minutes (24% of the total time) in patient rooms, and rounds were significantly more likely to include teaching on physical diagnosis (23/37 rounds vs. 0/24 rounds, P < 0.001). Physical examination teaching did not significantly prolong those rounds that included bedside teaching (124 vs. 119 minutes, P = 0.56), but did significantly increase the amount of time spent at the bedside (32 vs. 22 minutes, P = 0.046).

Eighteen hospitalists (64% response) with a mean of 5.9 years of experience as attending physicians completed a needs‐assessment survey (Table 1). Fourteen of the 18 hospitalists (78%) reported that they prioritize bedside teaching and 16 (89%) requested more emphasis on bedside teaching in the residency curriculum. Twelve hospitalists (67%) indicated that they were confident in their ability to lead bedside teaching rounds; 9 (50%) were confident in their ability to teach physical examination. Eleven (61%) of the respondents felt poorly prepared to do bedside teaching after completing their residency, and 12 (67%) felt that they had received inadequate training in how to teach the physical examination. Of the obstacles to bedside teaching, time and inadequate training and skills were the most frequently noted, present in 11 and 6 of the reports, respectively. Lack of confidence and lack of role models were also cited in 4 and 2 of the reports, respectively.

Hospitalist Survey
 Strongly Disagree (%)Disagree (%)Agree (%)Strongly Agree (%)
  • NOTE: n = 18.

I make bedside teaching a priority0225622
More emphasis on bedside teaching in the residency curriculum is needed0113950
I feel confident in my ability to lead bedside teaching rounds11225017
I was well‐prepared to do bedside teaching after residency training22392811
I feel confident in my ability to teach the physical exam11393317
I have received adequate training in how to teach the physical exam17502211

Seventeen medical residents (49% response) completed a survey regarding their general medical service rotation with a hospitalist upon its completion (Table 2). Sixteen of the respondents (94%) agreed that time spent at the bedside during hospitalist attending teaching rounds that specific rotation was valuable, and 15 (82%) of the residents sought more emphasis on bedside teaching in the residency curriculum. Four of the respondents (24%) reported that their physical examination skills improved over the rotation, 5 (29%) felt better prepared to teach the physical examination, and 9 (53%) felt better prepared to lead bedside teaching rounds. Only 3 (18%) of the respondents reported that they had received helpful feedback on their physical examination skills from their attending. Responding residents noted physical examination teaching, communication and interpersonal skills, focus on patient‐centered care, and integrating the clinical examination with diagnostic and management decisions as the most valuable aspects of bedside teaching.

Resident End of Hospitalist Rotation Survey
 Strongly Disagree (%)Disagree (%)Agree (%)Strongly Agree (%)
  • NOTE: n = 17.

Time spent at the bedside during teaching rounds was valuable066529
More emphasis on bedside teaching in the residency curriculum is needed0185329
I feel better prepared to lead bedside teaching rounds641530
My physical exam skills improved over the rotation671240
I feel better prepared to teach the physical exam665290
I received helpful feedback on my physical exam skills1865180

Discussion

Bedside teaching is highly valued by clinicians and trainees, though there is little evidence supporting its efficacy. Patients also enjoy and are accepting of bedside presentations7, 9, 10 if certain rules are adhered to (eg, avoid medical jargon) and benefit by having a better understanding of their illness.9 This study supports previous views of medical residents, students,1, 5, 7 and faculty11 of the value and need for greater emphasis on bedside teaching in medical education.

This study of rounding behavior found that hospitalists in this academic center go to the bedside most days, but 39% of attending teaching rounds did not include a bedside encounter. Physical examination teaching is infrequent. Though time spent at the bedside was only a small fraction of total teaching time (17%) in this practice, this fraction is at the high end of previous reports. Teaching rounds that did not include bedside teaching most likely occurred in the confines of a conference room.

Many factors appear to contribute to the paucity of time spent at the bedside: time constraints, shorter hospital stays, greater work demands,11 residency duty‐hour regulations,12 declining bedside teaching skills, unrealistic expectations of the encounter, and erosion of the teaching ethic.3 A decline in clinical examination skills among trainees and attending physicians leads to a growing reliance on data and technology, thereby perpetuating the cycle of declining bedside skills.4

The hospitalists in this study identify time as the most dominant obstacle to bedside teaching. On days when hospitalist attending physicians went to the bedside, rounds were on average 53 minutes longer than on those days when they did not go to the bedside. This time increase varied little whether or not physical examination teaching occurred. The difference in rounding time may be partially explained by the admitting cycle and patient census. Teaching attendings are likely to go to the bedside to see new patients on postcall days when the patient census is also the highest.

Many members of this hospitalist group indicated that they felt inadequately prepared to lead bedside teaching rounds. Of those who responded to the survey, 67% did not feel that they received adequate training in how to teach the physical examination. Consequently, only one‐half of responding hospitalists expressed confidence in their ability to teach the physical examination. Not surprisingly, physical examination skills were a component of a minority of teaching sessions and only one‐quarter of the medical residents perceived that their physical examination skills improved during the rotation with a hospitalist attending. The paucity of feedback to the house‐staff likely contributed to this stagnancy. Residents who become hospitalists ill‐prepared to lead bedside teaching and teach the physical examination will perpetuate the decline in bedside teaching.

Though a substantial portion of the hospitalists in this study lacked confidence, an overwhelming majority of medical residents found their time spent at the bedside with a hospitalist to be valuable. More than one‐half of residents reported that they were better prepared to lead bedside teaching after the rotation. Residents recognize that bedside teaching can include communication and clinical reasoning skills. Hospitalists should be made aware that a broad range of skills and content can be taught at the bedside.

Hospitalists have an increasing influence on the education of medical residents and students and are appropriate targets for faculty development programs aimed at improving bedside teaching. As a newer, growing specialty, hospitalists tend to be younger physicians, and are therefore more reliant on the education attained during residency to support their bedside activities. Many residencies have developed resident as educator programs in an attempt to create a future generation of attendings better able to teach.13

Several limitations should be acknowledged when interpreting the results of this study. The study was limited to a hospitalist group at a single academic medical center and relied on resident recall. Though the response rate to the daily e‐mails was relatively low, the interrater reliability was high, and a broad range of residents and attendings were represented. Residents with greater patient censuses may have been too busy to respond, but it is unclear in which direction this would bias the results.

Conclusions

This study provides additional evidence that bedside and physical examination teaching are in decline. Time is an increasingly precious commodity for hospitalists; though many commentators echo the sentiments of the respondents in this study that more time at the bedside is needed, the amount of time that should be optimally spent at the bedside remains unclear. Research to improve the quality of bedside learning and its influence on patient care outcomes is needed.

References
  1. Williams KN,Ramani S,Fraser B,Orlander JD.Improving bedside teaching: findings from a focus group study of learners.Acad Med.2008;83(3):257264.
  2. LaCombe MA.On bedside teaching.Ann Intern Med.1997;126(3):217220.
  3. Ramani S,Orlander JD,Strunin L,Barber TW.Whither bedside teaching? A focus‐group study of clinical teachers.Acad Med.2003;78(4):384390.
  4. Thibault GE.Bedside rounds revisited.N Engl J Med.1997;336(16):11741175.
  5. McMahon GT,Marina O,Kritek PA,Katz JT.Effect of a physical examination teaching program on the behavior of medical residents.J Gen Intern Med.2005;20(8):710714.
  6. Weissmann PF,Branch WT,Gracey CF,Haidet P,Frankel RM.Role modeling humanistic behavior: learning bedside manner from the experts.Acad Med.2006;81(7):661667.
  7. Nair BR,Coughlan JL,Hensley MJ.Student and patient perspectives on bedside teaching.Med Educ.1997;31(5):341346.
  8. Wachter RM.Hospitalists in the United States—mission accomplished or work in progress?N Engl J Med.2004;350(19):19351936.
  9. Lehmann LS,Brancati FL,Chen MC,Roter D,Dobs AS.The effect of bedside case presentations on patients' perceptions of their medical care.N Engl J Med.1997;336(16):11501155.
  10. Landry MA,Lafrenaye S,Roy MC,Cyr C.A randomized, controlled trial of bedside versus conference‐room case presentation in a pediatric intensive care unit.Pediatrics.2007;120(2):275280.
  11. Nair BR,Coughlan JL,Hensley MJ.Impediments to bed‐side teaching.Med Educ.1998;32(2):159162.
  12. Myers JS,Bellini LM,Morris JB, et al.Internal medicine and general surgery residents' attitudes about the ACGME duty hours regulations: a multicenter study.Acad Med.2006;81(12):10521058.
  13. Weissman MA,Bensinger L,Koestler JL.Resident as teacher: educating the educators.Mt Sinai J Med.2006;73(8):11651169.
References
  1. Williams KN,Ramani S,Fraser B,Orlander JD.Improving bedside teaching: findings from a focus group study of learners.Acad Med.2008;83(3):257264.
  2. LaCombe MA.On bedside teaching.Ann Intern Med.1997;126(3):217220.
  3. Ramani S,Orlander JD,Strunin L,Barber TW.Whither bedside teaching? A focus‐group study of clinical teachers.Acad Med.2003;78(4):384390.
  4. Thibault GE.Bedside rounds revisited.N Engl J Med.1997;336(16):11741175.
  5. McMahon GT,Marina O,Kritek PA,Katz JT.Effect of a physical examination teaching program on the behavior of medical residents.J Gen Intern Med.2005;20(8):710714.
  6. Weissmann PF,Branch WT,Gracey CF,Haidet P,Frankel RM.Role modeling humanistic behavior: learning bedside manner from the experts.Acad Med.2006;81(7):661667.
  7. Nair BR,Coughlan JL,Hensley MJ.Student and patient perspectives on bedside teaching.Med Educ.1997;31(5):341346.
  8. Wachter RM.Hospitalists in the United States—mission accomplished or work in progress?N Engl J Med.2004;350(19):19351936.
  9. Lehmann LS,Brancati FL,Chen MC,Roter D,Dobs AS.The effect of bedside case presentations on patients' perceptions of their medical care.N Engl J Med.1997;336(16):11501155.
  10. Landry MA,Lafrenaye S,Roy MC,Cyr C.A randomized, controlled trial of bedside versus conference‐room case presentation in a pediatric intensive care unit.Pediatrics.2007;120(2):275280.
  11. Nair BR,Coughlan JL,Hensley MJ.Impediments to bed‐side teaching.Med Educ.1998;32(2):159162.
  12. Myers JS,Bellini LM,Morris JB, et al.Internal medicine and general surgery residents' attitudes about the ACGME duty hours regulations: a multicenter study.Acad Med.2006;81(12):10521058.
  13. Weissman MA,Bensinger L,Koestler JL.Resident as teacher: educating the educators.Mt Sinai J Med.2006;73(8):11651169.
Issue
Journal of Hospital Medicine - 4(5)
Issue
Journal of Hospital Medicine - 4(5)
Page Number
304-307
Page Number
304-307
Publications
Publications
Article Type
Display Headline
Quantification of bedside teaching by an academic hospitalist group
Display Headline
Quantification of bedside teaching by an academic hospitalist group
Legacy Keywords
bedside teaching, graduate medical education, physical examination
Legacy Keywords
bedside teaching, graduate medical education, physical examination
Sections
Article Source

Copyright © 2009 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
3400 Spruce Street, Penn Tower Suite 2009, Philadelphia, PA 19104
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Article PDF Media