Credentialing of Hospitalists in Ultrasound-Guided Bedside Procedures: A Position Statement of the Society of Hospital Medicine

Article Type
Changed
Tue, 03/05/2019 - 13:29

The American Board of Internal Medicine (ABIM) changed its certification policy for bedside procedures over a decade ago.1 Acquiring manual competence in abdominal paracentesis, arterial catheter placement, arthrocentesis, central venous catheter placement, lumbar puncture, and thoracentesis is no longer an expectation of residency training. ABIM diplomates should “know” these procedures but not necessarily “do” them. Hospitalists, most of whom are themselves ABIM diplomates, are still, however, expected to do them as core competencies,2perhaps because hospitalists are often available off-hours, when roughly half of bedside procedures are performed.3

Hospitalists increasingly perform bedside procedures with ultrasound guidance.4 Yet training in ultrasound guidance is significantly varied as well,5 simply because point-of-care ultrasound (POCUS) has only recently become widespread.6 And though some skills are transferrable from landmark-guided to ultrasound -guided procedures, many are not.7-10 Furthermore, ultrasound guidance is often not explicitly delineated on the privileging forms used by hospitals,11 even where ultrasound guidance has become standard.12

Given the variability in training for both ultrasound- and landmark-guided procedures, and given the lack of a universal standard for certification, local hospitals often ask their respective hospitalist group leaders to certify hospitalists’ basic competence as part of credentialing (see the Table for definitions). How hospitalist group leaders should certify competence, however, is not clear. The importance of this gap has recently increased, as hospitalists continue to perform procedures despite not having clear answers to questions about basic competence.13-15

Therefore, the Society of Hospital Medicine (SHM) Education Committee convened a group of experts and conducted a systematic literature review in order to provide recommendations for credentialing hospitalist physicians in ultrasound-guided bedside procedures. These recommendations do not include training recommendations, aside from recommendations about remedial training for hospitalists who do not pass certification. Training is a means to competence but does not guarantee it. We believe that training recommendations ought to be considered separately.

METHODS

Working Group Formation

In January 2015, the SHM Board of Directors asked the SHM Education Committee to convene the POCUS Task Force. The purpose of the task force was to develop recommendations on ultrasound guidance for bedside procedures. The SHM Education Committee appointed 3 chairs of the task force: 1 senior member of the SHM Education Committee and 2 POCUS experts. The chairs assembled a task force of 31 members that included 5 working groups, a multispecialty peer review group, and a guideline methodologist (supplemental Appendix 1). Invitation was based on members’ past contributions to SHM POCUS-related activities, up-front commitment, and declared conflicts of interest. Working group members self-identified as “hospitalists,” whereas peer reviewers were nonhospitalists but nationally recognized POCUS physician-leaders specializing in emergency medicine, cardiology, critical care medicine, and anesthesiology. Task force membership was vetted by a chair of the SHM POCUS Task Force and the Director of Education before work began. This position statement was authored by the Credentialing Working Group together with the chairs of the other 4 working groups and a guideline methodologist.

 

 

Disclosures

Signed disclosure statements of all task force members were reviewed prior to inclusion on the task force (supplemental Appendix 2); no members received honoraria for participation. Industry representatives did not contribute to the development of the guidelines nor to any conference calls or meetings.

Literature Search Strategy

A literature search was conducted by a biomedical librarian. Records from 1979 to January of 2017 were searched in Medline, Embase, CINAHL, Cochrane, and Google Scholar (supplemental Appendix 3). Search limiters were English language and adults. Articles were manually screened to exclude nonhuman or endoscopic ultrasound applications. Final article selection was based on working group consensus.

Draft Pathways

The Credentialing Working Group drafted initial and ongoing certification pathways (Figure 1 and Figure 2). The other 4 working groups from the task force were surveyed about the elements and overall appropriateness of these draft pathways. This survey and its results have already been published.12 The Credentialing Working Group then revised the certification pathways by using these survey results and codified individual aspects of these pathways into recommendations.

Development of Position Statement

Based on the Grading of Recommendation Assessment Development and Evaluation methodology, all final article selections were initially rated as either low-quality (observational studies) or unclassifiable (expert opinion).16 These initial ratings were downgraded further because of indirectness, because none of the articles involved the intervention of interest (a credentialing pathway) in a population of interest (hospitalists) measuring the outcomes of interest (patient-level outcomes).17 Given the universal low-quality evidence ratings, we altered the task force strategy of developing guidelines, which the other 4 working groups are writing, and instead developed a position statement by using consensus gathering in 3 steps.

First, the Credentialing Working Group drafted an initial position statement composed of recommendations for credentialing pathways and other general aspects of credentialing. All final article selections were incorporated as references in a draft of the position statement and compiled in a full-text compendium. Second, feedback was provided by the other 4 task force working groups, the task force peer reviewers, and the SHM Education Committee. Feedback was incorporated by the authors of this statement who were the Credentialing Working Group, the chairs of the other 4 working groups, and a guideline methodologist. Third, final suggestions from all members of the SHM POCUS Task Force and SHM Education Committee were incorporated before final approval by the SHM Board of Directors in September 2017.

RESULTS

A total of 1438 references were identified in the original search. Manual selection led to 101 articles, which were incorporated into the following 4 domains with 16 recommendations.

General Credentialing Process

Basic Cognitive Competence Can Be Certified with Written or Oral Examinations

The ABIM defines cognitive competence as having 3 abilities: “(1) to explain indications, contraindications, patient preparation methods, sterile techniques, pain management, proper techniques for handling specimens and fluids obtained, and test results; (2) to recognize and manage complications; and, (3) to clearly explain to a patient all facets of the procedure necessary to obtain informed consent.”1 These abilities can be assessed with written or oral examinations that may be integrated into simulation- or patient-based assessments.18-21

Minimum Thresholds of Experience to Trigger the Timing of a Patient-Based Assessment Should Be Determined by Empirical Methods

Learning curves are highly variable22-25 and even plateaus may not herald basic competence.26 Expert opinions27 can be used to establish minimum thresholds of experience, but such opinions may paradoxically exceed the current thresholds of experts’ own hospitals.12 Thus, empirical methods, such as those based on cumulative sum analysis28-30 or local learning curves,31,32 are preferred. If such methods are not available, a recent survey of hospitalist experts may provide guidance.12 Regardless, once established, minimum thresholds are necessary but not sufficient to determine competency (see “Basic manual competence must be certified through patient-based assessments” section).

Hospitalists Should Formally Log All of Their Attempted Procedures, Ideally in an Electronic Medical Record

Simple self-reported numbers of procedures performed often misrepresent actual experience33,34 and do not include periprocedural complications.35,36 Thus, hospitalists should report their experience with logs of all attempted procedures, both successful and unsuccessful. Such logs must include information about supervising providers (if applicable) and patient outcomes, including periprocedural adverse events,37 but they must also remain compliant with the Health Insurance Portability and Accountability Act.

Health Information Technology Service Should Routinely Pull Collations of All Attempted Procedures from Comprehensive Electronic Medical Records

Active surveillance may reduce complications by identifying hospitalists who may benefit from further training.38 In order to facilitate active surveillance systems, documentation (such as a procedure note) should be both integrated into an electronic medical record and protocol driven,39 including procedure technique, ultrasound findings, and any safety events (both near misses and adverse events).

 

 

Basic Manual Competence Must Be Certified Through Patient-Based Assessments

Multiple interacting factors, including environment, patients, baseline skills, training, experience, and skills decay, affect manual competence. Certifications that are based solely on reaching minimum thresholds of experience, even when accurate, are not valid reflections of manual competence,15,40-43 and neither are those based on self-perception.44 Patient-based assessments are, thus, necessary to ensure manual competence.45-48

Certification Assessments of Manual Competence Should Combine 2 Types of Structured Instruments: Checklists and Overall Scores

Assessments based on direct observation are more reliable when formally structured.49,50 Though checklists used in observed structured clinical examinations capture many important manual skills,51-56 they do not completely reflect a hospitalist’s manual competence;57 situations may occur in which a hospitalist meets all the individual items on a checklist but cannot perform an entire procedure with basic competence. Therefore, checklists should be paired with overall scores.58-61 Both checklists and overall scores ought to be obtained from reliable and valid instruments.

Certification Assessments Should Include Feedback

Assessments without feedback are missed learning opportunities.62 Both simulation-63 and patient-based assessments should provide feedback in real time to reinforce effective behaviors and remedy faulty ones.

If Remedial Training is Needed, Simulator-Based Training Can Supplement but Not Replace Patient-Based Training

Supervised simulator-based training allows hospitalists to master basic components of a procedure64 (including orientation to equipment, sequence of operations, dexterity, ultrasound anatomy, and real-time guidance technique) while improving both cognitive and manual skills.42,43,65-71 In addition to their role in basic training (which is outside the scope of this position statement), simulators can be useful for remedial training. To be sufficient for hospitalists who do not pass their patient-based assessments, however, remedial training that begins with simulation must also include patient-based training and assessment.72-75

Initial Credentialing Process

A Minimum Threshold of Experience Should Be Reached before Patient-Based Assessments are Conducted (Figure 1)

Recent experience, such as the number of successful procedures performed on a representative sample of patients61,76,77 in the last 2 years, should meet a minimum threshold (see “Minimum thresholds of experience to trigger the timing of a patient-based assessment should be determined by empirical methods” section) before a patient-based assessment for intramural certification occurs.31,78 Such procedures should be supervised unless performed with privileges, for example, at another hospital. After reaching both a minimum threshold of experience and passing an observed patient-based assessment, which includes assessments of both cognitive and manual skills, hospitalists can be considered intramurally certified for initial credentialing. The hospitalist may begin to independently perform ultrasound-guided procedures if all credentialing requirements are met and privileges are granted.

Initial Certification Assessments Should Ideally Begin on Simulators

Simulators allow the assurance of safe manual skills, including proper needle insertion techniques and disposal of sharp objects.3,79 If simulators are not available, however, then patient-based training and assessments can still be performed under direct observation. Safe performance of ultrasound-guided procedures during patient-based assessments (without preceding simulator-based assessments) is sufficient to certify manual competence.

Ongoing Credentialing

Certification to Perform Ultrasound-Guided Procedures Should Be Routinely Re-Evaluated During Ongoing Credentialing (Figure 2)

Ongoing certifications are needed because skills decay.80,81 They should be routine, perhaps coinciding with the usual reprivileging cycle (often biennually). When feasible,82 maintenance of manual competence is best ensured by directly observed patient-based assessments; when not feasible, performance reviews are acceptable.

Observed Patient-Based Assessments Should Occur When a Periprocedural Safety Event Occurs that is Potentially Caused by “Provider Error”

Safety events include both near misses and adverse events. Information about both is ideally “flagged” and “pushed” to hospitalist group leaders by active surveillance and reporting systems. Once reviewed, if a safety event is considered to potentially have been caused by provider error (including knowledge- and skill-based errors),83 then the provider who performed the procedure should undergo an observed patient-based assessment.

Simulation-Based Practice Can Supplement Patient-Based Experience for Ongoing Credentialing

When hospitalists do not achieve a minimum threshold of patient-based experience since the antecedent certification, simulation-based training can supplement their patient-based experience.84 In these cases, however, an observed patient-based assessment must occur. Another consideration is whether or not the privilege should be relinquished because of an infrequent need.

Credentialing Infrastructure

Hospitalists Themselves Should Not Bear the Financial Costs of Developing and Maintaining Training and Certification Programs for Ultrasound-Guided Procedures

Equipment and personnel costs85,86 commonly impede ultrasound-guided procedure programs.4,87,88 Hospitalists whose job descriptions include the performance of ultrasound-guided procedures should not be expected to bear the costs of ultrasound machines, image archival software, equipment maintenance, and initial and ongoing training and certification.

Assessors Should Be Unbiased Expert Providers Who Have Demonstrated Mastery in Performance of the Procedure Being Assessed and Regularly Perform It in a Similar Practice Environment

 

 

Assessors should be expert providers who regularly perform the ultrasound-guided procedure in a similar practice environment.9,89-94 For example, providers who are not hospitalists but who are experts in an ultrasound-guided procedure and commonly perform it on the hospital wards would be acceptable assessors. However, a radiologist who only performs that procedure in a fully-staffed interventional radiology suite with fluoroscopy or computed tomography guidance would not be an acceptable assessor. More than 1 assessor may balance idiosyncratic assessments;95 but when assessments are well structured, additional assessors are generally not needed.18Candidate assessors should be vetted by the hospitalist group leader and the hospital privileging committee.

If Intramural Assessors Are Not Available, Extramural Assessors May Be Considered

Intramural assessors are generally preferred because of familiarity with the local practice environment, including the available procedure kits and typical patient characteristics. Nevertheless, extramural assessors27,77,85,96 may theoretically provide even more valid assessments than intramural ones because extramural assessors are neither influenced by relationships with local hospitalists nor biased by local hospitalists’ skills.97,98 Remote performance assessment through video recordings99 or live-video streaming is another option100 but is not sufficient unless a room camera is available to simultaneously view probe movement and the ultrasound screen.101 In addition, remote assessment does not allow the assessor to physically assume control of the procedure to either salvage it or perhaps, in some cases, prevent a complication.

DISCUSSION

There are no high-quality randomized trials in support of a single credentialing pathway over any other.94,102 The credentialing pathways at the center of this position statement are based on expert opinion. Our methods can be criticized straightaway, therefore, for reliance on the experience and expertise of our working group and task force. Any position statement written without high-quality supportive evidence would be appropriately subject to the same criticism. Without evidence in support of an overall pathway, we codified specific aspects of the pathways into 16 individual recommendations.

Patient-level outcomes do not back these recommendations. Consider, for example, our recommendation that certification assessments be made from structured instruments and not simply from an assessor’s gestalt. Here, the basis is not improved patient-level outcomes from a trial (such as reduced complications or increased procedural success) but improved psychometric performance from reliability studies. The body of evidence for our recommendations is similarly indirect, mostly because the outcomes studied are more proximate and, thus, less meaningful than patient-level outcomes, which are the outcomes of greatest interest but are woefully understudied for clinical competence.17,97,103

The need for high-quality evidence is most pronounced in distinguishing how recommendations should be modified for various settings. Wide variations in resources and patient-mix will make some recommendations impracticable, meaning that they could not be carried out with available resources. For example, our recommendation that credentialing decisions should ultimately rely on certifications made by assessors during patient-based assessments may not be practicable at small, rural hospitals. Such hospitals may not have access to local assessors, and they may not admit enough patients who need the types of ultrasound-guided procedures for which hospitalists seek certification (especially given the need to coordinate the schedules of patients, procedure-performing hospitalists, and assessors). Collaborative efforts between hospitals for regional certification may be a potential solution to consider. But if recommendations are truly impracticable, the task force recognizes they may need to be modified. Given the low quality of evidence supporting our recommendations, such modifications would be readily defendable, especially if they emerged from collaborative discussions between privileging committees, hospitalist directors, and local experts.

One way for hospitals to implement our recommendations may be to follow a recommendation proposed by the authors of the original hospitalist core competencies over a decade ago: “The presence of a procedural skill in the Core Competencies does not necessarily indicate that every hospitalist will perform or be proficient in that procedure.”104 In other words, bedside procedures may be delegated to some but not all hospitalists. Such “proceduralists” would have some proportion of their clinical responsibility dedicated to performing procedures. Delineation of this job description must be made locally because it balances 2 hospital-specific characteristics: patients’ needs for procedures against the availability of providers with basic competence to perform them, which includes hospitalists but also emergency medicine physicians, specialists, and interventional radiologists. A salutary benefit for hospitals is that hospitalists who are not proceduralists would not need to undergo certification in basic competence for the bedside procedures they will not be performing.

Regardless of whether some or all hospitalists at a particular hospital are expected to perform bedside procedures, technology may help to improve the practicability of our recommendations. For example, simulators may evolve to replace actual patient-level experience in achieving minimum thresholds. Certification assessments of manual skills may even someday occur entirely on simulators. Real-time high-definition video streaming enhanced with multiple cameras may allow for remote assessments. Until such advances mature, high-quality patient-level data should be sought through additional research to refine our current recommendations.

We hope that these recommendations will improve how basic competence in ultrasound-guided bedside procedures is assessed. Our ultimate goal is to improve how hospitalists perform these procedures. Patient safety is, therefore, considered paramount to cost. Nevertheless, the hospital administrative leaders and privileging committee members on our Task Force concluded that many hospitals have been seeking guidance on credentialing for bedside procedures, and the likely difficulties of implementing our recommendations (including cost) would not be prohibitive at most hospitals, especially given recognition that these recommendations can be tailored to each setting.

 

 

Acknowledgments

Collaborators from SHM POCUS Task Force are Saaid Abdel-Ghani, Michael Blaivas, Dan Brotman, Carolina Candotti, Jagriti Chadha, Joel Cho, Ria Dancel, Ricardo Franco, Richard Hoppmann, Susan Hunt, Venkat Kalidindi, Ketino Kobaidze, Josh Lenchus, Benji Mathews, Satyen Nichani, Vicki Noble, Martin Perez, Nitin Puri, Aliaksei Pustavoitau, Sophia Rodgers, Gerard Salame, Daniel Schnobrich, Kirk Spencer, Vivek Tayal, Jeff Bates, Anjali Bhagra, Kreegan Reierson, Robert Arntfield, Paul Mayo, Loretta Grikis.

Disclosure

Brian P. Lucas received funding from the Department of Veterans Affairs, Veterans Health Administration, Office of Research and Development and Dartmouth SYNERGY, National Institutes of Health, and National Center for Translational Science (UL1TR001086). Nilam Soni received funding from the Department of Veterans Affairs, Quality Enhancement Research Initiative (QUERI) Partnered Evaluation Initiative (HX002263-01A1). The contents of this publication do not represent the views of the United States Department of Veterans Affairs or the United States Government.

Files
References

1. American Board of Internal Medicine. Policies and procedures for certification. Philadelphia: American Board of Internal Medicine; 2006.
2. Nichani S, Fitterman N, Lukela M, Crocker J; Society of Hospital Medicine. The Core Competencies in Hospital Medicine 2017 Revision. Section 2: Procedures. J Hosp Med. 2017;12(4 Suppl 1):S44-S54 PubMed
3. Lucas BP, Asbury JK, Franco-Sadud R. Training future hospitalists with simulators: a needed step toward accessible, expertly performed bedside procedures. J Hosp Med. 2009;4(7):395-396. PubMed
4. Schnobrich DJ, Gladding S, Olson APJ, Duran-Nelson A. Point-of-care ultrasound in internal medicine: a national survey of educational leadership. J Grad Med Educ. 2013;5(3):498-502. PubMed
5. Brown GM, Otremba M, Devine LA, Gray C, Millington SJ, Ma IW. Defining competencies for ultrasound-guided bedside procedures: consensus opinions from Canadian physicians. J Ultrasound Med. 2016;35(1):129-141. PubMed
6. Vaisman A, Cram P. Procedural competence among faculty in academic health centers: challenges and future directions. Acad Med. 2017;92(1):31-34. PubMed
7. Kreisman RD. With ED ultrasound, credentialing is at issue. ED Legal Letter. 2010;21:102-103. 
8. Goudie AM. Credentialing a new skill: what should the standard be for emergency department ultrasound in Australasia? Emerg Med Australas. 2010;22:263-264. PubMed
9. Maizel J, Guyomarc HL, Henon P, et al. Residents learning ultrasound-guided catheterization are not sufficiently skilled to use landmarks. Crit Care. 2014;18(1):R36. doi:10.1186/cc13741. PubMed
10. American College of Emergency Physicians. Ultrasound guidelines: emergency, point-of-care, and clinical ultrasound guidelines in medicine. Ann Emerg Med. 2017;69(5):e27-e54. PubMed
11. Amini R, Adhikari S, Fiorello A. Ultrasound competency assessment in emergency medicine residency programs. Acad Emerg Med. 2014;21(7):799-801. PubMed
12. Jensen T, Soni NJ, Tierney DM, Lucas BP. Hospital privileging practices for bedside procedures: a survey of hospitalist experts. J Hosp Med. 2017;12(10):836-839. PubMed
13. Chang W. Is hospitalist proficiency in bedside procedures in decline? The Hospitalist. 2012. http://www.the-hospitalist.org/hospitalist/article/125236/patient-safety/hospitalist-proficiency-bedside-procedures-decline. Accessed September 30, 2017.
14. Barsuk JH, Feinglass J, Kozmic SE, Hohmann SF, Ganger D, Wayne DB. Specialties Performing Paracentesis Procedures at University Hospitals: Implications for Training and Certification. J Hosp Med. 2014;9(3):162-168. PubMed
15. Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Residents’ Procedural Experience Does Not Ensure Competence: A Research Synthesis. J Grad Med Educ. 2017;9(2):201-208. PubMed
16. Balshem H, Helfand M, Schunemann HJ, et al. GRADE guidelines: 3. Rating the quality of evidence. J Clin Epidemiol. 2011;64(4):401-406. PubMed
17. Guyatt GH, Oxman AD, Kunz R, et al. GRADE guidelines: 8. Rating the quality of evidence—indirectness. J Clin Epidemiol. 2011;64(12):1303-1310. PubMed
18. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9 Suppl):S63-S67. PubMed
19. Grover S, Currier PF, Elinoff JM, Mouchantaf KJ, Katz JT, McMahon GT. Development of a test to evaluate residents knowledge of medical procedures. J Hosp Med. 2009;4(7):430-432. PubMed
20. Millington SJ, Wong RY, Kassen BO, Roberts JM, Ma IWY. Improving internal medicine residents’ performance, knowledge, and confidence in central venous catheterization using simulators. J Hosp Med. 2009;4(7):410-416. PubMed
21. Lenchus JD, Carvalho CM, Ferreri K, et al. Filling the void: defining invasive bedside procedural competency for internal medicine residents. J Grad Med Educ. 2013;5(4):605-612. PubMed
22. Heegeman DJ, Kieke B Jr. Learning curves, credentialing, and the need for ultrasound fellowships. Acad Emerg Med. 2003;10:404-405. PubMed
23. Jang TB, Ruggeri W, Dyne P, Kaji AH. The learning curve of resident physicians using emergency ultrasonography for cholelithaisis and cholecystitis. Acad Emerg Med. 2010;17(11):1247-1252. PubMed
24. Akhtar MI, Hamid M. Ultrasound guided central venous access; a review of literature. Anaesth Pain Intensive Care. 2015;19:317-322. 
25. Bahl A, Yunker A. Assessment of the numbers–based model for evaluation of resident competency in emergency ultrasound core applications. J Emerg Med Trauma Acute Care. 2015;2015(5). doi:10.5339/jemtac.2015.5 
26. Cazes N, Desmots F, Geffroy Y, Renard A, Leyral J, Chaumoitre K. Emergency ultrasound: a prospective study on sufficient adequate training for military doctors. Diagn Interv Imaging. 2013;94(11):1109-1115. PubMed
27. Arntfield RT, Millington SJ, Ainsworth CD, et al. Canadian recommendations for critical care ultrasound training and competency for the Canadian critical care society. Can Respir J. 2014;21(16):341-345. 
28. Bolsin S, Colson M. The use of the Cusum technique in the assessment of trainee competence in new procedures. Int J Qual Health Care. 2000;12(5):433-438. PubMed
29. de Oliveira Filho GR, Helayel PE, da Conceição DB, Garzel IS, Pavei P, Ceccon MS. Learning curves and mathematical models for interventional ultrasound basic skills. Anaesth Analg. 2008;106(2):568-573. PubMed
30. Starkie T, Drake EJ. Assessment of procedural skills training and performance in anesthesia using cumulative sum analysis (cusum). Can J Anaesth. 2013;60(12):1228-1239. PubMed
31. Tierney D. Competency cut-point identification derived from a mastery learning cohort approach: A hybrid model. Ultrasound Med Biol. 2015;41:S19. 
32. Rankin JH, Elkhunovich MA, Rangarajan V, Chilstrom M, Mailhot T. Learning Curves for Ultrasound Assessment of Lumbar Puncture Insertion Sites: When is Competency Established? J Emerg Med. 2016;51(1):55-62. PubMed
33. Klasko SK, Cummings RV, Glazerman LR. Resident data collection: Do the numbers add up? Am J Obstet Gynecol. 1995;172(4 Pt 1):1312-1316. PubMed
34. Tierney D. Development & analysis of a mobile POCUS tracking tool. Ultrasound Med Biol. 2015;41(suppl 4):S31. 
35. Sethi MV, Zimmer J, Ure B, Lacher M. Prospective assessment of complications on a daily basis is essential to determine morbidity and mortality in routine pediatric surgery. J Pediatr Surg. 2016;51(4):630-633. PubMed
36. Fisher JC, Kuenzler KA, Tomita SS, Sinha P, Shah P, Ginsburg HB. Increased capture of pediatric surgical complications utilizing a novel case-log web application to enhance quality improvement. J Pediatr Surg. 2017;52(1):166-171. PubMed
37. Rethans JJ, Norcini JJ, Barón-Maldonado M, et al. The relationship between competence and performance: implications for assessing practice performance. Med Educ. 2002;36(10):901-909. PubMed
38. Duncan DR, Morgenthaler TI, Ryu JH, Daniels CE. Reducing iatrogenic risk in thoracentesis: establishing best practice via experiential training in a zero-risk environment. Chest. 2009;135(5):1315-1320. PubMed
39. Society of Critical Care Medicine Ultrasound Certification Task Force. Recommendations for achieving and maintaining competence and credentialing in critical care ultrasound with focused cardiac ultrasound and advanced critical care echocardiography. http://journals.lww.com/ccmjournal/Documents/Critical%20Care%20Ultrasound.pdf Published 2013. Accessed February 2, 2017.
40. Carraccio C, Wolfsthal SD, Englander R, Ferentz K, Martin C. Shifting paradigms: from Flexner to competencies. Acad Med. 2002;77(5):361-367. PubMed
41. Clark EG, Paparello JJ, Wayne DB, et al. Use of a national continuing medical education meeting to provide simulation-based training in temporary hemodialysis catheter insertion skills: a pre-test post-test study. Can J Kidney Health Dis. 2014;1:25-31. PubMed
42. Barsuk JH, Cohen ER, Caprio T, McGaghie WC, Simuni T, Wayne DB. Simulation-based education with mastery learning improves residents’ lumbar puncture skills. Neurology. 2012;79(2):132-137. PubMed
43. Barsuk JH, McGaghie WC, Cohen ER, O’Leary KJ, Wayne DB. Simulation-based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med. 2009;37(10):2697-2701. PubMed
44. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 2006;296(9):1094-1102. PubMed
45. Shah J, Darzi A. Surgical skills assessment: an ongoing debate. BJU Int. 2001;88(7):655-660. PubMed
46. Lamperti M, Bodenham AR, Pittiruti M, et al. International evidence-based recommendations on ultrasound-guided vascular access. Intensive Care Med. 2012;38(7):1105-1117. PubMed
47. Tolsgaard MG, Todsen T, Sorensen JL, et al. International multispecialty consensus on how to evaluate ultrasound competence: a Delphi consensus survey. PLOS One. 2013;8(2):e57687. doi:10.1371/journal.pone.0057687 PubMed
48. Moureau N, Laperti M, Kelly LJ, et al. Evidence-based consensus on the insertion of central venous access devices: definition of minimal requirements for training. Br J Anaesth. 2013;110(3):347-356. PubMed

49. Feldman LS, Hagarty S, Ghitulescu G, Stanbridge D, Fried GM. Relationship between objective assessment of technical skills and subjective in-training evaluations in surgical residents. J Am Coll Surg. 2004;198(1):105-110. PubMed
50. Baker S, Willey B, Mitchell C. The attempt to standardize technical and analytic competence in sonography education. J Diagn Med Sonogr. 2011;27(5):203-211. 
51. Tolsgaard MG, Ringsted C, Dreisler E, et al. Reliable and valid assessment of ultrasound operator competence in obstetrics and gynecology. Ultrasound Obstet Gynecol. 2014;43(4):437-443. PubMed
52. Rice J, Crichlow A, Baker M, et al. An assessment tool for the placement of ultrasound-guided peripheral intravenous access. J Grad Med Educ. 2016;8(2):202-207. PubMed
53. Hartman N, Wittler M, Askew K, Hiestand B, Manthey D. Validation of a performance checklist for ultrasound-guided internal jubular central lines for use in procedural instruction and assessment. Postgrad Med J. 2017;93(1096):67-70. PubMed
54. Primdahl SC, Todsen T, Clemmesen L, et al. Rating scale for the assessment of competence in ultrasound-guided peripheral vascular access—a Delphi Consensus Study. J Vasc Access. 2016;17(5):440-445. 
55. Berg D, Berg K, Riesenberg LA, et al. The development of a validated checklist for thoracentesis: preliminary results. Am J Med Qual. 2013;28(3):220-226. PubMed
56. Berg K, Riesenberg LA, Berg D, et al. The development of a validated checklist for radial arterial line placement: preliminary results. Am J Med Qual. 2014;29(3):242-246. PubMed
57. Walzak A, Bacchus M, Schaefer MP, et al. Diagnosing technical competence in six bedside procedures: comparing checklists and a global rating scale in the assessment of resident performance. Acad Med. 2015;90(8):1100-1108. PubMed
58. Riesenberg LA, Berg K, Berg D, et al. The development of a validated checklist for femoral venous catheterization: preliminary results. Am J Med Qual. 2014;29(5):445-450. PubMed
59. Riesenberg LA, Berg K, Berg D, et al. The development of a validated checklist for paracentesis: preliminary results. Am J Med Qual. 2013;28(3):227-231. PubMed
60. Huang GC, Newman LR, Schwartzstein RM, et al. Procedural competence in internal medicine residents: validity of a central venous catheter insertion assessment instrument. Acad Med. 2009;84(8):1127-1134. PubMed
61. Salamonsen M, McGrath D, Steiler G, et al. A new instrument to assess physician skill at thoracic ultrasound, including pleural effusion markup. Chest. 2013;144(3):930-934. PubMed
62. Boniface K, Yarris LM. Emergency ultrasound: Leveling the training and assessment landscape. Acad Emerg Med. 2014;21(7):803-805. PubMed
63. Boyle E, O’Keeffe D, Naughton P, Hill A, McDonnell C, Moneley D. The importance of expert feedback during endovascular simulator training. J Vasc Surg. 2011;54(1):240-248.e1. PubMed
64. Langhan TS, Rigby IJ, Walker IW, Howes D, Donnon T, Lord JA. Simulation-based training in critical resuscitation procedures improves residents’ competence. CJEM. 2009;11(6):535-539. PubMed
65. Barsuk JH, McGaghie WC, Cohen ER, Balachandran JS, Wayne DB. Use of simulation-based mastery learning to improve the quality of central venous catheter placement in a medical intensive care unit. J Hosp Med. 2009;4(7):397-403. PubMed
66. Lenchus JD. End of the “see one, do one, teach one” era: the next generation of invasive bedside procedural instruction. J Am Osteopath Assoc. 2010;110(6):340-346. PubMed
67. Barsuk JH, Cohen ER, Vozenilek JA, O’Connor LM, McGaghie WC, Wayne DB. Simulation-based education with mastery learning improves paracentesis skills. J Grad Med Educ. 2012;4(1):23-27. PubMed
68. McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med. 2011;86(6):706-711. PubMed
69. Ross JG. Simulation and psychomotor skill acquisition: A review of the literature. Clin Simul Nurs. 2012;8(9):e429-e435. 
70. Barsuk JH, Cohen ER, Potts S, et al. Dissemination of a simulation-based mastery learning intervention reduces central line-associated bloodstream infections. BMJ Qual Saf. 2014;23(9):749-756. PubMed
71. McSparron JI, Michaud GC, Gordan PL, et al. Simulation for skills-based education in pulmonary and critical care medicine. Ann Am Thorac Soc. 2015;12(4):579-586. PubMed
72. Kneebone RL, Scott W, Darzi A, Horrocks M. Simulation and clinical practice: strengthening the relationship. Med Educ. 2004;38(10):1095-1102. PubMed
73. Mema B, Harris I. The barriers and facilitators to transfer of ultrasound-guided central venous line skills from simulation to practice: exploring perceptions of learners and supervisors. Teach Learn Med. 2016;28(2):115-124. PubMed
74. Castanelli DJ. The rise of simulation in technical skills teaching and the implications for training novices in anaestheia. Anaesth Intensive Care. 2009;37(6):903-910. PubMed
75. McGaghie WC, Issenberg SB, Barsuk JH, Wayne DB. A critical review of simulation-based mastery learning with translational outcomes. Med Educ. 2014;48(4):375-385. PubMed
76. Langlois SLP. Focused ultrasound training for clinicians. Crit Care Med. 2007;35(5 suppl):S138-S143.
77. Price S, Via G, Sloth E, et al. Echocardiography practice, training and accreditation in the intesive care: document for the World Interactive Network Focused on Critical Ultrasound (WINFOCUS). Cardiovasc Ultrasound. 2008;6:49-83. PubMed
78. Blehar DJ, Barton B, Gaspari RJ. Learning curves in emergency ultrasound education. Acad Emerg Med. 2015;22(5):574-582. PubMed
79. Ault MJ, Rosen BT, Ault B. The use of tissue models for vascular access training. Phase I of the procedural patient safety initiative. J Gen Intern Med. 2006;21(5):514-517. PubMed
80. Barsuk JH, Cohen ER, McGaghie WC, Wayne DB. Long-term retention of central venous catheter insertion skills after simulation-based mastery learning. Acad Med. 2010;85(10 Suppl):S9-S12. PubMed
81. Sliman Sean, Amundson S, Shaw D, Phan JN, Waalen J, Kimura B. Recently-acquired cardiac ultrasound skills are rapidly lost when not used: implications for competency in physician imaging. J Amer Coll Cardiol. 2016;67(13S):1569. 
82. Kessler CS, Leone KA. The current state of core competency assessment in emergency medicine and a future research agenda: recommendations of the working group on assessment of observable learner performance. Acad Emerg Med. 2012;19(12):1354-1359. PubMed
83. Chang A, Schyve PM, Croteau RJ, O’Leary DS, Loeb JM. The JCAHO patient safety event taxonomy: a standardized terminology and classification schema for near misses and adverse events. Int J Qual Health Care. 2005;17(2):95-105. PubMed
84. Sawyer T, White M, Zaveri P, et al. Learn, see, practice, prove, do, maintain: an evidence-based pedagogical framework for procedural skill training in medicine. Acad Med. 2015;90(8):1025-1033. PubMed
85. Das D, Kapoor M, Brown C, Ndubuisi A, Gupta S. Current status of emergency department attending physician ultrasound credentialing and quality assurance in the United States. Crit Ultrasound J. 2016;8(1):6-12. PubMed
86. Ndubuisi AK, Gupta S, Brown C, Das D. Current status and future issues in emergency department attending physician ultrasound credentialing. Ann Emerg Med. 2014;64(45):S27-S28. 
87. Tandy Tk, Hoffenberg S. Emergency department ultrasound services by emergency physicians: model for gaining hospital approval. Ann Emerg Med. 1997;29(3):367-374. PubMed
88. Lewiss RE, Saul T, Del Rios M. Acquiring credentials in bedside ultrasound: a cross-sectional survey. BMJ Open. 2013;3:e003502. doi:10.1136/bmjopen-2013-003502 PubMed
89. Lanoix R. Credentialing issues in emergency ultrasonography. Emerg Med Clin North Am. 1997;15(4):913-920. PubMed
90. Scalea T, Rodriquez A, Chiu WC, et al. Focused assessment with sonography for trauma (FAST): results from an international consensus conference. J Trauma. 1999;46(3):466-472. PubMed
91. Hertzberg BS, Kliewer MA, Bowie JD, et al. Physician training requirements in sonography: how many cases are needed for competence? AJR. 2000;174(5):1221-1227. PubMed
92. Blaivas M, Theodoro DL, Sierzenski P. Proliferation of ultrasound fellowships in emergency medicine: how do we ensure future experts are expertly trained? Acad Emerg Med. 2002;9(8):863-864. PubMed
93. Bodenham AR. Editorial II: Ultrasound imaging by anaesthetists: training and accreditation issues. Br J Anaesth. 2006;96(4):414-417. PubMed
94. Williamson JP, Twaddell SH, Lee YCG, et al. Thoracic ultrasound recognition of competence: A position paper of the Thoracic Society of Australia and New Zealand. Respirology. 2017;22(2):405-408. PubMed
95. Harrison G. Summative clinical competency assessment: a survey of ultrasound practitioners’ views. Ultrasound. 2015;23(1):11-17. PubMed
96. Evans LV, Morse JL, Hamann CJ, Osborne M, Lin Z, D'Onofrio G. The development of an independent rater system to assess residents' competence in invasive procedures. Acad Med. 2009;84(8):1135-1143. PubMed
97. Wass V, Van der Vleuten C, Shatzer J, Jones R. Assessment of clinical competence. Lancet. 2001;357(9260):945-949. PubMed
98. Arntfield RT. The utility of remote supervision with feedback as a method to deliver high-volume critical care ultrasound training. J Crit Care. 2015;30(2):441.e1-e6. PubMed
99. Akhtar S, Theodoro D, Gaspari R, et al. Resident training in emergency ultrasound: consensus recommendations from the 2008 Council of Emergency Residency Directors Conference. Acad Emerg Med. 2009;16:S32-S36. PubMed
100. Yu E. The assessment of technical skills in a cardiology training program: is the ITER sufficient? Can J Cardiol. 2000;16(4):457-462. PubMed
101. Todsen T, Tolsgaard MG, Olsen BH, et al. Reliable and valid assessment of point-of-care ultrasonography. Ann Surg. 2015;261(2):309-315. PubMed
102. Stein JC, Nobay F. Emergency department ultrasound credentialing: a sample policy and procedure. J Emerg Med. 2009;37(2):153-159. PubMed
103. Chen FM. Burstin H, Huntington J. The importance of clinical outcomes in medical education research. Med Educ. 2005;39(4):350-351. PubMed
104. Dressler DD, Pistoria MJ, Budnitz TL, McKean SCW, Amin AN. Core competencies in hospital medicine: development and methodology. J Hosp Med. 2006;1:48-56. PubMed
105. ten Cate O. Nuts and bolts of entrustable professional activities. J Grad Med Educ. 2013;5(1):157-158. PubMed
106. Castillo J, Caruana CJ, Wainwright D. The changing concept of competence and categorisation of learning outcomes in Europe: Implications for the design of higher education radiography curricula at the European level. Radiography. 2011;17(3):230-234. 
107. Goldstein SR. Accreditation, certification: why all the confusion? Obstet Gynecol. 2007;110(6):1396-1398. PubMed
108. Moore CL. Credentialing and reimbursement in point-of-care ultrasound. Clin Pediatr Emerg Med. 2011;12(1):73-77. PubMed
109. ten Cate O, Scheele F. Competency-based postgraduate training: can we bridge the gap between theory and clinical practice? Acad Med. 2007;82(6):542-547. PubMed
110. Abuhamad AZ, Benacerraf BR, Woletz P, Burke BL. The accreditation of ultrasound practices: impact on compliance with minimum performance guidelines. J Ultrasound Med. 2004;23(8):1023-1029. PubMed

 

 

Article PDF
Issue
Journal of Hospital Medicine 13(2)
Publications
Topics
Page Number
126-135. Published online first January 17, 2018
Sections
Files
Files
Article PDF
Article PDF

The American Board of Internal Medicine (ABIM) changed its certification policy for bedside procedures over a decade ago.1 Acquiring manual competence in abdominal paracentesis, arterial catheter placement, arthrocentesis, central venous catheter placement, lumbar puncture, and thoracentesis is no longer an expectation of residency training. ABIM diplomates should “know” these procedures but not necessarily “do” them. Hospitalists, most of whom are themselves ABIM diplomates, are still, however, expected to do them as core competencies,2perhaps because hospitalists are often available off-hours, when roughly half of bedside procedures are performed.3

Hospitalists increasingly perform bedside procedures with ultrasound guidance.4 Yet training in ultrasound guidance is significantly varied as well,5 simply because point-of-care ultrasound (POCUS) has only recently become widespread.6 And though some skills are transferrable from landmark-guided to ultrasound -guided procedures, many are not.7-10 Furthermore, ultrasound guidance is often not explicitly delineated on the privileging forms used by hospitals,11 even where ultrasound guidance has become standard.12

Given the variability in training for both ultrasound- and landmark-guided procedures, and given the lack of a universal standard for certification, local hospitals often ask their respective hospitalist group leaders to certify hospitalists’ basic competence as part of credentialing (see the Table for definitions). How hospitalist group leaders should certify competence, however, is not clear. The importance of this gap has recently increased, as hospitalists continue to perform procedures despite not having clear answers to questions about basic competence.13-15

Therefore, the Society of Hospital Medicine (SHM) Education Committee convened a group of experts and conducted a systematic literature review in order to provide recommendations for credentialing hospitalist physicians in ultrasound-guided bedside procedures. These recommendations do not include training recommendations, aside from recommendations about remedial training for hospitalists who do not pass certification. Training is a means to competence but does not guarantee it. We believe that training recommendations ought to be considered separately.

METHODS

Working Group Formation

In January 2015, the SHM Board of Directors asked the SHM Education Committee to convene the POCUS Task Force. The purpose of the task force was to develop recommendations on ultrasound guidance for bedside procedures. The SHM Education Committee appointed 3 chairs of the task force: 1 senior member of the SHM Education Committee and 2 POCUS experts. The chairs assembled a task force of 31 members that included 5 working groups, a multispecialty peer review group, and a guideline methodologist (supplemental Appendix 1). Invitation was based on members’ past contributions to SHM POCUS-related activities, up-front commitment, and declared conflicts of interest. Working group members self-identified as “hospitalists,” whereas peer reviewers were nonhospitalists but nationally recognized POCUS physician-leaders specializing in emergency medicine, cardiology, critical care medicine, and anesthesiology. Task force membership was vetted by a chair of the SHM POCUS Task Force and the Director of Education before work began. This position statement was authored by the Credentialing Working Group together with the chairs of the other 4 working groups and a guideline methodologist.

 

 

Disclosures

Signed disclosure statements of all task force members were reviewed prior to inclusion on the task force (supplemental Appendix 2); no members received honoraria for participation. Industry representatives did not contribute to the development of the guidelines nor to any conference calls or meetings.

Literature Search Strategy

A literature search was conducted by a biomedical librarian. Records from 1979 to January of 2017 were searched in Medline, Embase, CINAHL, Cochrane, and Google Scholar (supplemental Appendix 3). Search limiters were English language and adults. Articles were manually screened to exclude nonhuman or endoscopic ultrasound applications. Final article selection was based on working group consensus.

Draft Pathways

The Credentialing Working Group drafted initial and ongoing certification pathways (Figure 1 and Figure 2). The other 4 working groups from the task force were surveyed about the elements and overall appropriateness of these draft pathways. This survey and its results have already been published.12 The Credentialing Working Group then revised the certification pathways by using these survey results and codified individual aspects of these pathways into recommendations.

Development of Position Statement

Based on the Grading of Recommendation Assessment Development and Evaluation methodology, all final article selections were initially rated as either low-quality (observational studies) or unclassifiable (expert opinion).16 These initial ratings were downgraded further because of indirectness, because none of the articles involved the intervention of interest (a credentialing pathway) in a population of interest (hospitalists) measuring the outcomes of interest (patient-level outcomes).17 Given the universal low-quality evidence ratings, we altered the task force strategy of developing guidelines, which the other 4 working groups are writing, and instead developed a position statement by using consensus gathering in 3 steps.

First, the Credentialing Working Group drafted an initial position statement composed of recommendations for credentialing pathways and other general aspects of credentialing. All final article selections were incorporated as references in a draft of the position statement and compiled in a full-text compendium. Second, feedback was provided by the other 4 task force working groups, the task force peer reviewers, and the SHM Education Committee. Feedback was incorporated by the authors of this statement who were the Credentialing Working Group, the chairs of the other 4 working groups, and a guideline methodologist. Third, final suggestions from all members of the SHM POCUS Task Force and SHM Education Committee were incorporated before final approval by the SHM Board of Directors in September 2017.

RESULTS

A total of 1438 references were identified in the original search. Manual selection led to 101 articles, which were incorporated into the following 4 domains with 16 recommendations.

General Credentialing Process

Basic Cognitive Competence Can Be Certified with Written or Oral Examinations

The ABIM defines cognitive competence as having 3 abilities: “(1) to explain indications, contraindications, patient preparation methods, sterile techniques, pain management, proper techniques for handling specimens and fluids obtained, and test results; (2) to recognize and manage complications; and, (3) to clearly explain to a patient all facets of the procedure necessary to obtain informed consent.”1 These abilities can be assessed with written or oral examinations that may be integrated into simulation- or patient-based assessments.18-21

Minimum Thresholds of Experience to Trigger the Timing of a Patient-Based Assessment Should Be Determined by Empirical Methods

Learning curves are highly variable22-25 and even plateaus may not herald basic competence.26 Expert opinions27 can be used to establish minimum thresholds of experience, but such opinions may paradoxically exceed the current thresholds of experts’ own hospitals.12 Thus, empirical methods, such as those based on cumulative sum analysis28-30 or local learning curves,31,32 are preferred. If such methods are not available, a recent survey of hospitalist experts may provide guidance.12 Regardless, once established, minimum thresholds are necessary but not sufficient to determine competency (see “Basic manual competence must be certified through patient-based assessments” section).

Hospitalists Should Formally Log All of Their Attempted Procedures, Ideally in an Electronic Medical Record

Simple self-reported numbers of procedures performed often misrepresent actual experience33,34 and do not include periprocedural complications.35,36 Thus, hospitalists should report their experience with logs of all attempted procedures, both successful and unsuccessful. Such logs must include information about supervising providers (if applicable) and patient outcomes, including periprocedural adverse events,37 but they must also remain compliant with the Health Insurance Portability and Accountability Act.

Health Information Technology Service Should Routinely Pull Collations of All Attempted Procedures from Comprehensive Electronic Medical Records

Active surveillance may reduce complications by identifying hospitalists who may benefit from further training.38 In order to facilitate active surveillance systems, documentation (such as a procedure note) should be both integrated into an electronic medical record and protocol driven,39 including procedure technique, ultrasound findings, and any safety events (both near misses and adverse events).

 

 

Basic Manual Competence Must Be Certified Through Patient-Based Assessments

Multiple interacting factors, including environment, patients, baseline skills, training, experience, and skills decay, affect manual competence. Certifications that are based solely on reaching minimum thresholds of experience, even when accurate, are not valid reflections of manual competence,15,40-43 and neither are those based on self-perception.44 Patient-based assessments are, thus, necessary to ensure manual competence.45-48

Certification Assessments of Manual Competence Should Combine 2 Types of Structured Instruments: Checklists and Overall Scores

Assessments based on direct observation are more reliable when formally structured.49,50 Though checklists used in observed structured clinical examinations capture many important manual skills,51-56 they do not completely reflect a hospitalist’s manual competence;57 situations may occur in which a hospitalist meets all the individual items on a checklist but cannot perform an entire procedure with basic competence. Therefore, checklists should be paired with overall scores.58-61 Both checklists and overall scores ought to be obtained from reliable and valid instruments.

Certification Assessments Should Include Feedback

Assessments without feedback are missed learning opportunities.62 Both simulation-63 and patient-based assessments should provide feedback in real time to reinforce effective behaviors and remedy faulty ones.

If Remedial Training is Needed, Simulator-Based Training Can Supplement but Not Replace Patient-Based Training

Supervised simulator-based training allows hospitalists to master basic components of a procedure64 (including orientation to equipment, sequence of operations, dexterity, ultrasound anatomy, and real-time guidance technique) while improving both cognitive and manual skills.42,43,65-71 In addition to their role in basic training (which is outside the scope of this position statement), simulators can be useful for remedial training. To be sufficient for hospitalists who do not pass their patient-based assessments, however, remedial training that begins with simulation must also include patient-based training and assessment.72-75

Initial Credentialing Process

A Minimum Threshold of Experience Should Be Reached before Patient-Based Assessments are Conducted (Figure 1)

Recent experience, such as the number of successful procedures performed on a representative sample of patients61,76,77 in the last 2 years, should meet a minimum threshold (see “Minimum thresholds of experience to trigger the timing of a patient-based assessment should be determined by empirical methods” section) before a patient-based assessment for intramural certification occurs.31,78 Such procedures should be supervised unless performed with privileges, for example, at another hospital. After reaching both a minimum threshold of experience and passing an observed patient-based assessment, which includes assessments of both cognitive and manual skills, hospitalists can be considered intramurally certified for initial credentialing. The hospitalist may begin to independently perform ultrasound-guided procedures if all credentialing requirements are met and privileges are granted.

Initial Certification Assessments Should Ideally Begin on Simulators

Simulators allow the assurance of safe manual skills, including proper needle insertion techniques and disposal of sharp objects.3,79 If simulators are not available, however, then patient-based training and assessments can still be performed under direct observation. Safe performance of ultrasound-guided procedures during patient-based assessments (without preceding simulator-based assessments) is sufficient to certify manual competence.

Ongoing Credentialing

Certification to Perform Ultrasound-Guided Procedures Should Be Routinely Re-Evaluated During Ongoing Credentialing (Figure 2)

Ongoing certifications are needed because skills decay.80,81 They should be routine, perhaps coinciding with the usual reprivileging cycle (often biennually). When feasible,82 maintenance of manual competence is best ensured by directly observed patient-based assessments; when not feasible, performance reviews are acceptable.

Observed Patient-Based Assessments Should Occur When a Periprocedural Safety Event Occurs that is Potentially Caused by “Provider Error”

Safety events include both near misses and adverse events. Information about both is ideally “flagged” and “pushed” to hospitalist group leaders by active surveillance and reporting systems. Once reviewed, if a safety event is considered to potentially have been caused by provider error (including knowledge- and skill-based errors),83 then the provider who performed the procedure should undergo an observed patient-based assessment.

Simulation-Based Practice Can Supplement Patient-Based Experience for Ongoing Credentialing

When hospitalists do not achieve a minimum threshold of patient-based experience since the antecedent certification, simulation-based training can supplement their patient-based experience.84 In these cases, however, an observed patient-based assessment must occur. Another consideration is whether or not the privilege should be relinquished because of an infrequent need.

Credentialing Infrastructure

Hospitalists Themselves Should Not Bear the Financial Costs of Developing and Maintaining Training and Certification Programs for Ultrasound-Guided Procedures

Equipment and personnel costs85,86 commonly impede ultrasound-guided procedure programs.4,87,88 Hospitalists whose job descriptions include the performance of ultrasound-guided procedures should not be expected to bear the costs of ultrasound machines, image archival software, equipment maintenance, and initial and ongoing training and certification.

Assessors Should Be Unbiased Expert Providers Who Have Demonstrated Mastery in Performance of the Procedure Being Assessed and Regularly Perform It in a Similar Practice Environment

 

 

Assessors should be expert providers who regularly perform the ultrasound-guided procedure in a similar practice environment.9,89-94 For example, providers who are not hospitalists but who are experts in an ultrasound-guided procedure and commonly perform it on the hospital wards would be acceptable assessors. However, a radiologist who only performs that procedure in a fully-staffed interventional radiology suite with fluoroscopy or computed tomography guidance would not be an acceptable assessor. More than 1 assessor may balance idiosyncratic assessments;95 but when assessments are well structured, additional assessors are generally not needed.18Candidate assessors should be vetted by the hospitalist group leader and the hospital privileging committee.

If Intramural Assessors Are Not Available, Extramural Assessors May Be Considered

Intramural assessors are generally preferred because of familiarity with the local practice environment, including the available procedure kits and typical patient characteristics. Nevertheless, extramural assessors27,77,85,96 may theoretically provide even more valid assessments than intramural ones because extramural assessors are neither influenced by relationships with local hospitalists nor biased by local hospitalists’ skills.97,98 Remote performance assessment through video recordings99 or live-video streaming is another option100 but is not sufficient unless a room camera is available to simultaneously view probe movement and the ultrasound screen.101 In addition, remote assessment does not allow the assessor to physically assume control of the procedure to either salvage it or perhaps, in some cases, prevent a complication.

DISCUSSION

There are no high-quality randomized trials in support of a single credentialing pathway over any other.94,102 The credentialing pathways at the center of this position statement are based on expert opinion. Our methods can be criticized straightaway, therefore, for reliance on the experience and expertise of our working group and task force. Any position statement written without high-quality supportive evidence would be appropriately subject to the same criticism. Without evidence in support of an overall pathway, we codified specific aspects of the pathways into 16 individual recommendations.

Patient-level outcomes do not back these recommendations. Consider, for example, our recommendation that certification assessments be made from structured instruments and not simply from an assessor’s gestalt. Here, the basis is not improved patient-level outcomes from a trial (such as reduced complications or increased procedural success) but improved psychometric performance from reliability studies. The body of evidence for our recommendations is similarly indirect, mostly because the outcomes studied are more proximate and, thus, less meaningful than patient-level outcomes, which are the outcomes of greatest interest but are woefully understudied for clinical competence.17,97,103

The need for high-quality evidence is most pronounced in distinguishing how recommendations should be modified for various settings. Wide variations in resources and patient-mix will make some recommendations impracticable, meaning that they could not be carried out with available resources. For example, our recommendation that credentialing decisions should ultimately rely on certifications made by assessors during patient-based assessments may not be practicable at small, rural hospitals. Such hospitals may not have access to local assessors, and they may not admit enough patients who need the types of ultrasound-guided procedures for which hospitalists seek certification (especially given the need to coordinate the schedules of patients, procedure-performing hospitalists, and assessors). Collaborative efforts between hospitals for regional certification may be a potential solution to consider. But if recommendations are truly impracticable, the task force recognizes they may need to be modified. Given the low quality of evidence supporting our recommendations, such modifications would be readily defendable, especially if they emerged from collaborative discussions between privileging committees, hospitalist directors, and local experts.

One way for hospitals to implement our recommendations may be to follow a recommendation proposed by the authors of the original hospitalist core competencies over a decade ago: “The presence of a procedural skill in the Core Competencies does not necessarily indicate that every hospitalist will perform or be proficient in that procedure.”104 In other words, bedside procedures may be delegated to some but not all hospitalists. Such “proceduralists” would have some proportion of their clinical responsibility dedicated to performing procedures. Delineation of this job description must be made locally because it balances 2 hospital-specific characteristics: patients’ needs for procedures against the availability of providers with basic competence to perform them, which includes hospitalists but also emergency medicine physicians, specialists, and interventional radiologists. A salutary benefit for hospitals is that hospitalists who are not proceduralists would not need to undergo certification in basic competence for the bedside procedures they will not be performing.

Regardless of whether some or all hospitalists at a particular hospital are expected to perform bedside procedures, technology may help to improve the practicability of our recommendations. For example, simulators may evolve to replace actual patient-level experience in achieving minimum thresholds. Certification assessments of manual skills may even someday occur entirely on simulators. Real-time high-definition video streaming enhanced with multiple cameras may allow for remote assessments. Until such advances mature, high-quality patient-level data should be sought through additional research to refine our current recommendations.

We hope that these recommendations will improve how basic competence in ultrasound-guided bedside procedures is assessed. Our ultimate goal is to improve how hospitalists perform these procedures. Patient safety is, therefore, considered paramount to cost. Nevertheless, the hospital administrative leaders and privileging committee members on our Task Force concluded that many hospitals have been seeking guidance on credentialing for bedside procedures, and the likely difficulties of implementing our recommendations (including cost) would not be prohibitive at most hospitals, especially given recognition that these recommendations can be tailored to each setting.

 

 

Acknowledgments

Collaborators from SHM POCUS Task Force are Saaid Abdel-Ghani, Michael Blaivas, Dan Brotman, Carolina Candotti, Jagriti Chadha, Joel Cho, Ria Dancel, Ricardo Franco, Richard Hoppmann, Susan Hunt, Venkat Kalidindi, Ketino Kobaidze, Josh Lenchus, Benji Mathews, Satyen Nichani, Vicki Noble, Martin Perez, Nitin Puri, Aliaksei Pustavoitau, Sophia Rodgers, Gerard Salame, Daniel Schnobrich, Kirk Spencer, Vivek Tayal, Jeff Bates, Anjali Bhagra, Kreegan Reierson, Robert Arntfield, Paul Mayo, Loretta Grikis.

Disclosure

Brian P. Lucas received funding from the Department of Veterans Affairs, Veterans Health Administration, Office of Research and Development and Dartmouth SYNERGY, National Institutes of Health, and National Center for Translational Science (UL1TR001086). Nilam Soni received funding from the Department of Veterans Affairs, Quality Enhancement Research Initiative (QUERI) Partnered Evaluation Initiative (HX002263-01A1). The contents of this publication do not represent the views of the United States Department of Veterans Affairs or the United States Government.

The American Board of Internal Medicine (ABIM) changed its certification policy for bedside procedures over a decade ago.1 Acquiring manual competence in abdominal paracentesis, arterial catheter placement, arthrocentesis, central venous catheter placement, lumbar puncture, and thoracentesis is no longer an expectation of residency training. ABIM diplomates should “know” these procedures but not necessarily “do” them. Hospitalists, most of whom are themselves ABIM diplomates, are still, however, expected to do them as core competencies,2perhaps because hospitalists are often available off-hours, when roughly half of bedside procedures are performed.3

Hospitalists increasingly perform bedside procedures with ultrasound guidance.4 Yet training in ultrasound guidance is significantly varied as well,5 simply because point-of-care ultrasound (POCUS) has only recently become widespread.6 And though some skills are transferrable from landmark-guided to ultrasound -guided procedures, many are not.7-10 Furthermore, ultrasound guidance is often not explicitly delineated on the privileging forms used by hospitals,11 even where ultrasound guidance has become standard.12

Given the variability in training for both ultrasound- and landmark-guided procedures, and given the lack of a universal standard for certification, local hospitals often ask their respective hospitalist group leaders to certify hospitalists’ basic competence as part of credentialing (see the Table for definitions). How hospitalist group leaders should certify competence, however, is not clear. The importance of this gap has recently increased, as hospitalists continue to perform procedures despite not having clear answers to questions about basic competence.13-15

Therefore, the Society of Hospital Medicine (SHM) Education Committee convened a group of experts and conducted a systematic literature review in order to provide recommendations for credentialing hospitalist physicians in ultrasound-guided bedside procedures. These recommendations do not include training recommendations, aside from recommendations about remedial training for hospitalists who do not pass certification. Training is a means to competence but does not guarantee it. We believe that training recommendations ought to be considered separately.

METHODS

Working Group Formation

In January 2015, the SHM Board of Directors asked the SHM Education Committee to convene the POCUS Task Force. The purpose of the task force was to develop recommendations on ultrasound guidance for bedside procedures. The SHM Education Committee appointed 3 chairs of the task force: 1 senior member of the SHM Education Committee and 2 POCUS experts. The chairs assembled a task force of 31 members that included 5 working groups, a multispecialty peer review group, and a guideline methodologist (supplemental Appendix 1). Invitation was based on members’ past contributions to SHM POCUS-related activities, up-front commitment, and declared conflicts of interest. Working group members self-identified as “hospitalists,” whereas peer reviewers were nonhospitalists but nationally recognized POCUS physician-leaders specializing in emergency medicine, cardiology, critical care medicine, and anesthesiology. Task force membership was vetted by a chair of the SHM POCUS Task Force and the Director of Education before work began. This position statement was authored by the Credentialing Working Group together with the chairs of the other 4 working groups and a guideline methodologist.

 

 

Disclosures

Signed disclosure statements of all task force members were reviewed prior to inclusion on the task force (supplemental Appendix 2); no members received honoraria for participation. Industry representatives did not contribute to the development of the guidelines nor to any conference calls or meetings.

Literature Search Strategy

A literature search was conducted by a biomedical librarian. Records from 1979 to January of 2017 were searched in Medline, Embase, CINAHL, Cochrane, and Google Scholar (supplemental Appendix 3). Search limiters were English language and adults. Articles were manually screened to exclude nonhuman or endoscopic ultrasound applications. Final article selection was based on working group consensus.

Draft Pathways

The Credentialing Working Group drafted initial and ongoing certification pathways (Figure 1 and Figure 2). The other 4 working groups from the task force were surveyed about the elements and overall appropriateness of these draft pathways. This survey and its results have already been published.12 The Credentialing Working Group then revised the certification pathways by using these survey results and codified individual aspects of these pathways into recommendations.

Development of Position Statement

Based on the Grading of Recommendation Assessment Development and Evaluation methodology, all final article selections were initially rated as either low-quality (observational studies) or unclassifiable (expert opinion).16 These initial ratings were downgraded further because of indirectness, because none of the articles involved the intervention of interest (a credentialing pathway) in a population of interest (hospitalists) measuring the outcomes of interest (patient-level outcomes).17 Given the universal low-quality evidence ratings, we altered the task force strategy of developing guidelines, which the other 4 working groups are writing, and instead developed a position statement by using consensus gathering in 3 steps.

First, the Credentialing Working Group drafted an initial position statement composed of recommendations for credentialing pathways and other general aspects of credentialing. All final article selections were incorporated as references in a draft of the position statement and compiled in a full-text compendium. Second, feedback was provided by the other 4 task force working groups, the task force peer reviewers, and the SHM Education Committee. Feedback was incorporated by the authors of this statement who were the Credentialing Working Group, the chairs of the other 4 working groups, and a guideline methodologist. Third, final suggestions from all members of the SHM POCUS Task Force and SHM Education Committee were incorporated before final approval by the SHM Board of Directors in September 2017.

RESULTS

A total of 1438 references were identified in the original search. Manual selection led to 101 articles, which were incorporated into the following 4 domains with 16 recommendations.

General Credentialing Process

Basic Cognitive Competence Can Be Certified with Written or Oral Examinations

The ABIM defines cognitive competence as having 3 abilities: “(1) to explain indications, contraindications, patient preparation methods, sterile techniques, pain management, proper techniques for handling specimens and fluids obtained, and test results; (2) to recognize and manage complications; and, (3) to clearly explain to a patient all facets of the procedure necessary to obtain informed consent.”1 These abilities can be assessed with written or oral examinations that may be integrated into simulation- or patient-based assessments.18-21

Minimum Thresholds of Experience to Trigger the Timing of a Patient-Based Assessment Should Be Determined by Empirical Methods

Learning curves are highly variable22-25 and even plateaus may not herald basic competence.26 Expert opinions27 can be used to establish minimum thresholds of experience, but such opinions may paradoxically exceed the current thresholds of experts’ own hospitals.12 Thus, empirical methods, such as those based on cumulative sum analysis28-30 or local learning curves,31,32 are preferred. If such methods are not available, a recent survey of hospitalist experts may provide guidance.12 Regardless, once established, minimum thresholds are necessary but not sufficient to determine competency (see “Basic manual competence must be certified through patient-based assessments” section).

Hospitalists Should Formally Log All of Their Attempted Procedures, Ideally in an Electronic Medical Record

Simple self-reported numbers of procedures performed often misrepresent actual experience33,34 and do not include periprocedural complications.35,36 Thus, hospitalists should report their experience with logs of all attempted procedures, both successful and unsuccessful. Such logs must include information about supervising providers (if applicable) and patient outcomes, including periprocedural adverse events,37 but they must also remain compliant with the Health Insurance Portability and Accountability Act.

Health Information Technology Service Should Routinely Pull Collations of All Attempted Procedures from Comprehensive Electronic Medical Records

Active surveillance may reduce complications by identifying hospitalists who may benefit from further training.38 In order to facilitate active surveillance systems, documentation (such as a procedure note) should be both integrated into an electronic medical record and protocol driven,39 including procedure technique, ultrasound findings, and any safety events (both near misses and adverse events).

 

 

Basic Manual Competence Must Be Certified Through Patient-Based Assessments

Multiple interacting factors, including environment, patients, baseline skills, training, experience, and skills decay, affect manual competence. Certifications that are based solely on reaching minimum thresholds of experience, even when accurate, are not valid reflections of manual competence,15,40-43 and neither are those based on self-perception.44 Patient-based assessments are, thus, necessary to ensure manual competence.45-48

Certification Assessments of Manual Competence Should Combine 2 Types of Structured Instruments: Checklists and Overall Scores

Assessments based on direct observation are more reliable when formally structured.49,50 Though checklists used in observed structured clinical examinations capture many important manual skills,51-56 they do not completely reflect a hospitalist’s manual competence;57 situations may occur in which a hospitalist meets all the individual items on a checklist but cannot perform an entire procedure with basic competence. Therefore, checklists should be paired with overall scores.58-61 Both checklists and overall scores ought to be obtained from reliable and valid instruments.

Certification Assessments Should Include Feedback

Assessments without feedback are missed learning opportunities.62 Both simulation-63 and patient-based assessments should provide feedback in real time to reinforce effective behaviors and remedy faulty ones.

If Remedial Training is Needed, Simulator-Based Training Can Supplement but Not Replace Patient-Based Training

Supervised simulator-based training allows hospitalists to master basic components of a procedure64 (including orientation to equipment, sequence of operations, dexterity, ultrasound anatomy, and real-time guidance technique) while improving both cognitive and manual skills.42,43,65-71 In addition to their role in basic training (which is outside the scope of this position statement), simulators can be useful for remedial training. To be sufficient for hospitalists who do not pass their patient-based assessments, however, remedial training that begins with simulation must also include patient-based training and assessment.72-75

Initial Credentialing Process

A Minimum Threshold of Experience Should Be Reached before Patient-Based Assessments are Conducted (Figure 1)

Recent experience, such as the number of successful procedures performed on a representative sample of patients61,76,77 in the last 2 years, should meet a minimum threshold (see “Minimum thresholds of experience to trigger the timing of a patient-based assessment should be determined by empirical methods” section) before a patient-based assessment for intramural certification occurs.31,78 Such procedures should be supervised unless performed with privileges, for example, at another hospital. After reaching both a minimum threshold of experience and passing an observed patient-based assessment, which includes assessments of both cognitive and manual skills, hospitalists can be considered intramurally certified for initial credentialing. The hospitalist may begin to independently perform ultrasound-guided procedures if all credentialing requirements are met and privileges are granted.

Initial Certification Assessments Should Ideally Begin on Simulators

Simulators allow the assurance of safe manual skills, including proper needle insertion techniques and disposal of sharp objects.3,79 If simulators are not available, however, then patient-based training and assessments can still be performed under direct observation. Safe performance of ultrasound-guided procedures during patient-based assessments (without preceding simulator-based assessments) is sufficient to certify manual competence.

Ongoing Credentialing

Certification to Perform Ultrasound-Guided Procedures Should Be Routinely Re-Evaluated During Ongoing Credentialing (Figure 2)

Ongoing certifications are needed because skills decay.80,81 They should be routine, perhaps coinciding with the usual reprivileging cycle (often biennually). When feasible,82 maintenance of manual competence is best ensured by directly observed patient-based assessments; when not feasible, performance reviews are acceptable.

Observed Patient-Based Assessments Should Occur When a Periprocedural Safety Event Occurs that is Potentially Caused by “Provider Error”

Safety events include both near misses and adverse events. Information about both is ideally “flagged” and “pushed” to hospitalist group leaders by active surveillance and reporting systems. Once reviewed, if a safety event is considered to potentially have been caused by provider error (including knowledge- and skill-based errors),83 then the provider who performed the procedure should undergo an observed patient-based assessment.

Simulation-Based Practice Can Supplement Patient-Based Experience for Ongoing Credentialing

When hospitalists do not achieve a minimum threshold of patient-based experience since the antecedent certification, simulation-based training can supplement their patient-based experience.84 In these cases, however, an observed patient-based assessment must occur. Another consideration is whether or not the privilege should be relinquished because of an infrequent need.

Credentialing Infrastructure

Hospitalists Themselves Should Not Bear the Financial Costs of Developing and Maintaining Training and Certification Programs for Ultrasound-Guided Procedures

Equipment and personnel costs85,86 commonly impede ultrasound-guided procedure programs.4,87,88 Hospitalists whose job descriptions include the performance of ultrasound-guided procedures should not be expected to bear the costs of ultrasound machines, image archival software, equipment maintenance, and initial and ongoing training and certification.

Assessors Should Be Unbiased Expert Providers Who Have Demonstrated Mastery in Performance of the Procedure Being Assessed and Regularly Perform It in a Similar Practice Environment

 

 

Assessors should be expert providers who regularly perform the ultrasound-guided procedure in a similar practice environment.9,89-94 For example, providers who are not hospitalists but who are experts in an ultrasound-guided procedure and commonly perform it on the hospital wards would be acceptable assessors. However, a radiologist who only performs that procedure in a fully-staffed interventional radiology suite with fluoroscopy or computed tomography guidance would not be an acceptable assessor. More than 1 assessor may balance idiosyncratic assessments;95 but when assessments are well structured, additional assessors are generally not needed.18Candidate assessors should be vetted by the hospitalist group leader and the hospital privileging committee.

If Intramural Assessors Are Not Available, Extramural Assessors May Be Considered

Intramural assessors are generally preferred because of familiarity with the local practice environment, including the available procedure kits and typical patient characteristics. Nevertheless, extramural assessors27,77,85,96 may theoretically provide even more valid assessments than intramural ones because extramural assessors are neither influenced by relationships with local hospitalists nor biased by local hospitalists’ skills.97,98 Remote performance assessment through video recordings99 or live-video streaming is another option100 but is not sufficient unless a room camera is available to simultaneously view probe movement and the ultrasound screen.101 In addition, remote assessment does not allow the assessor to physically assume control of the procedure to either salvage it or perhaps, in some cases, prevent a complication.

DISCUSSION

There are no high-quality randomized trials in support of a single credentialing pathway over any other.94,102 The credentialing pathways at the center of this position statement are based on expert opinion. Our methods can be criticized straightaway, therefore, for reliance on the experience and expertise of our working group and task force. Any position statement written without high-quality supportive evidence would be appropriately subject to the same criticism. Without evidence in support of an overall pathway, we codified specific aspects of the pathways into 16 individual recommendations.

Patient-level outcomes do not back these recommendations. Consider, for example, our recommendation that certification assessments be made from structured instruments and not simply from an assessor’s gestalt. Here, the basis is not improved patient-level outcomes from a trial (such as reduced complications or increased procedural success) but improved psychometric performance from reliability studies. The body of evidence for our recommendations is similarly indirect, mostly because the outcomes studied are more proximate and, thus, less meaningful than patient-level outcomes, which are the outcomes of greatest interest but are woefully understudied for clinical competence.17,97,103

The need for high-quality evidence is most pronounced in distinguishing how recommendations should be modified for various settings. Wide variations in resources and patient-mix will make some recommendations impracticable, meaning that they could not be carried out with available resources. For example, our recommendation that credentialing decisions should ultimately rely on certifications made by assessors during patient-based assessments may not be practicable at small, rural hospitals. Such hospitals may not have access to local assessors, and they may not admit enough patients who need the types of ultrasound-guided procedures for which hospitalists seek certification (especially given the need to coordinate the schedules of patients, procedure-performing hospitalists, and assessors). Collaborative efforts between hospitals for regional certification may be a potential solution to consider. But if recommendations are truly impracticable, the task force recognizes they may need to be modified. Given the low quality of evidence supporting our recommendations, such modifications would be readily defendable, especially if they emerged from collaborative discussions between privileging committees, hospitalist directors, and local experts.

One way for hospitals to implement our recommendations may be to follow a recommendation proposed by the authors of the original hospitalist core competencies over a decade ago: “The presence of a procedural skill in the Core Competencies does not necessarily indicate that every hospitalist will perform or be proficient in that procedure.”104 In other words, bedside procedures may be delegated to some but not all hospitalists. Such “proceduralists” would have some proportion of their clinical responsibility dedicated to performing procedures. Delineation of this job description must be made locally because it balances 2 hospital-specific characteristics: patients’ needs for procedures against the availability of providers with basic competence to perform them, which includes hospitalists but also emergency medicine physicians, specialists, and interventional radiologists. A salutary benefit for hospitals is that hospitalists who are not proceduralists would not need to undergo certification in basic competence for the bedside procedures they will not be performing.

Regardless of whether some or all hospitalists at a particular hospital are expected to perform bedside procedures, technology may help to improve the practicability of our recommendations. For example, simulators may evolve to replace actual patient-level experience in achieving minimum thresholds. Certification assessments of manual skills may even someday occur entirely on simulators. Real-time high-definition video streaming enhanced with multiple cameras may allow for remote assessments. Until such advances mature, high-quality patient-level data should be sought through additional research to refine our current recommendations.

We hope that these recommendations will improve how basic competence in ultrasound-guided bedside procedures is assessed. Our ultimate goal is to improve how hospitalists perform these procedures. Patient safety is, therefore, considered paramount to cost. Nevertheless, the hospital administrative leaders and privileging committee members on our Task Force concluded that many hospitals have been seeking guidance on credentialing for bedside procedures, and the likely difficulties of implementing our recommendations (including cost) would not be prohibitive at most hospitals, especially given recognition that these recommendations can be tailored to each setting.

 

 

Acknowledgments

Collaborators from SHM POCUS Task Force are Saaid Abdel-Ghani, Michael Blaivas, Dan Brotman, Carolina Candotti, Jagriti Chadha, Joel Cho, Ria Dancel, Ricardo Franco, Richard Hoppmann, Susan Hunt, Venkat Kalidindi, Ketino Kobaidze, Josh Lenchus, Benji Mathews, Satyen Nichani, Vicki Noble, Martin Perez, Nitin Puri, Aliaksei Pustavoitau, Sophia Rodgers, Gerard Salame, Daniel Schnobrich, Kirk Spencer, Vivek Tayal, Jeff Bates, Anjali Bhagra, Kreegan Reierson, Robert Arntfield, Paul Mayo, Loretta Grikis.

Disclosure

Brian P. Lucas received funding from the Department of Veterans Affairs, Veterans Health Administration, Office of Research and Development and Dartmouth SYNERGY, National Institutes of Health, and National Center for Translational Science (UL1TR001086). Nilam Soni received funding from the Department of Veterans Affairs, Quality Enhancement Research Initiative (QUERI) Partnered Evaluation Initiative (HX002263-01A1). The contents of this publication do not represent the views of the United States Department of Veterans Affairs or the United States Government.

References

1. American Board of Internal Medicine. Policies and procedures for certification. Philadelphia: American Board of Internal Medicine; 2006.
2. Nichani S, Fitterman N, Lukela M, Crocker J; Society of Hospital Medicine. The Core Competencies in Hospital Medicine 2017 Revision. Section 2: Procedures. J Hosp Med. 2017;12(4 Suppl 1):S44-S54 PubMed
3. Lucas BP, Asbury JK, Franco-Sadud R. Training future hospitalists with simulators: a needed step toward accessible, expertly performed bedside procedures. J Hosp Med. 2009;4(7):395-396. PubMed
4. Schnobrich DJ, Gladding S, Olson APJ, Duran-Nelson A. Point-of-care ultrasound in internal medicine: a national survey of educational leadership. J Grad Med Educ. 2013;5(3):498-502. PubMed
5. Brown GM, Otremba M, Devine LA, Gray C, Millington SJ, Ma IW. Defining competencies for ultrasound-guided bedside procedures: consensus opinions from Canadian physicians. J Ultrasound Med. 2016;35(1):129-141. PubMed
6. Vaisman A, Cram P. Procedural competence among faculty in academic health centers: challenges and future directions. Acad Med. 2017;92(1):31-34. PubMed
7. Kreisman RD. With ED ultrasound, credentialing is at issue. ED Legal Letter. 2010;21:102-103. 
8. Goudie AM. Credentialing a new skill: what should the standard be for emergency department ultrasound in Australasia? Emerg Med Australas. 2010;22:263-264. PubMed
9. Maizel J, Guyomarc HL, Henon P, et al. Residents learning ultrasound-guided catheterization are not sufficiently skilled to use landmarks. Crit Care. 2014;18(1):R36. doi:10.1186/cc13741. PubMed
10. American College of Emergency Physicians. Ultrasound guidelines: emergency, point-of-care, and clinical ultrasound guidelines in medicine. Ann Emerg Med. 2017;69(5):e27-e54. PubMed
11. Amini R, Adhikari S, Fiorello A. Ultrasound competency assessment in emergency medicine residency programs. Acad Emerg Med. 2014;21(7):799-801. PubMed
12. Jensen T, Soni NJ, Tierney DM, Lucas BP. Hospital privileging practices for bedside procedures: a survey of hospitalist experts. J Hosp Med. 2017;12(10):836-839. PubMed
13. Chang W. Is hospitalist proficiency in bedside procedures in decline? The Hospitalist. 2012. http://www.the-hospitalist.org/hospitalist/article/125236/patient-safety/hospitalist-proficiency-bedside-procedures-decline. Accessed September 30, 2017.
14. Barsuk JH, Feinglass J, Kozmic SE, Hohmann SF, Ganger D, Wayne DB. Specialties Performing Paracentesis Procedures at University Hospitals: Implications for Training and Certification. J Hosp Med. 2014;9(3):162-168. PubMed
15. Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Residents’ Procedural Experience Does Not Ensure Competence: A Research Synthesis. J Grad Med Educ. 2017;9(2):201-208. PubMed
16. Balshem H, Helfand M, Schunemann HJ, et al. GRADE guidelines: 3. Rating the quality of evidence. J Clin Epidemiol. 2011;64(4):401-406. PubMed
17. Guyatt GH, Oxman AD, Kunz R, et al. GRADE guidelines: 8. Rating the quality of evidence—indirectness. J Clin Epidemiol. 2011;64(12):1303-1310. PubMed
18. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9 Suppl):S63-S67. PubMed
19. Grover S, Currier PF, Elinoff JM, Mouchantaf KJ, Katz JT, McMahon GT. Development of a test to evaluate residents knowledge of medical procedures. J Hosp Med. 2009;4(7):430-432. PubMed
20. Millington SJ, Wong RY, Kassen BO, Roberts JM, Ma IWY. Improving internal medicine residents’ performance, knowledge, and confidence in central venous catheterization using simulators. J Hosp Med. 2009;4(7):410-416. PubMed
21. Lenchus JD, Carvalho CM, Ferreri K, et al. Filling the void: defining invasive bedside procedural competency for internal medicine residents. J Grad Med Educ. 2013;5(4):605-612. PubMed
22. Heegeman DJ, Kieke B Jr. Learning curves, credentialing, and the need for ultrasound fellowships. Acad Emerg Med. 2003;10:404-405. PubMed
23. Jang TB, Ruggeri W, Dyne P, Kaji AH. The learning curve of resident physicians using emergency ultrasonography for cholelithaisis and cholecystitis. Acad Emerg Med. 2010;17(11):1247-1252. PubMed
24. Akhtar MI, Hamid M. Ultrasound guided central venous access; a review of literature. Anaesth Pain Intensive Care. 2015;19:317-322. 
25. Bahl A, Yunker A. Assessment of the numbers–based model for evaluation of resident competency in emergency ultrasound core applications. J Emerg Med Trauma Acute Care. 2015;2015(5). doi:10.5339/jemtac.2015.5 
26. Cazes N, Desmots F, Geffroy Y, Renard A, Leyral J, Chaumoitre K. Emergency ultrasound: a prospective study on sufficient adequate training for military doctors. Diagn Interv Imaging. 2013;94(11):1109-1115. PubMed
27. Arntfield RT, Millington SJ, Ainsworth CD, et al. Canadian recommendations for critical care ultrasound training and competency for the Canadian critical care society. Can Respir J. 2014;21(16):341-345. 
28. Bolsin S, Colson M. The use of the Cusum technique in the assessment of trainee competence in new procedures. Int J Qual Health Care. 2000;12(5):433-438. PubMed
29. de Oliveira Filho GR, Helayel PE, da Conceição DB, Garzel IS, Pavei P, Ceccon MS. Learning curves and mathematical models for interventional ultrasound basic skills. Anaesth Analg. 2008;106(2):568-573. PubMed
30. Starkie T, Drake EJ. Assessment of procedural skills training and performance in anesthesia using cumulative sum analysis (cusum). Can J Anaesth. 2013;60(12):1228-1239. PubMed
31. Tierney D. Competency cut-point identification derived from a mastery learning cohort approach: A hybrid model. Ultrasound Med Biol. 2015;41:S19. 
32. Rankin JH, Elkhunovich MA, Rangarajan V, Chilstrom M, Mailhot T. Learning Curves for Ultrasound Assessment of Lumbar Puncture Insertion Sites: When is Competency Established? J Emerg Med. 2016;51(1):55-62. PubMed
33. Klasko SK, Cummings RV, Glazerman LR. Resident data collection: Do the numbers add up? Am J Obstet Gynecol. 1995;172(4 Pt 1):1312-1316. PubMed
34. Tierney D. Development & analysis of a mobile POCUS tracking tool. Ultrasound Med Biol. 2015;41(suppl 4):S31. 
35. Sethi MV, Zimmer J, Ure B, Lacher M. Prospective assessment of complications on a daily basis is essential to determine morbidity and mortality in routine pediatric surgery. J Pediatr Surg. 2016;51(4):630-633. PubMed
36. Fisher JC, Kuenzler KA, Tomita SS, Sinha P, Shah P, Ginsburg HB. Increased capture of pediatric surgical complications utilizing a novel case-log web application to enhance quality improvement. J Pediatr Surg. 2017;52(1):166-171. PubMed
37. Rethans JJ, Norcini JJ, Barón-Maldonado M, et al. The relationship between competence and performance: implications for assessing practice performance. Med Educ. 2002;36(10):901-909. PubMed
38. Duncan DR, Morgenthaler TI, Ryu JH, Daniels CE. Reducing iatrogenic risk in thoracentesis: establishing best practice via experiential training in a zero-risk environment. Chest. 2009;135(5):1315-1320. PubMed
39. Society of Critical Care Medicine Ultrasound Certification Task Force. Recommendations for achieving and maintaining competence and credentialing in critical care ultrasound with focused cardiac ultrasound and advanced critical care echocardiography. http://journals.lww.com/ccmjournal/Documents/Critical%20Care%20Ultrasound.pdf Published 2013. Accessed February 2, 2017.
40. Carraccio C, Wolfsthal SD, Englander R, Ferentz K, Martin C. Shifting paradigms: from Flexner to competencies. Acad Med. 2002;77(5):361-367. PubMed
41. Clark EG, Paparello JJ, Wayne DB, et al. Use of a national continuing medical education meeting to provide simulation-based training in temporary hemodialysis catheter insertion skills: a pre-test post-test study. Can J Kidney Health Dis. 2014;1:25-31. PubMed
42. Barsuk JH, Cohen ER, Caprio T, McGaghie WC, Simuni T, Wayne DB. Simulation-based education with mastery learning improves residents’ lumbar puncture skills. Neurology. 2012;79(2):132-137. PubMed
43. Barsuk JH, McGaghie WC, Cohen ER, O’Leary KJ, Wayne DB. Simulation-based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med. 2009;37(10):2697-2701. PubMed
44. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 2006;296(9):1094-1102. PubMed
45. Shah J, Darzi A. Surgical skills assessment: an ongoing debate. BJU Int. 2001;88(7):655-660. PubMed
46. Lamperti M, Bodenham AR, Pittiruti M, et al. International evidence-based recommendations on ultrasound-guided vascular access. Intensive Care Med. 2012;38(7):1105-1117. PubMed
47. Tolsgaard MG, Todsen T, Sorensen JL, et al. International multispecialty consensus on how to evaluate ultrasound competence: a Delphi consensus survey. PLOS One. 2013;8(2):e57687. doi:10.1371/journal.pone.0057687 PubMed
48. Moureau N, Laperti M, Kelly LJ, et al. Evidence-based consensus on the insertion of central venous access devices: definition of minimal requirements for training. Br J Anaesth. 2013;110(3):347-356. PubMed

49. Feldman LS, Hagarty S, Ghitulescu G, Stanbridge D, Fried GM. Relationship between objective assessment of technical skills and subjective in-training evaluations in surgical residents. J Am Coll Surg. 2004;198(1):105-110. PubMed
50. Baker S, Willey B, Mitchell C. The attempt to standardize technical and analytic competence in sonography education. J Diagn Med Sonogr. 2011;27(5):203-211. 
51. Tolsgaard MG, Ringsted C, Dreisler E, et al. Reliable and valid assessment of ultrasound operator competence in obstetrics and gynecology. Ultrasound Obstet Gynecol. 2014;43(4):437-443. PubMed
52. Rice J, Crichlow A, Baker M, et al. An assessment tool for the placement of ultrasound-guided peripheral intravenous access. J Grad Med Educ. 2016;8(2):202-207. PubMed
53. Hartman N, Wittler M, Askew K, Hiestand B, Manthey D. Validation of a performance checklist for ultrasound-guided internal jubular central lines for use in procedural instruction and assessment. Postgrad Med J. 2017;93(1096):67-70. PubMed
54. Primdahl SC, Todsen T, Clemmesen L, et al. Rating scale for the assessment of competence in ultrasound-guided peripheral vascular access—a Delphi Consensus Study. J Vasc Access. 2016;17(5):440-445. 
55. Berg D, Berg K, Riesenberg LA, et al. The development of a validated checklist for thoracentesis: preliminary results. Am J Med Qual. 2013;28(3):220-226. PubMed
56. Berg K, Riesenberg LA, Berg D, et al. The development of a validated checklist for radial arterial line placement: preliminary results. Am J Med Qual. 2014;29(3):242-246. PubMed
57. Walzak A, Bacchus M, Schaefer MP, et al. Diagnosing technical competence in six bedside procedures: comparing checklists and a global rating scale in the assessment of resident performance. Acad Med. 2015;90(8):1100-1108. PubMed
58. Riesenberg LA, Berg K, Berg D, et al. The development of a validated checklist for femoral venous catheterization: preliminary results. Am J Med Qual. 2014;29(5):445-450. PubMed
59. Riesenberg LA, Berg K, Berg D, et al. The development of a validated checklist for paracentesis: preliminary results. Am J Med Qual. 2013;28(3):227-231. PubMed
60. Huang GC, Newman LR, Schwartzstein RM, et al. Procedural competence in internal medicine residents: validity of a central venous catheter insertion assessment instrument. Acad Med. 2009;84(8):1127-1134. PubMed
61. Salamonsen M, McGrath D, Steiler G, et al. A new instrument to assess physician skill at thoracic ultrasound, including pleural effusion markup. Chest. 2013;144(3):930-934. PubMed
62. Boniface K, Yarris LM. Emergency ultrasound: Leveling the training and assessment landscape. Acad Emerg Med. 2014;21(7):803-805. PubMed
63. Boyle E, O’Keeffe D, Naughton P, Hill A, McDonnell C, Moneley D. The importance of expert feedback during endovascular simulator training. J Vasc Surg. 2011;54(1):240-248.e1. PubMed
64. Langhan TS, Rigby IJ, Walker IW, Howes D, Donnon T, Lord JA. Simulation-based training in critical resuscitation procedures improves residents’ competence. CJEM. 2009;11(6):535-539. PubMed
65. Barsuk JH, McGaghie WC, Cohen ER, Balachandran JS, Wayne DB. Use of simulation-based mastery learning to improve the quality of central venous catheter placement in a medical intensive care unit. J Hosp Med. 2009;4(7):397-403. PubMed
66. Lenchus JD. End of the “see one, do one, teach one” era: the next generation of invasive bedside procedural instruction. J Am Osteopath Assoc. 2010;110(6):340-346. PubMed
67. Barsuk JH, Cohen ER, Vozenilek JA, O’Connor LM, McGaghie WC, Wayne DB. Simulation-based education with mastery learning improves paracentesis skills. J Grad Med Educ. 2012;4(1):23-27. PubMed
68. McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med. 2011;86(6):706-711. PubMed
69. Ross JG. Simulation and psychomotor skill acquisition: A review of the literature. Clin Simul Nurs. 2012;8(9):e429-e435. 
70. Barsuk JH, Cohen ER, Potts S, et al. Dissemination of a simulation-based mastery learning intervention reduces central line-associated bloodstream infections. BMJ Qual Saf. 2014;23(9):749-756. PubMed
71. McSparron JI, Michaud GC, Gordan PL, et al. Simulation for skills-based education in pulmonary and critical care medicine. Ann Am Thorac Soc. 2015;12(4):579-586. PubMed
72. Kneebone RL, Scott W, Darzi A, Horrocks M. Simulation and clinical practice: strengthening the relationship. Med Educ. 2004;38(10):1095-1102. PubMed
73. Mema B, Harris I. The barriers and facilitators to transfer of ultrasound-guided central venous line skills from simulation to practice: exploring perceptions of learners and supervisors. Teach Learn Med. 2016;28(2):115-124. PubMed
74. Castanelli DJ. The rise of simulation in technical skills teaching and the implications for training novices in anaestheia. Anaesth Intensive Care. 2009;37(6):903-910. PubMed
75. McGaghie WC, Issenberg SB, Barsuk JH, Wayne DB. A critical review of simulation-based mastery learning with translational outcomes. Med Educ. 2014;48(4):375-385. PubMed
76. Langlois SLP. Focused ultrasound training for clinicians. Crit Care Med. 2007;35(5 suppl):S138-S143.
77. Price S, Via G, Sloth E, et al. Echocardiography practice, training and accreditation in the intesive care: document for the World Interactive Network Focused on Critical Ultrasound (WINFOCUS). Cardiovasc Ultrasound. 2008;6:49-83. PubMed
78. Blehar DJ, Barton B, Gaspari RJ. Learning curves in emergency ultrasound education. Acad Emerg Med. 2015;22(5):574-582. PubMed
79. Ault MJ, Rosen BT, Ault B. The use of tissue models for vascular access training. Phase I of the procedural patient safety initiative. J Gen Intern Med. 2006;21(5):514-517. PubMed
80. Barsuk JH, Cohen ER, McGaghie WC, Wayne DB. Long-term retention of central venous catheter insertion skills after simulation-based mastery learning. Acad Med. 2010;85(10 Suppl):S9-S12. PubMed
81. Sliman Sean, Amundson S, Shaw D, Phan JN, Waalen J, Kimura B. Recently-acquired cardiac ultrasound skills are rapidly lost when not used: implications for competency in physician imaging. J Amer Coll Cardiol. 2016;67(13S):1569. 
82. Kessler CS, Leone KA. The current state of core competency assessment in emergency medicine and a future research agenda: recommendations of the working group on assessment of observable learner performance. Acad Emerg Med. 2012;19(12):1354-1359. PubMed
83. Chang A, Schyve PM, Croteau RJ, O’Leary DS, Loeb JM. The JCAHO patient safety event taxonomy: a standardized terminology and classification schema for near misses and adverse events. Int J Qual Health Care. 2005;17(2):95-105. PubMed
84. Sawyer T, White M, Zaveri P, et al. Learn, see, practice, prove, do, maintain: an evidence-based pedagogical framework for procedural skill training in medicine. Acad Med. 2015;90(8):1025-1033. PubMed
85. Das D, Kapoor M, Brown C, Ndubuisi A, Gupta S. Current status of emergency department attending physician ultrasound credentialing and quality assurance in the United States. Crit Ultrasound J. 2016;8(1):6-12. PubMed
86. Ndubuisi AK, Gupta S, Brown C, Das D. Current status and future issues in emergency department attending physician ultrasound credentialing. Ann Emerg Med. 2014;64(45):S27-S28. 
87. Tandy Tk, Hoffenberg S. Emergency department ultrasound services by emergency physicians: model for gaining hospital approval. Ann Emerg Med. 1997;29(3):367-374. PubMed
88. Lewiss RE, Saul T, Del Rios M. Acquiring credentials in bedside ultrasound: a cross-sectional survey. BMJ Open. 2013;3:e003502. doi:10.1136/bmjopen-2013-003502 PubMed
89. Lanoix R. Credentialing issues in emergency ultrasonography. Emerg Med Clin North Am. 1997;15(4):913-920. PubMed
90. Scalea T, Rodriquez A, Chiu WC, et al. Focused assessment with sonography for trauma (FAST): results from an international consensus conference. J Trauma. 1999;46(3):466-472. PubMed
91. Hertzberg BS, Kliewer MA, Bowie JD, et al. Physician training requirements in sonography: how many cases are needed for competence? AJR. 2000;174(5):1221-1227. PubMed
92. Blaivas M, Theodoro DL, Sierzenski P. Proliferation of ultrasound fellowships in emergency medicine: how do we ensure future experts are expertly trained? Acad Emerg Med. 2002;9(8):863-864. PubMed
93. Bodenham AR. Editorial II: Ultrasound imaging by anaesthetists: training and accreditation issues. Br J Anaesth. 2006;96(4):414-417. PubMed
94. Williamson JP, Twaddell SH, Lee YCG, et al. Thoracic ultrasound recognition of competence: A position paper of the Thoracic Society of Australia and New Zealand. Respirology. 2017;22(2):405-408. PubMed
95. Harrison G. Summative clinical competency assessment: a survey of ultrasound practitioners’ views. Ultrasound. 2015;23(1):11-17. PubMed
96. Evans LV, Morse JL, Hamann CJ, Osborne M, Lin Z, D'Onofrio G. The development of an independent rater system to assess residents' competence in invasive procedures. Acad Med. 2009;84(8):1135-1143. PubMed
97. Wass V, Van der Vleuten C, Shatzer J, Jones R. Assessment of clinical competence. Lancet. 2001;357(9260):945-949. PubMed
98. Arntfield RT. The utility of remote supervision with feedback as a method to deliver high-volume critical care ultrasound training. J Crit Care. 2015;30(2):441.e1-e6. PubMed
99. Akhtar S, Theodoro D, Gaspari R, et al. Resident training in emergency ultrasound: consensus recommendations from the 2008 Council of Emergency Residency Directors Conference. Acad Emerg Med. 2009;16:S32-S36. PubMed
100. Yu E. The assessment of technical skills in a cardiology training program: is the ITER sufficient? Can J Cardiol. 2000;16(4):457-462. PubMed
101. Todsen T, Tolsgaard MG, Olsen BH, et al. Reliable and valid assessment of point-of-care ultrasonography. Ann Surg. 2015;261(2):309-315. PubMed
102. Stein JC, Nobay F. Emergency department ultrasound credentialing: a sample policy and procedure. J Emerg Med. 2009;37(2):153-159. PubMed
103. Chen FM. Burstin H, Huntington J. The importance of clinical outcomes in medical education research. Med Educ. 2005;39(4):350-351. PubMed
104. Dressler DD, Pistoria MJ, Budnitz TL, McKean SCW, Amin AN. Core competencies in hospital medicine: development and methodology. J Hosp Med. 2006;1:48-56. PubMed
105. ten Cate O. Nuts and bolts of entrustable professional activities. J Grad Med Educ. 2013;5(1):157-158. PubMed
106. Castillo J, Caruana CJ, Wainwright D. The changing concept of competence and categorisation of learning outcomes in Europe: Implications for the design of higher education radiography curricula at the European level. Radiography. 2011;17(3):230-234. 
107. Goldstein SR. Accreditation, certification: why all the confusion? Obstet Gynecol. 2007;110(6):1396-1398. PubMed
108. Moore CL. Credentialing and reimbursement in point-of-care ultrasound. Clin Pediatr Emerg Med. 2011;12(1):73-77. PubMed
109. ten Cate O, Scheele F. Competency-based postgraduate training: can we bridge the gap between theory and clinical practice? Acad Med. 2007;82(6):542-547. PubMed
110. Abuhamad AZ, Benacerraf BR, Woletz P, Burke BL. The accreditation of ultrasound practices: impact on compliance with minimum performance guidelines. J Ultrasound Med. 2004;23(8):1023-1029. PubMed

 

 

References

1. American Board of Internal Medicine. Policies and procedures for certification. Philadelphia: American Board of Internal Medicine; 2006.
2. Nichani S, Fitterman N, Lukela M, Crocker J; Society of Hospital Medicine. The Core Competencies in Hospital Medicine 2017 Revision. Section 2: Procedures. J Hosp Med. 2017;12(4 Suppl 1):S44-S54 PubMed
3. Lucas BP, Asbury JK, Franco-Sadud R. Training future hospitalists with simulators: a needed step toward accessible, expertly performed bedside procedures. J Hosp Med. 2009;4(7):395-396. PubMed
4. Schnobrich DJ, Gladding S, Olson APJ, Duran-Nelson A. Point-of-care ultrasound in internal medicine: a national survey of educational leadership. J Grad Med Educ. 2013;5(3):498-502. PubMed
5. Brown GM, Otremba M, Devine LA, Gray C, Millington SJ, Ma IW. Defining competencies for ultrasound-guided bedside procedures: consensus opinions from Canadian physicians. J Ultrasound Med. 2016;35(1):129-141. PubMed
6. Vaisman A, Cram P. Procedural competence among faculty in academic health centers: challenges and future directions. Acad Med. 2017;92(1):31-34. PubMed
7. Kreisman RD. With ED ultrasound, credentialing is at issue. ED Legal Letter. 2010;21:102-103. 
8. Goudie AM. Credentialing a new skill: what should the standard be for emergency department ultrasound in Australasia? Emerg Med Australas. 2010;22:263-264. PubMed
9. Maizel J, Guyomarc HL, Henon P, et al. Residents learning ultrasound-guided catheterization are not sufficiently skilled to use landmarks. Crit Care. 2014;18(1):R36. doi:10.1186/cc13741. PubMed
10. American College of Emergency Physicians. Ultrasound guidelines: emergency, point-of-care, and clinical ultrasound guidelines in medicine. Ann Emerg Med. 2017;69(5):e27-e54. PubMed
11. Amini R, Adhikari S, Fiorello A. Ultrasound competency assessment in emergency medicine residency programs. Acad Emerg Med. 2014;21(7):799-801. PubMed
12. Jensen T, Soni NJ, Tierney DM, Lucas BP. Hospital privileging practices for bedside procedures: a survey of hospitalist experts. J Hosp Med. 2017;12(10):836-839. PubMed
13. Chang W. Is hospitalist proficiency in bedside procedures in decline? The Hospitalist. 2012. http://www.the-hospitalist.org/hospitalist/article/125236/patient-safety/hospitalist-proficiency-bedside-procedures-decline. Accessed September 30, 2017.
14. Barsuk JH, Feinglass J, Kozmic SE, Hohmann SF, Ganger D, Wayne DB. Specialties Performing Paracentesis Procedures at University Hospitals: Implications for Training and Certification. J Hosp Med. 2014;9(3):162-168. PubMed
15. Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Residents’ Procedural Experience Does Not Ensure Competence: A Research Synthesis. J Grad Med Educ. 2017;9(2):201-208. PubMed
16. Balshem H, Helfand M, Schunemann HJ, et al. GRADE guidelines: 3. Rating the quality of evidence. J Clin Epidemiol. 2011;64(4):401-406. PubMed
17. Guyatt GH, Oxman AD, Kunz R, et al. GRADE guidelines: 8. Rating the quality of evidence—indirectness. J Clin Epidemiol. 2011;64(12):1303-1310. PubMed
18. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9 Suppl):S63-S67. PubMed
19. Grover S, Currier PF, Elinoff JM, Mouchantaf KJ, Katz JT, McMahon GT. Development of a test to evaluate residents knowledge of medical procedures. J Hosp Med. 2009;4(7):430-432. PubMed
20. Millington SJ, Wong RY, Kassen BO, Roberts JM, Ma IWY. Improving internal medicine residents’ performance, knowledge, and confidence in central venous catheterization using simulators. J Hosp Med. 2009;4(7):410-416. PubMed
21. Lenchus JD, Carvalho CM, Ferreri K, et al. Filling the void: defining invasive bedside procedural competency for internal medicine residents. J Grad Med Educ. 2013;5(4):605-612. PubMed
22. Heegeman DJ, Kieke B Jr. Learning curves, credentialing, and the need for ultrasound fellowships. Acad Emerg Med. 2003;10:404-405. PubMed
23. Jang TB, Ruggeri W, Dyne P, Kaji AH. The learning curve of resident physicians using emergency ultrasonography for cholelithaisis and cholecystitis. Acad Emerg Med. 2010;17(11):1247-1252. PubMed
24. Akhtar MI, Hamid M. Ultrasound guided central venous access; a review of literature. Anaesth Pain Intensive Care. 2015;19:317-322. 
25. Bahl A, Yunker A. Assessment of the numbers–based model for evaluation of resident competency in emergency ultrasound core applications. J Emerg Med Trauma Acute Care. 2015;2015(5). doi:10.5339/jemtac.2015.5 
26. Cazes N, Desmots F, Geffroy Y, Renard A, Leyral J, Chaumoitre K. Emergency ultrasound: a prospective study on sufficient adequate training for military doctors. Diagn Interv Imaging. 2013;94(11):1109-1115. PubMed
27. Arntfield RT, Millington SJ, Ainsworth CD, et al. Canadian recommendations for critical care ultrasound training and competency for the Canadian critical care society. Can Respir J. 2014;21(16):341-345. 
28. Bolsin S, Colson M. The use of the Cusum technique in the assessment of trainee competence in new procedures. Int J Qual Health Care. 2000;12(5):433-438. PubMed
29. de Oliveira Filho GR, Helayel PE, da Conceição DB, Garzel IS, Pavei P, Ceccon MS. Learning curves and mathematical models for interventional ultrasound basic skills. Anaesth Analg. 2008;106(2):568-573. PubMed
30. Starkie T, Drake EJ. Assessment of procedural skills training and performance in anesthesia using cumulative sum analysis (cusum). Can J Anaesth. 2013;60(12):1228-1239. PubMed
31. Tierney D. Competency cut-point identification derived from a mastery learning cohort approach: A hybrid model. Ultrasound Med Biol. 2015;41:S19. 
32. Rankin JH, Elkhunovich MA, Rangarajan V, Chilstrom M, Mailhot T. Learning Curves for Ultrasound Assessment of Lumbar Puncture Insertion Sites: When is Competency Established? J Emerg Med. 2016;51(1):55-62. PubMed
33. Klasko SK, Cummings RV, Glazerman LR. Resident data collection: Do the numbers add up? Am J Obstet Gynecol. 1995;172(4 Pt 1):1312-1316. PubMed
34. Tierney D. Development & analysis of a mobile POCUS tracking tool. Ultrasound Med Biol. 2015;41(suppl 4):S31. 
35. Sethi MV, Zimmer J, Ure B, Lacher M. Prospective assessment of complications on a daily basis is essential to determine morbidity and mortality in routine pediatric surgery. J Pediatr Surg. 2016;51(4):630-633. PubMed
36. Fisher JC, Kuenzler KA, Tomita SS, Sinha P, Shah P, Ginsburg HB. Increased capture of pediatric surgical complications utilizing a novel case-log web application to enhance quality improvement. J Pediatr Surg. 2017;52(1):166-171. PubMed
37. Rethans JJ, Norcini JJ, Barón-Maldonado M, et al. The relationship between competence and performance: implications for assessing practice performance. Med Educ. 2002;36(10):901-909. PubMed
38. Duncan DR, Morgenthaler TI, Ryu JH, Daniels CE. Reducing iatrogenic risk in thoracentesis: establishing best practice via experiential training in a zero-risk environment. Chest. 2009;135(5):1315-1320. PubMed
39. Society of Critical Care Medicine Ultrasound Certification Task Force. Recommendations for achieving and maintaining competence and credentialing in critical care ultrasound with focused cardiac ultrasound and advanced critical care echocardiography. http://journals.lww.com/ccmjournal/Documents/Critical%20Care%20Ultrasound.pdf Published 2013. Accessed February 2, 2017.
40. Carraccio C, Wolfsthal SD, Englander R, Ferentz K, Martin C. Shifting paradigms: from Flexner to competencies. Acad Med. 2002;77(5):361-367. PubMed
41. Clark EG, Paparello JJ, Wayne DB, et al. Use of a national continuing medical education meeting to provide simulation-based training in temporary hemodialysis catheter insertion skills: a pre-test post-test study. Can J Kidney Health Dis. 2014;1:25-31. PubMed
42. Barsuk JH, Cohen ER, Caprio T, McGaghie WC, Simuni T, Wayne DB. Simulation-based education with mastery learning improves residents’ lumbar puncture skills. Neurology. 2012;79(2):132-137. PubMed
43. Barsuk JH, McGaghie WC, Cohen ER, O’Leary KJ, Wayne DB. Simulation-based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med. 2009;37(10):2697-2701. PubMed
44. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 2006;296(9):1094-1102. PubMed
45. Shah J, Darzi A. Surgical skills assessment: an ongoing debate. BJU Int. 2001;88(7):655-660. PubMed
46. Lamperti M, Bodenham AR, Pittiruti M, et al. International evidence-based recommendations on ultrasound-guided vascular access. Intensive Care Med. 2012;38(7):1105-1117. PubMed
47. Tolsgaard MG, Todsen T, Sorensen JL, et al. International multispecialty consensus on how to evaluate ultrasound competence: a Delphi consensus survey. PLOS One. 2013;8(2):e57687. doi:10.1371/journal.pone.0057687 PubMed
48. Moureau N, Laperti M, Kelly LJ, et al. Evidence-based consensus on the insertion of central venous access devices: definition of minimal requirements for training. Br J Anaesth. 2013;110(3):347-356. PubMed

49. Feldman LS, Hagarty S, Ghitulescu G, Stanbridge D, Fried GM. Relationship between objective assessment of technical skills and subjective in-training evaluations in surgical residents. J Am Coll Surg. 2004;198(1):105-110. PubMed
50. Baker S, Willey B, Mitchell C. The attempt to standardize technical and analytic competence in sonography education. J Diagn Med Sonogr. 2011;27(5):203-211. 
51. Tolsgaard MG, Ringsted C, Dreisler E, et al. Reliable and valid assessment of ultrasound operator competence in obstetrics and gynecology. Ultrasound Obstet Gynecol. 2014;43(4):437-443. PubMed
52. Rice J, Crichlow A, Baker M, et al. An assessment tool for the placement of ultrasound-guided peripheral intravenous access. J Grad Med Educ. 2016;8(2):202-207. PubMed
53. Hartman N, Wittler M, Askew K, Hiestand B, Manthey D. Validation of a performance checklist for ultrasound-guided internal jubular central lines for use in procedural instruction and assessment. Postgrad Med J. 2017;93(1096):67-70. PubMed
54. Primdahl SC, Todsen T, Clemmesen L, et al. Rating scale for the assessment of competence in ultrasound-guided peripheral vascular access—a Delphi Consensus Study. J Vasc Access. 2016;17(5):440-445. 
55. Berg D, Berg K, Riesenberg LA, et al. The development of a validated checklist for thoracentesis: preliminary results. Am J Med Qual. 2013;28(3):220-226. PubMed
56. Berg K, Riesenberg LA, Berg D, et al. The development of a validated checklist for radial arterial line placement: preliminary results. Am J Med Qual. 2014;29(3):242-246. PubMed
57. Walzak A, Bacchus M, Schaefer MP, et al. Diagnosing technical competence in six bedside procedures: comparing checklists and a global rating scale in the assessment of resident performance. Acad Med. 2015;90(8):1100-1108. PubMed
58. Riesenberg LA, Berg K, Berg D, et al. The development of a validated checklist for femoral venous catheterization: preliminary results. Am J Med Qual. 2014;29(5):445-450. PubMed
59. Riesenberg LA, Berg K, Berg D, et al. The development of a validated checklist for paracentesis: preliminary results. Am J Med Qual. 2013;28(3):227-231. PubMed
60. Huang GC, Newman LR, Schwartzstein RM, et al. Procedural competence in internal medicine residents: validity of a central venous catheter insertion assessment instrument. Acad Med. 2009;84(8):1127-1134. PubMed
61. Salamonsen M, McGrath D, Steiler G, et al. A new instrument to assess physician skill at thoracic ultrasound, including pleural effusion markup. Chest. 2013;144(3):930-934. PubMed
62. Boniface K, Yarris LM. Emergency ultrasound: Leveling the training and assessment landscape. Acad Emerg Med. 2014;21(7):803-805. PubMed
63. Boyle E, O’Keeffe D, Naughton P, Hill A, McDonnell C, Moneley D. The importance of expert feedback during endovascular simulator training. J Vasc Surg. 2011;54(1):240-248.e1. PubMed
64. Langhan TS, Rigby IJ, Walker IW, Howes D, Donnon T, Lord JA. Simulation-based training in critical resuscitation procedures improves residents’ competence. CJEM. 2009;11(6):535-539. PubMed
65. Barsuk JH, McGaghie WC, Cohen ER, Balachandran JS, Wayne DB. Use of simulation-based mastery learning to improve the quality of central venous catheter placement in a medical intensive care unit. J Hosp Med. 2009;4(7):397-403. PubMed
66. Lenchus JD. End of the “see one, do one, teach one” era: the next generation of invasive bedside procedural instruction. J Am Osteopath Assoc. 2010;110(6):340-346. PubMed
67. Barsuk JH, Cohen ER, Vozenilek JA, O’Connor LM, McGaghie WC, Wayne DB. Simulation-based education with mastery learning improves paracentesis skills. J Grad Med Educ. 2012;4(1):23-27. PubMed
68. McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med. 2011;86(6):706-711. PubMed
69. Ross JG. Simulation and psychomotor skill acquisition: A review of the literature. Clin Simul Nurs. 2012;8(9):e429-e435. 
70. Barsuk JH, Cohen ER, Potts S, et al. Dissemination of a simulation-based mastery learning intervention reduces central line-associated bloodstream infections. BMJ Qual Saf. 2014;23(9):749-756. PubMed
71. McSparron JI, Michaud GC, Gordan PL, et al. Simulation for skills-based education in pulmonary and critical care medicine. Ann Am Thorac Soc. 2015;12(4):579-586. PubMed
72. Kneebone RL, Scott W, Darzi A, Horrocks M. Simulation and clinical practice: strengthening the relationship. Med Educ. 2004;38(10):1095-1102. PubMed
73. Mema B, Harris I. The barriers and facilitators to transfer of ultrasound-guided central venous line skills from simulation to practice: exploring perceptions of learners and supervisors. Teach Learn Med. 2016;28(2):115-124. PubMed
74. Castanelli DJ. The rise of simulation in technical skills teaching and the implications for training novices in anaestheia. Anaesth Intensive Care. 2009;37(6):903-910. PubMed
75. McGaghie WC, Issenberg SB, Barsuk JH, Wayne DB. A critical review of simulation-based mastery learning with translational outcomes. Med Educ. 2014;48(4):375-385. PubMed
76. Langlois SLP. Focused ultrasound training for clinicians. Crit Care Med. 2007;35(5 suppl):S138-S143.
77. Price S, Via G, Sloth E, et al. Echocardiography practice, training and accreditation in the intesive care: document for the World Interactive Network Focused on Critical Ultrasound (WINFOCUS). Cardiovasc Ultrasound. 2008;6:49-83. PubMed
78. Blehar DJ, Barton B, Gaspari RJ. Learning curves in emergency ultrasound education. Acad Emerg Med. 2015;22(5):574-582. PubMed
79. Ault MJ, Rosen BT, Ault B. The use of tissue models for vascular access training. Phase I of the procedural patient safety initiative. J Gen Intern Med. 2006;21(5):514-517. PubMed
80. Barsuk JH, Cohen ER, McGaghie WC, Wayne DB. Long-term retention of central venous catheter insertion skills after simulation-based mastery learning. Acad Med. 2010;85(10 Suppl):S9-S12. PubMed
81. Sliman Sean, Amundson S, Shaw D, Phan JN, Waalen J, Kimura B. Recently-acquired cardiac ultrasound skills are rapidly lost when not used: implications for competency in physician imaging. J Amer Coll Cardiol. 2016;67(13S):1569. 
82. Kessler CS, Leone KA. The current state of core competency assessment in emergency medicine and a future research agenda: recommendations of the working group on assessment of observable learner performance. Acad Emerg Med. 2012;19(12):1354-1359. PubMed
83. Chang A, Schyve PM, Croteau RJ, O’Leary DS, Loeb JM. The JCAHO patient safety event taxonomy: a standardized terminology and classification schema for near misses and adverse events. Int J Qual Health Care. 2005;17(2):95-105. PubMed
84. Sawyer T, White M, Zaveri P, et al. Learn, see, practice, prove, do, maintain: an evidence-based pedagogical framework for procedural skill training in medicine. Acad Med. 2015;90(8):1025-1033. PubMed
85. Das D, Kapoor M, Brown C, Ndubuisi A, Gupta S. Current status of emergency department attending physician ultrasound credentialing and quality assurance in the United States. Crit Ultrasound J. 2016;8(1):6-12. PubMed
86. Ndubuisi AK, Gupta S, Brown C, Das D. Current status and future issues in emergency department attending physician ultrasound credentialing. Ann Emerg Med. 2014;64(45):S27-S28. 
87. Tandy Tk, Hoffenberg S. Emergency department ultrasound services by emergency physicians: model for gaining hospital approval. Ann Emerg Med. 1997;29(3):367-374. PubMed
88. Lewiss RE, Saul T, Del Rios M. Acquiring credentials in bedside ultrasound: a cross-sectional survey. BMJ Open. 2013;3:e003502. doi:10.1136/bmjopen-2013-003502 PubMed
89. Lanoix R. Credentialing issues in emergency ultrasonography. Emerg Med Clin North Am. 1997;15(4):913-920. PubMed
90. Scalea T, Rodriquez A, Chiu WC, et al. Focused assessment with sonography for trauma (FAST): results from an international consensus conference. J Trauma. 1999;46(3):466-472. PubMed
91. Hertzberg BS, Kliewer MA, Bowie JD, et al. Physician training requirements in sonography: how many cases are needed for competence? AJR. 2000;174(5):1221-1227. PubMed
92. Blaivas M, Theodoro DL, Sierzenski P. Proliferation of ultrasound fellowships in emergency medicine: how do we ensure future experts are expertly trained? Acad Emerg Med. 2002;9(8):863-864. PubMed
93. Bodenham AR. Editorial II: Ultrasound imaging by anaesthetists: training and accreditation issues. Br J Anaesth. 2006;96(4):414-417. PubMed
94. Williamson JP, Twaddell SH, Lee YCG, et al. Thoracic ultrasound recognition of competence: A position paper of the Thoracic Society of Australia and New Zealand. Respirology. 2017;22(2):405-408. PubMed
95. Harrison G. Summative clinical competency assessment: a survey of ultrasound practitioners’ views. Ultrasound. 2015;23(1):11-17. PubMed
96. Evans LV, Morse JL, Hamann CJ, Osborne M, Lin Z, D'Onofrio G. The development of an independent rater system to assess residents' competence in invasive procedures. Acad Med. 2009;84(8):1135-1143. PubMed
97. Wass V, Van der Vleuten C, Shatzer J, Jones R. Assessment of clinical competence. Lancet. 2001;357(9260):945-949. PubMed
98. Arntfield RT. The utility of remote supervision with feedback as a method to deliver high-volume critical care ultrasound training. J Crit Care. 2015;30(2):441.e1-e6. PubMed
99. Akhtar S, Theodoro D, Gaspari R, et al. Resident training in emergency ultrasound: consensus recommendations from the 2008 Council of Emergency Residency Directors Conference. Acad Emerg Med. 2009;16:S32-S36. PubMed
100. Yu E. The assessment of technical skills in a cardiology training program: is the ITER sufficient? Can J Cardiol. 2000;16(4):457-462. PubMed
101. Todsen T, Tolsgaard MG, Olsen BH, et al. Reliable and valid assessment of point-of-care ultrasonography. Ann Surg. 2015;261(2):309-315. PubMed
102. Stein JC, Nobay F. Emergency department ultrasound credentialing: a sample policy and procedure. J Emerg Med. 2009;37(2):153-159. PubMed
103. Chen FM. Burstin H, Huntington J. The importance of clinical outcomes in medical education research. Med Educ. 2005;39(4):350-351. PubMed
104. Dressler DD, Pistoria MJ, Budnitz TL, McKean SCW, Amin AN. Core competencies in hospital medicine: development and methodology. J Hosp Med. 2006;1:48-56. PubMed
105. ten Cate O. Nuts and bolts of entrustable professional activities. J Grad Med Educ. 2013;5(1):157-158. PubMed
106. Castillo J, Caruana CJ, Wainwright D. The changing concept of competence and categorisation of learning outcomes in Europe: Implications for the design of higher education radiography curricula at the European level. Radiography. 2011;17(3):230-234. 
107. Goldstein SR. Accreditation, certification: why all the confusion? Obstet Gynecol. 2007;110(6):1396-1398. PubMed
108. Moore CL. Credentialing and reimbursement in point-of-care ultrasound. Clin Pediatr Emerg Med. 2011;12(1):73-77. PubMed
109. ten Cate O, Scheele F. Competency-based postgraduate training: can we bridge the gap between theory and clinical practice? Acad Med. 2007;82(6):542-547. PubMed
110. Abuhamad AZ, Benacerraf BR, Woletz P, Burke BL. The accreditation of ultrasound practices: impact on compliance with minimum performance guidelines. J Ultrasound Med. 2004;23(8):1023-1029. PubMed

 

 

Issue
Journal of Hospital Medicine 13(2)
Issue
Journal of Hospital Medicine 13(2)
Page Number
126-135. Published online first January 17, 2018
Page Number
126-135. Published online first January 17, 2018
Publications
Publications
Topics
Article Type
Sections
Article Source

© 2018 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Current Affiliation - Brian P. Lucas, MD, MS, 215 N Main Street, White River Junction, VT; Telephone: 802-295-9363 extension 4314; Fax: 802-296-6325; E-mail: brian.p.lucas@dartmouth.edu
Content Gating
Open Access (article Unlocked/Open Access)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Article PDF Media
Media Files