Affiliations
Division of Pulmonary and Critical Care Medicine, Mayo Clinic, Rochester, Minnesota
Given name(s)
David M.
Family name
Tierney
Degrees
MD

Point-of-Care Ultrasound for Hospitalists: A Position Statement of the Society of Hospital Medicine

Article Type
Changed
Wed, 03/17/2021 - 08:24

Many hospitalists incorporate point-of-care ultrasound (POCUS) into their daily practice because it adds value to their bedside evaluation of patients. However, standards for training and assessing hospitalists in POCUS have not yet been established. Other acute care specialties, including emergency medicine and critical care medicine, have already incorporated POCUS into their graduate medical education training programs, but most internal medicine residency programs are only beginning to provide POCUS training.1

Several features distinguish POCUS from comprehensive ultrasound examinations. First, POCUS is designed to answer focused questions, whereas comprehensive ultrasound examinations evaluate all organs in an anatomical region; for example, an abdominal POCUS exam may evaluate only for presence or absence of intraperitoneal free fluid, whereas a comprehensive examination of the right upper quadrant will evaluate the liver, gallbladder, and biliary ducts. Second, POCUS examinations are generally performed by the same clinician who generates the relevant clinical question to answer with POCUS and ultimately integrates the findings into the patient’s care.2 By contrast, comprehensive ultrasound examinations involve multiple providers and steps: a clinician generates a relevant clinical question and requests an ultrasound examination that is acquired by a sonographer, interpreted by a radiologist, and reported back to the requesting clinician. Third, POCUS is often used to evaluate multiple body systems. For example, to evaluate a patient with undifferentiated hypotension, a multisystem POCUS examination of the heart, inferior vena cava, lungs, abdomen, and lower extremity veins is typically performed. Finally, POCUS examinations can be performed serially to investigate changes in clinical status or evaluate response to therapy, such as monitoring the heart, lungs, and inferior vena cava during fluid resuscitation.

The purpose of this position statement is to inform a broad audience about how hospitalists are using diagnostic and procedural applications of POCUS. This position statement does not mandate that hospitalists use POCUS. Rather, it is intended to provide guidance on the safe and effective use of POCUS by the hospitalists who use it and the administrators who oversee its use. We discuss POCUS (1) applications, (2) training, (3) assessments, and (4) program management. This position statement was reviewed and approved by the Society of Hospital Medicine (SHM) Executive Committee in March 2018.

 

 

APPLICATIONS

Common diagnostic and procedural applications of POCUS used by hospitalists are listed in Table 1. Selected evidence supporting the use of these applications is described in the supplementary online content (Appendices 1–8 available at http://journalofhospitalmedicine.com) and SHM position statements on specific ultrasound-guided bedside procedures.3,4 Additional applications not listed in Table 1 that may be performed by some hospitalists include assessment of the eyes, stomach, bowels, ovaries, pregnancy, and testicles, as well as performance of regional anesthesia. Moreover, hospitalists caring for pediatric and adolescent patients may use additional applications besides those listed here. Currently, many hospitalists already perform more complex and sophisticated POCUS examinations than those listed in Table 1. The scope of POCUS use by hospitalists continues to expand, and this position statement should not restrict that expansion.

As outlined in our earlier position statements,3,4 ultrasound guidance lowers complication rates and increases success rates of invasive bedside procedures. Diagnostic POCUS can guide clinical decision making prior to bedside procedures. For instance, hospitalists may use POCUS to assess the size and character of a pleural effusion to help determine the most appropriate management strategy: observation, medical treatment, thoracentesis, chest tube placement, or surgical therapy. Furthermore, diagnostic POCUS can be used to rapidly assess for immediate postprocedural complications, such as pneumothorax, or if the patient develops new symptoms.

TRAINING

Basic Knowledge

Basic knowledge includes fundamentals of ultrasound physics; safety;4 anatomy; physiology; and device operation, including maintenance and cleaning. Basic knowledge can be taught by multiple methods, including live or recorded lectures, online modules, or directed readings.

Image Acquisition

Training should occur across multiple types of patients (eg, obese, cachectic, postsurgical) and clinical settings (eg, intensive care unit, general medicine wards, emergency department) when available. Training is largely hands-on because the relevant skills involve integration of 3D anatomy with spatial manipulation, hand-eye coordination, and fine motor movements. Virtual reality ultrasound simulators may accelerate mastery, particularly for cardiac image acquisition, and expose learners to standardized sets of pathologic findings. Real-time bedside feedback on image acquisition is ideal because understanding how ultrasound probe manipulation affects the images acquired is essential to learning.

Image Interpretation

Training in image interpretation relies on visual pattern recognition of normal and abnormal findings. Therefore, the normal to abnormal spectrum should be broad, and learners should maintain a log of what abnormalities have been identified. Giving real-time feedback at the bedside is ideal because of the connection between image acquisition and interpretation. Image interpretation can be taught through didactic sessions, image review sessions, or review of teaching files with annotated images.

Clinical Integration

Learners must interpret and integrate image findings with other clinical data considering the image quality, patient characteristics, and changing physiology. Clinical integration should be taught by instructors that share similar clinical knowledge as learners. Although sonographers are well suited to teach image acquisition, they should not be the sole instructors to teach hospitalists how to integrate ultrasound findings in clinical decision making. Likewise, emphasis should be placed on the appropriate use of POCUS within a provider’s skill set. Learners must appreciate the clinical significance of POCUS findings, including recognition of incidental findings that may require further workup. Supplemental training in clinical integration can occur through didactics that include complex patient scenarios.

 

 

Pathways

Clinical competency can be achieved with training adherent to five criteria. First, the training environment should be similar to where the trainee will practice. Second, training and feedback should occur in real time. Third, specific applications should be taught rather than broad training in “hospitalist POCUS.” Each application requires unique skills and knowledge, including image acquisition pitfalls and artifacts. Fourth, clinical competence must be achieved and demonstrated; it is not necessarily gained through experience. Fifth, once competency is achieved, continued education and feedback are necessary to ensure it is maintained.

Residency-based POCUS training pathways can best fulfill these criteria. They may eventually become commonplace, but until then alternative pathways must exist for hospitalist providers who are already in practice. There are three important attributes of such pathways. First, administrators’ expectations about learners’ clinical productivity must be realistically, but only temporarily, relaxed; otherwise, competing demands on time will likely overwhelm learners and subvert training. Second, training should begin through a local or national hands-on training program. The SHM POCUS certificate program consolidates training for common diagnostic POCUS applications for hospitalists.6 Other medical societies offer training for their respective clinical specialties.7 Third, once basic POCUS training has begun, longitudinal training should continue ideally with a local hospitalist POCUS expert.

In some settings, a subgroup of hospitalists may not desire, or be able to achieve, competency in the manual skills of POCUS image acquisition. Nevertheless, hospitalists may still find value in understanding POCUS nomenclature, image pattern recognition, and the evidence and pitfalls behind clinical integration of specific POCUS findings. This subset of POCUS skills allows hospitalists to communicate effectively with and understand the clinical decisions made by their colleagues who are competent in POCUS use.

The minimal skills a hospitalist should possess to serve as a POCUS trainer include proficiency of basic knowledge, image acquisition, image interpretation, and clinical integration of the POCUS applications being taught; effectiveness as a hands-on instructor to teach image acquisition skills; and an in-depth understanding of common POCUS pitfalls and limitations.

ASSESSMENTS

Assessment methods for POCUS can include the following: knowledge-based questions, image acquisition using task-specific checklists on human or simulation models, image interpretation using a series of videos or still images with normal and abnormal findings, clinical integration using “next best step” in a multiple choice format with POCUS images, and simulation-based clinical scenarios. Assessment methods should be aligned with local availability of resources and trainers.

Basic Knowledge

Basic knowledge can be assessed via multiple choice questions assessing knowledge of ultrasound physics, image optimization, relevant anatomy, and limitations of POCUS imaging. Basic knowledge lies primarily in the cognitive domain and does not assess manual skills.

Image Acquisition

Image acquisition can be assessed by observation and rating of image quality. Where resources allow, assessment of image acquisition is likely best done through a combination of developing an image portfolio with a minimum number of high quality images, plus direct observation of image acquisition by an expert. Various programs have utilized minimum numbers of images acquired to help define competence with image acquisition skills.6–8 Although minimums may be a necessary step to gain competence, using them as a sole means to determine competence does not account for variable learning curves.9 As with other manual skills in hospital medicine, such as ultrasound-guided bedside procedures, minimum numbers are best used as a starting point for assessments.3,10 In this regard, portfolio development with meticulous attention to the gain, depth, and proper tomographic plane of images can monitor a hospitalist’s progress toward competence by providing objective assessments and feedback. Simulation may also be used as it allows assessment of image acquisition skills and an opportunity to provide real-time feedback, similar to direct observation but without actual patients.

 

 

Image Interpretation

Image interpretation is best assessed by an expert observing the learner at bedside; however, when bedside assessment is not possible, image interpretation skills may be assessed using multiple choice or free text interpretation of archived ultrasound images with normal and abnormal findings. This is often incorporated into the portfolio development portion of a training program, as learners can submit their image interpretation along with the video clip. Both normal and abnormal images can be used to assess anatomic recognition and interpretation. Emphasis should be placed on determining when an image is suboptimal for diagnosis (eg, incomplete exam or poor-quality images). Quality assurance programs should incorporate structured feedback sessions.

Clinical Integration

Assessment of clinical integration can be completed through case scenarios that assess knowledge, interpretation of images, and integration of findings into clinical decision making, which is often delivered via a computer-based assessment. Assessments should combine specific POCUS applications to evaluate common clinical problems in hospital medicine, such as undifferentiated hypotension and dyspnea. High-fidelity simulators can be used to blend clinical case scenarios with image acquisition, image interpretation, and clinical integration. When feasible, comprehensive feedback on how providers acquire, interpret, and apply ultrasound at the bedside is likely the best mechanism to assess clinical integration. This process can be done with a hospitalist’s own patients.

General Assessment

A general assessment that includes a summative knowledge and hands-on skills assessment using task-specific checklists can be performed upon completion of training. A high-fidelity simulator with dynamic or virtual anatomy can provide reproducible standardized assessments with variation in the type and difficulty of cases. When available, we encourage the use of dynamic assessments on actual patients that have both normal and abnormal ultrasound findings because simulated patient scenarios have limitations, even with the use of high-fidelity simulators. Programs are recommended to use formative and summative assessments for evaluation. Quantitative scoring systems using checklists are likely the best framework.11,12

CERTIFICATES AND CERTIFICATION

A certificate of completion is proof of a provider’s participation in an educational activity; it does not equate with competency, though it may be a step toward it. Most POCUS training workshops and short courses provide certificates of completion. Certification of competency is an attestation of a hospitalist’s basic competence within a defined scope of practice (Table 2).13 However, without longitudinal supervision and feedback, skills can decay; therefore, we recommend a longitudinal training program that provides mentored feedback and incorporates periodic competency assessments. At present, no national board certification in POCUS is available to grant external certification of competency for hospitalists.

External Certificate

Certificates of completion can be external through a national organization. An external certificate of completion designed for hospitalists includes the POCUS Certificate of Completion offered by SHM in collaboration with CHEST.6 This certificate program provides regional training options and longitudinal portfolio development. Other external certificates are also available to hospitalists.7,14,15

Most hospitalists are boarded by the American Board of Internal Medicine or the American Board of Family Medicine. These boards do not yet include certification of competency in POCUS. Other specialty boards, such as emergency medicine, include competency in POCUS. For emergency medicine, completion of an accredited residency training program and certification by the national board includes POCUS competency.

 

 

Internal Certificate

There are a few examples of successful local institutional programs that have provided internal certificates of competency.12,14 Competency assessments require significant resources including investment by both faculty and learners. Ongoing evaluation of competency should be based on quality assurance processes.

Credentialing and Privileging

The American Medical Association (AMA) House of Delegates in 1999 passed a resolution (AMA HR. 802) recommending hospitals follow specialty-specific guidelines for privileging decisions related to POCUS use.17 The resolution included a statement that, “ultrasound imaging is within the scope of practice of appropriately trained physicians.”

Some institutions have begun to rely on a combination of internal and external certificate programs to grant privileges to hospitalists.10 Although specific privileges for POCUS may not be required in some hospitals, some institutions may require certification of training and assessments prior to granting permission to use POCUS.

Hospitalist programs are encouraged to evaluate ongoing POCUS use by their providers after granting initial permission. If privileging is instituted by a hospital, hospitalists must play a significant role in determining the requirements for privileging and ongoing maintenance of skills.

Maintenance of Skills

All medical skills can decay with disuse, including those associated with POCUS.12,18 Thus, POCUS users should continue using POCUS regularly in clinical practice and participate in POCUS continuing medical education activities, ideally with ongoing assessments. Maintenance of skills may be confirmed through routine participation in a quality assurance program.

PROGRAM MANAGEMENT

Use of POCUS in hospital medicine has unique considerations, and hospitalists should be integrally involved in decision making surrounding institutional POCUS program management. Appointing a dedicated POCUS director can help a program succeed.8

Equipment and Image Archiving

Several factors are important to consider when selecting an ultrasound machine: portability, screen size, and ease of use; integration with the electronic medical record and options for image archiving; manufacturer’s service plan, including technical and clinical support; and compliance with local infection control policies. The ability to easily archive and retrieve images is essential for quality assurance, continuing education, institutional quality improvement, documentation, and reimbursement. In certain scenarios, image archiving may not be possible (such as with personal handheld devices or in emergency situations) or necessary (such as with frequent serial examinations during fluid resuscitation). An image archive is ideally linked to reports, orders, and billing software.10,19 If such linkages are not feasible, parallel external storage that complies with regulatory standards (ie, HIPAA compliance) may be suitable.20

Documentation and Billing

Components of documentation include the indication and type of ultrasound examination performed, date and time of the examination, patient identifying information, name of provider(s) acquiring and interpreting the images, specific scanning protocols used, patient position, probe used, and findings. Documentation can occur through a standalone note or as part of another note, such as a progress note. Whenever possible, documentation should be timely to facilitate communication with other providers.

Billing is supported through the AMA Current Procedural Terminology codes for “focused” or “limited” ultrasound examinations (Appendix 9). The following three criteria must be satisfied for billing. First, images must be permanently stored. Specific requirements vary by insurance policy, though current practice suggests a minimum of one image demonstrating relevant anatomy and pathology for the ultrasound examination coded. For ultrasound-guided procedures that require needle insertion, images should be captured at the point of interest, and a procedure note should reflect that the needle was guided and visualized under ultrasound.21 Second, proper documentation must be entered in the medical record. Third, local institutional privileges for POCUS must be considered. Although privileges are not required to bill, some hospitals or payers may require them.

 

 

Quality Assurance

Published guidelines on quality assurance in POCUS are available from different specialty organizations, including emergency medicine, pediatric emergency medicine, critical care, anesthesiology, obstetrics, and cardiology.8,22–28 Quality assurance is aimed at ensuring that physicians maintain basic competency in using POCUS to influence bedside decisions.

Quality assurance should be carried out by an individual or committee with expertise in POCUS. Multidisciplinary QA programs in which hospital medicine providers are working collaboratively with other POCUS providers have been demonstrated to be highly effective.10 Oversight includes ensuring that providers using POCUS are appropriately trained,10,22,28 using the equipment correctly,8,26,28 and documenting properly. Some programs have implemented mechanisms to review and provide feedback on image acquisition, interpretation, and clinical integration.8,10 Other programs have compared POCUS findings with referral studies, such as comprehensive ultrasound examinations.

CONCLUSIONS

Practicing hospitalists must continue to collaborate with their institutions to build POCUS capabilities. In particular, they must work with their local privileging body to determine what credentials are required. The distinction between certificates of completion and certificates of competency, including whether those certificates are internal or external, is important in the credentialing process.

External certificates of competency are currently unavailable for most practicing hospitalists because ABIM certification does not include POCUS-related competencies. As internal medicine residency training programs begin to adopt POCUS training and certification into their educational curricula, we foresee a need to update the ABIM Policies and Procedures for Certification. Until then, we recommend that certificates of competency be defined and granted internally by local hospitalist groups.

Given the many advantages of POCUS over traditional tools, we anticipate its increasing implementation among hospitalists in the future. As with all medical technology, its role in clinical care should be continuously reexamined and redefined through health services research. Such information will be useful in developing practice guidelines, educational curricula, and training standards.

Acknowledgments

The authors would like to thank all members that participated in the discussion and finalization of this position statement during the Point-of-care Ultrasound Faculty Retreat at the 2018 Society of Hospital Medicine Annual Conference: Saaid Abdel-Ghani, Brandon Boesch, Joel Cho, Ria Dancel, Renee Dversdal, Ricardo Franco-Sadud, Benjamin Galen, Trevor P. Jensen, Mohit Jindal, Gordon Johnson, Linda M. Kurian, Gigi Liu, Charles M. LoPresti, Brian P. Lucas, Venkat Kalidindi, Benji Matthews, Anna Maw, Gregory Mints, Kreegan Reierson, Gerard Salame, Richard Schildhouse, Daniel Schnobrich, Nilam Soni, Kirk Spencer, Hiromizu Takahashi, David M. Tierney, Tanping Wong, and Toru Yamada.

Files
References

1. Schnobrich DJ, Mathews BK, Trappey BE, Muthyala BK, Olson APJ. Entrusting internal medicine residents to use point of care ultrasound: Towards improved assessment and supervision. Med Teach. 2018:1-6. doi:10.1080/0142159X.2018.1457210.
2. Soni NJ, Lucas BP. Diagnostic point-of-care ultrasound for hospitalists. J Hosp Med. 2015;10(2):120-124. doi:10.1002/jhm.2285.
3. Lucas BP, Tierney DM, Jensen TP, et al. Credentialing of hospitalists in ultrasound-guided bedside procedures: a position statement of the society of hospital medicine. J Hosp Med. 2018;13(2):117-125. doi:10.12788/jhm.2917.
4. Dancel R, Schnobrich D, Puri N, et al. Recommendations on the use of ultrasound guidance for adult thoracentesis: a position statement of the society of hospital medicine. J Hosp Med. 2018;13(2):126-135. doi:10.12788/jhm.2940.
5. National Council on Radiation Protection and Measurements, The Council. Implementation of the Principle of as Low as Reasonably Achievable (ALARA) for Medical and Dental Personnel.; 1990.
6. Society of Hospital Medicine. Point of Care Ultrasound course: https://www.hospitalmedicine.org/clinical-topics/ultrasonography-cert/. Accessed February 6, 2018.
7. Critical Care Ultrasonography Certificate of Completion Program. CHEST. American College of Chest Physicians. http://www.chestnet.org/Education/Advanced-Clinical-Training/Certificate-of-Completion-Program/Critical-Care-Ultrasonography. Accessed February 6, 2018.
8. American College of Emergency Physicians Policy Statement: Emergency Ultrasound Guidelines. 2016. https://www.acep.org/Clinical---Practice-Management/ACEP-Ultrasound-Guidelines/. Accessed February 6, 2018.
9. Blehar DJ, Barton B, Gaspari RJ. Learning curves in emergency ultrasound education. Acad Emerg Med. 2015;22(5):574-582. doi:10.1111/acem.12653.
10. Mathews BK, Zwank M. Hospital medicine point of care ultrasound credentialing: an example protocol. J Hosp Med. 2017;12(9):767-772. doi:10.12788/jhm.2809.
11. Barsuk JH, McGaghie WC, Cohen ER, Balachandran JS, Wayne DB. Use of simulation-based mastery learning to improve the quality of central venous catheter placement in a medical intensive care unit. J Hosp Med. 2009;4(7):397-403. doi:10.1002/jhm.468.
12. Mathews BK, Reierson K, Vuong K, et al. The design and evaluation of the Comprehensive Hospitalist Assessment and Mentorship with Portfolios (CHAMP) ultrasound program. J Hosp Med. 2018;13(8):544-550. doi:10.12788/jhm.2938.
13. Soni NJ, Tierney DM, Jensen TP, Lucas BP. Certification of point-of-care ultrasound competency. J Hosp Med. 2017;12(9):775-776. doi:10.12788/jhm.2812.
14. Ultrasound Certification for Physicians. Alliance for Physician Certification and Advancement. APCA. https://apca.org/. Accessed February 6, 2018.
15. National Board of Echocardiography, Inc. https://www.echoboards.org/EchoBoards/News/2019_Adult_Critical_Care_Echocardiography_Exam.aspx. Accessed June 18, 2018.
16. Tierney DM. Internal Medicine Bedside Ultrasound Program (IMBUS). Abbott Northwestern. http://imbus.anwresidency.com/index.html. Accessed February 6, 2018.
17. American Medical Association House of Delegates Resolution H-230.960: Privileging for Ultrasound Imaging. Resolution 802. Policy Finder Website. http://search0.ama-assn.org/search/pfonline. Published 1999. Accessed February 18, 2018.
18. Kelm D, Ratelle J, Azeem N, et al. Longitudinal ultrasound curriculum improves long-term retention among internal medicine residents. J Grad Med Educ. 2015;7(3):454-457. doi:10.4300/JGME-14-00284.1.
19. Flannigan MJ, Adhikari S. Point-of-care ultrasound work flow innovation: impact on documentation and billing. J Ultrasound Med. 2017;36(12):2467-2474. doi:10.1002/jum.14284.
20. Emergency Ultrasound: Workflow White Paper. https://www.acep.org/uploadedFiles/ACEP/memberCenter/SectionsofMembership/ultra/Workflow%20White%20Paper.pdf. Published 2013. Accessed February 18, 2018.
21. Ultrasound Coding and Reimbursement Document 2009. Emergency Ultrasound Section. American College of Emergency Physicians. http://emergencyultrasoundteaching.com/assets/2009_coding_update.pdf. Published 2009. Accessed February 18, 2018.
22. Mayo PH, Beaulieu Y, Doelken P, et al. American College of Chest Physicians/La Societe de Reanimation de Langue Francaise statement on competence in critical care ultrasonography. Chest. 2009;135(4):1050-1060. doi:10.1378/chest.08-2305.
23. Frankel HL, Kirkpatrick AW, Elbarbary M, et al. Guidelines for the appropriate use of bedside general and cardiac ultrasonography in the evaluation of critically ill patients-part I: general ultrasonography. Crit Care Med. 2015;43(11):2479-2502. doi:10.1097/ccm.0000000000001216.
24. Levitov A, Frankel HL, Blaivas M, et al. Guidelines for the appropriate use of bedside general and cardiac ultrasonography in the evaluation of critically ill patients-part ii: cardiac ultrasonography. Crit Care Med. 2016;44(6):1206-1227. doi:10.1097/ccm.0000000000001847.
25. ACR–ACOG–AIUM–SRU Practice Parameter for the Performance of Obstetrical Ultrasound. https://www.acr.org/-/media/ACR/Files/Practice-Parameters/us-ob.pdf. Published 2013. Accessed February 18, 2018.
26. AIUM practice guideline for documentation of an ultrasound examination. J Ultrasound Med. 2014;33(6):1098-1102. doi:10.7863/ultra.33.6.1098.
27. Marin JR, Lewiss RE. Point-of-care ultrasonography by pediatric emergency medicine physicians. Pediatrics. 2015;135(4):e1113-e1122. doi:10.1542/peds.2015-0343.
28. Spencer KT, Kimura BJ, Korcarz CE, Pellikka PA, Rahko PS, Siegel RJ. Focused cardiac ultrasound: recommendations from the American Society of Echocardiography. J Am Soc Echocardiogr. 2013;26(6):567-581. doi:10.1016/j.echo.2013.04.001.

Article PDF
Author and Disclosure Information

1Division of General & Hospital Medicine, The University of Texas Health San Antonio, San Antonio, Texas; 2Section of Hospital Medicine, South Texas Veterans Health Care System, San Antonio, Texas; 3Divisions of General Internal Medicine and Hospital Pediatrics, University of Minnesota, Minneapolis, Minnesota; 4Department of Hospital Medicine, HealthPartners Medical Group, Regions Hospital, St. Paul, Minnesota; 5Department of Medical Education, Abbott Northwestern Hospital, Minneapolis, Minnesota; 6Division of Hospital Medicine, Department of Medicine, University of California San Francisco, San Francisco, California; 7Division of Hospital Medicine, Department of Medicine, University of North Carolina, Chapel Hill, North Carolina; 8Division of General Pediatrics and Adolescent Medicine, Department of Pediatrics, University of North Carolina, Chapel Hill, North Carolina; 9Department of Hospital Medicine, Kaiser Permanente San Francisco Medical Center, San Francisco, California; 10Division of Hospital Medicine, Oregon Health & Science University, Portland, Oregon; 11Division of Hospital Medicine, Weill Cornell Medicine, New York, New York; 12Division of General Internal Medicine, Mayo Clinic, Rochester, Minnesota; 13Division of Hospital Medicine, Zucker School of Medicine at Hofstra Northwell, New Hyde Park, New York; 14Hospitalist Program, Division of General Internal Medicine, Department of Medicine, Johns Hopkins School of Medicine, Baltimore, Maryland; 15Division of Hospital Medicine, University of California Davis, Davis, California; 16Division of Hospital Medicine, Alameda Health System-Highland Hospital, Oakland, California; 17Louis Stokes Cleveland Veterans Affairs Hospital, Cleveland, Ohio; 18Case Western Reserve University School of Medicine, Cleveland, Ohio; 19Division of Hospital Medicine, University of Miami, Miami, Florida; 20Division of Hospital Medicine, Legacy Healthcare System, Portland, Oregon; 21Division of Hospital Medicine, University of Colorado, Aurora, Colorado; 22Department of Medicine, University of Central Florida, Naples, Florida; 23White River Junction VA Medical Center, White River Junction, Vermont; 24Geisel School of Medicine at Dartmouth College, Hanover, New Hampshire.

Funding

Nilam Soni: Department of Veterans Affairs, Quality Enhancement Research Initiative (QUERI) Partnered Evaluation Initiative Grant (HX002263-01A1). Brian P Lucas: Department of Veterans Affairs, Veterans Health Administration, Office of Research and Development and Dartmouth SYNERGY, National Institutes of Health, National Center for Translational Science (UL1TR001086)

Disclaimer

The contents of this publication do not represent the views of the US Department of Veterans Affairs or the United States Government.

Publications
Topics
Sections
Files
Files
Author and Disclosure Information

1Division of General & Hospital Medicine, The University of Texas Health San Antonio, San Antonio, Texas; 2Section of Hospital Medicine, South Texas Veterans Health Care System, San Antonio, Texas; 3Divisions of General Internal Medicine and Hospital Pediatrics, University of Minnesota, Minneapolis, Minnesota; 4Department of Hospital Medicine, HealthPartners Medical Group, Regions Hospital, St. Paul, Minnesota; 5Department of Medical Education, Abbott Northwestern Hospital, Minneapolis, Minnesota; 6Division of Hospital Medicine, Department of Medicine, University of California San Francisco, San Francisco, California; 7Division of Hospital Medicine, Department of Medicine, University of North Carolina, Chapel Hill, North Carolina; 8Division of General Pediatrics and Adolescent Medicine, Department of Pediatrics, University of North Carolina, Chapel Hill, North Carolina; 9Department of Hospital Medicine, Kaiser Permanente San Francisco Medical Center, San Francisco, California; 10Division of Hospital Medicine, Oregon Health & Science University, Portland, Oregon; 11Division of Hospital Medicine, Weill Cornell Medicine, New York, New York; 12Division of General Internal Medicine, Mayo Clinic, Rochester, Minnesota; 13Division of Hospital Medicine, Zucker School of Medicine at Hofstra Northwell, New Hyde Park, New York; 14Hospitalist Program, Division of General Internal Medicine, Department of Medicine, Johns Hopkins School of Medicine, Baltimore, Maryland; 15Division of Hospital Medicine, University of California Davis, Davis, California; 16Division of Hospital Medicine, Alameda Health System-Highland Hospital, Oakland, California; 17Louis Stokes Cleveland Veterans Affairs Hospital, Cleveland, Ohio; 18Case Western Reserve University School of Medicine, Cleveland, Ohio; 19Division of Hospital Medicine, University of Miami, Miami, Florida; 20Division of Hospital Medicine, Legacy Healthcare System, Portland, Oregon; 21Division of Hospital Medicine, University of Colorado, Aurora, Colorado; 22Department of Medicine, University of Central Florida, Naples, Florida; 23White River Junction VA Medical Center, White River Junction, Vermont; 24Geisel School of Medicine at Dartmouth College, Hanover, New Hampshire.

Funding

Nilam Soni: Department of Veterans Affairs, Quality Enhancement Research Initiative (QUERI) Partnered Evaluation Initiative Grant (HX002263-01A1). Brian P Lucas: Department of Veterans Affairs, Veterans Health Administration, Office of Research and Development and Dartmouth SYNERGY, National Institutes of Health, National Center for Translational Science (UL1TR001086)

Disclaimer

The contents of this publication do not represent the views of the US Department of Veterans Affairs or the United States Government.

Author and Disclosure Information

1Division of General & Hospital Medicine, The University of Texas Health San Antonio, San Antonio, Texas; 2Section of Hospital Medicine, South Texas Veterans Health Care System, San Antonio, Texas; 3Divisions of General Internal Medicine and Hospital Pediatrics, University of Minnesota, Minneapolis, Minnesota; 4Department of Hospital Medicine, HealthPartners Medical Group, Regions Hospital, St. Paul, Minnesota; 5Department of Medical Education, Abbott Northwestern Hospital, Minneapolis, Minnesota; 6Division of Hospital Medicine, Department of Medicine, University of California San Francisco, San Francisco, California; 7Division of Hospital Medicine, Department of Medicine, University of North Carolina, Chapel Hill, North Carolina; 8Division of General Pediatrics and Adolescent Medicine, Department of Pediatrics, University of North Carolina, Chapel Hill, North Carolina; 9Department of Hospital Medicine, Kaiser Permanente San Francisco Medical Center, San Francisco, California; 10Division of Hospital Medicine, Oregon Health & Science University, Portland, Oregon; 11Division of Hospital Medicine, Weill Cornell Medicine, New York, New York; 12Division of General Internal Medicine, Mayo Clinic, Rochester, Minnesota; 13Division of Hospital Medicine, Zucker School of Medicine at Hofstra Northwell, New Hyde Park, New York; 14Hospitalist Program, Division of General Internal Medicine, Department of Medicine, Johns Hopkins School of Medicine, Baltimore, Maryland; 15Division of Hospital Medicine, University of California Davis, Davis, California; 16Division of Hospital Medicine, Alameda Health System-Highland Hospital, Oakland, California; 17Louis Stokes Cleveland Veterans Affairs Hospital, Cleveland, Ohio; 18Case Western Reserve University School of Medicine, Cleveland, Ohio; 19Division of Hospital Medicine, University of Miami, Miami, Florida; 20Division of Hospital Medicine, Legacy Healthcare System, Portland, Oregon; 21Division of Hospital Medicine, University of Colorado, Aurora, Colorado; 22Department of Medicine, University of Central Florida, Naples, Florida; 23White River Junction VA Medical Center, White River Junction, Vermont; 24Geisel School of Medicine at Dartmouth College, Hanover, New Hampshire.

Funding

Nilam Soni: Department of Veterans Affairs, Quality Enhancement Research Initiative (QUERI) Partnered Evaluation Initiative Grant (HX002263-01A1). Brian P Lucas: Department of Veterans Affairs, Veterans Health Administration, Office of Research and Development and Dartmouth SYNERGY, National Institutes of Health, National Center for Translational Science (UL1TR001086)

Disclaimer

The contents of this publication do not represent the views of the US Department of Veterans Affairs or the United States Government.

Article PDF
Article PDF
Related Articles

Many hospitalists incorporate point-of-care ultrasound (POCUS) into their daily practice because it adds value to their bedside evaluation of patients. However, standards for training and assessing hospitalists in POCUS have not yet been established. Other acute care specialties, including emergency medicine and critical care medicine, have already incorporated POCUS into their graduate medical education training programs, but most internal medicine residency programs are only beginning to provide POCUS training.1

Several features distinguish POCUS from comprehensive ultrasound examinations. First, POCUS is designed to answer focused questions, whereas comprehensive ultrasound examinations evaluate all organs in an anatomical region; for example, an abdominal POCUS exam may evaluate only for presence or absence of intraperitoneal free fluid, whereas a comprehensive examination of the right upper quadrant will evaluate the liver, gallbladder, and biliary ducts. Second, POCUS examinations are generally performed by the same clinician who generates the relevant clinical question to answer with POCUS and ultimately integrates the findings into the patient’s care.2 By contrast, comprehensive ultrasound examinations involve multiple providers and steps: a clinician generates a relevant clinical question and requests an ultrasound examination that is acquired by a sonographer, interpreted by a radiologist, and reported back to the requesting clinician. Third, POCUS is often used to evaluate multiple body systems. For example, to evaluate a patient with undifferentiated hypotension, a multisystem POCUS examination of the heart, inferior vena cava, lungs, abdomen, and lower extremity veins is typically performed. Finally, POCUS examinations can be performed serially to investigate changes in clinical status or evaluate response to therapy, such as monitoring the heart, lungs, and inferior vena cava during fluid resuscitation.

The purpose of this position statement is to inform a broad audience about how hospitalists are using diagnostic and procedural applications of POCUS. This position statement does not mandate that hospitalists use POCUS. Rather, it is intended to provide guidance on the safe and effective use of POCUS by the hospitalists who use it and the administrators who oversee its use. We discuss POCUS (1) applications, (2) training, (3) assessments, and (4) program management. This position statement was reviewed and approved by the Society of Hospital Medicine (SHM) Executive Committee in March 2018.

 

 

APPLICATIONS

Common diagnostic and procedural applications of POCUS used by hospitalists are listed in Table 1. Selected evidence supporting the use of these applications is described in the supplementary online content (Appendices 1–8 available at http://journalofhospitalmedicine.com) and SHM position statements on specific ultrasound-guided bedside procedures.3,4 Additional applications not listed in Table 1 that may be performed by some hospitalists include assessment of the eyes, stomach, bowels, ovaries, pregnancy, and testicles, as well as performance of regional anesthesia. Moreover, hospitalists caring for pediatric and adolescent patients may use additional applications besides those listed here. Currently, many hospitalists already perform more complex and sophisticated POCUS examinations than those listed in Table 1. The scope of POCUS use by hospitalists continues to expand, and this position statement should not restrict that expansion.

As outlined in our earlier position statements,3,4 ultrasound guidance lowers complication rates and increases success rates of invasive bedside procedures. Diagnostic POCUS can guide clinical decision making prior to bedside procedures. For instance, hospitalists may use POCUS to assess the size and character of a pleural effusion to help determine the most appropriate management strategy: observation, medical treatment, thoracentesis, chest tube placement, or surgical therapy. Furthermore, diagnostic POCUS can be used to rapidly assess for immediate postprocedural complications, such as pneumothorax, or if the patient develops new symptoms.

TRAINING

Basic Knowledge

Basic knowledge includes fundamentals of ultrasound physics; safety;4 anatomy; physiology; and device operation, including maintenance and cleaning. Basic knowledge can be taught by multiple methods, including live or recorded lectures, online modules, or directed readings.

Image Acquisition

Training should occur across multiple types of patients (eg, obese, cachectic, postsurgical) and clinical settings (eg, intensive care unit, general medicine wards, emergency department) when available. Training is largely hands-on because the relevant skills involve integration of 3D anatomy with spatial manipulation, hand-eye coordination, and fine motor movements. Virtual reality ultrasound simulators may accelerate mastery, particularly for cardiac image acquisition, and expose learners to standardized sets of pathologic findings. Real-time bedside feedback on image acquisition is ideal because understanding how ultrasound probe manipulation affects the images acquired is essential to learning.

Image Interpretation

Training in image interpretation relies on visual pattern recognition of normal and abnormal findings. Therefore, the normal to abnormal spectrum should be broad, and learners should maintain a log of what abnormalities have been identified. Giving real-time feedback at the bedside is ideal because of the connection between image acquisition and interpretation. Image interpretation can be taught through didactic sessions, image review sessions, or review of teaching files with annotated images.

Clinical Integration

Learners must interpret and integrate image findings with other clinical data considering the image quality, patient characteristics, and changing physiology. Clinical integration should be taught by instructors that share similar clinical knowledge as learners. Although sonographers are well suited to teach image acquisition, they should not be the sole instructors to teach hospitalists how to integrate ultrasound findings in clinical decision making. Likewise, emphasis should be placed on the appropriate use of POCUS within a provider’s skill set. Learners must appreciate the clinical significance of POCUS findings, including recognition of incidental findings that may require further workup. Supplemental training in clinical integration can occur through didactics that include complex patient scenarios.

 

 

Pathways

Clinical competency can be achieved with training adherent to five criteria. First, the training environment should be similar to where the trainee will practice. Second, training and feedback should occur in real time. Third, specific applications should be taught rather than broad training in “hospitalist POCUS.” Each application requires unique skills and knowledge, including image acquisition pitfalls and artifacts. Fourth, clinical competence must be achieved and demonstrated; it is not necessarily gained through experience. Fifth, once competency is achieved, continued education and feedback are necessary to ensure it is maintained.

Residency-based POCUS training pathways can best fulfill these criteria. They may eventually become commonplace, but until then alternative pathways must exist for hospitalist providers who are already in practice. There are three important attributes of such pathways. First, administrators’ expectations about learners’ clinical productivity must be realistically, but only temporarily, relaxed; otherwise, competing demands on time will likely overwhelm learners and subvert training. Second, training should begin through a local or national hands-on training program. The SHM POCUS certificate program consolidates training for common diagnostic POCUS applications for hospitalists.6 Other medical societies offer training for their respective clinical specialties.7 Third, once basic POCUS training has begun, longitudinal training should continue ideally with a local hospitalist POCUS expert.

In some settings, a subgroup of hospitalists may not desire, or be able to achieve, competency in the manual skills of POCUS image acquisition. Nevertheless, hospitalists may still find value in understanding POCUS nomenclature, image pattern recognition, and the evidence and pitfalls behind clinical integration of specific POCUS findings. This subset of POCUS skills allows hospitalists to communicate effectively with and understand the clinical decisions made by their colleagues who are competent in POCUS use.

The minimal skills a hospitalist should possess to serve as a POCUS trainer include proficiency of basic knowledge, image acquisition, image interpretation, and clinical integration of the POCUS applications being taught; effectiveness as a hands-on instructor to teach image acquisition skills; and an in-depth understanding of common POCUS pitfalls and limitations.

ASSESSMENTS

Assessment methods for POCUS can include the following: knowledge-based questions, image acquisition using task-specific checklists on human or simulation models, image interpretation using a series of videos or still images with normal and abnormal findings, clinical integration using “next best step” in a multiple choice format with POCUS images, and simulation-based clinical scenarios. Assessment methods should be aligned with local availability of resources and trainers.

Basic Knowledge

Basic knowledge can be assessed via multiple choice questions assessing knowledge of ultrasound physics, image optimization, relevant anatomy, and limitations of POCUS imaging. Basic knowledge lies primarily in the cognitive domain and does not assess manual skills.

Image Acquisition

Image acquisition can be assessed by observation and rating of image quality. Where resources allow, assessment of image acquisition is likely best done through a combination of developing an image portfolio with a minimum number of high quality images, plus direct observation of image acquisition by an expert. Various programs have utilized minimum numbers of images acquired to help define competence with image acquisition skills.6–8 Although minimums may be a necessary step to gain competence, using them as a sole means to determine competence does not account for variable learning curves.9 As with other manual skills in hospital medicine, such as ultrasound-guided bedside procedures, minimum numbers are best used as a starting point for assessments.3,10 In this regard, portfolio development with meticulous attention to the gain, depth, and proper tomographic plane of images can monitor a hospitalist’s progress toward competence by providing objective assessments and feedback. Simulation may also be used as it allows assessment of image acquisition skills and an opportunity to provide real-time feedback, similar to direct observation but without actual patients.

 

 

Image Interpretation

Image interpretation is best assessed by an expert observing the learner at bedside; however, when bedside assessment is not possible, image interpretation skills may be assessed using multiple choice or free text interpretation of archived ultrasound images with normal and abnormal findings. This is often incorporated into the portfolio development portion of a training program, as learners can submit their image interpretation along with the video clip. Both normal and abnormal images can be used to assess anatomic recognition and interpretation. Emphasis should be placed on determining when an image is suboptimal for diagnosis (eg, incomplete exam or poor-quality images). Quality assurance programs should incorporate structured feedback sessions.

Clinical Integration

Assessment of clinical integration can be completed through case scenarios that assess knowledge, interpretation of images, and integration of findings into clinical decision making, which is often delivered via a computer-based assessment. Assessments should combine specific POCUS applications to evaluate common clinical problems in hospital medicine, such as undifferentiated hypotension and dyspnea. High-fidelity simulators can be used to blend clinical case scenarios with image acquisition, image interpretation, and clinical integration. When feasible, comprehensive feedback on how providers acquire, interpret, and apply ultrasound at the bedside is likely the best mechanism to assess clinical integration. This process can be done with a hospitalist’s own patients.

General Assessment

A general assessment that includes a summative knowledge and hands-on skills assessment using task-specific checklists can be performed upon completion of training. A high-fidelity simulator with dynamic or virtual anatomy can provide reproducible standardized assessments with variation in the type and difficulty of cases. When available, we encourage the use of dynamic assessments on actual patients that have both normal and abnormal ultrasound findings because simulated patient scenarios have limitations, even with the use of high-fidelity simulators. Programs are recommended to use formative and summative assessments for evaluation. Quantitative scoring systems using checklists are likely the best framework.11,12

CERTIFICATES AND CERTIFICATION

A certificate of completion is proof of a provider’s participation in an educational activity; it does not equate with competency, though it may be a step toward it. Most POCUS training workshops and short courses provide certificates of completion. Certification of competency is an attestation of a hospitalist’s basic competence within a defined scope of practice (Table 2).13 However, without longitudinal supervision and feedback, skills can decay; therefore, we recommend a longitudinal training program that provides mentored feedback and incorporates periodic competency assessments. At present, no national board certification in POCUS is available to grant external certification of competency for hospitalists.

External Certificate

Certificates of completion can be external through a national organization. An external certificate of completion designed for hospitalists includes the POCUS Certificate of Completion offered by SHM in collaboration with CHEST.6 This certificate program provides regional training options and longitudinal portfolio development. Other external certificates are also available to hospitalists.7,14,15

Most hospitalists are boarded by the American Board of Internal Medicine or the American Board of Family Medicine. These boards do not yet include certification of competency in POCUS. Other specialty boards, such as emergency medicine, include competency in POCUS. For emergency medicine, completion of an accredited residency training program and certification by the national board includes POCUS competency.

 

 

Internal Certificate

There are a few examples of successful local institutional programs that have provided internal certificates of competency.12,14 Competency assessments require significant resources including investment by both faculty and learners. Ongoing evaluation of competency should be based on quality assurance processes.

Credentialing and Privileging

The American Medical Association (AMA) House of Delegates in 1999 passed a resolution (AMA HR. 802) recommending hospitals follow specialty-specific guidelines for privileging decisions related to POCUS use.17 The resolution included a statement that, “ultrasound imaging is within the scope of practice of appropriately trained physicians.”

Some institutions have begun to rely on a combination of internal and external certificate programs to grant privileges to hospitalists.10 Although specific privileges for POCUS may not be required in some hospitals, some institutions may require certification of training and assessments prior to granting permission to use POCUS.

Hospitalist programs are encouraged to evaluate ongoing POCUS use by their providers after granting initial permission. If privileging is instituted by a hospital, hospitalists must play a significant role in determining the requirements for privileging and ongoing maintenance of skills.

Maintenance of Skills

All medical skills can decay with disuse, including those associated with POCUS.12,18 Thus, POCUS users should continue using POCUS regularly in clinical practice and participate in POCUS continuing medical education activities, ideally with ongoing assessments. Maintenance of skills may be confirmed through routine participation in a quality assurance program.

PROGRAM MANAGEMENT

Use of POCUS in hospital medicine has unique considerations, and hospitalists should be integrally involved in decision making surrounding institutional POCUS program management. Appointing a dedicated POCUS director can help a program succeed.8

Equipment and Image Archiving

Several factors are important to consider when selecting an ultrasound machine: portability, screen size, and ease of use; integration with the electronic medical record and options for image archiving; manufacturer’s service plan, including technical and clinical support; and compliance with local infection control policies. The ability to easily archive and retrieve images is essential for quality assurance, continuing education, institutional quality improvement, documentation, and reimbursement. In certain scenarios, image archiving may not be possible (such as with personal handheld devices or in emergency situations) or necessary (such as with frequent serial examinations during fluid resuscitation). An image archive is ideally linked to reports, orders, and billing software.10,19 If such linkages are not feasible, parallel external storage that complies with regulatory standards (ie, HIPAA compliance) may be suitable.20

Documentation and Billing

Components of documentation include the indication and type of ultrasound examination performed, date and time of the examination, patient identifying information, name of provider(s) acquiring and interpreting the images, specific scanning protocols used, patient position, probe used, and findings. Documentation can occur through a standalone note or as part of another note, such as a progress note. Whenever possible, documentation should be timely to facilitate communication with other providers.

Billing is supported through the AMA Current Procedural Terminology codes for “focused” or “limited” ultrasound examinations (Appendix 9). The following three criteria must be satisfied for billing. First, images must be permanently stored. Specific requirements vary by insurance policy, though current practice suggests a minimum of one image demonstrating relevant anatomy and pathology for the ultrasound examination coded. For ultrasound-guided procedures that require needle insertion, images should be captured at the point of interest, and a procedure note should reflect that the needle was guided and visualized under ultrasound.21 Second, proper documentation must be entered in the medical record. Third, local institutional privileges for POCUS must be considered. Although privileges are not required to bill, some hospitals or payers may require them.

 

 

Quality Assurance

Published guidelines on quality assurance in POCUS are available from different specialty organizations, including emergency medicine, pediatric emergency medicine, critical care, anesthesiology, obstetrics, and cardiology.8,22–28 Quality assurance is aimed at ensuring that physicians maintain basic competency in using POCUS to influence bedside decisions.

Quality assurance should be carried out by an individual or committee with expertise in POCUS. Multidisciplinary QA programs in which hospital medicine providers are working collaboratively with other POCUS providers have been demonstrated to be highly effective.10 Oversight includes ensuring that providers using POCUS are appropriately trained,10,22,28 using the equipment correctly,8,26,28 and documenting properly. Some programs have implemented mechanisms to review and provide feedback on image acquisition, interpretation, and clinical integration.8,10 Other programs have compared POCUS findings with referral studies, such as comprehensive ultrasound examinations.

CONCLUSIONS

Practicing hospitalists must continue to collaborate with their institutions to build POCUS capabilities. In particular, they must work with their local privileging body to determine what credentials are required. The distinction between certificates of completion and certificates of competency, including whether those certificates are internal or external, is important in the credentialing process.

External certificates of competency are currently unavailable for most practicing hospitalists because ABIM certification does not include POCUS-related competencies. As internal medicine residency training programs begin to adopt POCUS training and certification into their educational curricula, we foresee a need to update the ABIM Policies and Procedures for Certification. Until then, we recommend that certificates of competency be defined and granted internally by local hospitalist groups.

Given the many advantages of POCUS over traditional tools, we anticipate its increasing implementation among hospitalists in the future. As with all medical technology, its role in clinical care should be continuously reexamined and redefined through health services research. Such information will be useful in developing practice guidelines, educational curricula, and training standards.

Acknowledgments

The authors would like to thank all members that participated in the discussion and finalization of this position statement during the Point-of-care Ultrasound Faculty Retreat at the 2018 Society of Hospital Medicine Annual Conference: Saaid Abdel-Ghani, Brandon Boesch, Joel Cho, Ria Dancel, Renee Dversdal, Ricardo Franco-Sadud, Benjamin Galen, Trevor P. Jensen, Mohit Jindal, Gordon Johnson, Linda M. Kurian, Gigi Liu, Charles M. LoPresti, Brian P. Lucas, Venkat Kalidindi, Benji Matthews, Anna Maw, Gregory Mints, Kreegan Reierson, Gerard Salame, Richard Schildhouse, Daniel Schnobrich, Nilam Soni, Kirk Spencer, Hiromizu Takahashi, David M. Tierney, Tanping Wong, and Toru Yamada.

Many hospitalists incorporate point-of-care ultrasound (POCUS) into their daily practice because it adds value to their bedside evaluation of patients. However, standards for training and assessing hospitalists in POCUS have not yet been established. Other acute care specialties, including emergency medicine and critical care medicine, have already incorporated POCUS into their graduate medical education training programs, but most internal medicine residency programs are only beginning to provide POCUS training.1

Several features distinguish POCUS from comprehensive ultrasound examinations. First, POCUS is designed to answer focused questions, whereas comprehensive ultrasound examinations evaluate all organs in an anatomical region; for example, an abdominal POCUS exam may evaluate only for presence or absence of intraperitoneal free fluid, whereas a comprehensive examination of the right upper quadrant will evaluate the liver, gallbladder, and biliary ducts. Second, POCUS examinations are generally performed by the same clinician who generates the relevant clinical question to answer with POCUS and ultimately integrates the findings into the patient’s care.2 By contrast, comprehensive ultrasound examinations involve multiple providers and steps: a clinician generates a relevant clinical question and requests an ultrasound examination that is acquired by a sonographer, interpreted by a radiologist, and reported back to the requesting clinician. Third, POCUS is often used to evaluate multiple body systems. For example, to evaluate a patient with undifferentiated hypotension, a multisystem POCUS examination of the heart, inferior vena cava, lungs, abdomen, and lower extremity veins is typically performed. Finally, POCUS examinations can be performed serially to investigate changes in clinical status or evaluate response to therapy, such as monitoring the heart, lungs, and inferior vena cava during fluid resuscitation.

The purpose of this position statement is to inform a broad audience about how hospitalists are using diagnostic and procedural applications of POCUS. This position statement does not mandate that hospitalists use POCUS. Rather, it is intended to provide guidance on the safe and effective use of POCUS by the hospitalists who use it and the administrators who oversee its use. We discuss POCUS (1) applications, (2) training, (3) assessments, and (4) program management. This position statement was reviewed and approved by the Society of Hospital Medicine (SHM) Executive Committee in March 2018.

 

 

APPLICATIONS

Common diagnostic and procedural applications of POCUS used by hospitalists are listed in Table 1. Selected evidence supporting the use of these applications is described in the supplementary online content (Appendices 1–8 available at http://journalofhospitalmedicine.com) and SHM position statements on specific ultrasound-guided bedside procedures.3,4 Additional applications not listed in Table 1 that may be performed by some hospitalists include assessment of the eyes, stomach, bowels, ovaries, pregnancy, and testicles, as well as performance of regional anesthesia. Moreover, hospitalists caring for pediatric and adolescent patients may use additional applications besides those listed here. Currently, many hospitalists already perform more complex and sophisticated POCUS examinations than those listed in Table 1. The scope of POCUS use by hospitalists continues to expand, and this position statement should not restrict that expansion.

As outlined in our earlier position statements,3,4 ultrasound guidance lowers complication rates and increases success rates of invasive bedside procedures. Diagnostic POCUS can guide clinical decision making prior to bedside procedures. For instance, hospitalists may use POCUS to assess the size and character of a pleural effusion to help determine the most appropriate management strategy: observation, medical treatment, thoracentesis, chest tube placement, or surgical therapy. Furthermore, diagnostic POCUS can be used to rapidly assess for immediate postprocedural complications, such as pneumothorax, or if the patient develops new symptoms.

TRAINING

Basic Knowledge

Basic knowledge includes fundamentals of ultrasound physics; safety;4 anatomy; physiology; and device operation, including maintenance and cleaning. Basic knowledge can be taught by multiple methods, including live or recorded lectures, online modules, or directed readings.

Image Acquisition

Training should occur across multiple types of patients (eg, obese, cachectic, postsurgical) and clinical settings (eg, intensive care unit, general medicine wards, emergency department) when available. Training is largely hands-on because the relevant skills involve integration of 3D anatomy with spatial manipulation, hand-eye coordination, and fine motor movements. Virtual reality ultrasound simulators may accelerate mastery, particularly for cardiac image acquisition, and expose learners to standardized sets of pathologic findings. Real-time bedside feedback on image acquisition is ideal because understanding how ultrasound probe manipulation affects the images acquired is essential to learning.

Image Interpretation

Training in image interpretation relies on visual pattern recognition of normal and abnormal findings. Therefore, the normal to abnormal spectrum should be broad, and learners should maintain a log of what abnormalities have been identified. Giving real-time feedback at the bedside is ideal because of the connection between image acquisition and interpretation. Image interpretation can be taught through didactic sessions, image review sessions, or review of teaching files with annotated images.

Clinical Integration

Learners must interpret and integrate image findings with other clinical data considering the image quality, patient characteristics, and changing physiology. Clinical integration should be taught by instructors that share similar clinical knowledge as learners. Although sonographers are well suited to teach image acquisition, they should not be the sole instructors to teach hospitalists how to integrate ultrasound findings in clinical decision making. Likewise, emphasis should be placed on the appropriate use of POCUS within a provider’s skill set. Learners must appreciate the clinical significance of POCUS findings, including recognition of incidental findings that may require further workup. Supplemental training in clinical integration can occur through didactics that include complex patient scenarios.

 

 

Pathways

Clinical competency can be achieved with training adherent to five criteria. First, the training environment should be similar to where the trainee will practice. Second, training and feedback should occur in real time. Third, specific applications should be taught rather than broad training in “hospitalist POCUS.” Each application requires unique skills and knowledge, including image acquisition pitfalls and artifacts. Fourth, clinical competence must be achieved and demonstrated; it is not necessarily gained through experience. Fifth, once competency is achieved, continued education and feedback are necessary to ensure it is maintained.

Residency-based POCUS training pathways can best fulfill these criteria. They may eventually become commonplace, but until then alternative pathways must exist for hospitalist providers who are already in practice. There are three important attributes of such pathways. First, administrators’ expectations about learners’ clinical productivity must be realistically, but only temporarily, relaxed; otherwise, competing demands on time will likely overwhelm learners and subvert training. Second, training should begin through a local or national hands-on training program. The SHM POCUS certificate program consolidates training for common diagnostic POCUS applications for hospitalists.6 Other medical societies offer training for their respective clinical specialties.7 Third, once basic POCUS training has begun, longitudinal training should continue ideally with a local hospitalist POCUS expert.

In some settings, a subgroup of hospitalists may not desire, or be able to achieve, competency in the manual skills of POCUS image acquisition. Nevertheless, hospitalists may still find value in understanding POCUS nomenclature, image pattern recognition, and the evidence and pitfalls behind clinical integration of specific POCUS findings. This subset of POCUS skills allows hospitalists to communicate effectively with and understand the clinical decisions made by their colleagues who are competent in POCUS use.

The minimal skills a hospitalist should possess to serve as a POCUS trainer include proficiency of basic knowledge, image acquisition, image interpretation, and clinical integration of the POCUS applications being taught; effectiveness as a hands-on instructor to teach image acquisition skills; and an in-depth understanding of common POCUS pitfalls and limitations.

ASSESSMENTS

Assessment methods for POCUS can include the following: knowledge-based questions, image acquisition using task-specific checklists on human or simulation models, image interpretation using a series of videos or still images with normal and abnormal findings, clinical integration using “next best step” in a multiple choice format with POCUS images, and simulation-based clinical scenarios. Assessment methods should be aligned with local availability of resources and trainers.

Basic Knowledge

Basic knowledge can be assessed via multiple choice questions assessing knowledge of ultrasound physics, image optimization, relevant anatomy, and limitations of POCUS imaging. Basic knowledge lies primarily in the cognitive domain and does not assess manual skills.

Image Acquisition

Image acquisition can be assessed by observation and rating of image quality. Where resources allow, assessment of image acquisition is likely best done through a combination of developing an image portfolio with a minimum number of high quality images, plus direct observation of image acquisition by an expert. Various programs have utilized minimum numbers of images acquired to help define competence with image acquisition skills.6–8 Although minimums may be a necessary step to gain competence, using them as a sole means to determine competence does not account for variable learning curves.9 As with other manual skills in hospital medicine, such as ultrasound-guided bedside procedures, minimum numbers are best used as a starting point for assessments.3,10 In this regard, portfolio development with meticulous attention to the gain, depth, and proper tomographic plane of images can monitor a hospitalist’s progress toward competence by providing objective assessments and feedback. Simulation may also be used as it allows assessment of image acquisition skills and an opportunity to provide real-time feedback, similar to direct observation but without actual patients.

 

 

Image Interpretation

Image interpretation is best assessed by an expert observing the learner at bedside; however, when bedside assessment is not possible, image interpretation skills may be assessed using multiple choice or free text interpretation of archived ultrasound images with normal and abnormal findings. This is often incorporated into the portfolio development portion of a training program, as learners can submit their image interpretation along with the video clip. Both normal and abnormal images can be used to assess anatomic recognition and interpretation. Emphasis should be placed on determining when an image is suboptimal for diagnosis (eg, incomplete exam or poor-quality images). Quality assurance programs should incorporate structured feedback sessions.

Clinical Integration

Assessment of clinical integration can be completed through case scenarios that assess knowledge, interpretation of images, and integration of findings into clinical decision making, which is often delivered via a computer-based assessment. Assessments should combine specific POCUS applications to evaluate common clinical problems in hospital medicine, such as undifferentiated hypotension and dyspnea. High-fidelity simulators can be used to blend clinical case scenarios with image acquisition, image interpretation, and clinical integration. When feasible, comprehensive feedback on how providers acquire, interpret, and apply ultrasound at the bedside is likely the best mechanism to assess clinical integration. This process can be done with a hospitalist’s own patients.

General Assessment

A general assessment that includes a summative knowledge and hands-on skills assessment using task-specific checklists can be performed upon completion of training. A high-fidelity simulator with dynamic or virtual anatomy can provide reproducible standardized assessments with variation in the type and difficulty of cases. When available, we encourage the use of dynamic assessments on actual patients that have both normal and abnormal ultrasound findings because simulated patient scenarios have limitations, even with the use of high-fidelity simulators. Programs are recommended to use formative and summative assessments for evaluation. Quantitative scoring systems using checklists are likely the best framework.11,12

CERTIFICATES AND CERTIFICATION

A certificate of completion is proof of a provider’s participation in an educational activity; it does not equate with competency, though it may be a step toward it. Most POCUS training workshops and short courses provide certificates of completion. Certification of competency is an attestation of a hospitalist’s basic competence within a defined scope of practice (Table 2).13 However, without longitudinal supervision and feedback, skills can decay; therefore, we recommend a longitudinal training program that provides mentored feedback and incorporates periodic competency assessments. At present, no national board certification in POCUS is available to grant external certification of competency for hospitalists.

External Certificate

Certificates of completion can be external through a national organization. An external certificate of completion designed for hospitalists includes the POCUS Certificate of Completion offered by SHM in collaboration with CHEST.6 This certificate program provides regional training options and longitudinal portfolio development. Other external certificates are also available to hospitalists.7,14,15

Most hospitalists are boarded by the American Board of Internal Medicine or the American Board of Family Medicine. These boards do not yet include certification of competency in POCUS. Other specialty boards, such as emergency medicine, include competency in POCUS. For emergency medicine, completion of an accredited residency training program and certification by the national board includes POCUS competency.

 

 

Internal Certificate

There are a few examples of successful local institutional programs that have provided internal certificates of competency.12,14 Competency assessments require significant resources including investment by both faculty and learners. Ongoing evaluation of competency should be based on quality assurance processes.

Credentialing and Privileging

The American Medical Association (AMA) House of Delegates in 1999 passed a resolution (AMA HR. 802) recommending hospitals follow specialty-specific guidelines for privileging decisions related to POCUS use.17 The resolution included a statement that, “ultrasound imaging is within the scope of practice of appropriately trained physicians.”

Some institutions have begun to rely on a combination of internal and external certificate programs to grant privileges to hospitalists.10 Although specific privileges for POCUS may not be required in some hospitals, some institutions may require certification of training and assessments prior to granting permission to use POCUS.

Hospitalist programs are encouraged to evaluate ongoing POCUS use by their providers after granting initial permission. If privileging is instituted by a hospital, hospitalists must play a significant role in determining the requirements for privileging and ongoing maintenance of skills.

Maintenance of Skills

All medical skills can decay with disuse, including those associated with POCUS.12,18 Thus, POCUS users should continue using POCUS regularly in clinical practice and participate in POCUS continuing medical education activities, ideally with ongoing assessments. Maintenance of skills may be confirmed through routine participation in a quality assurance program.

PROGRAM MANAGEMENT

Use of POCUS in hospital medicine has unique considerations, and hospitalists should be integrally involved in decision making surrounding institutional POCUS program management. Appointing a dedicated POCUS director can help a program succeed.8

Equipment and Image Archiving

Several factors are important to consider when selecting an ultrasound machine: portability, screen size, and ease of use; integration with the electronic medical record and options for image archiving; manufacturer’s service plan, including technical and clinical support; and compliance with local infection control policies. The ability to easily archive and retrieve images is essential for quality assurance, continuing education, institutional quality improvement, documentation, and reimbursement. In certain scenarios, image archiving may not be possible (such as with personal handheld devices or in emergency situations) or necessary (such as with frequent serial examinations during fluid resuscitation). An image archive is ideally linked to reports, orders, and billing software.10,19 If such linkages are not feasible, parallel external storage that complies with regulatory standards (ie, HIPAA compliance) may be suitable.20

Documentation and Billing

Components of documentation include the indication and type of ultrasound examination performed, date and time of the examination, patient identifying information, name of provider(s) acquiring and interpreting the images, specific scanning protocols used, patient position, probe used, and findings. Documentation can occur through a standalone note or as part of another note, such as a progress note. Whenever possible, documentation should be timely to facilitate communication with other providers.

Billing is supported through the AMA Current Procedural Terminology codes for “focused” or “limited” ultrasound examinations (Appendix 9). The following three criteria must be satisfied for billing. First, images must be permanently stored. Specific requirements vary by insurance policy, though current practice suggests a minimum of one image demonstrating relevant anatomy and pathology for the ultrasound examination coded. For ultrasound-guided procedures that require needle insertion, images should be captured at the point of interest, and a procedure note should reflect that the needle was guided and visualized under ultrasound.21 Second, proper documentation must be entered in the medical record. Third, local institutional privileges for POCUS must be considered. Although privileges are not required to bill, some hospitals or payers may require them.

 

 

Quality Assurance

Published guidelines on quality assurance in POCUS are available from different specialty organizations, including emergency medicine, pediatric emergency medicine, critical care, anesthesiology, obstetrics, and cardiology.8,22–28 Quality assurance is aimed at ensuring that physicians maintain basic competency in using POCUS to influence bedside decisions.

Quality assurance should be carried out by an individual or committee with expertise in POCUS. Multidisciplinary QA programs in which hospital medicine providers are working collaboratively with other POCUS providers have been demonstrated to be highly effective.10 Oversight includes ensuring that providers using POCUS are appropriately trained,10,22,28 using the equipment correctly,8,26,28 and documenting properly. Some programs have implemented mechanisms to review and provide feedback on image acquisition, interpretation, and clinical integration.8,10 Other programs have compared POCUS findings with referral studies, such as comprehensive ultrasound examinations.

CONCLUSIONS

Practicing hospitalists must continue to collaborate with their institutions to build POCUS capabilities. In particular, they must work with their local privileging body to determine what credentials are required. The distinction between certificates of completion and certificates of competency, including whether those certificates are internal or external, is important in the credentialing process.

External certificates of competency are currently unavailable for most practicing hospitalists because ABIM certification does not include POCUS-related competencies. As internal medicine residency training programs begin to adopt POCUS training and certification into their educational curricula, we foresee a need to update the ABIM Policies and Procedures for Certification. Until then, we recommend that certificates of competency be defined and granted internally by local hospitalist groups.

Given the many advantages of POCUS over traditional tools, we anticipate its increasing implementation among hospitalists in the future. As with all medical technology, its role in clinical care should be continuously reexamined and redefined through health services research. Such information will be useful in developing practice guidelines, educational curricula, and training standards.

Acknowledgments

The authors would like to thank all members that participated in the discussion and finalization of this position statement during the Point-of-care Ultrasound Faculty Retreat at the 2018 Society of Hospital Medicine Annual Conference: Saaid Abdel-Ghani, Brandon Boesch, Joel Cho, Ria Dancel, Renee Dversdal, Ricardo Franco-Sadud, Benjamin Galen, Trevor P. Jensen, Mohit Jindal, Gordon Johnson, Linda M. Kurian, Gigi Liu, Charles M. LoPresti, Brian P. Lucas, Venkat Kalidindi, Benji Matthews, Anna Maw, Gregory Mints, Kreegan Reierson, Gerard Salame, Richard Schildhouse, Daniel Schnobrich, Nilam Soni, Kirk Spencer, Hiromizu Takahashi, David M. Tierney, Tanping Wong, and Toru Yamada.

References

1. Schnobrich DJ, Mathews BK, Trappey BE, Muthyala BK, Olson APJ. Entrusting internal medicine residents to use point of care ultrasound: Towards improved assessment and supervision. Med Teach. 2018:1-6. doi:10.1080/0142159X.2018.1457210.
2. Soni NJ, Lucas BP. Diagnostic point-of-care ultrasound for hospitalists. J Hosp Med. 2015;10(2):120-124. doi:10.1002/jhm.2285.
3. Lucas BP, Tierney DM, Jensen TP, et al. Credentialing of hospitalists in ultrasound-guided bedside procedures: a position statement of the society of hospital medicine. J Hosp Med. 2018;13(2):117-125. doi:10.12788/jhm.2917.
4. Dancel R, Schnobrich D, Puri N, et al. Recommendations on the use of ultrasound guidance for adult thoracentesis: a position statement of the society of hospital medicine. J Hosp Med. 2018;13(2):126-135. doi:10.12788/jhm.2940.
5. National Council on Radiation Protection and Measurements, The Council. Implementation of the Principle of as Low as Reasonably Achievable (ALARA) for Medical and Dental Personnel.; 1990.
6. Society of Hospital Medicine. Point of Care Ultrasound course: https://www.hospitalmedicine.org/clinical-topics/ultrasonography-cert/. Accessed February 6, 2018.
7. Critical Care Ultrasonography Certificate of Completion Program. CHEST. American College of Chest Physicians. http://www.chestnet.org/Education/Advanced-Clinical-Training/Certificate-of-Completion-Program/Critical-Care-Ultrasonography. Accessed February 6, 2018.
8. American College of Emergency Physicians Policy Statement: Emergency Ultrasound Guidelines. 2016. https://www.acep.org/Clinical---Practice-Management/ACEP-Ultrasound-Guidelines/. Accessed February 6, 2018.
9. Blehar DJ, Barton B, Gaspari RJ. Learning curves in emergency ultrasound education. Acad Emerg Med. 2015;22(5):574-582. doi:10.1111/acem.12653.
10. Mathews BK, Zwank M. Hospital medicine point of care ultrasound credentialing: an example protocol. J Hosp Med. 2017;12(9):767-772. doi:10.12788/jhm.2809.
11. Barsuk JH, McGaghie WC, Cohen ER, Balachandran JS, Wayne DB. Use of simulation-based mastery learning to improve the quality of central venous catheter placement in a medical intensive care unit. J Hosp Med. 2009;4(7):397-403. doi:10.1002/jhm.468.
12. Mathews BK, Reierson K, Vuong K, et al. The design and evaluation of the Comprehensive Hospitalist Assessment and Mentorship with Portfolios (CHAMP) ultrasound program. J Hosp Med. 2018;13(8):544-550. doi:10.12788/jhm.2938.
13. Soni NJ, Tierney DM, Jensen TP, Lucas BP. Certification of point-of-care ultrasound competency. J Hosp Med. 2017;12(9):775-776. doi:10.12788/jhm.2812.
14. Ultrasound Certification for Physicians. Alliance for Physician Certification and Advancement. APCA. https://apca.org/. Accessed February 6, 2018.
15. National Board of Echocardiography, Inc. https://www.echoboards.org/EchoBoards/News/2019_Adult_Critical_Care_Echocardiography_Exam.aspx. Accessed June 18, 2018.
16. Tierney DM. Internal Medicine Bedside Ultrasound Program (IMBUS). Abbott Northwestern. http://imbus.anwresidency.com/index.html. Accessed February 6, 2018.
17. American Medical Association House of Delegates Resolution H-230.960: Privileging for Ultrasound Imaging. Resolution 802. Policy Finder Website. http://search0.ama-assn.org/search/pfonline. Published 1999. Accessed February 18, 2018.
18. Kelm D, Ratelle J, Azeem N, et al. Longitudinal ultrasound curriculum improves long-term retention among internal medicine residents. J Grad Med Educ. 2015;7(3):454-457. doi:10.4300/JGME-14-00284.1.
19. Flannigan MJ, Adhikari S. Point-of-care ultrasound work flow innovation: impact on documentation and billing. J Ultrasound Med. 2017;36(12):2467-2474. doi:10.1002/jum.14284.
20. Emergency Ultrasound: Workflow White Paper. https://www.acep.org/uploadedFiles/ACEP/memberCenter/SectionsofMembership/ultra/Workflow%20White%20Paper.pdf. Published 2013. Accessed February 18, 2018.
21. Ultrasound Coding and Reimbursement Document 2009. Emergency Ultrasound Section. American College of Emergency Physicians. http://emergencyultrasoundteaching.com/assets/2009_coding_update.pdf. Published 2009. Accessed February 18, 2018.
22. Mayo PH, Beaulieu Y, Doelken P, et al. American College of Chest Physicians/La Societe de Reanimation de Langue Francaise statement on competence in critical care ultrasonography. Chest. 2009;135(4):1050-1060. doi:10.1378/chest.08-2305.
23. Frankel HL, Kirkpatrick AW, Elbarbary M, et al. Guidelines for the appropriate use of bedside general and cardiac ultrasonography in the evaluation of critically ill patients-part I: general ultrasonography. Crit Care Med. 2015;43(11):2479-2502. doi:10.1097/ccm.0000000000001216.
24. Levitov A, Frankel HL, Blaivas M, et al. Guidelines for the appropriate use of bedside general and cardiac ultrasonography in the evaluation of critically ill patients-part ii: cardiac ultrasonography. Crit Care Med. 2016;44(6):1206-1227. doi:10.1097/ccm.0000000000001847.
25. ACR–ACOG–AIUM–SRU Practice Parameter for the Performance of Obstetrical Ultrasound. https://www.acr.org/-/media/ACR/Files/Practice-Parameters/us-ob.pdf. Published 2013. Accessed February 18, 2018.
26. AIUM practice guideline for documentation of an ultrasound examination. J Ultrasound Med. 2014;33(6):1098-1102. doi:10.7863/ultra.33.6.1098.
27. Marin JR, Lewiss RE. Point-of-care ultrasonography by pediatric emergency medicine physicians. Pediatrics. 2015;135(4):e1113-e1122. doi:10.1542/peds.2015-0343.
28. Spencer KT, Kimura BJ, Korcarz CE, Pellikka PA, Rahko PS, Siegel RJ. Focused cardiac ultrasound: recommendations from the American Society of Echocardiography. J Am Soc Echocardiogr. 2013;26(6):567-581. doi:10.1016/j.echo.2013.04.001.

References

1. Schnobrich DJ, Mathews BK, Trappey BE, Muthyala BK, Olson APJ. Entrusting internal medicine residents to use point of care ultrasound: Towards improved assessment and supervision. Med Teach. 2018:1-6. doi:10.1080/0142159X.2018.1457210.
2. Soni NJ, Lucas BP. Diagnostic point-of-care ultrasound for hospitalists. J Hosp Med. 2015;10(2):120-124. doi:10.1002/jhm.2285.
3. Lucas BP, Tierney DM, Jensen TP, et al. Credentialing of hospitalists in ultrasound-guided bedside procedures: a position statement of the society of hospital medicine. J Hosp Med. 2018;13(2):117-125. doi:10.12788/jhm.2917.
4. Dancel R, Schnobrich D, Puri N, et al. Recommendations on the use of ultrasound guidance for adult thoracentesis: a position statement of the society of hospital medicine. J Hosp Med. 2018;13(2):126-135. doi:10.12788/jhm.2940.
5. National Council on Radiation Protection and Measurements, The Council. Implementation of the Principle of as Low as Reasonably Achievable (ALARA) for Medical and Dental Personnel.; 1990.
6. Society of Hospital Medicine. Point of Care Ultrasound course: https://www.hospitalmedicine.org/clinical-topics/ultrasonography-cert/. Accessed February 6, 2018.
7. Critical Care Ultrasonography Certificate of Completion Program. CHEST. American College of Chest Physicians. http://www.chestnet.org/Education/Advanced-Clinical-Training/Certificate-of-Completion-Program/Critical-Care-Ultrasonography. Accessed February 6, 2018.
8. American College of Emergency Physicians Policy Statement: Emergency Ultrasound Guidelines. 2016. https://www.acep.org/Clinical---Practice-Management/ACEP-Ultrasound-Guidelines/. Accessed February 6, 2018.
9. Blehar DJ, Barton B, Gaspari RJ. Learning curves in emergency ultrasound education. Acad Emerg Med. 2015;22(5):574-582. doi:10.1111/acem.12653.
10. Mathews BK, Zwank M. Hospital medicine point of care ultrasound credentialing: an example protocol. J Hosp Med. 2017;12(9):767-772. doi:10.12788/jhm.2809.
11. Barsuk JH, McGaghie WC, Cohen ER, Balachandran JS, Wayne DB. Use of simulation-based mastery learning to improve the quality of central venous catheter placement in a medical intensive care unit. J Hosp Med. 2009;4(7):397-403. doi:10.1002/jhm.468.
12. Mathews BK, Reierson K, Vuong K, et al. The design and evaluation of the Comprehensive Hospitalist Assessment and Mentorship with Portfolios (CHAMP) ultrasound program. J Hosp Med. 2018;13(8):544-550. doi:10.12788/jhm.2938.
13. Soni NJ, Tierney DM, Jensen TP, Lucas BP. Certification of point-of-care ultrasound competency. J Hosp Med. 2017;12(9):775-776. doi:10.12788/jhm.2812.
14. Ultrasound Certification for Physicians. Alliance for Physician Certification and Advancement. APCA. https://apca.org/. Accessed February 6, 2018.
15. National Board of Echocardiography, Inc. https://www.echoboards.org/EchoBoards/News/2019_Adult_Critical_Care_Echocardiography_Exam.aspx. Accessed June 18, 2018.
16. Tierney DM. Internal Medicine Bedside Ultrasound Program (IMBUS). Abbott Northwestern. http://imbus.anwresidency.com/index.html. Accessed February 6, 2018.
17. American Medical Association House of Delegates Resolution H-230.960: Privileging for Ultrasound Imaging. Resolution 802. Policy Finder Website. http://search0.ama-assn.org/search/pfonline. Published 1999. Accessed February 18, 2018.
18. Kelm D, Ratelle J, Azeem N, et al. Longitudinal ultrasound curriculum improves long-term retention among internal medicine residents. J Grad Med Educ. 2015;7(3):454-457. doi:10.4300/JGME-14-00284.1.
19. Flannigan MJ, Adhikari S. Point-of-care ultrasound work flow innovation: impact on documentation and billing. J Ultrasound Med. 2017;36(12):2467-2474. doi:10.1002/jum.14284.
20. Emergency Ultrasound: Workflow White Paper. https://www.acep.org/uploadedFiles/ACEP/memberCenter/SectionsofMembership/ultra/Workflow%20White%20Paper.pdf. Published 2013. Accessed February 18, 2018.
21. Ultrasound Coding and Reimbursement Document 2009. Emergency Ultrasound Section. American College of Emergency Physicians. http://emergencyultrasoundteaching.com/assets/2009_coding_update.pdf. Published 2009. Accessed February 18, 2018.
22. Mayo PH, Beaulieu Y, Doelken P, et al. American College of Chest Physicians/La Societe de Reanimation de Langue Francaise statement on competence in critical care ultrasonography. Chest. 2009;135(4):1050-1060. doi:10.1378/chest.08-2305.
23. Frankel HL, Kirkpatrick AW, Elbarbary M, et al. Guidelines for the appropriate use of bedside general and cardiac ultrasonography in the evaluation of critically ill patients-part I: general ultrasonography. Crit Care Med. 2015;43(11):2479-2502. doi:10.1097/ccm.0000000000001216.
24. Levitov A, Frankel HL, Blaivas M, et al. Guidelines for the appropriate use of bedside general and cardiac ultrasonography in the evaluation of critically ill patients-part ii: cardiac ultrasonography. Crit Care Med. 2016;44(6):1206-1227. doi:10.1097/ccm.0000000000001847.
25. ACR–ACOG–AIUM–SRU Practice Parameter for the Performance of Obstetrical Ultrasound. https://www.acr.org/-/media/ACR/Files/Practice-Parameters/us-ob.pdf. Published 2013. Accessed February 18, 2018.
26. AIUM practice guideline for documentation of an ultrasound examination. J Ultrasound Med. 2014;33(6):1098-1102. doi:10.7863/ultra.33.6.1098.
27. Marin JR, Lewiss RE. Point-of-care ultrasonography by pediatric emergency medicine physicians. Pediatrics. 2015;135(4):e1113-e1122. doi:10.1542/peds.2015-0343.
28. Spencer KT, Kimura BJ, Korcarz CE, Pellikka PA, Rahko PS, Siegel RJ. Focused cardiac ultrasound: recommendations from the American Society of Echocardiography. J Am Soc Echocardiogr. 2013;26(6):567-581. doi:10.1016/j.echo.2013.04.001.

Publications
Publications
Topics
Article Type
Sections
Article Source

© 2019 Society of Hospital Medicine

Citation Override
Published Online Only January 2, 2019. doi: 10.12788/jhm.3079
Disallow All Ads
Correspondence Location
Corresponding Author: Nilam J. Soni, MD MS; E-mail: sonin@uthscsa.edu; Telephone: 210-743-6030.
Content Gating
Open Access (article Unlocked/Open Access)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Article PDF Media
Media Files

Credentialing of Hospitalists in Ultrasound-Guided Bedside Procedures: A Position Statement of the Society of Hospital Medicine

Article Type
Changed
Tue, 03/05/2019 - 13:29

The American Board of Internal Medicine (ABIM) changed its certification policy for bedside procedures over a decade ago.1 Acquiring manual competence in abdominal paracentesis, arterial catheter placement, arthrocentesis, central venous catheter placement, lumbar puncture, and thoracentesis is no longer an expectation of residency training. ABIM diplomates should “know” these procedures but not necessarily “do” them. Hospitalists, most of whom are themselves ABIM diplomates, are still, however, expected to do them as core competencies,2perhaps because hospitalists are often available off-hours, when roughly half of bedside procedures are performed.3

Hospitalists increasingly perform bedside procedures with ultrasound guidance.4 Yet training in ultrasound guidance is significantly varied as well,5 simply because point-of-care ultrasound (POCUS) has only recently become widespread.6 And though some skills are transferrable from landmark-guided to ultrasound -guided procedures, many are not.7-10 Furthermore, ultrasound guidance is often not explicitly delineated on the privileging forms used by hospitals,11 even where ultrasound guidance has become standard.12

Given the variability in training for both ultrasound- and landmark-guided procedures, and given the lack of a universal standard for certification, local hospitals often ask their respective hospitalist group leaders to certify hospitalists’ basic competence as part of credentialing (see the Table for definitions). How hospitalist group leaders should certify competence, however, is not clear. The importance of this gap has recently increased, as hospitalists continue to perform procedures despite not having clear answers to questions about basic competence.13-15

Therefore, the Society of Hospital Medicine (SHM) Education Committee convened a group of experts and conducted a systematic literature review in order to provide recommendations for credentialing hospitalist physicians in ultrasound-guided bedside procedures. These recommendations do not include training recommendations, aside from recommendations about remedial training for hospitalists who do not pass certification. Training is a means to competence but does not guarantee it. We believe that training recommendations ought to be considered separately.

METHODS

Working Group Formation

In January 2015, the SHM Board of Directors asked the SHM Education Committee to convene the POCUS Task Force. The purpose of the task force was to develop recommendations on ultrasound guidance for bedside procedures. The SHM Education Committee appointed 3 chairs of the task force: 1 senior member of the SHM Education Committee and 2 POCUS experts. The chairs assembled a task force of 31 members that included 5 working groups, a multispecialty peer review group, and a guideline methodologist (supplemental Appendix 1). Invitation was based on members’ past contributions to SHM POCUS-related activities, up-front commitment, and declared conflicts of interest. Working group members self-identified as “hospitalists,” whereas peer reviewers were nonhospitalists but nationally recognized POCUS physician-leaders specializing in emergency medicine, cardiology, critical care medicine, and anesthesiology. Task force membership was vetted by a chair of the SHM POCUS Task Force and the Director of Education before work began. This position statement was authored by the Credentialing Working Group together with the chairs of the other 4 working groups and a guideline methodologist.

 

 

Disclosures

Signed disclosure statements of all task force members were reviewed prior to inclusion on the task force (supplemental Appendix 2); no members received honoraria for participation. Industry representatives did not contribute to the development of the guidelines nor to any conference calls or meetings.

Literature Search Strategy

A literature search was conducted by a biomedical librarian. Records from 1979 to January of 2017 were searched in Medline, Embase, CINAHL, Cochrane, and Google Scholar (supplemental Appendix 3). Search limiters were English language and adults. Articles were manually screened to exclude nonhuman or endoscopic ultrasound applications. Final article selection was based on working group consensus.

Draft Pathways

The Credentialing Working Group drafted initial and ongoing certification pathways (Figure 1 and Figure 2). The other 4 working groups from the task force were surveyed about the elements and overall appropriateness of these draft pathways. This survey and its results have already been published.12 The Credentialing Working Group then revised the certification pathways by using these survey results and codified individual aspects of these pathways into recommendations.

Development of Position Statement

Based on the Grading of Recommendation Assessment Development and Evaluation methodology, all final article selections were initially rated as either low-quality (observational studies) or unclassifiable (expert opinion).16 These initial ratings were downgraded further because of indirectness, because none of the articles involved the intervention of interest (a credentialing pathway) in a population of interest (hospitalists) measuring the outcomes of interest (patient-level outcomes).17 Given the universal low-quality evidence ratings, we altered the task force strategy of developing guidelines, which the other 4 working groups are writing, and instead developed a position statement by using consensus gathering in 3 steps.

First, the Credentialing Working Group drafted an initial position statement composed of recommendations for credentialing pathways and other general aspects of credentialing. All final article selections were incorporated as references in a draft of the position statement and compiled in a full-text compendium. Second, feedback was provided by the other 4 task force working groups, the task force peer reviewers, and the SHM Education Committee. Feedback was incorporated by the authors of this statement who were the Credentialing Working Group, the chairs of the other 4 working groups, and a guideline methodologist. Third, final suggestions from all members of the SHM POCUS Task Force and SHM Education Committee were incorporated before final approval by the SHM Board of Directors in September 2017.

RESULTS

A total of 1438 references were identified in the original search. Manual selection led to 101 articles, which were incorporated into the following 4 domains with 16 recommendations.

General Credentialing Process

Basic Cognitive Competence Can Be Certified with Written or Oral Examinations

The ABIM defines cognitive competence as having 3 abilities: “(1) to explain indications, contraindications, patient preparation methods, sterile techniques, pain management, proper techniques for handling specimens and fluids obtained, and test results; (2) to recognize and manage complications; and, (3) to clearly explain to a patient all facets of the procedure necessary to obtain informed consent.”1 These abilities can be assessed with written or oral examinations that may be integrated into simulation- or patient-based assessments.18-21

Minimum Thresholds of Experience to Trigger the Timing of a Patient-Based Assessment Should Be Determined by Empirical Methods

Learning curves are highly variable22-25 and even plateaus may not herald basic competence.26 Expert opinions27 can be used to establish minimum thresholds of experience, but such opinions may paradoxically exceed the current thresholds of experts’ own hospitals.12 Thus, empirical methods, such as those based on cumulative sum analysis28-30 or local learning curves,31,32 are preferred. If such methods are not available, a recent survey of hospitalist experts may provide guidance.12 Regardless, once established, minimum thresholds are necessary but not sufficient to determine competency (see “Basic manual competence must be certified through patient-based assessments” section).

Hospitalists Should Formally Log All of Their Attempted Procedures, Ideally in an Electronic Medical Record

Simple self-reported numbers of procedures performed often misrepresent actual experience33,34 and do not include periprocedural complications.35,36 Thus, hospitalists should report their experience with logs of all attempted procedures, both successful and unsuccessful. Such logs must include information about supervising providers (if applicable) and patient outcomes, including periprocedural adverse events,37 but they must also remain compliant with the Health Insurance Portability and Accountability Act.

Health Information Technology Service Should Routinely Pull Collations of All Attempted Procedures from Comprehensive Electronic Medical Records

Active surveillance may reduce complications by identifying hospitalists who may benefit from further training.38 In order to facilitate active surveillance systems, documentation (such as a procedure note) should be both integrated into an electronic medical record and protocol driven,39 including procedure technique, ultrasound findings, and any safety events (both near misses and adverse events).

 

 

Basic Manual Competence Must Be Certified Through Patient-Based Assessments

Multiple interacting factors, including environment, patients, baseline skills, training, experience, and skills decay, affect manual competence. Certifications that are based solely on reaching minimum thresholds of experience, even when accurate, are not valid reflections of manual competence,15,40-43 and neither are those based on self-perception.44 Patient-based assessments are, thus, necessary to ensure manual competence.45-48

Certification Assessments of Manual Competence Should Combine 2 Types of Structured Instruments: Checklists and Overall Scores

Assessments based on direct observation are more reliable when formally structured.49,50 Though checklists used in observed structured clinical examinations capture many important manual skills,51-56 they do not completely reflect a hospitalist’s manual competence;57 situations may occur in which a hospitalist meets all the individual items on a checklist but cannot perform an entire procedure with basic competence. Therefore, checklists should be paired with overall scores.58-61 Both checklists and overall scores ought to be obtained from reliable and valid instruments.

Certification Assessments Should Include Feedback

Assessments without feedback are missed learning opportunities.62 Both simulation-63 and patient-based assessments should provide feedback in real time to reinforce effective behaviors and remedy faulty ones.

If Remedial Training is Needed, Simulator-Based Training Can Supplement but Not Replace Patient-Based Training

Supervised simulator-based training allows hospitalists to master basic components of a procedure64 (including orientation to equipment, sequence of operations, dexterity, ultrasound anatomy, and real-time guidance technique) while improving both cognitive and manual skills.42,43,65-71 In addition to their role in basic training (which is outside the scope of this position statement), simulators can be useful for remedial training. To be sufficient for hospitalists who do not pass their patient-based assessments, however, remedial training that begins with simulation must also include patient-based training and assessment.72-75

Initial Credentialing Process

A Minimum Threshold of Experience Should Be Reached before Patient-Based Assessments are Conducted (Figure 1)

Recent experience, such as the number of successful procedures performed on a representative sample of patients61,76,77 in the last 2 years, should meet a minimum threshold (see “Minimum thresholds of experience to trigger the timing of a patient-based assessment should be determined by empirical methods” section) before a patient-based assessment for intramural certification occurs.31,78 Such procedures should be supervised unless performed with privileges, for example, at another hospital. After reaching both a minimum threshold of experience and passing an observed patient-based assessment, which includes assessments of both cognitive and manual skills, hospitalists can be considered intramurally certified for initial credentialing. The hospitalist may begin to independently perform ultrasound-guided procedures if all credentialing requirements are met and privileges are granted.

Initial Certification Assessments Should Ideally Begin on Simulators

Simulators allow the assurance of safe manual skills, including proper needle insertion techniques and disposal of sharp objects.3,79 If simulators are not available, however, then patient-based training and assessments can still be performed under direct observation. Safe performance of ultrasound-guided procedures during patient-based assessments (without preceding simulator-based assessments) is sufficient to certify manual competence.

Ongoing Credentialing

Certification to Perform Ultrasound-Guided Procedures Should Be Routinely Re-Evaluated During Ongoing Credentialing (Figure 2)

Ongoing certifications are needed because skills decay.80,81 They should be routine, perhaps coinciding with the usual reprivileging cycle (often biennually). When feasible,82 maintenance of manual competence is best ensured by directly observed patient-based assessments; when not feasible, performance reviews are acceptable.

Observed Patient-Based Assessments Should Occur When a Periprocedural Safety Event Occurs that is Potentially Caused by “Provider Error”

Safety events include both near misses and adverse events. Information about both is ideally “flagged” and “pushed” to hospitalist group leaders by active surveillance and reporting systems. Once reviewed, if a safety event is considered to potentially have been caused by provider error (including knowledge- and skill-based errors),83 then the provider who performed the procedure should undergo an observed patient-based assessment.

Simulation-Based Practice Can Supplement Patient-Based Experience for Ongoing Credentialing

When hospitalists do not achieve a minimum threshold of patient-based experience since the antecedent certification, simulation-based training can supplement their patient-based experience.84 In these cases, however, an observed patient-based assessment must occur. Another consideration is whether or not the privilege should be relinquished because of an infrequent need.

Credentialing Infrastructure

Hospitalists Themselves Should Not Bear the Financial Costs of Developing and Maintaining Training and Certification Programs for Ultrasound-Guided Procedures

Equipment and personnel costs85,86 commonly impede ultrasound-guided procedure programs.4,87,88 Hospitalists whose job descriptions include the performance of ultrasound-guided procedures should not be expected to bear the costs of ultrasound machines, image archival software, equipment maintenance, and initial and ongoing training and certification.

Assessors Should Be Unbiased Expert Providers Who Have Demonstrated Mastery in Performance of the Procedure Being Assessed and Regularly Perform It in a Similar Practice Environment

 

 

Assessors should be expert providers who regularly perform the ultrasound-guided procedure in a similar practice environment.9,89-94 For example, providers who are not hospitalists but who are experts in an ultrasound-guided procedure and commonly perform it on the hospital wards would be acceptable assessors. However, a radiologist who only performs that procedure in a fully-staffed interventional radiology suite with fluoroscopy or computed tomography guidance would not be an acceptable assessor. More than 1 assessor may balance idiosyncratic assessments;95 but when assessments are well structured, additional assessors are generally not needed.18Candidate assessors should be vetted by the hospitalist group leader and the hospital privileging committee.

If Intramural Assessors Are Not Available, Extramural Assessors May Be Considered

Intramural assessors are generally preferred because of familiarity with the local practice environment, including the available procedure kits and typical patient characteristics. Nevertheless, extramural assessors27,77,85,96 may theoretically provide even more valid assessments than intramural ones because extramural assessors are neither influenced by relationships with local hospitalists nor biased by local hospitalists’ skills.97,98 Remote performance assessment through video recordings99 or live-video streaming is another option100 but is not sufficient unless a room camera is available to simultaneously view probe movement and the ultrasound screen.101 In addition, remote assessment does not allow the assessor to physically assume control of the procedure to either salvage it or perhaps, in some cases, prevent a complication.

DISCUSSION

There are no high-quality randomized trials in support of a single credentialing pathway over any other.94,102 The credentialing pathways at the center of this position statement are based on expert opinion. Our methods can be criticized straightaway, therefore, for reliance on the experience and expertise of our working group and task force. Any position statement written without high-quality supportive evidence would be appropriately subject to the same criticism. Without evidence in support of an overall pathway, we codified specific aspects of the pathways into 16 individual recommendations.

Patient-level outcomes do not back these recommendations. Consider, for example, our recommendation that certification assessments be made from structured instruments and not simply from an assessor’s gestalt. Here, the basis is not improved patient-level outcomes from a trial (such as reduced complications or increased procedural success) but improved psychometric performance from reliability studies. The body of evidence for our recommendations is similarly indirect, mostly because the outcomes studied are more proximate and, thus, less meaningful than patient-level outcomes, which are the outcomes of greatest interest but are woefully understudied for clinical competence.17,97,103

The need for high-quality evidence is most pronounced in distinguishing how recommendations should be modified for various settings. Wide variations in resources and patient-mix will make some recommendations impracticable, meaning that they could not be carried out with available resources. For example, our recommendation that credentialing decisions should ultimately rely on certifications made by assessors during patient-based assessments may not be practicable at small, rural hospitals. Such hospitals may not have access to local assessors, and they may not admit enough patients who need the types of ultrasound-guided procedures for which hospitalists seek certification (especially given the need to coordinate the schedules of patients, procedure-performing hospitalists, and assessors). Collaborative efforts between hospitals for regional certification may be a potential solution to consider. But if recommendations are truly impracticable, the task force recognizes they may need to be modified. Given the low quality of evidence supporting our recommendations, such modifications would be readily defendable, especially if they emerged from collaborative discussions between privileging committees, hospitalist directors, and local experts.

One way for hospitals to implement our recommendations may be to follow a recommendation proposed by the authors of the original hospitalist core competencies over a decade ago: “The presence of a procedural skill in the Core Competencies does not necessarily indicate that every hospitalist will perform or be proficient in that procedure.”104 In other words, bedside procedures may be delegated to some but not all hospitalists. Such “proceduralists” would have some proportion of their clinical responsibility dedicated to performing procedures. Delineation of this job description must be made locally because it balances 2 hospital-specific characteristics: patients’ needs for procedures against the availability of providers with basic competence to perform them, which includes hospitalists but also emergency medicine physicians, specialists, and interventional radiologists. A salutary benefit for hospitals is that hospitalists who are not proceduralists would not need to undergo certification in basic competence for the bedside procedures they will not be performing.

Regardless of whether some or all hospitalists at a particular hospital are expected to perform bedside procedures, technology may help to improve the practicability of our recommendations. For example, simulators may evolve to replace actual patient-level experience in achieving minimum thresholds. Certification assessments of manual skills may even someday occur entirely on simulators. Real-time high-definition video streaming enhanced with multiple cameras may allow for remote assessments. Until such advances mature, high-quality patient-level data should be sought through additional research to refine our current recommendations.

We hope that these recommendations will improve how basic competence in ultrasound-guided bedside procedures is assessed. Our ultimate goal is to improve how hospitalists perform these procedures. Patient safety is, therefore, considered paramount to cost. Nevertheless, the hospital administrative leaders and privileging committee members on our Task Force concluded that many hospitals have been seeking guidance on credentialing for bedside procedures, and the likely difficulties of implementing our recommendations (including cost) would not be prohibitive at most hospitals, especially given recognition that these recommendations can be tailored to each setting.

 

 

Acknowledgments

Collaborators from SHM POCUS Task Force are Saaid Abdel-Ghani, Michael Blaivas, Dan Brotman, Carolina Candotti, Jagriti Chadha, Joel Cho, Ria Dancel, Ricardo Franco, Richard Hoppmann, Susan Hunt, Venkat Kalidindi, Ketino Kobaidze, Josh Lenchus, Benji Mathews, Satyen Nichani, Vicki Noble, Martin Perez, Nitin Puri, Aliaksei Pustavoitau, Sophia Rodgers, Gerard Salame, Daniel Schnobrich, Kirk Spencer, Vivek Tayal, Jeff Bates, Anjali Bhagra, Kreegan Reierson, Robert Arntfield, Paul Mayo, Loretta Grikis.

Disclosure

Brian P. Lucas received funding from the Department of Veterans Affairs, Veterans Health Administration, Office of Research and Development and Dartmouth SYNERGY, National Institutes of Health, and National Center for Translational Science (UL1TR001086). Nilam Soni received funding from the Department of Veterans Affairs, Quality Enhancement Research Initiative (QUERI) Partnered Evaluation Initiative (HX002263-01A1). The contents of this publication do not represent the views of the United States Department of Veterans Affairs or the United States Government.

Files
References

1. American Board of Internal Medicine. Policies and procedures for certification. Philadelphia: American Board of Internal Medicine; 2006.
2. Nichani S, Fitterman N, Lukela M, Crocker J; Society of Hospital Medicine. The Core Competencies in Hospital Medicine 2017 Revision. Section 2: Procedures. J Hosp Med. 2017;12(4 Suppl 1):S44-S54 PubMed
3. Lucas BP, Asbury JK, Franco-Sadud R. Training future hospitalists with simulators: a needed step toward accessible, expertly performed bedside procedures. J Hosp Med. 2009;4(7):395-396. PubMed
4. Schnobrich DJ, Gladding S, Olson APJ, Duran-Nelson A. Point-of-care ultrasound in internal medicine: a national survey of educational leadership. J Grad Med Educ. 2013;5(3):498-502. PubMed
5. Brown GM, Otremba M, Devine LA, Gray C, Millington SJ, Ma IW. Defining competencies for ultrasound-guided bedside procedures: consensus opinions from Canadian physicians. J Ultrasound Med. 2016;35(1):129-141. PubMed
6. Vaisman A, Cram P. Procedural competence among faculty in academic health centers: challenges and future directions. Acad Med. 2017;92(1):31-34. PubMed
7. Kreisman RD. With ED ultrasound, credentialing is at issue. ED Legal Letter. 2010;21:102-103. 
8. Goudie AM. Credentialing a new skill: what should the standard be for emergency department ultrasound in Australasia? Emerg Med Australas. 2010;22:263-264. PubMed
9. Maizel J, Guyomarc HL, Henon P, et al. Residents learning ultrasound-guided catheterization are not sufficiently skilled to use landmarks. Crit Care. 2014;18(1):R36. doi:10.1186/cc13741. PubMed
10. American College of Emergency Physicians. Ultrasound guidelines: emergency, point-of-care, and clinical ultrasound guidelines in medicine. Ann Emerg Med. 2017;69(5):e27-e54. PubMed
11. Amini R, Adhikari S, Fiorello A. Ultrasound competency assessment in emergency medicine residency programs. Acad Emerg Med. 2014;21(7):799-801. PubMed
12. Jensen T, Soni NJ, Tierney DM, Lucas BP. Hospital privileging practices for bedside procedures: a survey of hospitalist experts. J Hosp Med. 2017;12(10):836-839. PubMed
13. Chang W. Is hospitalist proficiency in bedside procedures in decline? The Hospitalist. 2012. http://www.the-hospitalist.org/hospitalist/article/125236/patient-safety/hospitalist-proficiency-bedside-procedures-decline. Accessed September 30, 2017.
14. Barsuk JH, Feinglass J, Kozmic SE, Hohmann SF, Ganger D, Wayne DB. Specialties Performing Paracentesis Procedures at University Hospitals: Implications for Training and Certification. J Hosp Med. 2014;9(3):162-168. PubMed
15. Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Residents’ Procedural Experience Does Not Ensure Competence: A Research Synthesis. J Grad Med Educ. 2017;9(2):201-208. PubMed
16. Balshem H, Helfand M, Schunemann HJ, et al. GRADE guidelines: 3. Rating the quality of evidence. J Clin Epidemiol. 2011;64(4):401-406. PubMed
17. Guyatt GH, Oxman AD, Kunz R, et al. GRADE guidelines: 8. Rating the quality of evidence—indirectness. J Clin Epidemiol. 2011;64(12):1303-1310. PubMed
18. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9 Suppl):S63-S67. PubMed
19. Grover S, Currier PF, Elinoff JM, Mouchantaf KJ, Katz JT, McMahon GT. Development of a test to evaluate residents knowledge of medical procedures. J Hosp Med. 2009;4(7):430-432. PubMed
20. Millington SJ, Wong RY, Kassen BO, Roberts JM, Ma IWY. Improving internal medicine residents’ performance, knowledge, and confidence in central venous catheterization using simulators. J Hosp Med. 2009;4(7):410-416. PubMed
21. Lenchus JD, Carvalho CM, Ferreri K, et al. Filling the void: defining invasive bedside procedural competency for internal medicine residents. J Grad Med Educ. 2013;5(4):605-612. PubMed
22. Heegeman DJ, Kieke B Jr. Learning curves, credentialing, and the need for ultrasound fellowships. Acad Emerg Med. 2003;10:404-405. PubMed
23. Jang TB, Ruggeri W, Dyne P, Kaji AH. The learning curve of resident physicians using emergency ultrasonography for cholelithaisis and cholecystitis. Acad Emerg Med. 2010;17(11):1247-1252. PubMed
24. Akhtar MI, Hamid M. Ultrasound guided central venous access; a review of literature. Anaesth Pain Intensive Care. 2015;19:317-322. 
25. Bahl A, Yunker A. Assessment of the numbers–based model for evaluation of resident competency in emergency ultrasound core applications. J Emerg Med Trauma Acute Care. 2015;2015(5). doi:10.5339/jemtac.2015.5 
26. Cazes N, Desmots F, Geffroy Y, Renard A, Leyral J, Chaumoitre K. Emergency ultrasound: a prospective study on sufficient adequate training for military doctors. Diagn Interv Imaging. 2013;94(11):1109-1115. PubMed
27. Arntfield RT, Millington SJ, Ainsworth CD, et al. Canadian recommendations for critical care ultrasound training and competency for the Canadian critical care society. Can Respir J. 2014;21(16):341-345. 
28. Bolsin S, Colson M. The use of the Cusum technique in the assessment of trainee competence in new procedures. Int J Qual Health Care. 2000;12(5):433-438. PubMed
29. de Oliveira Filho GR, Helayel PE, da Conceição DB, Garzel IS, Pavei P, Ceccon MS. Learning curves and mathematical models for interventional ultrasound basic skills. Anaesth Analg. 2008;106(2):568-573. PubMed
30. Starkie T, Drake EJ. Assessment of procedural skills training and performance in anesthesia using cumulative sum analysis (cusum). Can J Anaesth. 2013;60(12):1228-1239. PubMed
31. Tierney D. Competency cut-point identification derived from a mastery learning cohort approach: A hybrid model. Ultrasound Med Biol. 2015;41:S19. 
32. Rankin JH, Elkhunovich MA, Rangarajan V, Chilstrom M, Mailhot T. Learning Curves for Ultrasound Assessment of Lumbar Puncture Insertion Sites: When is Competency Established? J Emerg Med. 2016;51(1):55-62. PubMed
33. Klasko SK, Cummings RV, Glazerman LR. Resident data collection: Do the numbers add up? Am J Obstet Gynecol. 1995;172(4 Pt 1):1312-1316. PubMed
34. Tierney D. Development & analysis of a mobile POCUS tracking tool. Ultrasound Med Biol. 2015;41(suppl 4):S31. 
35. Sethi MV, Zimmer J, Ure B, Lacher M. Prospective assessment of complications on a daily basis is essential to determine morbidity and mortality in routine pediatric surgery. J Pediatr Surg. 2016;51(4):630-633. PubMed
36. Fisher JC, Kuenzler KA, Tomita SS, Sinha P, Shah P, Ginsburg HB. Increased capture of pediatric surgical complications utilizing a novel case-log web application to enhance quality improvement. J Pediatr Surg. 2017;52(1):166-171. PubMed
37. Rethans JJ, Norcini JJ, Barón-Maldonado M, et al. The relationship between competence and performance: implications for assessing practice performance. Med Educ. 2002;36(10):901-909. PubMed
38. Duncan DR, Morgenthaler TI, Ryu JH, Daniels CE. Reducing iatrogenic risk in thoracentesis: establishing best practice via experiential training in a zero-risk environment. Chest. 2009;135(5):1315-1320. PubMed
39. Society of Critical Care Medicine Ultrasound Certification Task Force. Recommendations for achieving and maintaining competence and credentialing in critical care ultrasound with focused cardiac ultrasound and advanced critical care echocardiography. http://journals.lww.com/ccmjournal/Documents/Critical%20Care%20Ultrasound.pdf Published 2013. Accessed February 2, 2017.
40. Carraccio C, Wolfsthal SD, Englander R, Ferentz K, Martin C. Shifting paradigms: from Flexner to competencies. Acad Med. 2002;77(5):361-367. PubMed
41. Clark EG, Paparello JJ, Wayne DB, et al. Use of a national continuing medical education meeting to provide simulation-based training in temporary hemodialysis catheter insertion skills: a pre-test post-test study. Can J Kidney Health Dis. 2014;1:25-31. PubMed
42. Barsuk JH, Cohen ER, Caprio T, McGaghie WC, Simuni T, Wayne DB. Simulation-based education with mastery learning improves residents’ lumbar puncture skills. Neurology. 2012;79(2):132-137. PubMed
43. Barsuk JH, McGaghie WC, Cohen ER, O’Leary KJ, Wayne DB. Simulation-based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med. 2009;37(10):2697-2701. PubMed
44. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 2006;296(9):1094-1102. PubMed
45. Shah J, Darzi A. Surgical skills assessment: an ongoing debate. BJU Int. 2001;88(7):655-660. PubMed
46. Lamperti M, Bodenham AR, Pittiruti M, et al. International evidence-based recommendations on ultrasound-guided vascular access. Intensive Care Med. 2012;38(7):1105-1117. PubMed
47. Tolsgaard MG, Todsen T, Sorensen JL, et al. International multispecialty consensus on how to evaluate ultrasound competence: a Delphi consensus survey. PLOS One. 2013;8(2):e57687. doi:10.1371/journal.pone.0057687 PubMed
48. Moureau N, Laperti M, Kelly LJ, et al. Evidence-based consensus on the insertion of central venous access devices: definition of minimal requirements for training. Br J Anaesth. 2013;110(3):347-356. PubMed

49. Feldman LS, Hagarty S, Ghitulescu G, Stanbridge D, Fried GM. Relationship between objective assessment of technical skills and subjective in-training evaluations in surgical residents. J Am Coll Surg. 2004;198(1):105-110. PubMed
50. Baker S, Willey B, Mitchell C. The attempt to standardize technical and analytic competence in sonography education. J Diagn Med Sonogr. 2011;27(5):203-211. 
51. Tolsgaard MG, Ringsted C, Dreisler E, et al. Reliable and valid assessment of ultrasound operator competence in obstetrics and gynecology. Ultrasound Obstet Gynecol. 2014;43(4):437-443. PubMed
52. Rice J, Crichlow A, Baker M, et al. An assessment tool for the placement of ultrasound-guided peripheral intravenous access. J Grad Med Educ. 2016;8(2):202-207. PubMed
53. Hartman N, Wittler M, Askew K, Hiestand B, Manthey D. Validation of a performance checklist for ultrasound-guided internal jubular central lines for use in procedural instruction and assessment. Postgrad Med J. 2017;93(1096):67-70. PubMed
54. Primdahl SC, Todsen T, Clemmesen L, et al. Rating scale for the assessment of competence in ultrasound-guided peripheral vascular access—a Delphi Consensus Study. J Vasc Access. 2016;17(5):440-445. 
55. Berg D, Berg K, Riesenberg LA, et al. The development of a validated checklist for thoracentesis: preliminary results. Am J Med Qual. 2013;28(3):220-226. PubMed
56. Berg K, Riesenberg LA, Berg D, et al. The development of a validated checklist for radial arterial line placement: preliminary results. Am J Med Qual. 2014;29(3):242-246. PubMed
57. Walzak A, Bacchus M, Schaefer MP, et al. Diagnosing technical competence in six bedside procedures: comparing checklists and a global rating scale in the assessment of resident performance. Acad Med. 2015;90(8):1100-1108. PubMed
58. Riesenberg LA, Berg K, Berg D, et al. The development of a validated checklist for femoral venous catheterization: preliminary results. Am J Med Qual. 2014;29(5):445-450. PubMed
59. Riesenberg LA, Berg K, Berg D, et al. The development of a validated checklist for paracentesis: preliminary results. Am J Med Qual. 2013;28(3):227-231. PubMed
60. Huang GC, Newman LR, Schwartzstein RM, et al. Procedural competence in internal medicine residents: validity of a central venous catheter insertion assessment instrument. Acad Med. 2009;84(8):1127-1134. PubMed
61. Salamonsen M, McGrath D, Steiler G, et al. A new instrument to assess physician skill at thoracic ultrasound, including pleural effusion markup. Chest. 2013;144(3):930-934. PubMed
62. Boniface K, Yarris LM. Emergency ultrasound: Leveling the training and assessment landscape. Acad Emerg Med. 2014;21(7):803-805. PubMed
63. Boyle E, O’Keeffe D, Naughton P, Hill A, McDonnell C, Moneley D. The importance of expert feedback during endovascular simulator training. J Vasc Surg. 2011;54(1):240-248.e1. PubMed
64. Langhan TS, Rigby IJ, Walker IW, Howes D, Donnon T, Lord JA. Simulation-based training in critical resuscitation procedures improves residents’ competence. CJEM. 2009;11(6):535-539. PubMed
65. Barsuk JH, McGaghie WC, Cohen ER, Balachandran JS, Wayne DB. Use of simulation-based mastery learning to improve the quality of central venous catheter placement in a medical intensive care unit. J Hosp Med. 2009;4(7):397-403. PubMed
66. Lenchus JD. End of the “see one, do one, teach one” era: the next generation of invasive bedside procedural instruction. J Am Osteopath Assoc. 2010;110(6):340-346. PubMed
67. Barsuk JH, Cohen ER, Vozenilek JA, O’Connor LM, McGaghie WC, Wayne DB. Simulation-based education with mastery learning improves paracentesis skills. J Grad Med Educ. 2012;4(1):23-27. PubMed
68. McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med. 2011;86(6):706-711. PubMed
69. Ross JG. Simulation and psychomotor skill acquisition: A review of the literature. Clin Simul Nurs. 2012;8(9):e429-e435. 
70. Barsuk JH, Cohen ER, Potts S, et al. Dissemination of a simulation-based mastery learning intervention reduces central line-associated bloodstream infections. BMJ Qual Saf. 2014;23(9):749-756. PubMed
71. McSparron JI, Michaud GC, Gordan PL, et al. Simulation for skills-based education in pulmonary and critical care medicine. Ann Am Thorac Soc. 2015;12(4):579-586. PubMed
72. Kneebone RL, Scott W, Darzi A, Horrocks M. Simulation and clinical practice: strengthening the relationship. Med Educ. 2004;38(10):1095-1102. PubMed
73. Mema B, Harris I. The barriers and facilitators to transfer of ultrasound-guided central venous line skills from simulation to practice: exploring perceptions of learners and supervisors. Teach Learn Med. 2016;28(2):115-124. PubMed
74. Castanelli DJ. The rise of simulation in technical skills teaching and the implications for training novices in anaestheia. Anaesth Intensive Care. 2009;37(6):903-910. PubMed
75. McGaghie WC, Issenberg SB, Barsuk JH, Wayne DB. A critical review of simulation-based mastery learning with translational outcomes. Med Educ. 2014;48(4):375-385. PubMed
76. Langlois SLP. Focused ultrasound training for clinicians. Crit Care Med. 2007;35(5 suppl):S138-S143.
77. Price S, Via G, Sloth E, et al. Echocardiography practice, training and accreditation in the intesive care: document for the World Interactive Network Focused on Critical Ultrasound (WINFOCUS). Cardiovasc Ultrasound. 2008;6:49-83. PubMed
78. Blehar DJ, Barton B, Gaspari RJ. Learning curves in emergency ultrasound education. Acad Emerg Med. 2015;22(5):574-582. PubMed
79. Ault MJ, Rosen BT, Ault B. The use of tissue models for vascular access training. Phase I of the procedural patient safety initiative. J Gen Intern Med. 2006;21(5):514-517. PubMed
80. Barsuk JH, Cohen ER, McGaghie WC, Wayne DB. Long-term retention of central venous catheter insertion skills after simulation-based mastery learning. Acad Med. 2010;85(10 Suppl):S9-S12. PubMed
81. Sliman Sean, Amundson S, Shaw D, Phan JN, Waalen J, Kimura B. Recently-acquired cardiac ultrasound skills are rapidly lost when not used: implications for competency in physician imaging. J Amer Coll Cardiol. 2016;67(13S):1569. 
82. Kessler CS, Leone KA. The current state of core competency assessment in emergency medicine and a future research agenda: recommendations of the working group on assessment of observable learner performance. Acad Emerg Med. 2012;19(12):1354-1359. PubMed
83. Chang A, Schyve PM, Croteau RJ, O’Leary DS, Loeb JM. The JCAHO patient safety event taxonomy: a standardized terminology and classification schema for near misses and adverse events. Int J Qual Health Care. 2005;17(2):95-105. PubMed
84. Sawyer T, White M, Zaveri P, et al. Learn, see, practice, prove, do, maintain: an evidence-based pedagogical framework for procedural skill training in medicine. Acad Med. 2015;90(8):1025-1033. PubMed
85. Das D, Kapoor M, Brown C, Ndubuisi A, Gupta S. Current status of emergency department attending physician ultrasound credentialing and quality assurance in the United States. Crit Ultrasound J. 2016;8(1):6-12. PubMed
86. Ndubuisi AK, Gupta S, Brown C, Das D. Current status and future issues in emergency department attending physician ultrasound credentialing. Ann Emerg Med. 2014;64(45):S27-S28. 
87. Tandy Tk, Hoffenberg S. Emergency department ultrasound services by emergency physicians: model for gaining hospital approval. Ann Emerg Med. 1997;29(3):367-374. PubMed
88. Lewiss RE, Saul T, Del Rios M. Acquiring credentials in bedside ultrasound: a cross-sectional survey. BMJ Open. 2013;3:e003502. doi:10.1136/bmjopen-2013-003502 PubMed
89. Lanoix R. Credentialing issues in emergency ultrasonography. Emerg Med Clin North Am. 1997;15(4):913-920. PubMed
90. Scalea T, Rodriquez A, Chiu WC, et al. Focused assessment with sonography for trauma (FAST): results from an international consensus conference. J Trauma. 1999;46(3):466-472. PubMed
91. Hertzberg BS, Kliewer MA, Bowie JD, et al. Physician training requirements in sonography: how many cases are needed for competence? AJR. 2000;174(5):1221-1227. PubMed
92. Blaivas M, Theodoro DL, Sierzenski P. Proliferation of ultrasound fellowships in emergency medicine: how do we ensure future experts are expertly trained? Acad Emerg Med. 2002;9(8):863-864. PubMed
93. Bodenham AR. Editorial II: Ultrasound imaging by anaesthetists: training and accreditation issues. Br J Anaesth. 2006;96(4):414-417. PubMed
94. Williamson JP, Twaddell SH, Lee YCG, et al. Thoracic ultrasound recognition of competence: A position paper of the Thoracic Society of Australia and New Zealand. Respirology. 2017;22(2):405-408. PubMed
95. Harrison G. Summative clinical competency assessment: a survey of ultrasound practitioners’ views. Ultrasound. 2015;23(1):11-17. PubMed
96. Evans LV, Morse JL, Hamann CJ, Osborne M, Lin Z, D'Onofrio G. The development of an independent rater system to assess residents' competence in invasive procedures. Acad Med. 2009;84(8):1135-1143. PubMed
97. Wass V, Van der Vleuten C, Shatzer J, Jones R. Assessment of clinical competence. Lancet. 2001;357(9260):945-949. PubMed
98. Arntfield RT. The utility of remote supervision with feedback as a method to deliver high-volume critical care ultrasound training. J Crit Care. 2015;30(2):441.e1-e6. PubMed
99. Akhtar S, Theodoro D, Gaspari R, et al. Resident training in emergency ultrasound: consensus recommendations from the 2008 Council of Emergency Residency Directors Conference. Acad Emerg Med. 2009;16:S32-S36. PubMed
100. Yu E. The assessment of technical skills in a cardiology training program: is the ITER sufficient? Can J Cardiol. 2000;16(4):457-462. PubMed
101. Todsen T, Tolsgaard MG, Olsen BH, et al. Reliable and valid assessment of point-of-care ultrasonography. Ann Surg. 2015;261(2):309-315. PubMed
102. Stein JC, Nobay F. Emergency department ultrasound credentialing: a sample policy and procedure. J Emerg Med. 2009;37(2):153-159. PubMed
103. Chen FM. Burstin H, Huntington J. The importance of clinical outcomes in medical education research. Med Educ. 2005;39(4):350-351. PubMed
104. Dressler DD, Pistoria MJ, Budnitz TL, McKean SCW, Amin AN. Core competencies in hospital medicine: development and methodology. J Hosp Med. 2006;1:48-56. PubMed
105. ten Cate O. Nuts and bolts of entrustable professional activities. J Grad Med Educ. 2013;5(1):157-158. PubMed
106. Castillo J, Caruana CJ, Wainwright D. The changing concept of competence and categorisation of learning outcomes in Europe: Implications for the design of higher education radiography curricula at the European level. Radiography. 2011;17(3):230-234. 
107. Goldstein SR. Accreditation, certification: why all the confusion? Obstet Gynecol. 2007;110(6):1396-1398. PubMed
108. Moore CL. Credentialing and reimbursement in point-of-care ultrasound. Clin Pediatr Emerg Med. 2011;12(1):73-77. PubMed
109. ten Cate O, Scheele F. Competency-based postgraduate training: can we bridge the gap between theory and clinical practice? Acad Med. 2007;82(6):542-547. PubMed
110. Abuhamad AZ, Benacerraf BR, Woletz P, Burke BL. The accreditation of ultrasound practices: impact on compliance with minimum performance guidelines. J Ultrasound Med. 2004;23(8):1023-1029. PubMed

 

 

Article PDF
Issue
Journal of Hospital Medicine 13(2)
Publications
Topics
Page Number
126-135. Published online first January 17, 2018
Sections
Files
Files
Article PDF
Article PDF

The American Board of Internal Medicine (ABIM) changed its certification policy for bedside procedures over a decade ago.1 Acquiring manual competence in abdominal paracentesis, arterial catheter placement, arthrocentesis, central venous catheter placement, lumbar puncture, and thoracentesis is no longer an expectation of residency training. ABIM diplomates should “know” these procedures but not necessarily “do” them. Hospitalists, most of whom are themselves ABIM diplomates, are still, however, expected to do them as core competencies,2perhaps because hospitalists are often available off-hours, when roughly half of bedside procedures are performed.3

Hospitalists increasingly perform bedside procedures with ultrasound guidance.4 Yet training in ultrasound guidance is significantly varied as well,5 simply because point-of-care ultrasound (POCUS) has only recently become widespread.6 And though some skills are transferrable from landmark-guided to ultrasound -guided procedures, many are not.7-10 Furthermore, ultrasound guidance is often not explicitly delineated on the privileging forms used by hospitals,11 even where ultrasound guidance has become standard.12

Given the variability in training for both ultrasound- and landmark-guided procedures, and given the lack of a universal standard for certification, local hospitals often ask their respective hospitalist group leaders to certify hospitalists’ basic competence as part of credentialing (see the Table for definitions). How hospitalist group leaders should certify competence, however, is not clear. The importance of this gap has recently increased, as hospitalists continue to perform procedures despite not having clear answers to questions about basic competence.13-15

Therefore, the Society of Hospital Medicine (SHM) Education Committee convened a group of experts and conducted a systematic literature review in order to provide recommendations for credentialing hospitalist physicians in ultrasound-guided bedside procedures. These recommendations do not include training recommendations, aside from recommendations about remedial training for hospitalists who do not pass certification. Training is a means to competence but does not guarantee it. We believe that training recommendations ought to be considered separately.

METHODS

Working Group Formation

In January 2015, the SHM Board of Directors asked the SHM Education Committee to convene the POCUS Task Force. The purpose of the task force was to develop recommendations on ultrasound guidance for bedside procedures. The SHM Education Committee appointed 3 chairs of the task force: 1 senior member of the SHM Education Committee and 2 POCUS experts. The chairs assembled a task force of 31 members that included 5 working groups, a multispecialty peer review group, and a guideline methodologist (supplemental Appendix 1). Invitation was based on members’ past contributions to SHM POCUS-related activities, up-front commitment, and declared conflicts of interest. Working group members self-identified as “hospitalists,” whereas peer reviewers were nonhospitalists but nationally recognized POCUS physician-leaders specializing in emergency medicine, cardiology, critical care medicine, and anesthesiology. Task force membership was vetted by a chair of the SHM POCUS Task Force and the Director of Education before work began. This position statement was authored by the Credentialing Working Group together with the chairs of the other 4 working groups and a guideline methodologist.

 

 

Disclosures

Signed disclosure statements of all task force members were reviewed prior to inclusion on the task force (supplemental Appendix 2); no members received honoraria for participation. Industry representatives did not contribute to the development of the guidelines nor to any conference calls or meetings.

Literature Search Strategy

A literature search was conducted by a biomedical librarian. Records from 1979 to January of 2017 were searched in Medline, Embase, CINAHL, Cochrane, and Google Scholar (supplemental Appendix 3). Search limiters were English language and adults. Articles were manually screened to exclude nonhuman or endoscopic ultrasound applications. Final article selection was based on working group consensus.

Draft Pathways

The Credentialing Working Group drafted initial and ongoing certification pathways (Figure 1 and Figure 2). The other 4 working groups from the task force were surveyed about the elements and overall appropriateness of these draft pathways. This survey and its results have already been published.12 The Credentialing Working Group then revised the certification pathways by using these survey results and codified individual aspects of these pathways into recommendations.

Development of Position Statement

Based on the Grading of Recommendation Assessment Development and Evaluation methodology, all final article selections were initially rated as either low-quality (observational studies) or unclassifiable (expert opinion).16 These initial ratings were downgraded further because of indirectness, because none of the articles involved the intervention of interest (a credentialing pathway) in a population of interest (hospitalists) measuring the outcomes of interest (patient-level outcomes).17 Given the universal low-quality evidence ratings, we altered the task force strategy of developing guidelines, which the other 4 working groups are writing, and instead developed a position statement by using consensus gathering in 3 steps.

First, the Credentialing Working Group drafted an initial position statement composed of recommendations for credentialing pathways and other general aspects of credentialing. All final article selections were incorporated as references in a draft of the position statement and compiled in a full-text compendium. Second, feedback was provided by the other 4 task force working groups, the task force peer reviewers, and the SHM Education Committee. Feedback was incorporated by the authors of this statement who were the Credentialing Working Group, the chairs of the other 4 working groups, and a guideline methodologist. Third, final suggestions from all members of the SHM POCUS Task Force and SHM Education Committee were incorporated before final approval by the SHM Board of Directors in September 2017.

RESULTS

A total of 1438 references were identified in the original search. Manual selection led to 101 articles, which were incorporated into the following 4 domains with 16 recommendations.

General Credentialing Process

Basic Cognitive Competence Can Be Certified with Written or Oral Examinations

The ABIM defines cognitive competence as having 3 abilities: “(1) to explain indications, contraindications, patient preparation methods, sterile techniques, pain management, proper techniques for handling specimens and fluids obtained, and test results; (2) to recognize and manage complications; and, (3) to clearly explain to a patient all facets of the procedure necessary to obtain informed consent.”1 These abilities can be assessed with written or oral examinations that may be integrated into simulation- or patient-based assessments.18-21

Minimum Thresholds of Experience to Trigger the Timing of a Patient-Based Assessment Should Be Determined by Empirical Methods

Learning curves are highly variable22-25 and even plateaus may not herald basic competence.26 Expert opinions27 can be used to establish minimum thresholds of experience, but such opinions may paradoxically exceed the current thresholds of experts’ own hospitals.12 Thus, empirical methods, such as those based on cumulative sum analysis28-30 or local learning curves,31,32 are preferred. If such methods are not available, a recent survey of hospitalist experts may provide guidance.12 Regardless, once established, minimum thresholds are necessary but not sufficient to determine competency (see “Basic manual competence must be certified through patient-based assessments” section).

Hospitalists Should Formally Log All of Their Attempted Procedures, Ideally in an Electronic Medical Record

Simple self-reported numbers of procedures performed often misrepresent actual experience33,34 and do not include periprocedural complications.35,36 Thus, hospitalists should report their experience with logs of all attempted procedures, both successful and unsuccessful. Such logs must include information about supervising providers (if applicable) and patient outcomes, including periprocedural adverse events,37 but they must also remain compliant with the Health Insurance Portability and Accountability Act.

Health Information Technology Service Should Routinely Pull Collations of All Attempted Procedures from Comprehensive Electronic Medical Records

Active surveillance may reduce complications by identifying hospitalists who may benefit from further training.38 In order to facilitate active surveillance systems, documentation (such as a procedure note) should be both integrated into an electronic medical record and protocol driven,39 including procedure technique, ultrasound findings, and any safety events (both near misses and adverse events).

 

 

Basic Manual Competence Must Be Certified Through Patient-Based Assessments

Multiple interacting factors, including environment, patients, baseline skills, training, experience, and skills decay, affect manual competence. Certifications that are based solely on reaching minimum thresholds of experience, even when accurate, are not valid reflections of manual competence,15,40-43 and neither are those based on self-perception.44 Patient-based assessments are, thus, necessary to ensure manual competence.45-48

Certification Assessments of Manual Competence Should Combine 2 Types of Structured Instruments: Checklists and Overall Scores

Assessments based on direct observation are more reliable when formally structured.49,50 Though checklists used in observed structured clinical examinations capture many important manual skills,51-56 they do not completely reflect a hospitalist’s manual competence;57 situations may occur in which a hospitalist meets all the individual items on a checklist but cannot perform an entire procedure with basic competence. Therefore, checklists should be paired with overall scores.58-61 Both checklists and overall scores ought to be obtained from reliable and valid instruments.

Certification Assessments Should Include Feedback

Assessments without feedback are missed learning opportunities.62 Both simulation-63 and patient-based assessments should provide feedback in real time to reinforce effective behaviors and remedy faulty ones.

If Remedial Training is Needed, Simulator-Based Training Can Supplement but Not Replace Patient-Based Training

Supervised simulator-based training allows hospitalists to master basic components of a procedure64 (including orientation to equipment, sequence of operations, dexterity, ultrasound anatomy, and real-time guidance technique) while improving both cognitive and manual skills.42,43,65-71 In addition to their role in basic training (which is outside the scope of this position statement), simulators can be useful for remedial training. To be sufficient for hospitalists who do not pass their patient-based assessments, however, remedial training that begins with simulation must also include patient-based training and assessment.72-75

Initial Credentialing Process

A Minimum Threshold of Experience Should Be Reached before Patient-Based Assessments are Conducted (Figure 1)

Recent experience, such as the number of successful procedures performed on a representative sample of patients61,76,77 in the last 2 years, should meet a minimum threshold (see “Minimum thresholds of experience to trigger the timing of a patient-based assessment should be determined by empirical methods” section) before a patient-based assessment for intramural certification occurs.31,78 Such procedures should be supervised unless performed with privileges, for example, at another hospital. After reaching both a minimum threshold of experience and passing an observed patient-based assessment, which includes assessments of both cognitive and manual skills, hospitalists can be considered intramurally certified for initial credentialing. The hospitalist may begin to independently perform ultrasound-guided procedures if all credentialing requirements are met and privileges are granted.

Initial Certification Assessments Should Ideally Begin on Simulators

Simulators allow the assurance of safe manual skills, including proper needle insertion techniques and disposal of sharp objects.3,79 If simulators are not available, however, then patient-based training and assessments can still be performed under direct observation. Safe performance of ultrasound-guided procedures during patient-based assessments (without preceding simulator-based assessments) is sufficient to certify manual competence.

Ongoing Credentialing

Certification to Perform Ultrasound-Guided Procedures Should Be Routinely Re-Evaluated During Ongoing Credentialing (Figure 2)

Ongoing certifications are needed because skills decay.80,81 They should be routine, perhaps coinciding with the usual reprivileging cycle (often biennually). When feasible,82 maintenance of manual competence is best ensured by directly observed patient-based assessments; when not feasible, performance reviews are acceptable.

Observed Patient-Based Assessments Should Occur When a Periprocedural Safety Event Occurs that is Potentially Caused by “Provider Error”

Safety events include both near misses and adverse events. Information about both is ideally “flagged” and “pushed” to hospitalist group leaders by active surveillance and reporting systems. Once reviewed, if a safety event is considered to potentially have been caused by provider error (including knowledge- and skill-based errors),83 then the provider who performed the procedure should undergo an observed patient-based assessment.

Simulation-Based Practice Can Supplement Patient-Based Experience for Ongoing Credentialing

When hospitalists do not achieve a minimum threshold of patient-based experience since the antecedent certification, simulation-based training can supplement their patient-based experience.84 In these cases, however, an observed patient-based assessment must occur. Another consideration is whether or not the privilege should be relinquished because of an infrequent need.

Credentialing Infrastructure

Hospitalists Themselves Should Not Bear the Financial Costs of Developing and Maintaining Training and Certification Programs for Ultrasound-Guided Procedures

Equipment and personnel costs85,86 commonly impede ultrasound-guided procedure programs.4,87,88 Hospitalists whose job descriptions include the performance of ultrasound-guided procedures should not be expected to bear the costs of ultrasound machines, image archival software, equipment maintenance, and initial and ongoing training and certification.

Assessors Should Be Unbiased Expert Providers Who Have Demonstrated Mastery in Performance of the Procedure Being Assessed and Regularly Perform It in a Similar Practice Environment

 

 

Assessors should be expert providers who regularly perform the ultrasound-guided procedure in a similar practice environment.9,89-94 For example, providers who are not hospitalists but who are experts in an ultrasound-guided procedure and commonly perform it on the hospital wards would be acceptable assessors. However, a radiologist who only performs that procedure in a fully-staffed interventional radiology suite with fluoroscopy or computed tomography guidance would not be an acceptable assessor. More than 1 assessor may balance idiosyncratic assessments;95 but when assessments are well structured, additional assessors are generally not needed.18Candidate assessors should be vetted by the hospitalist group leader and the hospital privileging committee.

If Intramural Assessors Are Not Available, Extramural Assessors May Be Considered

Intramural assessors are generally preferred because of familiarity with the local practice environment, including the available procedure kits and typical patient characteristics. Nevertheless, extramural assessors27,77,85,96 may theoretically provide even more valid assessments than intramural ones because extramural assessors are neither influenced by relationships with local hospitalists nor biased by local hospitalists’ skills.97,98 Remote performance assessment through video recordings99 or live-video streaming is another option100 but is not sufficient unless a room camera is available to simultaneously view probe movement and the ultrasound screen.101 In addition, remote assessment does not allow the assessor to physically assume control of the procedure to either salvage it or perhaps, in some cases, prevent a complication.

DISCUSSION

There are no high-quality randomized trials in support of a single credentialing pathway over any other.94,102 The credentialing pathways at the center of this position statement are based on expert opinion. Our methods can be criticized straightaway, therefore, for reliance on the experience and expertise of our working group and task force. Any position statement written without high-quality supportive evidence would be appropriately subject to the same criticism. Without evidence in support of an overall pathway, we codified specific aspects of the pathways into 16 individual recommendations.

Patient-level outcomes do not back these recommendations. Consider, for example, our recommendation that certification assessments be made from structured instruments and not simply from an assessor’s gestalt. Here, the basis is not improved patient-level outcomes from a trial (such as reduced complications or increased procedural success) but improved psychometric performance from reliability studies. The body of evidence for our recommendations is similarly indirect, mostly because the outcomes studied are more proximate and, thus, less meaningful than patient-level outcomes, which are the outcomes of greatest interest but are woefully understudied for clinical competence.17,97,103

The need for high-quality evidence is most pronounced in distinguishing how recommendations should be modified for various settings. Wide variations in resources and patient-mix will make some recommendations impracticable, meaning that they could not be carried out with available resources. For example, our recommendation that credentialing decisions should ultimately rely on certifications made by assessors during patient-based assessments may not be practicable at small, rural hospitals. Such hospitals may not have access to local assessors, and they may not admit enough patients who need the types of ultrasound-guided procedures for which hospitalists seek certification (especially given the need to coordinate the schedules of patients, procedure-performing hospitalists, and assessors). Collaborative efforts between hospitals for regional certification may be a potential solution to consider. But if recommendations are truly impracticable, the task force recognizes they may need to be modified. Given the low quality of evidence supporting our recommendations, such modifications would be readily defendable, especially if they emerged from collaborative discussions between privileging committees, hospitalist directors, and local experts.

One way for hospitals to implement our recommendations may be to follow a recommendation proposed by the authors of the original hospitalist core competencies over a decade ago: “The presence of a procedural skill in the Core Competencies does not necessarily indicate that every hospitalist will perform or be proficient in that procedure.”104 In other words, bedside procedures may be delegated to some but not all hospitalists. Such “proceduralists” would have some proportion of their clinical responsibility dedicated to performing procedures. Delineation of this job description must be made locally because it balances 2 hospital-specific characteristics: patients’ needs for procedures against the availability of providers with basic competence to perform them, which includes hospitalists but also emergency medicine physicians, specialists, and interventional radiologists. A salutary benefit for hospitals is that hospitalists who are not proceduralists would not need to undergo certification in basic competence for the bedside procedures they will not be performing.

Regardless of whether some or all hospitalists at a particular hospital are expected to perform bedside procedures, technology may help to improve the practicability of our recommendations. For example, simulators may evolve to replace actual patient-level experience in achieving minimum thresholds. Certification assessments of manual skills may even someday occur entirely on simulators. Real-time high-definition video streaming enhanced with multiple cameras may allow for remote assessments. Until such advances mature, high-quality patient-level data should be sought through additional research to refine our current recommendations.

We hope that these recommendations will improve how basic competence in ultrasound-guided bedside procedures is assessed. Our ultimate goal is to improve how hospitalists perform these procedures. Patient safety is, therefore, considered paramount to cost. Nevertheless, the hospital administrative leaders and privileging committee members on our Task Force concluded that many hospitals have been seeking guidance on credentialing for bedside procedures, and the likely difficulties of implementing our recommendations (including cost) would not be prohibitive at most hospitals, especially given recognition that these recommendations can be tailored to each setting.

 

 

Acknowledgments

Collaborators from SHM POCUS Task Force are Saaid Abdel-Ghani, Michael Blaivas, Dan Brotman, Carolina Candotti, Jagriti Chadha, Joel Cho, Ria Dancel, Ricardo Franco, Richard Hoppmann, Susan Hunt, Venkat Kalidindi, Ketino Kobaidze, Josh Lenchus, Benji Mathews, Satyen Nichani, Vicki Noble, Martin Perez, Nitin Puri, Aliaksei Pustavoitau, Sophia Rodgers, Gerard Salame, Daniel Schnobrich, Kirk Spencer, Vivek Tayal, Jeff Bates, Anjali Bhagra, Kreegan Reierson, Robert Arntfield, Paul Mayo, Loretta Grikis.

Disclosure

Brian P. Lucas received funding from the Department of Veterans Affairs, Veterans Health Administration, Office of Research and Development and Dartmouth SYNERGY, National Institutes of Health, and National Center for Translational Science (UL1TR001086). Nilam Soni received funding from the Department of Veterans Affairs, Quality Enhancement Research Initiative (QUERI) Partnered Evaluation Initiative (HX002263-01A1). The contents of this publication do not represent the views of the United States Department of Veterans Affairs or the United States Government.

The American Board of Internal Medicine (ABIM) changed its certification policy for bedside procedures over a decade ago.1 Acquiring manual competence in abdominal paracentesis, arterial catheter placement, arthrocentesis, central venous catheter placement, lumbar puncture, and thoracentesis is no longer an expectation of residency training. ABIM diplomates should “know” these procedures but not necessarily “do” them. Hospitalists, most of whom are themselves ABIM diplomates, are still, however, expected to do them as core competencies,2perhaps because hospitalists are often available off-hours, when roughly half of bedside procedures are performed.3

Hospitalists increasingly perform bedside procedures with ultrasound guidance.4 Yet training in ultrasound guidance is significantly varied as well,5 simply because point-of-care ultrasound (POCUS) has only recently become widespread.6 And though some skills are transferrable from landmark-guided to ultrasound -guided procedures, many are not.7-10 Furthermore, ultrasound guidance is often not explicitly delineated on the privileging forms used by hospitals,11 even where ultrasound guidance has become standard.12

Given the variability in training for both ultrasound- and landmark-guided procedures, and given the lack of a universal standard for certification, local hospitals often ask their respective hospitalist group leaders to certify hospitalists’ basic competence as part of credentialing (see the Table for definitions). How hospitalist group leaders should certify competence, however, is not clear. The importance of this gap has recently increased, as hospitalists continue to perform procedures despite not having clear answers to questions about basic competence.13-15

Therefore, the Society of Hospital Medicine (SHM) Education Committee convened a group of experts and conducted a systematic literature review in order to provide recommendations for credentialing hospitalist physicians in ultrasound-guided bedside procedures. These recommendations do not include training recommendations, aside from recommendations about remedial training for hospitalists who do not pass certification. Training is a means to competence but does not guarantee it. We believe that training recommendations ought to be considered separately.

METHODS

Working Group Formation

In January 2015, the SHM Board of Directors asked the SHM Education Committee to convene the POCUS Task Force. The purpose of the task force was to develop recommendations on ultrasound guidance for bedside procedures. The SHM Education Committee appointed 3 chairs of the task force: 1 senior member of the SHM Education Committee and 2 POCUS experts. The chairs assembled a task force of 31 members that included 5 working groups, a multispecialty peer review group, and a guideline methodologist (supplemental Appendix 1). Invitation was based on members’ past contributions to SHM POCUS-related activities, up-front commitment, and declared conflicts of interest. Working group members self-identified as “hospitalists,” whereas peer reviewers were nonhospitalists but nationally recognized POCUS physician-leaders specializing in emergency medicine, cardiology, critical care medicine, and anesthesiology. Task force membership was vetted by a chair of the SHM POCUS Task Force and the Director of Education before work began. This position statement was authored by the Credentialing Working Group together with the chairs of the other 4 working groups and a guideline methodologist.

 

 

Disclosures

Signed disclosure statements of all task force members were reviewed prior to inclusion on the task force (supplemental Appendix 2); no members received honoraria for participation. Industry representatives did not contribute to the development of the guidelines nor to any conference calls or meetings.

Literature Search Strategy

A literature search was conducted by a biomedical librarian. Records from 1979 to January of 2017 were searched in Medline, Embase, CINAHL, Cochrane, and Google Scholar (supplemental Appendix 3). Search limiters were English language and adults. Articles were manually screened to exclude nonhuman or endoscopic ultrasound applications. Final article selection was based on working group consensus.

Draft Pathways

The Credentialing Working Group drafted initial and ongoing certification pathways (Figure 1 and Figure 2). The other 4 working groups from the task force were surveyed about the elements and overall appropriateness of these draft pathways. This survey and its results have already been published.12 The Credentialing Working Group then revised the certification pathways by using these survey results and codified individual aspects of these pathways into recommendations.

Development of Position Statement

Based on the Grading of Recommendation Assessment Development and Evaluation methodology, all final article selections were initially rated as either low-quality (observational studies) or unclassifiable (expert opinion).16 These initial ratings were downgraded further because of indirectness, because none of the articles involved the intervention of interest (a credentialing pathway) in a population of interest (hospitalists) measuring the outcomes of interest (patient-level outcomes).17 Given the universal low-quality evidence ratings, we altered the task force strategy of developing guidelines, which the other 4 working groups are writing, and instead developed a position statement by using consensus gathering in 3 steps.

First, the Credentialing Working Group drafted an initial position statement composed of recommendations for credentialing pathways and other general aspects of credentialing. All final article selections were incorporated as references in a draft of the position statement and compiled in a full-text compendium. Second, feedback was provided by the other 4 task force working groups, the task force peer reviewers, and the SHM Education Committee. Feedback was incorporated by the authors of this statement who were the Credentialing Working Group, the chairs of the other 4 working groups, and a guideline methodologist. Third, final suggestions from all members of the SHM POCUS Task Force and SHM Education Committee were incorporated before final approval by the SHM Board of Directors in September 2017.

RESULTS

A total of 1438 references were identified in the original search. Manual selection led to 101 articles, which were incorporated into the following 4 domains with 16 recommendations.

General Credentialing Process

Basic Cognitive Competence Can Be Certified with Written or Oral Examinations

The ABIM defines cognitive competence as having 3 abilities: “(1) to explain indications, contraindications, patient preparation methods, sterile techniques, pain management, proper techniques for handling specimens and fluids obtained, and test results; (2) to recognize and manage complications; and, (3) to clearly explain to a patient all facets of the procedure necessary to obtain informed consent.”1 These abilities can be assessed with written or oral examinations that may be integrated into simulation- or patient-based assessments.18-21

Minimum Thresholds of Experience to Trigger the Timing of a Patient-Based Assessment Should Be Determined by Empirical Methods

Learning curves are highly variable22-25 and even plateaus may not herald basic competence.26 Expert opinions27 can be used to establish minimum thresholds of experience, but such opinions may paradoxically exceed the current thresholds of experts’ own hospitals.12 Thus, empirical methods, such as those based on cumulative sum analysis28-30 or local learning curves,31,32 are preferred. If such methods are not available, a recent survey of hospitalist experts may provide guidance.12 Regardless, once established, minimum thresholds are necessary but not sufficient to determine competency (see “Basic manual competence must be certified through patient-based assessments” section).

Hospitalists Should Formally Log All of Their Attempted Procedures, Ideally in an Electronic Medical Record

Simple self-reported numbers of procedures performed often misrepresent actual experience33,34 and do not include periprocedural complications.35,36 Thus, hospitalists should report their experience with logs of all attempted procedures, both successful and unsuccessful. Such logs must include information about supervising providers (if applicable) and patient outcomes, including periprocedural adverse events,37 but they must also remain compliant with the Health Insurance Portability and Accountability Act.

Health Information Technology Service Should Routinely Pull Collations of All Attempted Procedures from Comprehensive Electronic Medical Records

Active surveillance may reduce complications by identifying hospitalists who may benefit from further training.38 In order to facilitate active surveillance systems, documentation (such as a procedure note) should be both integrated into an electronic medical record and protocol driven,39 including procedure technique, ultrasound findings, and any safety events (both near misses and adverse events).

 

 

Basic Manual Competence Must Be Certified Through Patient-Based Assessments

Multiple interacting factors, including environment, patients, baseline skills, training, experience, and skills decay, affect manual competence. Certifications that are based solely on reaching minimum thresholds of experience, even when accurate, are not valid reflections of manual competence,15,40-43 and neither are those based on self-perception.44 Patient-based assessments are, thus, necessary to ensure manual competence.45-48

Certification Assessments of Manual Competence Should Combine 2 Types of Structured Instruments: Checklists and Overall Scores

Assessments based on direct observation are more reliable when formally structured.49,50 Though checklists used in observed structured clinical examinations capture many important manual skills,51-56 they do not completely reflect a hospitalist’s manual competence;57 situations may occur in which a hospitalist meets all the individual items on a checklist but cannot perform an entire procedure with basic competence. Therefore, checklists should be paired with overall scores.58-61 Both checklists and overall scores ought to be obtained from reliable and valid instruments.

Certification Assessments Should Include Feedback

Assessments without feedback are missed learning opportunities.62 Both simulation-63 and patient-based assessments should provide feedback in real time to reinforce effective behaviors and remedy faulty ones.

If Remedial Training is Needed, Simulator-Based Training Can Supplement but Not Replace Patient-Based Training

Supervised simulator-based training allows hospitalists to master basic components of a procedure64 (including orientation to equipment, sequence of operations, dexterity, ultrasound anatomy, and real-time guidance technique) while improving both cognitive and manual skills.42,43,65-71 In addition to their role in basic training (which is outside the scope of this position statement), simulators can be useful for remedial training. To be sufficient for hospitalists who do not pass their patient-based assessments, however, remedial training that begins with simulation must also include patient-based training and assessment.72-75

Initial Credentialing Process

A Minimum Threshold of Experience Should Be Reached before Patient-Based Assessments are Conducted (Figure 1)

Recent experience, such as the number of successful procedures performed on a representative sample of patients61,76,77 in the last 2 years, should meet a minimum threshold (see “Minimum thresholds of experience to trigger the timing of a patient-based assessment should be determined by empirical methods” section) before a patient-based assessment for intramural certification occurs.31,78 Such procedures should be supervised unless performed with privileges, for example, at another hospital. After reaching both a minimum threshold of experience and passing an observed patient-based assessment, which includes assessments of both cognitive and manual skills, hospitalists can be considered intramurally certified for initial credentialing. The hospitalist may begin to independently perform ultrasound-guided procedures if all credentialing requirements are met and privileges are granted.

Initial Certification Assessments Should Ideally Begin on Simulators

Simulators allow the assurance of safe manual skills, including proper needle insertion techniques and disposal of sharp objects.3,79 If simulators are not available, however, then patient-based training and assessments can still be performed under direct observation. Safe performance of ultrasound-guided procedures during patient-based assessments (without preceding simulator-based assessments) is sufficient to certify manual competence.

Ongoing Credentialing

Certification to Perform Ultrasound-Guided Procedures Should Be Routinely Re-Evaluated During Ongoing Credentialing (Figure 2)

Ongoing certifications are needed because skills decay.80,81 They should be routine, perhaps coinciding with the usual reprivileging cycle (often biennually). When feasible,82 maintenance of manual competence is best ensured by directly observed patient-based assessments; when not feasible, performance reviews are acceptable.

Observed Patient-Based Assessments Should Occur When a Periprocedural Safety Event Occurs that is Potentially Caused by “Provider Error”

Safety events include both near misses and adverse events. Information about both is ideally “flagged” and “pushed” to hospitalist group leaders by active surveillance and reporting systems. Once reviewed, if a safety event is considered to potentially have been caused by provider error (including knowledge- and skill-based errors),83 then the provider who performed the procedure should undergo an observed patient-based assessment.

Simulation-Based Practice Can Supplement Patient-Based Experience for Ongoing Credentialing

When hospitalists do not achieve a minimum threshold of patient-based experience since the antecedent certification, simulation-based training can supplement their patient-based experience.84 In these cases, however, an observed patient-based assessment must occur. Another consideration is whether or not the privilege should be relinquished because of an infrequent need.

Credentialing Infrastructure

Hospitalists Themselves Should Not Bear the Financial Costs of Developing and Maintaining Training and Certification Programs for Ultrasound-Guided Procedures

Equipment and personnel costs85,86 commonly impede ultrasound-guided procedure programs.4,87,88 Hospitalists whose job descriptions include the performance of ultrasound-guided procedures should not be expected to bear the costs of ultrasound machines, image archival software, equipment maintenance, and initial and ongoing training and certification.

Assessors Should Be Unbiased Expert Providers Who Have Demonstrated Mastery in Performance of the Procedure Being Assessed and Regularly Perform It in a Similar Practice Environment

 

 

Assessors should be expert providers who regularly perform the ultrasound-guided procedure in a similar practice environment.9,89-94 For example, providers who are not hospitalists but who are experts in an ultrasound-guided procedure and commonly perform it on the hospital wards would be acceptable assessors. However, a radiologist who only performs that procedure in a fully-staffed interventional radiology suite with fluoroscopy or computed tomography guidance would not be an acceptable assessor. More than 1 assessor may balance idiosyncratic assessments;95 but when assessments are well structured, additional assessors are generally not needed.18Candidate assessors should be vetted by the hospitalist group leader and the hospital privileging committee.

If Intramural Assessors Are Not Available, Extramural Assessors May Be Considered

Intramural assessors are generally preferred because of familiarity with the local practice environment, including the available procedure kits and typical patient characteristics. Nevertheless, extramural assessors27,77,85,96 may theoretically provide even more valid assessments than intramural ones because extramural assessors are neither influenced by relationships with local hospitalists nor biased by local hospitalists’ skills.97,98 Remote performance assessment through video recordings99 or live-video streaming is another option100 but is not sufficient unless a room camera is available to simultaneously view probe movement and the ultrasound screen.101 In addition, remote assessment does not allow the assessor to physically assume control of the procedure to either salvage it or perhaps, in some cases, prevent a complication.

DISCUSSION

There are no high-quality randomized trials in support of a single credentialing pathway over any other.94,102 The credentialing pathways at the center of this position statement are based on expert opinion. Our methods can be criticized straightaway, therefore, for reliance on the experience and expertise of our working group and task force. Any position statement written without high-quality supportive evidence would be appropriately subject to the same criticism. Without evidence in support of an overall pathway, we codified specific aspects of the pathways into 16 individual recommendations.

Patient-level outcomes do not back these recommendations. Consider, for example, our recommendation that certification assessments be made from structured instruments and not simply from an assessor’s gestalt. Here, the basis is not improved patient-level outcomes from a trial (such as reduced complications or increased procedural success) but improved psychometric performance from reliability studies. The body of evidence for our recommendations is similarly indirect, mostly because the outcomes studied are more proximate and, thus, less meaningful than patient-level outcomes, which are the outcomes of greatest interest but are woefully understudied for clinical competence.17,97,103

The need for high-quality evidence is most pronounced in distinguishing how recommendations should be modified for various settings. Wide variations in resources and patient-mix will make some recommendations impracticable, meaning that they could not be carried out with available resources. For example, our recommendation that credentialing decisions should ultimately rely on certifications made by assessors during patient-based assessments may not be practicable at small, rural hospitals. Such hospitals may not have access to local assessors, and they may not admit enough patients who need the types of ultrasound-guided procedures for which hospitalists seek certification (especially given the need to coordinate the schedules of patients, procedure-performing hospitalists, and assessors). Collaborative efforts between hospitals for regional certification may be a potential solution to consider. But if recommendations are truly impracticable, the task force recognizes they may need to be modified. Given the low quality of evidence supporting our recommendations, such modifications would be readily defendable, especially if they emerged from collaborative discussions between privileging committees, hospitalist directors, and local experts.

One way for hospitals to implement our recommendations may be to follow a recommendation proposed by the authors of the original hospitalist core competencies over a decade ago: “The presence of a procedural skill in the Core Competencies does not necessarily indicate that every hospitalist will perform or be proficient in that procedure.”104 In other words, bedside procedures may be delegated to some but not all hospitalists. Such “proceduralists” would have some proportion of their clinical responsibility dedicated to performing procedures. Delineation of this job description must be made locally because it balances 2 hospital-specific characteristics: patients’ needs for procedures against the availability of providers with basic competence to perform them, which includes hospitalists but also emergency medicine physicians, specialists, and interventional radiologists. A salutary benefit for hospitals is that hospitalists who are not proceduralists would not need to undergo certification in basic competence for the bedside procedures they will not be performing.

Regardless of whether some or all hospitalists at a particular hospital are expected to perform bedside procedures, technology may help to improve the practicability of our recommendations. For example, simulators may evolve to replace actual patient-level experience in achieving minimum thresholds. Certification assessments of manual skills may even someday occur entirely on simulators. Real-time high-definition video streaming enhanced with multiple cameras may allow for remote assessments. Until such advances mature, high-quality patient-level data should be sought through additional research to refine our current recommendations.

We hope that these recommendations will improve how basic competence in ultrasound-guided bedside procedures is assessed. Our ultimate goal is to improve how hospitalists perform these procedures. Patient safety is, therefore, considered paramount to cost. Nevertheless, the hospital administrative leaders and privileging committee members on our Task Force concluded that many hospitals have been seeking guidance on credentialing for bedside procedures, and the likely difficulties of implementing our recommendations (including cost) would not be prohibitive at most hospitals, especially given recognition that these recommendations can be tailored to each setting.

 

 

Acknowledgments

Collaborators from SHM POCUS Task Force are Saaid Abdel-Ghani, Michael Blaivas, Dan Brotman, Carolina Candotti, Jagriti Chadha, Joel Cho, Ria Dancel, Ricardo Franco, Richard Hoppmann, Susan Hunt, Venkat Kalidindi, Ketino Kobaidze, Josh Lenchus, Benji Mathews, Satyen Nichani, Vicki Noble, Martin Perez, Nitin Puri, Aliaksei Pustavoitau, Sophia Rodgers, Gerard Salame, Daniel Schnobrich, Kirk Spencer, Vivek Tayal, Jeff Bates, Anjali Bhagra, Kreegan Reierson, Robert Arntfield, Paul Mayo, Loretta Grikis.

Disclosure

Brian P. Lucas received funding from the Department of Veterans Affairs, Veterans Health Administration, Office of Research and Development and Dartmouth SYNERGY, National Institutes of Health, and National Center for Translational Science (UL1TR001086). Nilam Soni received funding from the Department of Veterans Affairs, Quality Enhancement Research Initiative (QUERI) Partnered Evaluation Initiative (HX002263-01A1). The contents of this publication do not represent the views of the United States Department of Veterans Affairs or the United States Government.

References

1. American Board of Internal Medicine. Policies and procedures for certification. Philadelphia: American Board of Internal Medicine; 2006.
2. Nichani S, Fitterman N, Lukela M, Crocker J; Society of Hospital Medicine. The Core Competencies in Hospital Medicine 2017 Revision. Section 2: Procedures. J Hosp Med. 2017;12(4 Suppl 1):S44-S54 PubMed
3. Lucas BP, Asbury JK, Franco-Sadud R. Training future hospitalists with simulators: a needed step toward accessible, expertly performed bedside procedures. J Hosp Med. 2009;4(7):395-396. PubMed
4. Schnobrich DJ, Gladding S, Olson APJ, Duran-Nelson A. Point-of-care ultrasound in internal medicine: a national survey of educational leadership. J Grad Med Educ. 2013;5(3):498-502. PubMed
5. Brown GM, Otremba M, Devine LA, Gray C, Millington SJ, Ma IW. Defining competencies for ultrasound-guided bedside procedures: consensus opinions from Canadian physicians. J Ultrasound Med. 2016;35(1):129-141. PubMed
6. Vaisman A, Cram P. Procedural competence among faculty in academic health centers: challenges and future directions. Acad Med. 2017;92(1):31-34. PubMed
7. Kreisman RD. With ED ultrasound, credentialing is at issue. ED Legal Letter. 2010;21:102-103. 
8. Goudie AM. Credentialing a new skill: what should the standard be for emergency department ultrasound in Australasia? Emerg Med Australas. 2010;22:263-264. PubMed
9. Maizel J, Guyomarc HL, Henon P, et al. Residents learning ultrasound-guided catheterization are not sufficiently skilled to use landmarks. Crit Care. 2014;18(1):R36. doi:10.1186/cc13741. PubMed
10. American College of Emergency Physicians. Ultrasound guidelines: emergency, point-of-care, and clinical ultrasound guidelines in medicine. Ann Emerg Med. 2017;69(5):e27-e54. PubMed
11. Amini R, Adhikari S, Fiorello A. Ultrasound competency assessment in emergency medicine residency programs. Acad Emerg Med. 2014;21(7):799-801. PubMed
12. Jensen T, Soni NJ, Tierney DM, Lucas BP. Hospital privileging practices for bedside procedures: a survey of hospitalist experts. J Hosp Med. 2017;12(10):836-839. PubMed
13. Chang W. Is hospitalist proficiency in bedside procedures in decline? The Hospitalist. 2012. http://www.the-hospitalist.org/hospitalist/article/125236/patient-safety/hospitalist-proficiency-bedside-procedures-decline. Accessed September 30, 2017.
14. Barsuk JH, Feinglass J, Kozmic SE, Hohmann SF, Ganger D, Wayne DB. Specialties Performing Paracentesis Procedures at University Hospitals: Implications for Training and Certification. J Hosp Med. 2014;9(3):162-168. PubMed
15. Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Residents’ Procedural Experience Does Not Ensure Competence: A Research Synthesis. J Grad Med Educ. 2017;9(2):201-208. PubMed
16. Balshem H, Helfand M, Schunemann HJ, et al. GRADE guidelines: 3. Rating the quality of evidence. J Clin Epidemiol. 2011;64(4):401-406. PubMed
17. Guyatt GH, Oxman AD, Kunz R, et al. GRADE guidelines: 8. Rating the quality of evidence—indirectness. J Clin Epidemiol. 2011;64(12):1303-1310. PubMed
18. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9 Suppl):S63-S67. PubMed
19. Grover S, Currier PF, Elinoff JM, Mouchantaf KJ, Katz JT, McMahon GT. Development of a test to evaluate residents knowledge of medical procedures. J Hosp Med. 2009;4(7):430-432. PubMed
20. Millington SJ, Wong RY, Kassen BO, Roberts JM, Ma IWY. Improving internal medicine residents’ performance, knowledge, and confidence in central venous catheterization using simulators. J Hosp Med. 2009;4(7):410-416. PubMed
21. Lenchus JD, Carvalho CM, Ferreri K, et al. Filling the void: defining invasive bedside procedural competency for internal medicine residents. J Grad Med Educ. 2013;5(4):605-612. PubMed
22. Heegeman DJ, Kieke B Jr. Learning curves, credentialing, and the need for ultrasound fellowships. Acad Emerg Med. 2003;10:404-405. PubMed
23. Jang TB, Ruggeri W, Dyne P, Kaji AH. The learning curve of resident physicians using emergency ultrasonography for cholelithaisis and cholecystitis. Acad Emerg Med. 2010;17(11):1247-1252. PubMed
24. Akhtar MI, Hamid M. Ultrasound guided central venous access; a review of literature. Anaesth Pain Intensive Care. 2015;19:317-322. 
25. Bahl A, Yunker A. Assessment of the numbers–based model for evaluation of resident competency in emergency ultrasound core applications. J Emerg Med Trauma Acute Care. 2015;2015(5). doi:10.5339/jemtac.2015.5 
26. Cazes N, Desmots F, Geffroy Y, Renard A, Leyral J, Chaumoitre K. Emergency ultrasound: a prospective study on sufficient adequate training for military doctors. Diagn Interv Imaging. 2013;94(11):1109-1115. PubMed
27. Arntfield RT, Millington SJ, Ainsworth CD, et al. Canadian recommendations for critical care ultrasound training and competency for the Canadian critical care society. Can Respir J. 2014;21(16):341-345. 
28. Bolsin S, Colson M. The use of the Cusum technique in the assessment of trainee competence in new procedures. Int J Qual Health Care. 2000;12(5):433-438. PubMed
29. de Oliveira Filho GR, Helayel PE, da Conceição DB, Garzel IS, Pavei P, Ceccon MS. Learning curves and mathematical models for interventional ultrasound basic skills. Anaesth Analg. 2008;106(2):568-573. PubMed
30. Starkie T, Drake EJ. Assessment of procedural skills training and performance in anesthesia using cumulative sum analysis (cusum). Can J Anaesth. 2013;60(12):1228-1239. PubMed
31. Tierney D. Competency cut-point identification derived from a mastery learning cohort approach: A hybrid model. Ultrasound Med Biol. 2015;41:S19. 
32. Rankin JH, Elkhunovich MA, Rangarajan V, Chilstrom M, Mailhot T. Learning Curves for Ultrasound Assessment of Lumbar Puncture Insertion Sites: When is Competency Established? J Emerg Med. 2016;51(1):55-62. PubMed
33. Klasko SK, Cummings RV, Glazerman LR. Resident data collection: Do the numbers add up? Am J Obstet Gynecol. 1995;172(4 Pt 1):1312-1316. PubMed
34. Tierney D. Development & analysis of a mobile POCUS tracking tool. Ultrasound Med Biol. 2015;41(suppl 4):S31. 
35. Sethi MV, Zimmer J, Ure B, Lacher M. Prospective assessment of complications on a daily basis is essential to determine morbidity and mortality in routine pediatric surgery. J Pediatr Surg. 2016;51(4):630-633. PubMed
36. Fisher JC, Kuenzler KA, Tomita SS, Sinha P, Shah P, Ginsburg HB. Increased capture of pediatric surgical complications utilizing a novel case-log web application to enhance quality improvement. J Pediatr Surg. 2017;52(1):166-171. PubMed
37. Rethans JJ, Norcini JJ, Barón-Maldonado M, et al. The relationship between competence and performance: implications for assessing practice performance. Med Educ. 2002;36(10):901-909. PubMed
38. Duncan DR, Morgenthaler TI, Ryu JH, Daniels CE. Reducing iatrogenic risk in thoracentesis: establishing best practice via experiential training in a zero-risk environment. Chest. 2009;135(5):1315-1320. PubMed
39. Society of Critical Care Medicine Ultrasound Certification Task Force. Recommendations for achieving and maintaining competence and credentialing in critical care ultrasound with focused cardiac ultrasound and advanced critical care echocardiography. http://journals.lww.com/ccmjournal/Documents/Critical%20Care%20Ultrasound.pdf Published 2013. Accessed February 2, 2017.
40. Carraccio C, Wolfsthal SD, Englander R, Ferentz K, Martin C. Shifting paradigms: from Flexner to competencies. Acad Med. 2002;77(5):361-367. PubMed
41. Clark EG, Paparello JJ, Wayne DB, et al. Use of a national continuing medical education meeting to provide simulation-based training in temporary hemodialysis catheter insertion skills: a pre-test post-test study. Can J Kidney Health Dis. 2014;1:25-31. PubMed
42. Barsuk JH, Cohen ER, Caprio T, McGaghie WC, Simuni T, Wayne DB. Simulation-based education with mastery learning improves residents’ lumbar puncture skills. Neurology. 2012;79(2):132-137. PubMed
43. Barsuk JH, McGaghie WC, Cohen ER, O’Leary KJ, Wayne DB. Simulation-based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med. 2009;37(10):2697-2701. PubMed
44. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 2006;296(9):1094-1102. PubMed
45. Shah J, Darzi A. Surgical skills assessment: an ongoing debate. BJU Int. 2001;88(7):655-660. PubMed
46. Lamperti M, Bodenham AR, Pittiruti M, et al. International evidence-based recommendations on ultrasound-guided vascular access. Intensive Care Med. 2012;38(7):1105-1117. PubMed
47. Tolsgaard MG, Todsen T, Sorensen JL, et al. International multispecialty consensus on how to evaluate ultrasound competence: a Delphi consensus survey. PLOS One. 2013;8(2):e57687. doi:10.1371/journal.pone.0057687 PubMed
48. Moureau N, Laperti M, Kelly LJ, et al. Evidence-based consensus on the insertion of central venous access devices: definition of minimal requirements for training. Br J Anaesth. 2013;110(3):347-356. PubMed

49. Feldman LS, Hagarty S, Ghitulescu G, Stanbridge D, Fried GM. Relationship between objective assessment of technical skills and subjective in-training evaluations in surgical residents. J Am Coll Surg. 2004;198(1):105-110. PubMed
50. Baker S, Willey B, Mitchell C. The attempt to standardize technical and analytic competence in sonography education. J Diagn Med Sonogr. 2011;27(5):203-211. 
51. Tolsgaard MG, Ringsted C, Dreisler E, et al. Reliable and valid assessment of ultrasound operator competence in obstetrics and gynecology. Ultrasound Obstet Gynecol. 2014;43(4):437-443. PubMed
52. Rice J, Crichlow A, Baker M, et al. An assessment tool for the placement of ultrasound-guided peripheral intravenous access. J Grad Med Educ. 2016;8(2):202-207. PubMed
53. Hartman N, Wittler M, Askew K, Hiestand B, Manthey D. Validation of a performance checklist for ultrasound-guided internal jubular central lines for use in procedural instruction and assessment. Postgrad Med J. 2017;93(1096):67-70. PubMed
54. Primdahl SC, Todsen T, Clemmesen L, et al. Rating scale for the assessment of competence in ultrasound-guided peripheral vascular access—a Delphi Consensus Study. J Vasc Access. 2016;17(5):440-445. 
55. Berg D, Berg K, Riesenberg LA, et al. The development of a validated checklist for thoracentesis: preliminary results. Am J Med Qual. 2013;28(3):220-226. PubMed
56. Berg K, Riesenberg LA, Berg D, et al. The development of a validated checklist for radial arterial line placement: preliminary results. Am J Med Qual. 2014;29(3):242-246. PubMed
57. Walzak A, Bacchus M, Schaefer MP, et al. Diagnosing technical competence in six bedside procedures: comparing checklists and a global rating scale in the assessment of resident performance. Acad Med. 2015;90(8):1100-1108. PubMed
58. Riesenberg LA, Berg K, Berg D, et al. The development of a validated checklist for femoral venous catheterization: preliminary results. Am J Med Qual. 2014;29(5):445-450. PubMed
59. Riesenberg LA, Berg K, Berg D, et al. The development of a validated checklist for paracentesis: preliminary results. Am J Med Qual. 2013;28(3):227-231. PubMed
60. Huang GC, Newman LR, Schwartzstein RM, et al. Procedural competence in internal medicine residents: validity of a central venous catheter insertion assessment instrument. Acad Med. 2009;84(8):1127-1134. PubMed
61. Salamonsen M, McGrath D, Steiler G, et al. A new instrument to assess physician skill at thoracic ultrasound, including pleural effusion markup. Chest. 2013;144(3):930-934. PubMed
62. Boniface K, Yarris LM. Emergency ultrasound: Leveling the training and assessment landscape. Acad Emerg Med. 2014;21(7):803-805. PubMed
63. Boyle E, O’Keeffe D, Naughton P, Hill A, McDonnell C, Moneley D. The importance of expert feedback during endovascular simulator training. J Vasc Surg. 2011;54(1):240-248.e1. PubMed
64. Langhan TS, Rigby IJ, Walker IW, Howes D, Donnon T, Lord JA. Simulation-based training in critical resuscitation procedures improves residents’ competence. CJEM. 2009;11(6):535-539. PubMed
65. Barsuk JH, McGaghie WC, Cohen ER, Balachandran JS, Wayne DB. Use of simulation-based mastery learning to improve the quality of central venous catheter placement in a medical intensive care unit. J Hosp Med. 2009;4(7):397-403. PubMed
66. Lenchus JD. End of the “see one, do one, teach one” era: the next generation of invasive bedside procedural instruction. J Am Osteopath Assoc. 2010;110(6):340-346. PubMed
67. Barsuk JH, Cohen ER, Vozenilek JA, O’Connor LM, McGaghie WC, Wayne DB. Simulation-based education with mastery learning improves paracentesis skills. J Grad Med Educ. 2012;4(1):23-27. PubMed
68. McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med. 2011;86(6):706-711. PubMed
69. Ross JG. Simulation and psychomotor skill acquisition: A review of the literature. Clin Simul Nurs. 2012;8(9):e429-e435. 
70. Barsuk JH, Cohen ER, Potts S, et al. Dissemination of a simulation-based mastery learning intervention reduces central line-associated bloodstream infections. BMJ Qual Saf. 2014;23(9):749-756. PubMed
71. McSparron JI, Michaud GC, Gordan PL, et al. Simulation for skills-based education in pulmonary and critical care medicine. Ann Am Thorac Soc. 2015;12(4):579-586. PubMed
72. Kneebone RL, Scott W, Darzi A, Horrocks M. Simulation and clinical practice: strengthening the relationship. Med Educ. 2004;38(10):1095-1102. PubMed
73. Mema B, Harris I. The barriers and facilitators to transfer of ultrasound-guided central venous line skills from simulation to practice: exploring perceptions of learners and supervisors. Teach Learn Med. 2016;28(2):115-124. PubMed
74. Castanelli DJ. The rise of simulation in technical skills teaching and the implications for training novices in anaestheia. Anaesth Intensive Care. 2009;37(6):903-910. PubMed
75. McGaghie WC, Issenberg SB, Barsuk JH, Wayne DB. A critical review of simulation-based mastery learning with translational outcomes. Med Educ. 2014;48(4):375-385. PubMed
76. Langlois SLP. Focused ultrasound training for clinicians. Crit Care Med. 2007;35(5 suppl):S138-S143.
77. Price S, Via G, Sloth E, et al. Echocardiography practice, training and accreditation in the intesive care: document for the World Interactive Network Focused on Critical Ultrasound (WINFOCUS). Cardiovasc Ultrasound. 2008;6:49-83. PubMed
78. Blehar DJ, Barton B, Gaspari RJ. Learning curves in emergency ultrasound education. Acad Emerg Med. 2015;22(5):574-582. PubMed
79. Ault MJ, Rosen BT, Ault B. The use of tissue models for vascular access training. Phase I of the procedural patient safety initiative. J Gen Intern Med. 2006;21(5):514-517. PubMed
80. Barsuk JH, Cohen ER, McGaghie WC, Wayne DB. Long-term retention of central venous catheter insertion skills after simulation-based mastery learning. Acad Med. 2010;85(10 Suppl):S9-S12. PubMed
81. Sliman Sean, Amundson S, Shaw D, Phan JN, Waalen J, Kimura B. Recently-acquired cardiac ultrasound skills are rapidly lost when not used: implications for competency in physician imaging. J Amer Coll Cardiol. 2016;67(13S):1569. 
82. Kessler CS, Leone KA. The current state of core competency assessment in emergency medicine and a future research agenda: recommendations of the working group on assessment of observable learner performance. Acad Emerg Med. 2012;19(12):1354-1359. PubMed
83. Chang A, Schyve PM, Croteau RJ, O’Leary DS, Loeb JM. The JCAHO patient safety event taxonomy: a standardized terminology and classification schema for near misses and adverse events. Int J Qual Health Care. 2005;17(2):95-105. PubMed
84. Sawyer T, White M, Zaveri P, et al. Learn, see, practice, prove, do, maintain: an evidence-based pedagogical framework for procedural skill training in medicine. Acad Med. 2015;90(8):1025-1033. PubMed
85. Das D, Kapoor M, Brown C, Ndubuisi A, Gupta S. Current status of emergency department attending physician ultrasound credentialing and quality assurance in the United States. Crit Ultrasound J. 2016;8(1):6-12. PubMed
86. Ndubuisi AK, Gupta S, Brown C, Das D. Current status and future issues in emergency department attending physician ultrasound credentialing. Ann Emerg Med. 2014;64(45):S27-S28. 
87. Tandy Tk, Hoffenberg S. Emergency department ultrasound services by emergency physicians: model for gaining hospital approval. Ann Emerg Med. 1997;29(3):367-374. PubMed
88. Lewiss RE, Saul T, Del Rios M. Acquiring credentials in bedside ultrasound: a cross-sectional survey. BMJ Open. 2013;3:e003502. doi:10.1136/bmjopen-2013-003502 PubMed
89. Lanoix R. Credentialing issues in emergency ultrasonography. Emerg Med Clin North Am. 1997;15(4):913-920. PubMed
90. Scalea T, Rodriquez A, Chiu WC, et al. Focused assessment with sonography for trauma (FAST): results from an international consensus conference. J Trauma. 1999;46(3):466-472. PubMed
91. Hertzberg BS, Kliewer MA, Bowie JD, et al. Physician training requirements in sonography: how many cases are needed for competence? AJR. 2000;174(5):1221-1227. PubMed
92. Blaivas M, Theodoro DL, Sierzenski P. Proliferation of ultrasound fellowships in emergency medicine: how do we ensure future experts are expertly trained? Acad Emerg Med. 2002;9(8):863-864. PubMed
93. Bodenham AR. Editorial II: Ultrasound imaging by anaesthetists: training and accreditation issues. Br J Anaesth. 2006;96(4):414-417. PubMed
94. Williamson JP, Twaddell SH, Lee YCG, et al. Thoracic ultrasound recognition of competence: A position paper of the Thoracic Society of Australia and New Zealand. Respirology. 2017;22(2):405-408. PubMed
95. Harrison G. Summative clinical competency assessment: a survey of ultrasound practitioners’ views. Ultrasound. 2015;23(1):11-17. PubMed
96. Evans LV, Morse JL, Hamann CJ, Osborne M, Lin Z, D'Onofrio G. The development of an independent rater system to assess residents' competence in invasive procedures. Acad Med. 2009;84(8):1135-1143. PubMed
97. Wass V, Van der Vleuten C, Shatzer J, Jones R. Assessment of clinical competence. Lancet. 2001;357(9260):945-949. PubMed
98. Arntfield RT. The utility of remote supervision with feedback as a method to deliver high-volume critical care ultrasound training. J Crit Care. 2015;30(2):441.e1-e6. PubMed
99. Akhtar S, Theodoro D, Gaspari R, et al. Resident training in emergency ultrasound: consensus recommendations from the 2008 Council of Emergency Residency Directors Conference. Acad Emerg Med. 2009;16:S32-S36. PubMed
100. Yu E. The assessment of technical skills in a cardiology training program: is the ITER sufficient? Can J Cardiol. 2000;16(4):457-462. PubMed
101. Todsen T, Tolsgaard MG, Olsen BH, et al. Reliable and valid assessment of point-of-care ultrasonography. Ann Surg. 2015;261(2):309-315. PubMed
102. Stein JC, Nobay F. Emergency department ultrasound credentialing: a sample policy and procedure. J Emerg Med. 2009;37(2):153-159. PubMed
103. Chen FM. Burstin H, Huntington J. The importance of clinical outcomes in medical education research. Med Educ. 2005;39(4):350-351. PubMed
104. Dressler DD, Pistoria MJ, Budnitz TL, McKean SCW, Amin AN. Core competencies in hospital medicine: development and methodology. J Hosp Med. 2006;1:48-56. PubMed
105. ten Cate O. Nuts and bolts of entrustable professional activities. J Grad Med Educ. 2013;5(1):157-158. PubMed
106. Castillo J, Caruana CJ, Wainwright D. The changing concept of competence and categorisation of learning outcomes in Europe: Implications for the design of higher education radiography curricula at the European level. Radiography. 2011;17(3):230-234. 
107. Goldstein SR. Accreditation, certification: why all the confusion? Obstet Gynecol. 2007;110(6):1396-1398. PubMed
108. Moore CL. Credentialing and reimbursement in point-of-care ultrasound. Clin Pediatr Emerg Med. 2011;12(1):73-77. PubMed
109. ten Cate O, Scheele F. Competency-based postgraduate training: can we bridge the gap between theory and clinical practice? Acad Med. 2007;82(6):542-547. PubMed
110. Abuhamad AZ, Benacerraf BR, Woletz P, Burke BL. The accreditation of ultrasound practices: impact on compliance with minimum performance guidelines. J Ultrasound Med. 2004;23(8):1023-1029. PubMed

 

 

References

1. American Board of Internal Medicine. Policies and procedures for certification. Philadelphia: American Board of Internal Medicine; 2006.
2. Nichani S, Fitterman N, Lukela M, Crocker J; Society of Hospital Medicine. The Core Competencies in Hospital Medicine 2017 Revision. Section 2: Procedures. J Hosp Med. 2017;12(4 Suppl 1):S44-S54 PubMed
3. Lucas BP, Asbury JK, Franco-Sadud R. Training future hospitalists with simulators: a needed step toward accessible, expertly performed bedside procedures. J Hosp Med. 2009;4(7):395-396. PubMed
4. Schnobrich DJ, Gladding S, Olson APJ, Duran-Nelson A. Point-of-care ultrasound in internal medicine: a national survey of educational leadership. J Grad Med Educ. 2013;5(3):498-502. PubMed
5. Brown GM, Otremba M, Devine LA, Gray C, Millington SJ, Ma IW. Defining competencies for ultrasound-guided bedside procedures: consensus opinions from Canadian physicians. J Ultrasound Med. 2016;35(1):129-141. PubMed
6. Vaisman A, Cram P. Procedural competence among faculty in academic health centers: challenges and future directions. Acad Med. 2017;92(1):31-34. PubMed
7. Kreisman RD. With ED ultrasound, credentialing is at issue. ED Legal Letter. 2010;21:102-103. 
8. Goudie AM. Credentialing a new skill: what should the standard be for emergency department ultrasound in Australasia? Emerg Med Australas. 2010;22:263-264. PubMed
9. Maizel J, Guyomarc HL, Henon P, et al. Residents learning ultrasound-guided catheterization are not sufficiently skilled to use landmarks. Crit Care. 2014;18(1):R36. doi:10.1186/cc13741. PubMed
10. American College of Emergency Physicians. Ultrasound guidelines: emergency, point-of-care, and clinical ultrasound guidelines in medicine. Ann Emerg Med. 2017;69(5):e27-e54. PubMed
11. Amini R, Adhikari S, Fiorello A. Ultrasound competency assessment in emergency medicine residency programs. Acad Emerg Med. 2014;21(7):799-801. PubMed
12. Jensen T, Soni NJ, Tierney DM, Lucas BP. Hospital privileging practices for bedside procedures: a survey of hospitalist experts. J Hosp Med. 2017;12(10):836-839. PubMed
13. Chang W. Is hospitalist proficiency in bedside procedures in decline? The Hospitalist. 2012. http://www.the-hospitalist.org/hospitalist/article/125236/patient-safety/hospitalist-proficiency-bedside-procedures-decline. Accessed September 30, 2017.
14. Barsuk JH, Feinglass J, Kozmic SE, Hohmann SF, Ganger D, Wayne DB. Specialties Performing Paracentesis Procedures at University Hospitals: Implications for Training and Certification. J Hosp Med. 2014;9(3):162-168. PubMed
15. Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Residents’ Procedural Experience Does Not Ensure Competence: A Research Synthesis. J Grad Med Educ. 2017;9(2):201-208. PubMed
16. Balshem H, Helfand M, Schunemann HJ, et al. GRADE guidelines: 3. Rating the quality of evidence. J Clin Epidemiol. 2011;64(4):401-406. PubMed
17. Guyatt GH, Oxman AD, Kunz R, et al. GRADE guidelines: 8. Rating the quality of evidence—indirectness. J Clin Epidemiol. 2011;64(12):1303-1310. PubMed
18. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9 Suppl):S63-S67. PubMed
19. Grover S, Currier PF, Elinoff JM, Mouchantaf KJ, Katz JT, McMahon GT. Development of a test to evaluate residents knowledge of medical procedures. J Hosp Med. 2009;4(7):430-432. PubMed
20. Millington SJ, Wong RY, Kassen BO, Roberts JM, Ma IWY. Improving internal medicine residents’ performance, knowledge, and confidence in central venous catheterization using simulators. J Hosp Med. 2009;4(7):410-416. PubMed
21. Lenchus JD, Carvalho CM, Ferreri K, et al. Filling the void: defining invasive bedside procedural competency for internal medicine residents. J Grad Med Educ. 2013;5(4):605-612. PubMed
22. Heegeman DJ, Kieke B Jr. Learning curves, credentialing, and the need for ultrasound fellowships. Acad Emerg Med. 2003;10:404-405. PubMed
23. Jang TB, Ruggeri W, Dyne P, Kaji AH. The learning curve of resident physicians using emergency ultrasonography for cholelithaisis and cholecystitis. Acad Emerg Med. 2010;17(11):1247-1252. PubMed
24. Akhtar MI, Hamid M. Ultrasound guided central venous access; a review of literature. Anaesth Pain Intensive Care. 2015;19:317-322. 
25. Bahl A, Yunker A. Assessment of the numbers–based model for evaluation of resident competency in emergency ultrasound core applications. J Emerg Med Trauma Acute Care. 2015;2015(5). doi:10.5339/jemtac.2015.5 
26. Cazes N, Desmots F, Geffroy Y, Renard A, Leyral J, Chaumoitre K. Emergency ultrasound: a prospective study on sufficient adequate training for military doctors. Diagn Interv Imaging. 2013;94(11):1109-1115. PubMed
27. Arntfield RT, Millington SJ, Ainsworth CD, et al. Canadian recommendations for critical care ultrasound training and competency for the Canadian critical care society. Can Respir J. 2014;21(16):341-345. 
28. Bolsin S, Colson M. The use of the Cusum technique in the assessment of trainee competence in new procedures. Int J Qual Health Care. 2000;12(5):433-438. PubMed
29. de Oliveira Filho GR, Helayel PE, da Conceição DB, Garzel IS, Pavei P, Ceccon MS. Learning curves and mathematical models for interventional ultrasound basic skills. Anaesth Analg. 2008;106(2):568-573. PubMed
30. Starkie T, Drake EJ. Assessment of procedural skills training and performance in anesthesia using cumulative sum analysis (cusum). Can J Anaesth. 2013;60(12):1228-1239. PubMed
31. Tierney D. Competency cut-point identification derived from a mastery learning cohort approach: A hybrid model. Ultrasound Med Biol. 2015;41:S19. 
32. Rankin JH, Elkhunovich MA, Rangarajan V, Chilstrom M, Mailhot T. Learning Curves for Ultrasound Assessment of Lumbar Puncture Insertion Sites: When is Competency Established? J Emerg Med. 2016;51(1):55-62. PubMed
33. Klasko SK, Cummings RV, Glazerman LR. Resident data collection: Do the numbers add up? Am J Obstet Gynecol. 1995;172(4 Pt 1):1312-1316. PubMed
34. Tierney D. Development & analysis of a mobile POCUS tracking tool. Ultrasound Med Biol. 2015;41(suppl 4):S31. 
35. Sethi MV, Zimmer J, Ure B, Lacher M. Prospective assessment of complications on a daily basis is essential to determine morbidity and mortality in routine pediatric surgery. J Pediatr Surg. 2016;51(4):630-633. PubMed
36. Fisher JC, Kuenzler KA, Tomita SS, Sinha P, Shah P, Ginsburg HB. Increased capture of pediatric surgical complications utilizing a novel case-log web application to enhance quality improvement. J Pediatr Surg. 2017;52(1):166-171. PubMed
37. Rethans JJ, Norcini JJ, Barón-Maldonado M, et al. The relationship between competence and performance: implications for assessing practice performance. Med Educ. 2002;36(10):901-909. PubMed
38. Duncan DR, Morgenthaler TI, Ryu JH, Daniels CE. Reducing iatrogenic risk in thoracentesis: establishing best practice via experiential training in a zero-risk environment. Chest. 2009;135(5):1315-1320. PubMed
39. Society of Critical Care Medicine Ultrasound Certification Task Force. Recommendations for achieving and maintaining competence and credentialing in critical care ultrasound with focused cardiac ultrasound and advanced critical care echocardiography. http://journals.lww.com/ccmjournal/Documents/Critical%20Care%20Ultrasound.pdf Published 2013. Accessed February 2, 2017.
40. Carraccio C, Wolfsthal SD, Englander R, Ferentz K, Martin C. Shifting paradigms: from Flexner to competencies. Acad Med. 2002;77(5):361-367. PubMed
41. Clark EG, Paparello JJ, Wayne DB, et al. Use of a national continuing medical education meeting to provide simulation-based training in temporary hemodialysis catheter insertion skills: a pre-test post-test study. Can J Kidney Health Dis. 2014;1:25-31. PubMed
42. Barsuk JH, Cohen ER, Caprio T, McGaghie WC, Simuni T, Wayne DB. Simulation-based education with mastery learning improves residents’ lumbar puncture skills. Neurology. 2012;79(2):132-137. PubMed
43. Barsuk JH, McGaghie WC, Cohen ER, O’Leary KJ, Wayne DB. Simulation-based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med. 2009;37(10):2697-2701. PubMed
44. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 2006;296(9):1094-1102. PubMed
45. Shah J, Darzi A. Surgical skills assessment: an ongoing debate. BJU Int. 2001;88(7):655-660. PubMed
46. Lamperti M, Bodenham AR, Pittiruti M, et al. International evidence-based recommendations on ultrasound-guided vascular access. Intensive Care Med. 2012;38(7):1105-1117. PubMed
47. Tolsgaard MG, Todsen T, Sorensen JL, et al. International multispecialty consensus on how to evaluate ultrasound competence: a Delphi consensus survey. PLOS One. 2013;8(2):e57687. doi:10.1371/journal.pone.0057687 PubMed
48. Moureau N, Laperti M, Kelly LJ, et al. Evidence-based consensus on the insertion of central venous access devices: definition of minimal requirements for training. Br J Anaesth. 2013;110(3):347-356. PubMed

49. Feldman LS, Hagarty S, Ghitulescu G, Stanbridge D, Fried GM. Relationship between objective assessment of technical skills and subjective in-training evaluations in surgical residents. J Am Coll Surg. 2004;198(1):105-110. PubMed
50. Baker S, Willey B, Mitchell C. The attempt to standardize technical and analytic competence in sonography education. J Diagn Med Sonogr. 2011;27(5):203-211. 
51. Tolsgaard MG, Ringsted C, Dreisler E, et al. Reliable and valid assessment of ultrasound operator competence in obstetrics and gynecology. Ultrasound Obstet Gynecol. 2014;43(4):437-443. PubMed
52. Rice J, Crichlow A, Baker M, et al. An assessment tool for the placement of ultrasound-guided peripheral intravenous access. J Grad Med Educ. 2016;8(2):202-207. PubMed
53. Hartman N, Wittler M, Askew K, Hiestand B, Manthey D. Validation of a performance checklist for ultrasound-guided internal jubular central lines for use in procedural instruction and assessment. Postgrad Med J. 2017;93(1096):67-70. PubMed
54. Primdahl SC, Todsen T, Clemmesen L, et al. Rating scale for the assessment of competence in ultrasound-guided peripheral vascular access—a Delphi Consensus Study. J Vasc Access. 2016;17(5):440-445. 
55. Berg D, Berg K, Riesenberg LA, et al. The development of a validated checklist for thoracentesis: preliminary results. Am J Med Qual. 2013;28(3):220-226. PubMed
56. Berg K, Riesenberg LA, Berg D, et al. The development of a validated checklist for radial arterial line placement: preliminary results. Am J Med Qual. 2014;29(3):242-246. PubMed
57. Walzak A, Bacchus M, Schaefer MP, et al. Diagnosing technical competence in six bedside procedures: comparing checklists and a global rating scale in the assessment of resident performance. Acad Med. 2015;90(8):1100-1108. PubMed
58. Riesenberg LA, Berg K, Berg D, et al. The development of a validated checklist for femoral venous catheterization: preliminary results. Am J Med Qual. 2014;29(5):445-450. PubMed
59. Riesenberg LA, Berg K, Berg D, et al. The development of a validated checklist for paracentesis: preliminary results. Am J Med Qual. 2013;28(3):227-231. PubMed
60. Huang GC, Newman LR, Schwartzstein RM, et al. Procedural competence in internal medicine residents: validity of a central venous catheter insertion assessment instrument. Acad Med. 2009;84(8):1127-1134. PubMed
61. Salamonsen M, McGrath D, Steiler G, et al. A new instrument to assess physician skill at thoracic ultrasound, including pleural effusion markup. Chest. 2013;144(3):930-934. PubMed
62. Boniface K, Yarris LM. Emergency ultrasound: Leveling the training and assessment landscape. Acad Emerg Med. 2014;21(7):803-805. PubMed
63. Boyle E, O’Keeffe D, Naughton P, Hill A, McDonnell C, Moneley D. The importance of expert feedback during endovascular simulator training. J Vasc Surg. 2011;54(1):240-248.e1. PubMed
64. Langhan TS, Rigby IJ, Walker IW, Howes D, Donnon T, Lord JA. Simulation-based training in critical resuscitation procedures improves residents’ competence. CJEM. 2009;11(6):535-539. PubMed
65. Barsuk JH, McGaghie WC, Cohen ER, Balachandran JS, Wayne DB. Use of simulation-based mastery learning to improve the quality of central venous catheter placement in a medical intensive care unit. J Hosp Med. 2009;4(7):397-403. PubMed
66. Lenchus JD. End of the “see one, do one, teach one” era: the next generation of invasive bedside procedural instruction. J Am Osteopath Assoc. 2010;110(6):340-346. PubMed
67. Barsuk JH, Cohen ER, Vozenilek JA, O’Connor LM, McGaghie WC, Wayne DB. Simulation-based education with mastery learning improves paracentesis skills. J Grad Med Educ. 2012;4(1):23-27. PubMed
68. McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med. 2011;86(6):706-711. PubMed
69. Ross JG. Simulation and psychomotor skill acquisition: A review of the literature. Clin Simul Nurs. 2012;8(9):e429-e435. 
70. Barsuk JH, Cohen ER, Potts S, et al. Dissemination of a simulation-based mastery learning intervention reduces central line-associated bloodstream infections. BMJ Qual Saf. 2014;23(9):749-756. PubMed
71. McSparron JI, Michaud GC, Gordan PL, et al. Simulation for skills-based education in pulmonary and critical care medicine. Ann Am Thorac Soc. 2015;12(4):579-586. PubMed
72. Kneebone RL, Scott W, Darzi A, Horrocks M. Simulation and clinical practice: strengthening the relationship. Med Educ. 2004;38(10):1095-1102. PubMed
73. Mema B, Harris I. The barriers and facilitators to transfer of ultrasound-guided central venous line skills from simulation to practice: exploring perceptions of learners and supervisors. Teach Learn Med. 2016;28(2):115-124. PubMed
74. Castanelli DJ. The rise of simulation in technical skills teaching and the implications for training novices in anaestheia. Anaesth Intensive Care. 2009;37(6):903-910. PubMed
75. McGaghie WC, Issenberg SB, Barsuk JH, Wayne DB. A critical review of simulation-based mastery learning with translational outcomes. Med Educ. 2014;48(4):375-385. PubMed
76. Langlois SLP. Focused ultrasound training for clinicians. Crit Care Med. 2007;35(5 suppl):S138-S143.
77. Price S, Via G, Sloth E, et al. Echocardiography practice, training and accreditation in the intesive care: document for the World Interactive Network Focused on Critical Ultrasound (WINFOCUS). Cardiovasc Ultrasound. 2008;6:49-83. PubMed
78. Blehar DJ, Barton B, Gaspari RJ. Learning curves in emergency ultrasound education. Acad Emerg Med. 2015;22(5):574-582. PubMed
79. Ault MJ, Rosen BT, Ault B. The use of tissue models for vascular access training. Phase I of the procedural patient safety initiative. J Gen Intern Med. 2006;21(5):514-517. PubMed
80. Barsuk JH, Cohen ER, McGaghie WC, Wayne DB. Long-term retention of central venous catheter insertion skills after simulation-based mastery learning. Acad Med. 2010;85(10 Suppl):S9-S12. PubMed
81. Sliman Sean, Amundson S, Shaw D, Phan JN, Waalen J, Kimura B. Recently-acquired cardiac ultrasound skills are rapidly lost when not used: implications for competency in physician imaging. J Amer Coll Cardiol. 2016;67(13S):1569. 
82. Kessler CS, Leone KA. The current state of core competency assessment in emergency medicine and a future research agenda: recommendations of the working group on assessment of observable learner performance. Acad Emerg Med. 2012;19(12):1354-1359. PubMed
83. Chang A, Schyve PM, Croteau RJ, O’Leary DS, Loeb JM. The JCAHO patient safety event taxonomy: a standardized terminology and classification schema for near misses and adverse events. Int J Qual Health Care. 2005;17(2):95-105. PubMed
84. Sawyer T, White M, Zaveri P, et al. Learn, see, practice, prove, do, maintain: an evidence-based pedagogical framework for procedural skill training in medicine. Acad Med. 2015;90(8):1025-1033. PubMed
85. Das D, Kapoor M, Brown C, Ndubuisi A, Gupta S. Current status of emergency department attending physician ultrasound credentialing and quality assurance in the United States. Crit Ultrasound J. 2016;8(1):6-12. PubMed
86. Ndubuisi AK, Gupta S, Brown C, Das D. Current status and future issues in emergency department attending physician ultrasound credentialing. Ann Emerg Med. 2014;64(45):S27-S28. 
87. Tandy Tk, Hoffenberg S. Emergency department ultrasound services by emergency physicians: model for gaining hospital approval. Ann Emerg Med. 1997;29(3):367-374. PubMed
88. Lewiss RE, Saul T, Del Rios M. Acquiring credentials in bedside ultrasound: a cross-sectional survey. BMJ Open. 2013;3:e003502. doi:10.1136/bmjopen-2013-003502 PubMed
89. Lanoix R. Credentialing issues in emergency ultrasonography. Emerg Med Clin North Am. 1997;15(4):913-920. PubMed
90. Scalea T, Rodriquez A, Chiu WC, et al. Focused assessment with sonography for trauma (FAST): results from an international consensus conference. J Trauma. 1999;46(3):466-472. PubMed
91. Hertzberg BS, Kliewer MA, Bowie JD, et al. Physician training requirements in sonography: how many cases are needed for competence? AJR. 2000;174(5):1221-1227. PubMed
92. Blaivas M, Theodoro DL, Sierzenski P. Proliferation of ultrasound fellowships in emergency medicine: how do we ensure future experts are expertly trained? Acad Emerg Med. 2002;9(8):863-864. PubMed
93. Bodenham AR. Editorial II: Ultrasound imaging by anaesthetists: training and accreditation issues. Br J Anaesth. 2006;96(4):414-417. PubMed
94. Williamson JP, Twaddell SH, Lee YCG, et al. Thoracic ultrasound recognition of competence: A position paper of the Thoracic Society of Australia and New Zealand. Respirology. 2017;22(2):405-408. PubMed
95. Harrison G. Summative clinical competency assessment: a survey of ultrasound practitioners’ views. Ultrasound. 2015;23(1):11-17. PubMed
96. Evans LV, Morse JL, Hamann CJ, Osborne M, Lin Z, D'Onofrio G. The development of an independent rater system to assess residents' competence in invasive procedures. Acad Med. 2009;84(8):1135-1143. PubMed
97. Wass V, Van der Vleuten C, Shatzer J, Jones R. Assessment of clinical competence. Lancet. 2001;357(9260):945-949. PubMed
98. Arntfield RT. The utility of remote supervision with feedback as a method to deliver high-volume critical care ultrasound training. J Crit Care. 2015;30(2):441.e1-e6. PubMed
99. Akhtar S, Theodoro D, Gaspari R, et al. Resident training in emergency ultrasound: consensus recommendations from the 2008 Council of Emergency Residency Directors Conference. Acad Emerg Med. 2009;16:S32-S36. PubMed
100. Yu E. The assessment of technical skills in a cardiology training program: is the ITER sufficient? Can J Cardiol. 2000;16(4):457-462. PubMed
101. Todsen T, Tolsgaard MG, Olsen BH, et al. Reliable and valid assessment of point-of-care ultrasonography. Ann Surg. 2015;261(2):309-315. PubMed
102. Stein JC, Nobay F. Emergency department ultrasound credentialing: a sample policy and procedure. J Emerg Med. 2009;37(2):153-159. PubMed
103. Chen FM. Burstin H, Huntington J. The importance of clinical outcomes in medical education research. Med Educ. 2005;39(4):350-351. PubMed
104. Dressler DD, Pistoria MJ, Budnitz TL, McKean SCW, Amin AN. Core competencies in hospital medicine: development and methodology. J Hosp Med. 2006;1:48-56. PubMed
105. ten Cate O. Nuts and bolts of entrustable professional activities. J Grad Med Educ. 2013;5(1):157-158. PubMed
106. Castillo J, Caruana CJ, Wainwright D. The changing concept of competence and categorisation of learning outcomes in Europe: Implications for the design of higher education radiography curricula at the European level. Radiography. 2011;17(3):230-234. 
107. Goldstein SR. Accreditation, certification: why all the confusion? Obstet Gynecol. 2007;110(6):1396-1398. PubMed
108. Moore CL. Credentialing and reimbursement in point-of-care ultrasound. Clin Pediatr Emerg Med. 2011;12(1):73-77. PubMed
109. ten Cate O, Scheele F. Competency-based postgraduate training: can we bridge the gap between theory and clinical practice? Acad Med. 2007;82(6):542-547. PubMed
110. Abuhamad AZ, Benacerraf BR, Woletz P, Burke BL. The accreditation of ultrasound practices: impact on compliance with minimum performance guidelines. J Ultrasound Med. 2004;23(8):1023-1029. PubMed

 

 

Issue
Journal of Hospital Medicine 13(2)
Issue
Journal of Hospital Medicine 13(2)
Page Number
126-135. Published online first January 17, 2018
Page Number
126-135. Published online first January 17, 2018
Publications
Publications
Topics
Article Type
Sections
Article Source

© 2018 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Current Affiliation - Brian P. Lucas, MD, MS, 215 N Main Street, White River Junction, VT; Telephone: 802-295-9363 extension 4314; Fax: 802-296-6325; E-mail: brian.p.lucas@dartmouth.edu
Content Gating
Open Access (article Unlocked/Open Access)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Article PDF Media
Media Files

Hospital Privileging Practices for Bedside Procedures: A Survey of Hospitalist Experts

Article Type
Changed
Fri, 12/14/2018 - 07:53

Performance of 6 bedside procedures (paracentesis, thoracentesis, lumbar puncture, arthrocentesis, central venous catheter [CVC] placement, and arterial line placement) are considered core competencies for hospitalists.1 Yet, the American Board of Internal Medicine (ABIM) no longer requires demonstration of manual competency for bedside procedures, and graduates may enter the workforce with minimal or no experience performing such procedures.2 As such, the burden falls on hospital privileging committees to ensure providers have the necessary training and experience to competently perform invasive procedures before granting institutional privileges to perform them.3 Although recommendations for privileging to perform certain surgical procedures have been proposed,4,5 there are no widely accepted guidelines for initial or ongoing privileging of common invasive bedside procedures performed by hospitalists, and current privileging practices vary significantly.

In 2015, the Society of Hospital Medicine (SHM) set up a Point-of-Care Ultrasound (POCUS) Task Force to draft evidence-based guidelines on the use of ultrasound to perform bedside procedures. The recommendations for certification of competency in ultrasound-guided procedures may guide institutional privileging. The purpose of this study was to better understand current hospital privileging practices for invasive bedside procedures both with and without ultrasound guidance and how current practices are perceived by experts.

METHODS

Study Design, Setting, and Participants

After approval by the University of Texas Health Science Center at San Antonio Institutional Review Board, we conducted a survey of hospital privileging processes for bedside procedures from a convenience sample of hospitalist procedure experts on the SHM POCUS Task Force. All 21 hospitalists on the task force were invited to participate, including the authors of this article. These hospitalists represent 21 unique institutions, and all have clinical, educational, and/or research expertise in ultrasound-guided bedside procedures.

Survey Design

A 26-question, electronic survey on privileging for bedside procedures was conducted (Appendix A). Twenty questions addressed procedures in general, such as minimum numbers of procedures required and use of simulation. Six questions focused on the use of ultrasound guidance. To provide context, many questions were framed to assess a privileging process being drafted by the task force. Answers were either multiple choice or free text.

Data Collection and Analysis

All members of the task force were invited to complete the survey by e-mail during November 2016. A reminder e-mail was sent on the day after initial distribution. No compensation was offered, and participation was not required. Survey results were compiled electronically through Research Electronic Data Capture, or “REDCap”TM (Nashville, Tennessee), and data analysis was performed with Stata version 14 (College Station, Texas). Means of current and recommended minimum thresholds were calculated by excluding responses of “I don’t know,” and responses of “no minimum number threshold” were coded as 0.

RESULTS

The survey response rate was 100% (21 of 21). All experts were hospitalists, but 2 also identified themselves as intensivists. Experts practiced in a variety of hospital settings, including private university hospitals (43%), public university hospitals (19%), Veterans Affairs teaching hospitals (14%), community teaching hospitals (14%), and community nonteaching hospitals (10%). Most hospitals (90%) were teaching hospitals for internal medicine trainees. All experts have personally performed bedside procedures on a regular basis, and most (86%) had leadership roles in teaching procedures to students, residents, fellows, physician assistants, nurse practitioners, and/or physicians. Approximately half (57%) were involved in granting privileges for bedside procedures at their institutions.

Most hospitals do not require the use of ultrasound guidance for the privileging of any procedure, but ultrasound guidance was reported to be routinely used for paracentesis (100%), thoracentesis (95%), and CVC placement (95%). Ultrasound guidance was less common for arterial line placement (57%), lumbar puncture (33%), and arthrocentesis (29%). There was strong agreement that ultrasound guidance ought to be required for initial and ongoing privileging of CVC placement, thoracentesis, and paracentesis. But there was less agreement for arterial line placement, arthrocentesis, and lumbar puncture (Figure 1).

Only half of the experts reported that their hospitals required a minimum number of procedures to earn initial (48%) or ongoing (52%) privileges to perform bedside procedures. Nevertheless, most experts thought there ought to be minimum numbers of procedures for initial (81%) and ongoing (81%) privileging, recommending higher minimums for both initial and ongoing privileging than are currently required at their hospitals (Figure 2).

The average difference between suggested and current minimum numbers of procedures required for initial privileging was 4.7 for paracentesis, 5.8 for thoracentesis, 5.8 for CVC catheter insertion, 5.4 for lumbar puncture, 4.8 for arterial line insertion, and 3.6 for arthrocentesis. The average difference between suggested and current minimum numbers of yearly procedures required for ongoing privileging was 2.0 for paracentesis, 2.8 for thoracentesis, 2.9 for CVC catheter insertion, 1.9 for lumbar puncture, 2.1 for arterial line insertion, and 2.5 for arthrocentesis (Appendix B).

Most hospitalist procedure experts thought that simulation training (67%) and direct observation of procedural skills (71%) should be core components of an initial privileging process. Many of the experts who did not agree with direct observation or simulation training as core components of initial privileging had concerns about feasibility with respect to manpower, availability of simulation equipment, and costs. In contrast, the majority (67%) did not think it was necessary to directly observe providers for ongoing privileging when routine monitoring was in place for periprocedural complications, which all experts (100%) agreed should be in place.

 

 

DISCUSSION

Our survey identified 3 distinct differences between hospitalist procedure experts’ recommendations and their own hospitals’ current privileging practices. First, whereas experts recommended ultrasound guidance for thoracentesis, paracentesis, and CVC placement, it is rarely a current requirement. Second, experts recommend requiring minimum numbers of procedures for both initial and ongoing privileging even though such minimums are not currently required at half of their hospitals. Third, recommended minimum numbers were generally higher than those currently in place.

The routine use of ultrasound guidance for thoracentesis, paracentesis, and CVC placement is likely a result of increased adoption based on the literature showing clinical benefits.6-9 Thus, the expert recommendations for required use of ultrasound guidance for these procedures seems both appropriate and feasible. The procedure minimums identified in our study are similar to prior ABIM guidelines when manual competency was required for board certification in internal medicine and are comparable to recent minimums proposed by the Society of Critical Care Medicine, both of which recommended a minimum of 5 to 10 per procedure.10,11 Nevertheless, no commonly agreed-upon minimum number of procedures currently exists for certification of competency, and the variability seen in the experts’ responses further supports the idea that no specific number will guarantee competence. Thus, while requiring minimum numbers of procedures was generally considered necessary by our experts, minimums alone were also considered insufficient for initial privileging because most recommended that direct observation and simulation should be part of an initial privileging process.

These findings encourage more rigorous requirements for both initial and ongoing privileging of procedures. Nevertheless, our findings were rarely unanimous. The most frequently cited reason for disagreement on our findings was feasibility and capacity for direct observation, and the absence of ultrasound equipment or simulators, particularly in resource-limited clinical environments.

Our study has several strengths and limitations. One strength is the recruitment of study experts specifically composed of hospitalist procedure experts from diverse geographic and hospital settings. Yet, we acknowledge that our findings may not be generalizable to other specialties. Another strength is we obtained 100% participation from the experts surveyed. Weaknesses of this study include the relatively small number of experts who are likely to be biased in favor of both the use of ultrasound guidance and higher standards for privileging. We also relied on self-reported data about privileging processes rather than direct observation of those practices. Finally, questions were framed in the context of only 1 possible privileging pathway, and experts may respond differently to a different framing.

CONCLUSION

Our findings may guide the development of more standardized frameworks for initial and ongoing privileging of hospitalists for invasive bedside procedures. In particular, additional privileging requirements may include the routine use of ultrasound guidance for paracentesis, thoracentesis, and CVC insertion; simulation preceding direct observation of manual skills if possible; and higher required minimums of procedures for both initial and ongoing privileging. The goal of a standardized framework for privileging should be directed at improving the quality and safety of bedside procedures but must consider feasibility in diverse clinical settings where hospitalists work.

Acknowledgments

The authors thank the hospitalists on the SHM POCUS Task Force who provided data about their institutions’ privileging processes and requirements. They are also grateful to Loretta M. Grikis, MLS, AHIP, at the White River Junction Veterans Affairs Medical Center for her assistance as a medical librarian.

Disclosure

Brian P. Lucas (U.S. Department of Veterans Affairs, Veterans Health Administration, Office of Research and Development, Dartmouth SYNERGY, and the National Institutes of Health National Center for Advancing Translational Sciences [UL1TR001086]). Nilam Soni (U.S. Department of Veterans Affairs and Quality Enhancement Research Initiative Partnered Evaluation Initiative grant [HX002263-01A1]). The contents of this publication do not represent the views of the U.S. Department of Veterans Affairs or the U.S. Government.

Files
References

1. Nichani S, Jonathan Crocker, MD, Nick Fitterman, MD, Michael Lukela, MD, Updating the core competencies in hospital medicine—2017 revision: Introduction and methodology. J Hosp Med 2017;12(4);283-287. PubMed
2. American Board of Internal Medicine. Policies and Procedures for Certification. http://www.abim.org/certification/policies/imss/im.aspx - procedures. Published July 2016. Accessed on November 8, 2016.
3. Department of Health & Human Services. Centers for Medicare & Medicaid Services (CMS) Requirements for Hospital Medical Staff Privileging. Centers for Medicare and Medicaid Services website. https://www.cms.gov/Medicare/Provider-Enrollment-and-Certification/SurveyCertificationGenInfo/downloads/SCLetter05-04.pdf. Published November 12, 2004. Accessed on November 8, 2016. 
4. Blackmon SH, Cooke DT, Whyte R, et al. The Society of Thoracic Surgeons Expert Consensus Statement: A tool kit to assist thoracic surgeons seeking privileging to use new technology and perform advanced procedures in general thoracic surgery. Ann Thorac Surg. 2016;101(3):1230-1237. PubMed
5. Bhora FY, Al-Ayoubi AM, Rehmani SS, Forleiter CM, Raad WN, Belsley SG. Robotically assisted thoracic surgery: proposed guidelines for privileging and credentialing. Innovations (Phila). 2016;11(6):386-389. PubMed
6. Mercaldi CJ, Lanes SF. Ultrasound guidance decreases complications and improves the cost of care among patients undergoing thoracentesis and paracentesis. Chest. 2013;143:532-538. PubMed
7. Patel PA, Ernst FR, Gunnarsson CL. Ultrasonography guidance reduces complications and costs associated with thoracentesis procedures. J Clin Ultrasound. 2012;40:135-141. PubMed
8. Brass P, Hellmich M, Kolodziej L, Schick G, Smith AF. Ultrasound guidance versus anatomical landmarks for internal jugular vein catheterization. Cochrane Database Syst Rev. 2015;1:CD006962. DOI: 10.1002/14651858.CD006962.pub2. PubMed
9. Brass P, Hellmich M, Kolodziej L, Schick G, Smith AF. Ultrasound guidance versus anatomical landmarks for subclavian or femoral vein catheterization. Cochrane Database Syst Rev. 2015;1:CD011447. DOI: 10.1002/14651858.CD011447. PubMed
10. American Board of Internal Medicine. Policies and Procedures. Philadelphia, PA; July 1990.
11. Society of Critical Care Medicine Ultrasound Certification Task Force. Recommendations for Achieving and Maintaining Competence and Credentialing in Critical Care Ultrasound with Focused Cardiac Ultrasound and Advanced Critical Care Echocardiography. http://journals.lww.com/ccmjournal/Documents/Critical%20Care%20Ultrasound.pdf. Published 2013. Accessed November 8, 2016. 

Article PDF
Issue
Journal of Hospital Medicine 12(10)
Publications
Topics
Page Number
836-839. Published online first September 6, 2017.
Sections
Files
Files
Article PDF
Article PDF

Performance of 6 bedside procedures (paracentesis, thoracentesis, lumbar puncture, arthrocentesis, central venous catheter [CVC] placement, and arterial line placement) are considered core competencies for hospitalists.1 Yet, the American Board of Internal Medicine (ABIM) no longer requires demonstration of manual competency for bedside procedures, and graduates may enter the workforce with minimal or no experience performing such procedures.2 As such, the burden falls on hospital privileging committees to ensure providers have the necessary training and experience to competently perform invasive procedures before granting institutional privileges to perform them.3 Although recommendations for privileging to perform certain surgical procedures have been proposed,4,5 there are no widely accepted guidelines for initial or ongoing privileging of common invasive bedside procedures performed by hospitalists, and current privileging practices vary significantly.

In 2015, the Society of Hospital Medicine (SHM) set up a Point-of-Care Ultrasound (POCUS) Task Force to draft evidence-based guidelines on the use of ultrasound to perform bedside procedures. The recommendations for certification of competency in ultrasound-guided procedures may guide institutional privileging. The purpose of this study was to better understand current hospital privileging practices for invasive bedside procedures both with and without ultrasound guidance and how current practices are perceived by experts.

METHODS

Study Design, Setting, and Participants

After approval by the University of Texas Health Science Center at San Antonio Institutional Review Board, we conducted a survey of hospital privileging processes for bedside procedures from a convenience sample of hospitalist procedure experts on the SHM POCUS Task Force. All 21 hospitalists on the task force were invited to participate, including the authors of this article. These hospitalists represent 21 unique institutions, and all have clinical, educational, and/or research expertise in ultrasound-guided bedside procedures.

Survey Design

A 26-question, electronic survey on privileging for bedside procedures was conducted (Appendix A). Twenty questions addressed procedures in general, such as minimum numbers of procedures required and use of simulation. Six questions focused on the use of ultrasound guidance. To provide context, many questions were framed to assess a privileging process being drafted by the task force. Answers were either multiple choice or free text.

Data Collection and Analysis

All members of the task force were invited to complete the survey by e-mail during November 2016. A reminder e-mail was sent on the day after initial distribution. No compensation was offered, and participation was not required. Survey results were compiled electronically through Research Electronic Data Capture, or “REDCap”TM (Nashville, Tennessee), and data analysis was performed with Stata version 14 (College Station, Texas). Means of current and recommended minimum thresholds were calculated by excluding responses of “I don’t know,” and responses of “no minimum number threshold” were coded as 0.

RESULTS

The survey response rate was 100% (21 of 21). All experts were hospitalists, but 2 also identified themselves as intensivists. Experts practiced in a variety of hospital settings, including private university hospitals (43%), public university hospitals (19%), Veterans Affairs teaching hospitals (14%), community teaching hospitals (14%), and community nonteaching hospitals (10%). Most hospitals (90%) were teaching hospitals for internal medicine trainees. All experts have personally performed bedside procedures on a regular basis, and most (86%) had leadership roles in teaching procedures to students, residents, fellows, physician assistants, nurse practitioners, and/or physicians. Approximately half (57%) were involved in granting privileges for bedside procedures at their institutions.

Most hospitals do not require the use of ultrasound guidance for the privileging of any procedure, but ultrasound guidance was reported to be routinely used for paracentesis (100%), thoracentesis (95%), and CVC placement (95%). Ultrasound guidance was less common for arterial line placement (57%), lumbar puncture (33%), and arthrocentesis (29%). There was strong agreement that ultrasound guidance ought to be required for initial and ongoing privileging of CVC placement, thoracentesis, and paracentesis. But there was less agreement for arterial line placement, arthrocentesis, and lumbar puncture (Figure 1).

Only half of the experts reported that their hospitals required a minimum number of procedures to earn initial (48%) or ongoing (52%) privileges to perform bedside procedures. Nevertheless, most experts thought there ought to be minimum numbers of procedures for initial (81%) and ongoing (81%) privileging, recommending higher minimums for both initial and ongoing privileging than are currently required at their hospitals (Figure 2).

The average difference between suggested and current minimum numbers of procedures required for initial privileging was 4.7 for paracentesis, 5.8 for thoracentesis, 5.8 for CVC catheter insertion, 5.4 for lumbar puncture, 4.8 for arterial line insertion, and 3.6 for arthrocentesis. The average difference between suggested and current minimum numbers of yearly procedures required for ongoing privileging was 2.0 for paracentesis, 2.8 for thoracentesis, 2.9 for CVC catheter insertion, 1.9 for lumbar puncture, 2.1 for arterial line insertion, and 2.5 for arthrocentesis (Appendix B).

Most hospitalist procedure experts thought that simulation training (67%) and direct observation of procedural skills (71%) should be core components of an initial privileging process. Many of the experts who did not agree with direct observation or simulation training as core components of initial privileging had concerns about feasibility with respect to manpower, availability of simulation equipment, and costs. In contrast, the majority (67%) did not think it was necessary to directly observe providers for ongoing privileging when routine monitoring was in place for periprocedural complications, which all experts (100%) agreed should be in place.

 

 

DISCUSSION

Our survey identified 3 distinct differences between hospitalist procedure experts’ recommendations and their own hospitals’ current privileging practices. First, whereas experts recommended ultrasound guidance for thoracentesis, paracentesis, and CVC placement, it is rarely a current requirement. Second, experts recommend requiring minimum numbers of procedures for both initial and ongoing privileging even though such minimums are not currently required at half of their hospitals. Third, recommended minimum numbers were generally higher than those currently in place.

The routine use of ultrasound guidance for thoracentesis, paracentesis, and CVC placement is likely a result of increased adoption based on the literature showing clinical benefits.6-9 Thus, the expert recommendations for required use of ultrasound guidance for these procedures seems both appropriate and feasible. The procedure minimums identified in our study are similar to prior ABIM guidelines when manual competency was required for board certification in internal medicine and are comparable to recent minimums proposed by the Society of Critical Care Medicine, both of which recommended a minimum of 5 to 10 per procedure.10,11 Nevertheless, no commonly agreed-upon minimum number of procedures currently exists for certification of competency, and the variability seen in the experts’ responses further supports the idea that no specific number will guarantee competence. Thus, while requiring minimum numbers of procedures was generally considered necessary by our experts, minimums alone were also considered insufficient for initial privileging because most recommended that direct observation and simulation should be part of an initial privileging process.

These findings encourage more rigorous requirements for both initial and ongoing privileging of procedures. Nevertheless, our findings were rarely unanimous. The most frequently cited reason for disagreement on our findings was feasibility and capacity for direct observation, and the absence of ultrasound equipment or simulators, particularly in resource-limited clinical environments.

Our study has several strengths and limitations. One strength is the recruitment of study experts specifically composed of hospitalist procedure experts from diverse geographic and hospital settings. Yet, we acknowledge that our findings may not be generalizable to other specialties. Another strength is we obtained 100% participation from the experts surveyed. Weaknesses of this study include the relatively small number of experts who are likely to be biased in favor of both the use of ultrasound guidance and higher standards for privileging. We also relied on self-reported data about privileging processes rather than direct observation of those practices. Finally, questions were framed in the context of only 1 possible privileging pathway, and experts may respond differently to a different framing.

CONCLUSION

Our findings may guide the development of more standardized frameworks for initial and ongoing privileging of hospitalists for invasive bedside procedures. In particular, additional privileging requirements may include the routine use of ultrasound guidance for paracentesis, thoracentesis, and CVC insertion; simulation preceding direct observation of manual skills if possible; and higher required minimums of procedures for both initial and ongoing privileging. The goal of a standardized framework for privileging should be directed at improving the quality and safety of bedside procedures but must consider feasibility in diverse clinical settings where hospitalists work.

Acknowledgments

The authors thank the hospitalists on the SHM POCUS Task Force who provided data about their institutions’ privileging processes and requirements. They are also grateful to Loretta M. Grikis, MLS, AHIP, at the White River Junction Veterans Affairs Medical Center for her assistance as a medical librarian.

Disclosure

Brian P. Lucas (U.S. Department of Veterans Affairs, Veterans Health Administration, Office of Research and Development, Dartmouth SYNERGY, and the National Institutes of Health National Center for Advancing Translational Sciences [UL1TR001086]). Nilam Soni (U.S. Department of Veterans Affairs and Quality Enhancement Research Initiative Partnered Evaluation Initiative grant [HX002263-01A1]). The contents of this publication do not represent the views of the U.S. Department of Veterans Affairs or the U.S. Government.

Performance of 6 bedside procedures (paracentesis, thoracentesis, lumbar puncture, arthrocentesis, central venous catheter [CVC] placement, and arterial line placement) are considered core competencies for hospitalists.1 Yet, the American Board of Internal Medicine (ABIM) no longer requires demonstration of manual competency for bedside procedures, and graduates may enter the workforce with minimal or no experience performing such procedures.2 As such, the burden falls on hospital privileging committees to ensure providers have the necessary training and experience to competently perform invasive procedures before granting institutional privileges to perform them.3 Although recommendations for privileging to perform certain surgical procedures have been proposed,4,5 there are no widely accepted guidelines for initial or ongoing privileging of common invasive bedside procedures performed by hospitalists, and current privileging practices vary significantly.

In 2015, the Society of Hospital Medicine (SHM) set up a Point-of-Care Ultrasound (POCUS) Task Force to draft evidence-based guidelines on the use of ultrasound to perform bedside procedures. The recommendations for certification of competency in ultrasound-guided procedures may guide institutional privileging. The purpose of this study was to better understand current hospital privileging practices for invasive bedside procedures both with and without ultrasound guidance and how current practices are perceived by experts.

METHODS

Study Design, Setting, and Participants

After approval by the University of Texas Health Science Center at San Antonio Institutional Review Board, we conducted a survey of hospital privileging processes for bedside procedures from a convenience sample of hospitalist procedure experts on the SHM POCUS Task Force. All 21 hospitalists on the task force were invited to participate, including the authors of this article. These hospitalists represent 21 unique institutions, and all have clinical, educational, and/or research expertise in ultrasound-guided bedside procedures.

Survey Design

A 26-question, electronic survey on privileging for bedside procedures was conducted (Appendix A). Twenty questions addressed procedures in general, such as minimum numbers of procedures required and use of simulation. Six questions focused on the use of ultrasound guidance. To provide context, many questions were framed to assess a privileging process being drafted by the task force. Answers were either multiple choice or free text.

Data Collection and Analysis

All members of the task force were invited to complete the survey by e-mail during November 2016. A reminder e-mail was sent on the day after initial distribution. No compensation was offered, and participation was not required. Survey results were compiled electronically through Research Electronic Data Capture, or “REDCap”TM (Nashville, Tennessee), and data analysis was performed with Stata version 14 (College Station, Texas). Means of current and recommended minimum thresholds were calculated by excluding responses of “I don’t know,” and responses of “no minimum number threshold” were coded as 0.

RESULTS

The survey response rate was 100% (21 of 21). All experts were hospitalists, but 2 also identified themselves as intensivists. Experts practiced in a variety of hospital settings, including private university hospitals (43%), public university hospitals (19%), Veterans Affairs teaching hospitals (14%), community teaching hospitals (14%), and community nonteaching hospitals (10%). Most hospitals (90%) were teaching hospitals for internal medicine trainees. All experts have personally performed bedside procedures on a regular basis, and most (86%) had leadership roles in teaching procedures to students, residents, fellows, physician assistants, nurse practitioners, and/or physicians. Approximately half (57%) were involved in granting privileges for bedside procedures at their institutions.

Most hospitals do not require the use of ultrasound guidance for the privileging of any procedure, but ultrasound guidance was reported to be routinely used for paracentesis (100%), thoracentesis (95%), and CVC placement (95%). Ultrasound guidance was less common for arterial line placement (57%), lumbar puncture (33%), and arthrocentesis (29%). There was strong agreement that ultrasound guidance ought to be required for initial and ongoing privileging of CVC placement, thoracentesis, and paracentesis. But there was less agreement for arterial line placement, arthrocentesis, and lumbar puncture (Figure 1).

Only half of the experts reported that their hospitals required a minimum number of procedures to earn initial (48%) or ongoing (52%) privileges to perform bedside procedures. Nevertheless, most experts thought there ought to be minimum numbers of procedures for initial (81%) and ongoing (81%) privileging, recommending higher minimums for both initial and ongoing privileging than are currently required at their hospitals (Figure 2).

The average difference between suggested and current minimum numbers of procedures required for initial privileging was 4.7 for paracentesis, 5.8 for thoracentesis, 5.8 for CVC catheter insertion, 5.4 for lumbar puncture, 4.8 for arterial line insertion, and 3.6 for arthrocentesis. The average difference between suggested and current minimum numbers of yearly procedures required for ongoing privileging was 2.0 for paracentesis, 2.8 for thoracentesis, 2.9 for CVC catheter insertion, 1.9 for lumbar puncture, 2.1 for arterial line insertion, and 2.5 for arthrocentesis (Appendix B).

Most hospitalist procedure experts thought that simulation training (67%) and direct observation of procedural skills (71%) should be core components of an initial privileging process. Many of the experts who did not agree with direct observation or simulation training as core components of initial privileging had concerns about feasibility with respect to manpower, availability of simulation equipment, and costs. In contrast, the majority (67%) did not think it was necessary to directly observe providers for ongoing privileging when routine monitoring was in place for periprocedural complications, which all experts (100%) agreed should be in place.

 

 

DISCUSSION

Our survey identified 3 distinct differences between hospitalist procedure experts’ recommendations and their own hospitals’ current privileging practices. First, whereas experts recommended ultrasound guidance for thoracentesis, paracentesis, and CVC placement, it is rarely a current requirement. Second, experts recommend requiring minimum numbers of procedures for both initial and ongoing privileging even though such minimums are not currently required at half of their hospitals. Third, recommended minimum numbers were generally higher than those currently in place.

The routine use of ultrasound guidance for thoracentesis, paracentesis, and CVC placement is likely a result of increased adoption based on the literature showing clinical benefits.6-9 Thus, the expert recommendations for required use of ultrasound guidance for these procedures seems both appropriate and feasible. The procedure minimums identified in our study are similar to prior ABIM guidelines when manual competency was required for board certification in internal medicine and are comparable to recent minimums proposed by the Society of Critical Care Medicine, both of which recommended a minimum of 5 to 10 per procedure.10,11 Nevertheless, no commonly agreed-upon minimum number of procedures currently exists for certification of competency, and the variability seen in the experts’ responses further supports the idea that no specific number will guarantee competence. Thus, while requiring minimum numbers of procedures was generally considered necessary by our experts, minimums alone were also considered insufficient for initial privileging because most recommended that direct observation and simulation should be part of an initial privileging process.

These findings encourage more rigorous requirements for both initial and ongoing privileging of procedures. Nevertheless, our findings were rarely unanimous. The most frequently cited reason for disagreement on our findings was feasibility and capacity for direct observation, and the absence of ultrasound equipment or simulators, particularly in resource-limited clinical environments.

Our study has several strengths and limitations. One strength is the recruitment of study experts specifically composed of hospitalist procedure experts from diverse geographic and hospital settings. Yet, we acknowledge that our findings may not be generalizable to other specialties. Another strength is we obtained 100% participation from the experts surveyed. Weaknesses of this study include the relatively small number of experts who are likely to be biased in favor of both the use of ultrasound guidance and higher standards for privileging. We also relied on self-reported data about privileging processes rather than direct observation of those practices. Finally, questions were framed in the context of only 1 possible privileging pathway, and experts may respond differently to a different framing.

CONCLUSION

Our findings may guide the development of more standardized frameworks for initial and ongoing privileging of hospitalists for invasive bedside procedures. In particular, additional privileging requirements may include the routine use of ultrasound guidance for paracentesis, thoracentesis, and CVC insertion; simulation preceding direct observation of manual skills if possible; and higher required minimums of procedures for both initial and ongoing privileging. The goal of a standardized framework for privileging should be directed at improving the quality and safety of bedside procedures but must consider feasibility in diverse clinical settings where hospitalists work.

Acknowledgments

The authors thank the hospitalists on the SHM POCUS Task Force who provided data about their institutions’ privileging processes and requirements. They are also grateful to Loretta M. Grikis, MLS, AHIP, at the White River Junction Veterans Affairs Medical Center for her assistance as a medical librarian.

Disclosure

Brian P. Lucas (U.S. Department of Veterans Affairs, Veterans Health Administration, Office of Research and Development, Dartmouth SYNERGY, and the National Institutes of Health National Center for Advancing Translational Sciences [UL1TR001086]). Nilam Soni (U.S. Department of Veterans Affairs and Quality Enhancement Research Initiative Partnered Evaluation Initiative grant [HX002263-01A1]). The contents of this publication do not represent the views of the U.S. Department of Veterans Affairs or the U.S. Government.

References

1. Nichani S, Jonathan Crocker, MD, Nick Fitterman, MD, Michael Lukela, MD, Updating the core competencies in hospital medicine—2017 revision: Introduction and methodology. J Hosp Med 2017;12(4);283-287. PubMed
2. American Board of Internal Medicine. Policies and Procedures for Certification. http://www.abim.org/certification/policies/imss/im.aspx - procedures. Published July 2016. Accessed on November 8, 2016.
3. Department of Health & Human Services. Centers for Medicare & Medicaid Services (CMS) Requirements for Hospital Medical Staff Privileging. Centers for Medicare and Medicaid Services website. https://www.cms.gov/Medicare/Provider-Enrollment-and-Certification/SurveyCertificationGenInfo/downloads/SCLetter05-04.pdf. Published November 12, 2004. Accessed on November 8, 2016. 
4. Blackmon SH, Cooke DT, Whyte R, et al. The Society of Thoracic Surgeons Expert Consensus Statement: A tool kit to assist thoracic surgeons seeking privileging to use new technology and perform advanced procedures in general thoracic surgery. Ann Thorac Surg. 2016;101(3):1230-1237. PubMed
5. Bhora FY, Al-Ayoubi AM, Rehmani SS, Forleiter CM, Raad WN, Belsley SG. Robotically assisted thoracic surgery: proposed guidelines for privileging and credentialing. Innovations (Phila). 2016;11(6):386-389. PubMed
6. Mercaldi CJ, Lanes SF. Ultrasound guidance decreases complications and improves the cost of care among patients undergoing thoracentesis and paracentesis. Chest. 2013;143:532-538. PubMed
7. Patel PA, Ernst FR, Gunnarsson CL. Ultrasonography guidance reduces complications and costs associated with thoracentesis procedures. J Clin Ultrasound. 2012;40:135-141. PubMed
8. Brass P, Hellmich M, Kolodziej L, Schick G, Smith AF. Ultrasound guidance versus anatomical landmarks for internal jugular vein catheterization. Cochrane Database Syst Rev. 2015;1:CD006962. DOI: 10.1002/14651858.CD006962.pub2. PubMed
9. Brass P, Hellmich M, Kolodziej L, Schick G, Smith AF. Ultrasound guidance versus anatomical landmarks for subclavian or femoral vein catheterization. Cochrane Database Syst Rev. 2015;1:CD011447. DOI: 10.1002/14651858.CD011447. PubMed
10. American Board of Internal Medicine. Policies and Procedures. Philadelphia, PA; July 1990.
11. Society of Critical Care Medicine Ultrasound Certification Task Force. Recommendations for Achieving and Maintaining Competence and Credentialing in Critical Care Ultrasound with Focused Cardiac Ultrasound and Advanced Critical Care Echocardiography. http://journals.lww.com/ccmjournal/Documents/Critical%20Care%20Ultrasound.pdf. Published 2013. Accessed November 8, 2016. 

References

1. Nichani S, Jonathan Crocker, MD, Nick Fitterman, MD, Michael Lukela, MD, Updating the core competencies in hospital medicine—2017 revision: Introduction and methodology. J Hosp Med 2017;12(4);283-287. PubMed
2. American Board of Internal Medicine. Policies and Procedures for Certification. http://www.abim.org/certification/policies/imss/im.aspx - procedures. Published July 2016. Accessed on November 8, 2016.
3. Department of Health & Human Services. Centers for Medicare & Medicaid Services (CMS) Requirements for Hospital Medical Staff Privileging. Centers for Medicare and Medicaid Services website. https://www.cms.gov/Medicare/Provider-Enrollment-and-Certification/SurveyCertificationGenInfo/downloads/SCLetter05-04.pdf. Published November 12, 2004. Accessed on November 8, 2016. 
4. Blackmon SH, Cooke DT, Whyte R, et al. The Society of Thoracic Surgeons Expert Consensus Statement: A tool kit to assist thoracic surgeons seeking privileging to use new technology and perform advanced procedures in general thoracic surgery. Ann Thorac Surg. 2016;101(3):1230-1237. PubMed
5. Bhora FY, Al-Ayoubi AM, Rehmani SS, Forleiter CM, Raad WN, Belsley SG. Robotically assisted thoracic surgery: proposed guidelines for privileging and credentialing. Innovations (Phila). 2016;11(6):386-389. PubMed
6. Mercaldi CJ, Lanes SF. Ultrasound guidance decreases complications and improves the cost of care among patients undergoing thoracentesis and paracentesis. Chest. 2013;143:532-538. PubMed
7. Patel PA, Ernst FR, Gunnarsson CL. Ultrasonography guidance reduces complications and costs associated with thoracentesis procedures. J Clin Ultrasound. 2012;40:135-141. PubMed
8. Brass P, Hellmich M, Kolodziej L, Schick G, Smith AF. Ultrasound guidance versus anatomical landmarks for internal jugular vein catheterization. Cochrane Database Syst Rev. 2015;1:CD006962. DOI: 10.1002/14651858.CD006962.pub2. PubMed
9. Brass P, Hellmich M, Kolodziej L, Schick G, Smith AF. Ultrasound guidance versus anatomical landmarks for subclavian or femoral vein catheterization. Cochrane Database Syst Rev. 2015;1:CD011447. DOI: 10.1002/14651858.CD011447. PubMed
10. American Board of Internal Medicine. Policies and Procedures. Philadelphia, PA; July 1990.
11. Society of Critical Care Medicine Ultrasound Certification Task Force. Recommendations for Achieving and Maintaining Competence and Credentialing in Critical Care Ultrasound with Focused Cardiac Ultrasound and Advanced Critical Care Echocardiography. http://journals.lww.com/ccmjournal/Documents/Critical%20Care%20Ultrasound.pdf. Published 2013. Accessed November 8, 2016. 

Issue
Journal of Hospital Medicine 12(10)
Issue
Journal of Hospital Medicine 12(10)
Page Number
836-839. Published online first September 6, 2017.
Page Number
836-839. Published online first September 6, 2017.
Publications
Publications
Topics
Article Type
Sections
Article Source

© 2017 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Trevor P. Jensen, MD, MS, Assistant Clinical Professor, UCSF Division of Hospital Medicine, 533 Parnassus Ave., UC Hall, San Francisco, CA 94143; Telephone: XXX-XXX-XXXX; Fax: XXX-XXX-XXXX; E-mail: Trevor.Jensen@ucsf.edu
Content Gating
Open Access (article Unlocked/Open Access)
Alternative CME
Disqus Comments
Default
Use ProPublica
Article PDF Media
Media Files

Certification of Point-of-Care Ultrasound Competency

Article Type
Changed
Fri, 12/14/2018 - 08:03

Any conversation about point-of-care ultrasound (POCUS) inevitably brings up discussion about credentialing, privileging, and certification. While credentialing and privileging are institution-specific processes, competency certification can be extramural through a national board or intramural through an institutional process.

Currently, no broadly accepted national board certification for POCUS exists; however, some specialty boards, such as emergency medicine, already include competency in POCUS. Thus, many institutions grant POCUS privileges to emergency medicine physicians based solely on their national board certification. In contrast, most hospitalists are certified by the American Board of Internal Medicine, which does not include competency in POCUS. Some hospitalists have pursued extramural certificate programs offered by professional organizations, such as the American College of Chest Physicians. The currently available extramural certificate programs can certify basic competency in POCUS knowledge and skills. But none of them can deem a provider competent in POCUS, which requires mastery of knowledge, image acquisition, image interpretation, and clinical integration (Figure). Image acquisition and interpretation skills are learned at varying rates. Those skills, followed by an understanding of how to integrate POCUS findings into clinical care of patients, are ones that cannot be acquired after a weekend training course.1

Some institutions have begun to develop intramural certification pathways for POCUS competency in order to grant privileges to hospitalists. In this edition of the Journal of Hospital Medicine, Mathews and Zwank2 describe a multidisciplinary collaboration to provide POCUS training, intramural certification, and quality assurance for hospitalists at one hospital in Minnesota. This model serves as a real-world example of how institutions are addressing the need to certify hospitalists in basic POCUS competency. After engaging stakeholders from radiology, critical care, emergency medicine, and cardiology, institutional standards were developed and hospitalists were assessed for basic POCUS competency. Certification included assessments of hospitalists’ knowledge, image acquisition, and image interpretation skills. The model described by Mathews did not assess competency in clinical integration but laid the groundwork for future evaluation of clinical outcomes in the cohort of certified hospitalists.

Although experts may not agree on all aspects of competency in POCUS, most will agree with the basic principles outlined by Mathews and Zwank. Initial certification should be based on training and an initial assessment of competency. Components of training should include ultrasound didactics, mentored hands-on practice, independent hands-on practice, and image interpretation practice. Ongoing certification should be based on quality assurance incorporated with an ongoing assessment of skills. Additionally, most experts will agree that competency can be recognized, and formative and summative assessments that combine a gestalt of provider skills with quantitative scoring systems using checklists are likely the best approach.

The real question is, what is the goal of certification of POCUS competency? Development of an institutional certification process demands substantive resources of the institution and time of the providers. Institutions would have to invest in equipment and staff to operate a full-time certification program, given the large number of providers that use POCUS and justify why substantive resources are being dedicated to certify POCUS skills and not others. Providers may be dissuaded from using POCUS if certification requirements are burdensome, which has potential negative consequences, such as reverting back to performing bedside procedures without ultrasound guidance or referring all patients to interventional radiology.

Conceptually, one may speculate that certification is required for providers to bill for POCUS exams, but certification is not required to bill, although institutions may require certification before granting privileges to use POCUS. However, based on the emergency medicine experience, a specialty that has been using POCUS for more than 20 years, billing may not be the main driver of POCUS use. A recent review of 2012 Medicare data revealed that <1% of emergency medicine providers received reimbursement for limited ultrasound exams.3 Despite the Accreditation Council for Graduate Medical Education (ACGME) requirement for POCUS competency of all graduating emergency medicine residents since 2001 and the increasing POCUS use reported by emergency medicine physicians,4,5 most emergency medicine physicians are not billing for POCUS exams. Maybe use of POCUS as a “quick look” or extension of the physical examination is more common than previously thought. Although billing for POCUS exams can generate some clinical revenue, the benefits for the healthcare system by expediting care,6,7 reducing ancillary testing,8,9 and reducing procedural complications10,11 likely outweigh the small gains from billing for limited ultrasound exams. As healthcare payment models evolve to reward healthcare systems that achieve good outcomes rather than services rendered, certification for the sole purpose of billing may become obsolete. Furthermore, concerns about billing increasing medical liability from using POCUS are likely overstated because few lawsuits have resulted from missed diagnoses by POCUS, and most lawsuits have been from failure to perform a POCUS exam in a timely manner.12,13

Many medical students graduating today have had some training in POCUS14 and, as this new generation of physicians enters the workforce, they will likely view POCUS as part of their routine bedside evaluation of patients. If POCUS training is integrated into medical school and residency curricula, and national board certification incorporates basic POCUS competency, then most institutions may no longer feel obligated to certify POCUS competency locally, and institutional certification programs, such as the one described by Mathews and Zwank, would become obsolete.

For now, until all providers enter the workforce with basic competency in POCUS and medical culture accepts that ultrasound is a diagnostic tool available to any trained provider, hospitalists may need to provide proof of their competence through intramural or extramural certification. The work of Mathews and Zwank provides an example of how local certification processes can be established. In a future edition of the Journal of Hospital Medicine, the Society of Hospital Medicine Point-of-Care Ultrasound Task Force will present a position statement with recommendations for certification of competency in bedside ultrasound-guided procedures.

 

 

Disclosure

Nilam Soni receives support from the U.S. Department of Veterans Affairs, Quality Enhancement Research Initiative (QUERI) Partnered Evaluation Initiative Grant (HX002263-01A1). Brian P. Lucas receives support from the Department of Veterans Affairs, Veterans Health Administration, Office of Research and Development and Dartmouth SYNERGY, National Institutes of Health, National Center for Translational Science (UL1TR001086). The contents of this publication do not represent the views of the U.S. Department of Veterans Affairs or the United States Government.

References

1. Bahner DP, Hughes D, Royall NA. I-AIM: a novel model for teaching and performing focused sonography. J Ultrasound Med. 2012;31:295-300. PubMed

2. Mathews BK, Zwank M. Hospital Medicine Point of Care Ultrasound Credentialing: An Example Protocol. J Hosp Med. 2017;12(9):767-772. PubMed

3. Hall MK, Hall J, Gross CP, et al. Use of Point-of-Care Ultrasound in the Emergency Department: Insights From the 2012 Medicare National Payment Data Set. J Ultrasound Med. 2016;35:2467-2474. PubMed

4. Amini R, Wyman MT, Hernandez NC, Guisto JA, Adhikari S. Use of Emergency Ultrasound in Arizona Community Emergency Departments. J Ultrasound Med. 2017;36(5):913-921. PubMed

5. Herbst MK, Camargo CA, Jr., Perez A, Moore CL. Use of Point-of-Care Ultrasound in Connecticut Emergency Departments. J Emerg Med. 2015;48:191-196. PubMed

6. Kory PD, Pellecchia CM, Shiloh AL, Mayo PH, DiBello C, Koenig S. Accuracy of ultrasonography performed by critical care physicians for the diagnosis of DVT. Chest. 2011;139:538-542. PubMed

7. Lucas BP, Candotti C, Margeta B, et al. Hand-carried echocardiography by hospitalists: a randomized trial. Am J Med. 2011;124:766-774. PubMed

8. Oks M, Cleven KL, Cardenas-Garcia J, et al. The effect of point-of-care ultrasonography on imaging studies in the medical ICU: a comparative study. Chest. 2014;146:1574-1577. PubMed

9. Koenig S, Chandra S, Alaverdian A, Dibello C, Mayo PH, Narasimhan M. Ultrasound assessment of pulmonary embolism in patients receiving CT pulmonary angiography. Chest. 2014;145:818-823. PubMed

10. Mercaldi CJ, Lanes SF. Ultrasound guidance decreases complications and improves the cost of care among patients undergoing thoracentesis and paracentesis. Chest. 2013;143:532-538. PubMed

11. Patel PA, Ernst FR, Gunnarsson CL. Ultrasonography guidance reduces complications and costs associated with thoracentesis procedures. J Clin Ultrasound. 2012;40:135-141. PubMed

12. Stolz L, O’Brien KM, Miller ML, Winters-Brown ND, Blaivas M, Adhikari S. A review of lawsuits related to point-of-care emergency ultrasound applications. West J Emerg Med. 2015;16:1-4. PubMed

13. Blaivas M, Pawl R. Analysis of lawsuits filed against emergency physicians for point-of-care emergency ultrasound examination performance and interpretation over a 20-year period. Am J Emerg Med. 2012;30:338-341. PubMed

14. Bahner DP, Goldman E, Way D, Royall NA, Liu YT. The state of ultrasound education in U.S. medical schools: results of a national survey. Acad Med. 2014;89:1681-1686. PubMed

Article PDF
Issue
Journal of Hospital Medicine 12 (9)
Publications
Topics
Page Number
775-776
Sections
Article PDF
Article PDF

Any conversation about point-of-care ultrasound (POCUS) inevitably brings up discussion about credentialing, privileging, and certification. While credentialing and privileging are institution-specific processes, competency certification can be extramural through a national board or intramural through an institutional process.

Currently, no broadly accepted national board certification for POCUS exists; however, some specialty boards, such as emergency medicine, already include competency in POCUS. Thus, many institutions grant POCUS privileges to emergency medicine physicians based solely on their national board certification. In contrast, most hospitalists are certified by the American Board of Internal Medicine, which does not include competency in POCUS. Some hospitalists have pursued extramural certificate programs offered by professional organizations, such as the American College of Chest Physicians. The currently available extramural certificate programs can certify basic competency in POCUS knowledge and skills. But none of them can deem a provider competent in POCUS, which requires mastery of knowledge, image acquisition, image interpretation, and clinical integration (Figure). Image acquisition and interpretation skills are learned at varying rates. Those skills, followed by an understanding of how to integrate POCUS findings into clinical care of patients, are ones that cannot be acquired after a weekend training course.1

Some institutions have begun to develop intramural certification pathways for POCUS competency in order to grant privileges to hospitalists. In this edition of the Journal of Hospital Medicine, Mathews and Zwank2 describe a multidisciplinary collaboration to provide POCUS training, intramural certification, and quality assurance for hospitalists at one hospital in Minnesota. This model serves as a real-world example of how institutions are addressing the need to certify hospitalists in basic POCUS competency. After engaging stakeholders from radiology, critical care, emergency medicine, and cardiology, institutional standards were developed and hospitalists were assessed for basic POCUS competency. Certification included assessments of hospitalists’ knowledge, image acquisition, and image interpretation skills. The model described by Mathews did not assess competency in clinical integration but laid the groundwork for future evaluation of clinical outcomes in the cohort of certified hospitalists.

Although experts may not agree on all aspects of competency in POCUS, most will agree with the basic principles outlined by Mathews and Zwank. Initial certification should be based on training and an initial assessment of competency. Components of training should include ultrasound didactics, mentored hands-on practice, independent hands-on practice, and image interpretation practice. Ongoing certification should be based on quality assurance incorporated with an ongoing assessment of skills. Additionally, most experts will agree that competency can be recognized, and formative and summative assessments that combine a gestalt of provider skills with quantitative scoring systems using checklists are likely the best approach.

The real question is, what is the goal of certification of POCUS competency? Development of an institutional certification process demands substantive resources of the institution and time of the providers. Institutions would have to invest in equipment and staff to operate a full-time certification program, given the large number of providers that use POCUS and justify why substantive resources are being dedicated to certify POCUS skills and not others. Providers may be dissuaded from using POCUS if certification requirements are burdensome, which has potential negative consequences, such as reverting back to performing bedside procedures without ultrasound guidance or referring all patients to interventional radiology.

Conceptually, one may speculate that certification is required for providers to bill for POCUS exams, but certification is not required to bill, although institutions may require certification before granting privileges to use POCUS. However, based on the emergency medicine experience, a specialty that has been using POCUS for more than 20 years, billing may not be the main driver of POCUS use. A recent review of 2012 Medicare data revealed that <1% of emergency medicine providers received reimbursement for limited ultrasound exams.3 Despite the Accreditation Council for Graduate Medical Education (ACGME) requirement for POCUS competency of all graduating emergency medicine residents since 2001 and the increasing POCUS use reported by emergency medicine physicians,4,5 most emergency medicine physicians are not billing for POCUS exams. Maybe use of POCUS as a “quick look” or extension of the physical examination is more common than previously thought. Although billing for POCUS exams can generate some clinical revenue, the benefits for the healthcare system by expediting care,6,7 reducing ancillary testing,8,9 and reducing procedural complications10,11 likely outweigh the small gains from billing for limited ultrasound exams. As healthcare payment models evolve to reward healthcare systems that achieve good outcomes rather than services rendered, certification for the sole purpose of billing may become obsolete. Furthermore, concerns about billing increasing medical liability from using POCUS are likely overstated because few lawsuits have resulted from missed diagnoses by POCUS, and most lawsuits have been from failure to perform a POCUS exam in a timely manner.12,13

Many medical students graduating today have had some training in POCUS14 and, as this new generation of physicians enters the workforce, they will likely view POCUS as part of their routine bedside evaluation of patients. If POCUS training is integrated into medical school and residency curricula, and national board certification incorporates basic POCUS competency, then most institutions may no longer feel obligated to certify POCUS competency locally, and institutional certification programs, such as the one described by Mathews and Zwank, would become obsolete.

For now, until all providers enter the workforce with basic competency in POCUS and medical culture accepts that ultrasound is a diagnostic tool available to any trained provider, hospitalists may need to provide proof of their competence through intramural or extramural certification. The work of Mathews and Zwank provides an example of how local certification processes can be established. In a future edition of the Journal of Hospital Medicine, the Society of Hospital Medicine Point-of-Care Ultrasound Task Force will present a position statement with recommendations for certification of competency in bedside ultrasound-guided procedures.

 

 

Disclosure

Nilam Soni receives support from the U.S. Department of Veterans Affairs, Quality Enhancement Research Initiative (QUERI) Partnered Evaluation Initiative Grant (HX002263-01A1). Brian P. Lucas receives support from the Department of Veterans Affairs, Veterans Health Administration, Office of Research and Development and Dartmouth SYNERGY, National Institutes of Health, National Center for Translational Science (UL1TR001086). The contents of this publication do not represent the views of the U.S. Department of Veterans Affairs or the United States Government.

Any conversation about point-of-care ultrasound (POCUS) inevitably brings up discussion about credentialing, privileging, and certification. While credentialing and privileging are institution-specific processes, competency certification can be extramural through a national board or intramural through an institutional process.

Currently, no broadly accepted national board certification for POCUS exists; however, some specialty boards, such as emergency medicine, already include competency in POCUS. Thus, many institutions grant POCUS privileges to emergency medicine physicians based solely on their national board certification. In contrast, most hospitalists are certified by the American Board of Internal Medicine, which does not include competency in POCUS. Some hospitalists have pursued extramural certificate programs offered by professional organizations, such as the American College of Chest Physicians. The currently available extramural certificate programs can certify basic competency in POCUS knowledge and skills. But none of them can deem a provider competent in POCUS, which requires mastery of knowledge, image acquisition, image interpretation, and clinical integration (Figure). Image acquisition and interpretation skills are learned at varying rates. Those skills, followed by an understanding of how to integrate POCUS findings into clinical care of patients, are ones that cannot be acquired after a weekend training course.1

Some institutions have begun to develop intramural certification pathways for POCUS competency in order to grant privileges to hospitalists. In this edition of the Journal of Hospital Medicine, Mathews and Zwank2 describe a multidisciplinary collaboration to provide POCUS training, intramural certification, and quality assurance for hospitalists at one hospital in Minnesota. This model serves as a real-world example of how institutions are addressing the need to certify hospitalists in basic POCUS competency. After engaging stakeholders from radiology, critical care, emergency medicine, and cardiology, institutional standards were developed and hospitalists were assessed for basic POCUS competency. Certification included assessments of hospitalists’ knowledge, image acquisition, and image interpretation skills. The model described by Mathews did not assess competency in clinical integration but laid the groundwork for future evaluation of clinical outcomes in the cohort of certified hospitalists.

Although experts may not agree on all aspects of competency in POCUS, most will agree with the basic principles outlined by Mathews and Zwank. Initial certification should be based on training and an initial assessment of competency. Components of training should include ultrasound didactics, mentored hands-on practice, independent hands-on practice, and image interpretation practice. Ongoing certification should be based on quality assurance incorporated with an ongoing assessment of skills. Additionally, most experts will agree that competency can be recognized, and formative and summative assessments that combine a gestalt of provider skills with quantitative scoring systems using checklists are likely the best approach.

The real question is, what is the goal of certification of POCUS competency? Development of an institutional certification process demands substantive resources of the institution and time of the providers. Institutions would have to invest in equipment and staff to operate a full-time certification program, given the large number of providers that use POCUS and justify why substantive resources are being dedicated to certify POCUS skills and not others. Providers may be dissuaded from using POCUS if certification requirements are burdensome, which has potential negative consequences, such as reverting back to performing bedside procedures without ultrasound guidance or referring all patients to interventional radiology.

Conceptually, one may speculate that certification is required for providers to bill for POCUS exams, but certification is not required to bill, although institutions may require certification before granting privileges to use POCUS. However, based on the emergency medicine experience, a specialty that has been using POCUS for more than 20 years, billing may not be the main driver of POCUS use. A recent review of 2012 Medicare data revealed that <1% of emergency medicine providers received reimbursement for limited ultrasound exams.3 Despite the Accreditation Council for Graduate Medical Education (ACGME) requirement for POCUS competency of all graduating emergency medicine residents since 2001 and the increasing POCUS use reported by emergency medicine physicians,4,5 most emergency medicine physicians are not billing for POCUS exams. Maybe use of POCUS as a “quick look” or extension of the physical examination is more common than previously thought. Although billing for POCUS exams can generate some clinical revenue, the benefits for the healthcare system by expediting care,6,7 reducing ancillary testing,8,9 and reducing procedural complications10,11 likely outweigh the small gains from billing for limited ultrasound exams. As healthcare payment models evolve to reward healthcare systems that achieve good outcomes rather than services rendered, certification for the sole purpose of billing may become obsolete. Furthermore, concerns about billing increasing medical liability from using POCUS are likely overstated because few lawsuits have resulted from missed diagnoses by POCUS, and most lawsuits have been from failure to perform a POCUS exam in a timely manner.12,13

Many medical students graduating today have had some training in POCUS14 and, as this new generation of physicians enters the workforce, they will likely view POCUS as part of their routine bedside evaluation of patients. If POCUS training is integrated into medical school and residency curricula, and national board certification incorporates basic POCUS competency, then most institutions may no longer feel obligated to certify POCUS competency locally, and institutional certification programs, such as the one described by Mathews and Zwank, would become obsolete.

For now, until all providers enter the workforce with basic competency in POCUS and medical culture accepts that ultrasound is a diagnostic tool available to any trained provider, hospitalists may need to provide proof of their competence through intramural or extramural certification. The work of Mathews and Zwank provides an example of how local certification processes can be established. In a future edition of the Journal of Hospital Medicine, the Society of Hospital Medicine Point-of-Care Ultrasound Task Force will present a position statement with recommendations for certification of competency in bedside ultrasound-guided procedures.

 

 

Disclosure

Nilam Soni receives support from the U.S. Department of Veterans Affairs, Quality Enhancement Research Initiative (QUERI) Partnered Evaluation Initiative Grant (HX002263-01A1). Brian P. Lucas receives support from the Department of Veterans Affairs, Veterans Health Administration, Office of Research and Development and Dartmouth SYNERGY, National Institutes of Health, National Center for Translational Science (UL1TR001086). The contents of this publication do not represent the views of the U.S. Department of Veterans Affairs or the United States Government.

References

1. Bahner DP, Hughes D, Royall NA. I-AIM: a novel model for teaching and performing focused sonography. J Ultrasound Med. 2012;31:295-300. PubMed

2. Mathews BK, Zwank M. Hospital Medicine Point of Care Ultrasound Credentialing: An Example Protocol. J Hosp Med. 2017;12(9):767-772. PubMed

3. Hall MK, Hall J, Gross CP, et al. Use of Point-of-Care Ultrasound in the Emergency Department: Insights From the 2012 Medicare National Payment Data Set. J Ultrasound Med. 2016;35:2467-2474. PubMed

4. Amini R, Wyman MT, Hernandez NC, Guisto JA, Adhikari S. Use of Emergency Ultrasound in Arizona Community Emergency Departments. J Ultrasound Med. 2017;36(5):913-921. PubMed

5. Herbst MK, Camargo CA, Jr., Perez A, Moore CL. Use of Point-of-Care Ultrasound in Connecticut Emergency Departments. J Emerg Med. 2015;48:191-196. PubMed

6. Kory PD, Pellecchia CM, Shiloh AL, Mayo PH, DiBello C, Koenig S. Accuracy of ultrasonography performed by critical care physicians for the diagnosis of DVT. Chest. 2011;139:538-542. PubMed

7. Lucas BP, Candotti C, Margeta B, et al. Hand-carried echocardiography by hospitalists: a randomized trial. Am J Med. 2011;124:766-774. PubMed

8. Oks M, Cleven KL, Cardenas-Garcia J, et al. The effect of point-of-care ultrasonography on imaging studies in the medical ICU: a comparative study. Chest. 2014;146:1574-1577. PubMed

9. Koenig S, Chandra S, Alaverdian A, Dibello C, Mayo PH, Narasimhan M. Ultrasound assessment of pulmonary embolism in patients receiving CT pulmonary angiography. Chest. 2014;145:818-823. PubMed

10. Mercaldi CJ, Lanes SF. Ultrasound guidance decreases complications and improves the cost of care among patients undergoing thoracentesis and paracentesis. Chest. 2013;143:532-538. PubMed

11. Patel PA, Ernst FR, Gunnarsson CL. Ultrasonography guidance reduces complications and costs associated with thoracentesis procedures. J Clin Ultrasound. 2012;40:135-141. PubMed

12. Stolz L, O’Brien KM, Miller ML, Winters-Brown ND, Blaivas M, Adhikari S. A review of lawsuits related to point-of-care emergency ultrasound applications. West J Emerg Med. 2015;16:1-4. PubMed

13. Blaivas M, Pawl R. Analysis of lawsuits filed against emergency physicians for point-of-care emergency ultrasound examination performance and interpretation over a 20-year period. Am J Emerg Med. 2012;30:338-341. PubMed

14. Bahner DP, Goldman E, Way D, Royall NA, Liu YT. The state of ultrasound education in U.S. medical schools: results of a national survey. Acad Med. 2014;89:1681-1686. PubMed

References

1. Bahner DP, Hughes D, Royall NA. I-AIM: a novel model for teaching and performing focused sonography. J Ultrasound Med. 2012;31:295-300. PubMed

2. Mathews BK, Zwank M. Hospital Medicine Point of Care Ultrasound Credentialing: An Example Protocol. J Hosp Med. 2017;12(9):767-772. PubMed

3. Hall MK, Hall J, Gross CP, et al. Use of Point-of-Care Ultrasound in the Emergency Department: Insights From the 2012 Medicare National Payment Data Set. J Ultrasound Med. 2016;35:2467-2474. PubMed

4. Amini R, Wyman MT, Hernandez NC, Guisto JA, Adhikari S. Use of Emergency Ultrasound in Arizona Community Emergency Departments. J Ultrasound Med. 2017;36(5):913-921. PubMed

5. Herbst MK, Camargo CA, Jr., Perez A, Moore CL. Use of Point-of-Care Ultrasound in Connecticut Emergency Departments. J Emerg Med. 2015;48:191-196. PubMed

6. Kory PD, Pellecchia CM, Shiloh AL, Mayo PH, DiBello C, Koenig S. Accuracy of ultrasonography performed by critical care physicians for the diagnosis of DVT. Chest. 2011;139:538-542. PubMed

7. Lucas BP, Candotti C, Margeta B, et al. Hand-carried echocardiography by hospitalists: a randomized trial. Am J Med. 2011;124:766-774. PubMed

8. Oks M, Cleven KL, Cardenas-Garcia J, et al. The effect of point-of-care ultrasonography on imaging studies in the medical ICU: a comparative study. Chest. 2014;146:1574-1577. PubMed

9. Koenig S, Chandra S, Alaverdian A, Dibello C, Mayo PH, Narasimhan M. Ultrasound assessment of pulmonary embolism in patients receiving CT pulmonary angiography. Chest. 2014;145:818-823. PubMed

10. Mercaldi CJ, Lanes SF. Ultrasound guidance decreases complications and improves the cost of care among patients undergoing thoracentesis and paracentesis. Chest. 2013;143:532-538. PubMed

11. Patel PA, Ernst FR, Gunnarsson CL. Ultrasonography guidance reduces complications and costs associated with thoracentesis procedures. J Clin Ultrasound. 2012;40:135-141. PubMed

12. Stolz L, O’Brien KM, Miller ML, Winters-Brown ND, Blaivas M, Adhikari S. A review of lawsuits related to point-of-care emergency ultrasound applications. West J Emerg Med. 2015;16:1-4. PubMed

13. Blaivas M, Pawl R. Analysis of lawsuits filed against emergency physicians for point-of-care emergency ultrasound examination performance and interpretation over a 20-year period. Am J Emerg Med. 2012;30:338-341. PubMed

14. Bahner DP, Goldman E, Way D, Royall NA, Liu YT. The state of ultrasound education in U.S. medical schools: results of a national survey. Acad Med. 2014;89:1681-1686. PubMed

Issue
Journal of Hospital Medicine 12 (9)
Issue
Journal of Hospital Medicine 12 (9)
Page Number
775-776
Page Number
775-776
Publications
Publications
Topics
Article Type
Sections
Article Source

© 2017 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Nilam J. Soni, MD, MS, 7703 Floyd Curl Drive, MC 7982, San Antonio, TX 78229; Telephone: 210-743-6030; Fax: 210-358-0647; E-mail: sonin@uthscsa.edu
Content Gating
Open Access (article Unlocked/Open Access)
Alternative CME
Disqus Comments
Default
Use ProPublica
Article PDF Media

Ultrasound Measurement to Estimate CVP

Article Type
Changed
Wed, 05/23/2018 - 14:49
Display Headline
Diagnostic accuracy of a simple ultrasound measurement to estimate central venous pressure in spontaneously breathing, critically ill patients

Severe sepsis and septic shock account for more than 750,000 hospital admissions and 215,000 deaths per year.1 Early fluid resuscitation is the cornerstone of treatment, and early goal‐directed therapy (EGDT), which includes a target central venous pressure (CVP) of 8 to 12 mm Hg, has been shown to improve outcomes, including mortality and length of stay.2 This goal allows appropriate initial resuscitation and may decrease the risk of excess fluid administration, which is related to adverse outcomes in critically ill patients.3 However, nonintensivists may not start early aggressive fluid resuscitation because of inability to accurately assess intravascular volume, concerns for inadvertent volume overload, or the difficulty of recognizing insidious illness. Assessment of volume status, primarily from inspection of the internal jugular vein to estimate CVP, is difficult to perform by clinical examination alone, especially if CVP is very low.4 Inspection of the external jugular vein is perhaps easier than inspecting the internal jugular vein and appears to accurately estimate CVP,5 but it does not allow the degree of precision necessary for EGDT. Echocardiography can estimate CVP based on respirophasic variation or collapsibility index, but this technique requires expensive equipment and sonographic expertise. The current gold standard technique for measuring CVP requires an invasive central venous catheter, which can delay timely resuscitation and is associated with complications.6

An alternative technique to guide resuscitation efforts should be accurate, safe, rapid, and easy to perform at the bedside, while providing real‐time measurement results. We hypothesized that CVP can be accurately assessed using noninvasive ultrasound imaging of the internal jugular vein, since jugular venous pressure is essentially equal to CVP.7 Specifically, our study estimated the diagnostic accuracy of ultrasound measurement of the aspect ratio (height/width) of the internal jugular vein compared with the invasively measured CVP target for EGDT. We expected that a lower aspect ratio would correlate with a lower CVP and a higher aspect ratio would correlate with a higher CVP.

Methods

Volunteers were enrolled at Saint Mary's Hospital (Mayo Clinic) in Rochester, MN, from January to March 2006, and patients were enrolled at Saint Mary's Hospital and at Abbott Northwestern Hospital (Allina Hospitals and Clinics) in Minneapolis, MN, from May 2006 to October 2007. The study was approved by the Institutional Review Boards of Mayo Clinic and Allina and had 2 phases. The first phase comprised ultrasound measurements of internal jugular vein aspect ratio and determination of intraobserver and interobserver agreement in healthy volunteers. The second phase involved measurement of internal jugular vein aspect ratio and invasive CVP in a convenience sample of 44 spontaneously breathing patients admitted to medical intensive care units: 9 patients at Saint Marys Hospital and 35 patients at Abbott Northwestern Hospital. Patients were enrolled only when study members were on duty in the intensive care unit and able to perform study measurements. As a result, a high proportion of patients who may have been eligible were not asked to participate.

Each volunteer was deemed euvolemic on the basis of normal orthostatic measurements and normal oral intake with no vomiting or diarrhea in the previous 5 days. Measurements of 19 volunteers were made by 1 author (A.S.K.), with subsequent measurements of 15 of the volunteers made by another author (O.G.) to determine interobserver variability; 4 participants did not undergo a second measurement because of scheduling conflicts.

Inclusion and exclusion criteria for the critically ill patients are provided in Table 1. Recruitment was based on presenting symptoms and test results that led the intensive care unit physicians to decide to place a CVP monitor. All the enrolled patients had invasive CVP measurement performed approximately 30 to 40 minutes after ultrasound measurement of the internal jugular vein; this delay was the time required to place the central line and obtain the measurement. All patients who were invited to participate in the study were included. No patients were excluded on the basis of the exclusion criteria or because of inability to place a central line. No complications related to central line placement occurred.

Study Inclusion and Exclusion Criteria for Critically Ill Patients
Inclusion criteria
1. Aged 18 years or older
2. Admission to the intensive care unit
3. Spontaneously breathing (not intubated/ventilated)
4. Planned insertion of a central venous pressure monitor for therapy
Exclusion criteria
1. Known cervical spine injuries or fusion
2. Nonremovable cervical collars
3. Surgical dressings that would prevent visualization of the internal jugular vein
4. Inability of the patient to be properly positioned
5. A code situation

We followed a prescribed measurement technique (Table 2) to determine the internal jugular vein aspect ratio in all volunteers and patients. Measurements of the volunteers were made with a Site‐Rite 3 Ultrasound System (Bard Access Systems, Inc., Salt Lake City, UT) using a 9.0‐MHz transducer. Measurements of the critically ill patients were made with a SonoSite MicroMaxx ultrasound system (SonoSite, Inc., Bothell, WA) using a 10.5‐MHz transducer. Study team physicians initially were blinded to actual measured CVP. Internal jugular vein aspect ratio and CVP were measured at tidal volume end‐expiration for all patients. One measurement was obtained for each patient, with measurements being made by 1 of 4 physicians (2 intensivists, 1 critical care fellow, and 1 chief medicine resident). With no specific ultrasound training and with only minimal practice, the physicians could obtain the optimal aspect ratio within a few seconds (Figure 1).

Figure 1
Measurement of aspect ratio. Cross‐sectional transverse‐plane ultrasound image shows the right internal jugular vein and the common carotid artery. The internal jugular vein aspect ratio (height/width) in this example is 0.77.
Internal Jugular Vein Measurement Process
1. Position the patient supine (0) with head and legs flat, ensuring overall comfort. A small pillow can be used to help keep head, neck, and trunk aligned
2. Have the patient rotate his or her head slightly to the side (<30) to expose the internal jugular vein
3. Place the transducer transversely on the patient's neck over the expected location of the internal jugular vein. The transducer should be perpendicular to the patient's neck
4. Apply slight pressure to the transducer to locate the internal jugular vein on the view screen. Use the minimum pressure necessary to obtain a good quality ultrasound image
5. Once the internal jugular vein is found, adjust the position of the transducer over the vein to obtain the most circular cross‐sectional image
6. Have the patient breathe normally, then ask him or her to briefly stop breathing at normal (tidal volume) end‐expiration
7. Store the best end‐expiration image (in which the internal jugular vein appears most circular) and have the patient resume normal breathing
8. Measure the height and width of the internal jugular vein using the built‐in cursor function or a ruler

This was an exploratory prospective study, and all methods of data collection were designed before patient enrollment. However, the ultrasound‐derived aspect ratio of 0.83 (which defined a CVP of 8 mm Hg) was determined post hoc to maximize sensitivity and specificity and was based on the aspect ratio of the euvolemic volunteers and the inflection point of the CVP vs aspect ratio curve for the critically ill patients.

Statistical Analysis

Groups were compared using the 2 test for differences in proportions and the Wilcoxon rank sum test for continuous data. P < 0.05 was considered statistically significant. Bland‐Altman plots were used to describe the bias and variability of the aspect ratio within and between observers.8 This technique compares 2 methods of measurement to determine agreement and repeatability by plotting the mean of the differences (which should be zero) and the upper and lower limits of agreement (1.96 standard deviations [SDs] of those differences above and below the mean). Results were calculated using the available data; there was no adjustment for missing data. Analyses were performed using SPLUS and SAS/STAT software (SAS Institute, Inc., Cary, NC).

Results

We first evaluated 19 white volunteers: 12 women and 7 men. Mean (SD) age was 42 (11) years and mean body mass index was 26.6 (4.5) kg/m2. Mean arterial pressure was 89 (13) mm Hg and mean heart rate was 71 (15) beats/minute. Mean aspect ratio of the right and left internal jugular vein for all volunteers was 0.82 (0.07). There was no difference in aspect ratio between the right (0.83 [0.10]) and left (0.81 [0.13]) vein (P > 0.10). Also, no difference in the aspect ratio was seen between men (0.81 [0.08]) and women (0.83 [0.07]) (P = 0.77). Bland‐Altman analysis indicated moderate intraobserver and interobserver agreement for the aspect ratio measurements (Figure 2).

Figure 2
Bland‐Altman analysis. (A,B) Intraobserver reliability for ultrasound measurements of the aspect ratio for the (A) right and (B) left internal jugular vein made by 1 observer (A.S.K.) in 19 volunteers. (C,D) Interobserver reliability for measurements of the (C) right and (D) left internal jugular vein by 2 observers (A.S.K. and O.G.) in 15 of the volunteers. Solid line represents the mean of the difference in aspect ratio; dotted lines represent the variability of the difference. Vertical line on each graph indicates an aspect ratio of 0.83.

We then compared the aspect ratio measured using ultrasound and CVP measured with an invasive monitor for 44 spontaneously breathing critically ill patients (22 women and 22 men; 38 were white). Mean (SD) age was 66 (14) years and mean body mass index was 28.8 (9.1) kg/m2. Mean arterial pressure (n = 36) was 67 (12) mm Hg and mean heart rate (n = 34) was 92 (22) beats/minute. Systemic inflammatory response syndrome (SIRS) criteria were present in 23 of 40 patients; complete data were unavailable for the other 4 patients. Of these 40 patients, 20 had sepsis, 15 had severe sepsis, and 5 had septic shock. The most common diagnoses were gastrointestinal tract bleeding in 6 patients and congestive heart failure in 4 patients. Acute Physiology and Chronic Health Evaluation (APACHE III) score, available for 8 of the 9 patients at Saint Marys Hospital, was 63 (10).

Figure 3 shows measured aspect ratios vs. invasively measured CVP for the critically ill patients. The curvilinear result is consistent with venous and right ventricular compliance ( volume/ pressure) characteristics. Note that the inflection point (beginning of the increased slope) of the curve corresponds to a CVP of about 8 mm Hg. Furthermore, the aspect ratio (0.8) at this point is the same as that seen in the euvolemic volunteers. These findings suggest that, in spontaneously breathing patients, a CVP of about 8 mm Hg and an aspect ratio of about 0.8 each defines the beginning of the plateau on the cardiac Frank‐Starling curve.

Figure 3
Measurements in spontaneously breathing critically ill patients. Plot of the ultrasound‐measured aspect ratio of the internal jugular vein (x‐axis) vs. the invasively‐measured end‐expiration central venous pressure (CVP) (y‐axis) for each patient (n = 44). The horizontal line indicates a CVP of 8 mm Hg, and the vertical line indicates an internal jugular vein aspect ratio of 0.83. Solid line represents a loess fit to the data.

Ultrasound imaging of the internal jugular vein aspect ratio accurately estimated the CVP target of 8 mm Hg based on the area under the receiver operating characteristics curve of 0.84 (95% confidence interval [CI], 0.72‐0.96) (Figure 4). For an invasively measured CVP of less than 8 mm Hg, the likelihood ratio for a positive ultrasound test result (aspect ratio < 0.83) was 3.5 (95% CI, 1.4‐8.4) and for a negative test result (aspect ratio 0.83) was 0.30 (95% CI, 0.14‐0.62). Clinically, this means that patients with a measured aspect ratio of less than 0.83 require further fluid resuscitation, whereas patients with a measured aspect ratio of 0.83 or greater are less likely to benefit from fluid resuscitation.

Figure 4
Receiver operating characteristics curve. Sensitivity (y‐axis) is plotted vs. 1 − specificity (x‐axis) for the 42 unique internal jugular vein aspect ratios among 44 patients. Area under the curve is 0.84 (95% CI, 0.72‐0.96). The “shoulder” indicates the point of maximum sensitivity (0.78) and specificity (0.77) that corresponds to the aspect ratio of 0.83 (*).

Discussion

This study demonstrated that the EGDT CVP target of 8 to 12 mm Hg can be accurately estimated (referenced to invasive CVP monitoring) using noninvasive ultrasound measurement of the internal jugular vein in spontaneously breathing critically ill patients. The measurement process is simple to perform at the bedside and moderately reliable when performed by different observers; also, the results appear to be equivalent for both sides and for males or females. Images can be stored electronically for serial comparisons and for viewing by other caregivers. Because the aspect ratio is essentially constant over the length of the internal jugular vein, unlike diameter, measurements can be performed anywhere along the vein. Also, ultrasound imaging allows visualization of the internal jugular vein despite anatomic variation.9

Previous attempts at noninvasive hemodynamic monitoring using plethysmography, thoracic electrical bioimpedance, and external Doppler probes have shown these methods to be cumbersome or inaccurate.1013 Other investigators have used echocardiography14, 15 and handheld ultrasound16 to image the diameter of the inferior vena cava in order to assess intravascular volume status, but these techniques require expertise in sonographic imaging. An alternative technique is to measure peripheral venous pressure, which correlates with CVP.17 This method, however, requires technical expertise to zero the monitor and is not yet widely used for critically ill patients. A literature search found 1 letter to the editor suggesting that real‐time ultrasound imaging of the internal jugular vein could be used to qualitatively determine jugular venous pressure18 and 3 studies using ultrasound in conjunction with a pressure transducer or manometer to determine the pressure needed to collapse the vein (either the internal jugular or a peripheral vein), with subsequent correlation to CVP.1921 These latter techniques appear to be cumbersome and require custom equipment that is not readily available in most hospitals.

Any measurement of CVP, including our technique, assumes correlation with volume responsiveness as a surrogate for intravascular volume. However, CVP is governed by multiple physiologic and pathologic factors, including intravascular volume, vascular and ventricular compliance, ventricular function, tricuspid stenosis and regurgitation, cardiac tamponade, and atrioventricular dissociation.22, 23 Therefore, CVP alone may not be an accurate measure of volume responsiveness (intravascular volume). CVP may also have spontaneous variation similar to pulmonary capillary wedge pressure, which can be as high as 7 mm Hg in any given patient.24 Furthermore, invasive CVP monitors also have limitations, and the overall accuracy of the Philips system used at Saint Marys Hospital is 4% of the reading or 4 mm Hg, whichever is greater.25 Nonetheless, the EGDT algorithm that incorporates CVP measurement with a target of 8 to 12 mm Hg in spontaneously breathing patients and 12 mm Hg in mechanically ventilated patients has resulted in decreased mortality among patients with severe sepsis and is recommended by the Surviving Sepsis Campaign guidelines26 and the Institute for Healthcare Improvement.27

These study results are important because nonintensivists such as hospitalists and emergency department physicians can use this technique to provide rapid fluid resuscitation early in the course of severe sepsis and septic shock, when aggressive fluid resuscitation is most effective. Ultrasound imaging of the internal jugular vein is easy to perform without formal training, and the equipment is readily available in most hospitals. Future studies will evaluate outcomes in spontaneously breathing and ventilated patients to determine the accuracy of this measurement technique in volume‐depleted and volume‐overloaded states. If validated in different patient populations, ultrasound measurement of the internal jugular vein could substitute for the EGDT CVP target in critically ill patients and allow early aggressive fluid resuscitation before a central venous catheter is placed.

Limitations

This exploratory study enrolled a small convenience sample of primarily white patients. The convenience sample is potentially prone to selection bias since a majority of patients who may have been eligible were never asked to participate. Also, not all patients had sepsis syndrome; our intention was to measure CVP and aspect ratio for available critically ill patients. Accordingly, results may be different depending on severity of illness. In addition, some of the patients were transferred from outside medical centers or from emergency departments and therefore may have already been partly resuscitated. Another limitation is that the intraobserver and interobserver variability for the healthy volunteers showed only moderate agreement, possibly indicating limited repeatability, although these results could be due to the small sample size. Also, we did not determine intraobserver and interobserver variability for the critically ill patients; results may be different from those of the healthy volunteers. Furthermore, underlying conditions such as tricuspid stenosis or regurgitation and cardiac tamponade may affect measurement results, but we included all patients without formal assessment, since treatment was performed on an urgent/emergent basis as would happen in real clinical settings.

Acknowledgements

The authors dedicate this work to their patients with severe sepsis. They thank Lisa Kirkland, MD, and Murat Yilmaz, MD, for their assistance with this study. They also thank the Mayo Clinic Divisions of General Internal Medicine and Pulmonary and Critical Care Medicine for funding.

References
  1. Angus DC,Linde‐Zwirble WT,Lidicker J,Clermont G,Carcillo J,Pinsky MR.Epidemiology of severe sepsis in the United States: analysis of incidence, outcome, and associated costs of care.Crit Care Med.2001;29(7):13031310.
  2. Rivers E,Nguyen B,Havstad S, et al.Early Goal‐Directed Therapy Collaborative Group. Early goal‐directed therapy in the treatment of severe sepsis and septic shock.N Engl J Med.2001;345(19):13681377.
  3. Durairaj L,Schmidt GA.Fluid therapy in resuscitated sepsis: less is more.Chest.2008;133(1):252263.
  4. Cook DJ,Simel DL.The rational clinical examination: does this patient have abnormal central venous pressure?JAMA.1996;275(8):630634.
  5. Vinayak AG,Levitt J,Gehlbach B,Pohlman AS,Hall JB,Kress JP.Usefulness of the external jugular vein examination in detecting abnormal central venous pressure in critically ill patients.Arch Intern Med.2006;166(19):21322137.
  6. Taylor RW,Palagiri AV.Central venous catheterization.Crit Care Med.2007;35(5):13901396.
  7. Magder S.How to use central venous pressure measurements.Curr Opin Crit Care.2005;11(3):264270.
  8. Bland JM,Altman DG.Statistical methods for assessing agreement between two methods of clinical measurement.Lancet.1986;1(8476):307310.
  9. Denys BG,Uretsky BF.Anatomical variations of internal jugular vein location: impact on central venous access.Crit Care Med.1991;19(12):15161519.
  10. Bloch KE,Krieger BP,Sackner MA.Noninvasive measurement of central venous pressure by neck inductive plethysmography.Chest.1991;100(2):371375.
  11. Ward KR,Tiba MH,Barbee RW, et al.A new noninvasive method to determine central venous pressure.Resuscitation.2006;70(2):238246.
  12. Barie PS.Advances in critical care monitoring.Arch Surg.1997;132(7):734739.
  13. Chandraratna PA,Brar R,Vijayasekaran S, et al.Continuous recording of pulmonary artery diastolic pressure and cardiac output using a novel ultrasound transducer.J Am Soc Echocardiogr.2002;15(11):13811386.
  14. Duvekot JJ,Cheriex EC,Tan WD,Heidendal GA,Peeters LL.Measurement of anterior‐posterior diameter of inferior vena cava by ultrasonography: a new non‐invasive method to assess acute changes in vascular filling state.Cardiovasc Res.1994;28(8):12691272.
  15. Yanagiba S,Ando Y,Kusano E,Asano Y.Utility of the inferior vena cava diameter as a marker of dry weight in nonoliguric hemodialyzed patients.ASAIO J.2001;47(5):528532.
  16. Brennan JM,Ronan A,Goonewardena S, et al.Handcarried ultrasound measurement of the inferior vena cava for assessment of intravascular volume status in the outpatient hemodialysis clinic.Clin J Am Soc Nephrol.2006;1(4):749753.
  17. Charalambous C,Barker TA,Zipitis CS, et al.Comparison of peripheral and central venous pressures in critically ill patients.Anaesth Intensive Care.2003;31(1):3439.
  18. Lipton BM.Determination of elevated jugular venous pressure by real‐time ultrasound.Ann Emerg Med.1999;34(1):115.
  19. Aggarwal V,Chatterjee A,Cho Y,Cheung D.Ultrasound‐guided noninvasive measurement of a patient's central venous pressure.Conf Proc IEEE Eng Med Biol Soc.2006;1:38433849.
  20. Thalhammer C,Aschwanden M,Odermatt A, et al.Noninvasive central venous pressure measurement by controlled compression sonography at the forearm.J Am Coll Cardiol.2007;50(16):15841589.
  21. Baumann UA,Marquis C,Stoupis C,Willenberg TA,Takala J,Jakob SM.Estimation of central venous pressure by ultrasound.Resuscitation.2005;64(2):193199.
  22. Stephan F,Novara A,Tournier B, et al.Determination of total effective vascular compliance in patients with sepsis syndrome.Am J Respir Crit Care Med.1998;157(1):5056.
  23. Smith T,Grounds RM,Rhodes A.Central venous pressure: uses and limitations. In: Pinsky MR, Payen D, eds.Functional Hemodynamic Monitoring.Berlin, Germany:Springer‐Verlag Berlin Heidelberg;2006:101.
  24. Nemens EJ,Woods SL.Normal fluctuations in pulmonary artery and pulmonary capillary wedge pressures in acutely ill patients.Heart Lung.1982;11(5):393398.
  25. Philips M3012A Data Sheet.Hemodynamic extension to the multi‐measurement server.Amsterdam:Koninklijke Philips Electronics N.V.;2003.
  26. Dellinger RP,Carlet JM,Masur H, et al.Surviving Sepsis Campaign Management Guidelines Committee. Surviving Sepsis Campaign guidelines for management of severe sepsis and septic shock.Crit Care Med.2004;32(3):858873. [Erratua: Crit Care Med. 2004;32(6):1448. Correction of dosage error in text. Crit Care Med. 2004;32(10):2169–2170.]
  27. Institute for Healthcare Improvement.Sepsis.Cambridge, MA:Institute for Healthcare Improvement. Available at:http://www.ihi.org/IHI/Topics/CriticalCare/Sepsis. Accessed March 2009.
Article PDF
Issue
Journal of Hospital Medicine - 4(6)
Publications
Page Number
350-355
Legacy Keywords
central venous pressure, early goal‐directed therapy, internal jugular vein, sensitivity, septic shock, severe sepsis, specificity, ultrasound imaging
Sections
Article PDF
Article PDF

Severe sepsis and septic shock account for more than 750,000 hospital admissions and 215,000 deaths per year.1 Early fluid resuscitation is the cornerstone of treatment, and early goal‐directed therapy (EGDT), which includes a target central venous pressure (CVP) of 8 to 12 mm Hg, has been shown to improve outcomes, including mortality and length of stay.2 This goal allows appropriate initial resuscitation and may decrease the risk of excess fluid administration, which is related to adverse outcomes in critically ill patients.3 However, nonintensivists may not start early aggressive fluid resuscitation because of inability to accurately assess intravascular volume, concerns for inadvertent volume overload, or the difficulty of recognizing insidious illness. Assessment of volume status, primarily from inspection of the internal jugular vein to estimate CVP, is difficult to perform by clinical examination alone, especially if CVP is very low.4 Inspection of the external jugular vein is perhaps easier than inspecting the internal jugular vein and appears to accurately estimate CVP,5 but it does not allow the degree of precision necessary for EGDT. Echocardiography can estimate CVP based on respirophasic variation or collapsibility index, but this technique requires expensive equipment and sonographic expertise. The current gold standard technique for measuring CVP requires an invasive central venous catheter, which can delay timely resuscitation and is associated with complications.6

An alternative technique to guide resuscitation efforts should be accurate, safe, rapid, and easy to perform at the bedside, while providing real‐time measurement results. We hypothesized that CVP can be accurately assessed using noninvasive ultrasound imaging of the internal jugular vein, since jugular venous pressure is essentially equal to CVP.7 Specifically, our study estimated the diagnostic accuracy of ultrasound measurement of the aspect ratio (height/width) of the internal jugular vein compared with the invasively measured CVP target for EGDT. We expected that a lower aspect ratio would correlate with a lower CVP and a higher aspect ratio would correlate with a higher CVP.

Methods

Volunteers were enrolled at Saint Mary's Hospital (Mayo Clinic) in Rochester, MN, from January to March 2006, and patients were enrolled at Saint Mary's Hospital and at Abbott Northwestern Hospital (Allina Hospitals and Clinics) in Minneapolis, MN, from May 2006 to October 2007. The study was approved by the Institutional Review Boards of Mayo Clinic and Allina and had 2 phases. The first phase comprised ultrasound measurements of internal jugular vein aspect ratio and determination of intraobserver and interobserver agreement in healthy volunteers. The second phase involved measurement of internal jugular vein aspect ratio and invasive CVP in a convenience sample of 44 spontaneously breathing patients admitted to medical intensive care units: 9 patients at Saint Marys Hospital and 35 patients at Abbott Northwestern Hospital. Patients were enrolled only when study members were on duty in the intensive care unit and able to perform study measurements. As a result, a high proportion of patients who may have been eligible were not asked to participate.

Each volunteer was deemed euvolemic on the basis of normal orthostatic measurements and normal oral intake with no vomiting or diarrhea in the previous 5 days. Measurements of 19 volunteers were made by 1 author (A.S.K.), with subsequent measurements of 15 of the volunteers made by another author (O.G.) to determine interobserver variability; 4 participants did not undergo a second measurement because of scheduling conflicts.

Inclusion and exclusion criteria for the critically ill patients are provided in Table 1. Recruitment was based on presenting symptoms and test results that led the intensive care unit physicians to decide to place a CVP monitor. All the enrolled patients had invasive CVP measurement performed approximately 30 to 40 minutes after ultrasound measurement of the internal jugular vein; this delay was the time required to place the central line and obtain the measurement. All patients who were invited to participate in the study were included. No patients were excluded on the basis of the exclusion criteria or because of inability to place a central line. No complications related to central line placement occurred.

Study Inclusion and Exclusion Criteria for Critically Ill Patients
Inclusion criteria
1. Aged 18 years or older
2. Admission to the intensive care unit
3. Spontaneously breathing (not intubated/ventilated)
4. Planned insertion of a central venous pressure monitor for therapy
Exclusion criteria
1. Known cervical spine injuries or fusion
2. Nonremovable cervical collars
3. Surgical dressings that would prevent visualization of the internal jugular vein
4. Inability of the patient to be properly positioned
5. A code situation

We followed a prescribed measurement technique (Table 2) to determine the internal jugular vein aspect ratio in all volunteers and patients. Measurements of the volunteers were made with a Site‐Rite 3 Ultrasound System (Bard Access Systems, Inc., Salt Lake City, UT) using a 9.0‐MHz transducer. Measurements of the critically ill patients were made with a SonoSite MicroMaxx ultrasound system (SonoSite, Inc., Bothell, WA) using a 10.5‐MHz transducer. Study team physicians initially were blinded to actual measured CVP. Internal jugular vein aspect ratio and CVP were measured at tidal volume end‐expiration for all patients. One measurement was obtained for each patient, with measurements being made by 1 of 4 physicians (2 intensivists, 1 critical care fellow, and 1 chief medicine resident). With no specific ultrasound training and with only minimal practice, the physicians could obtain the optimal aspect ratio within a few seconds (Figure 1).

Figure 1
Measurement of aspect ratio. Cross‐sectional transverse‐plane ultrasound image shows the right internal jugular vein and the common carotid artery. The internal jugular vein aspect ratio (height/width) in this example is 0.77.
Internal Jugular Vein Measurement Process
1. Position the patient supine (0) with head and legs flat, ensuring overall comfort. A small pillow can be used to help keep head, neck, and trunk aligned
2. Have the patient rotate his or her head slightly to the side (<30) to expose the internal jugular vein
3. Place the transducer transversely on the patient's neck over the expected location of the internal jugular vein. The transducer should be perpendicular to the patient's neck
4. Apply slight pressure to the transducer to locate the internal jugular vein on the view screen. Use the minimum pressure necessary to obtain a good quality ultrasound image
5. Once the internal jugular vein is found, adjust the position of the transducer over the vein to obtain the most circular cross‐sectional image
6. Have the patient breathe normally, then ask him or her to briefly stop breathing at normal (tidal volume) end‐expiration
7. Store the best end‐expiration image (in which the internal jugular vein appears most circular) and have the patient resume normal breathing
8. Measure the height and width of the internal jugular vein using the built‐in cursor function or a ruler

This was an exploratory prospective study, and all methods of data collection were designed before patient enrollment. However, the ultrasound‐derived aspect ratio of 0.83 (which defined a CVP of 8 mm Hg) was determined post hoc to maximize sensitivity and specificity and was based on the aspect ratio of the euvolemic volunteers and the inflection point of the CVP vs aspect ratio curve for the critically ill patients.

Statistical Analysis

Groups were compared using the 2 test for differences in proportions and the Wilcoxon rank sum test for continuous data. P < 0.05 was considered statistically significant. Bland‐Altman plots were used to describe the bias and variability of the aspect ratio within and between observers.8 This technique compares 2 methods of measurement to determine agreement and repeatability by plotting the mean of the differences (which should be zero) and the upper and lower limits of agreement (1.96 standard deviations [SDs] of those differences above and below the mean). Results were calculated using the available data; there was no adjustment for missing data. Analyses were performed using SPLUS and SAS/STAT software (SAS Institute, Inc., Cary, NC).

Results

We first evaluated 19 white volunteers: 12 women and 7 men. Mean (SD) age was 42 (11) years and mean body mass index was 26.6 (4.5) kg/m2. Mean arterial pressure was 89 (13) mm Hg and mean heart rate was 71 (15) beats/minute. Mean aspect ratio of the right and left internal jugular vein for all volunteers was 0.82 (0.07). There was no difference in aspect ratio between the right (0.83 [0.10]) and left (0.81 [0.13]) vein (P > 0.10). Also, no difference in the aspect ratio was seen between men (0.81 [0.08]) and women (0.83 [0.07]) (P = 0.77). Bland‐Altman analysis indicated moderate intraobserver and interobserver agreement for the aspect ratio measurements (Figure 2).

Figure 2
Bland‐Altman analysis. (A,B) Intraobserver reliability for ultrasound measurements of the aspect ratio for the (A) right and (B) left internal jugular vein made by 1 observer (A.S.K.) in 19 volunteers. (C,D) Interobserver reliability for measurements of the (C) right and (D) left internal jugular vein by 2 observers (A.S.K. and O.G.) in 15 of the volunteers. Solid line represents the mean of the difference in aspect ratio; dotted lines represent the variability of the difference. Vertical line on each graph indicates an aspect ratio of 0.83.

We then compared the aspect ratio measured using ultrasound and CVP measured with an invasive monitor for 44 spontaneously breathing critically ill patients (22 women and 22 men; 38 were white). Mean (SD) age was 66 (14) years and mean body mass index was 28.8 (9.1) kg/m2. Mean arterial pressure (n = 36) was 67 (12) mm Hg and mean heart rate (n = 34) was 92 (22) beats/minute. Systemic inflammatory response syndrome (SIRS) criteria were present in 23 of 40 patients; complete data were unavailable for the other 4 patients. Of these 40 patients, 20 had sepsis, 15 had severe sepsis, and 5 had septic shock. The most common diagnoses were gastrointestinal tract bleeding in 6 patients and congestive heart failure in 4 patients. Acute Physiology and Chronic Health Evaluation (APACHE III) score, available for 8 of the 9 patients at Saint Marys Hospital, was 63 (10).

Figure 3 shows measured aspect ratios vs. invasively measured CVP for the critically ill patients. The curvilinear result is consistent with venous and right ventricular compliance ( volume/ pressure) characteristics. Note that the inflection point (beginning of the increased slope) of the curve corresponds to a CVP of about 8 mm Hg. Furthermore, the aspect ratio (0.8) at this point is the same as that seen in the euvolemic volunteers. These findings suggest that, in spontaneously breathing patients, a CVP of about 8 mm Hg and an aspect ratio of about 0.8 each defines the beginning of the plateau on the cardiac Frank‐Starling curve.

Figure 3
Measurements in spontaneously breathing critically ill patients. Plot of the ultrasound‐measured aspect ratio of the internal jugular vein (x‐axis) vs. the invasively‐measured end‐expiration central venous pressure (CVP) (y‐axis) for each patient (n = 44). The horizontal line indicates a CVP of 8 mm Hg, and the vertical line indicates an internal jugular vein aspect ratio of 0.83. Solid line represents a loess fit to the data.

Ultrasound imaging of the internal jugular vein aspect ratio accurately estimated the CVP target of 8 mm Hg based on the area under the receiver operating characteristics curve of 0.84 (95% confidence interval [CI], 0.72‐0.96) (Figure 4). For an invasively measured CVP of less than 8 mm Hg, the likelihood ratio for a positive ultrasound test result (aspect ratio < 0.83) was 3.5 (95% CI, 1.4‐8.4) and for a negative test result (aspect ratio 0.83) was 0.30 (95% CI, 0.14‐0.62). Clinically, this means that patients with a measured aspect ratio of less than 0.83 require further fluid resuscitation, whereas patients with a measured aspect ratio of 0.83 or greater are less likely to benefit from fluid resuscitation.

Figure 4
Receiver operating characteristics curve. Sensitivity (y‐axis) is plotted vs. 1 − specificity (x‐axis) for the 42 unique internal jugular vein aspect ratios among 44 patients. Area under the curve is 0.84 (95% CI, 0.72‐0.96). The “shoulder” indicates the point of maximum sensitivity (0.78) and specificity (0.77) that corresponds to the aspect ratio of 0.83 (*).

Discussion

This study demonstrated that the EGDT CVP target of 8 to 12 mm Hg can be accurately estimated (referenced to invasive CVP monitoring) using noninvasive ultrasound measurement of the internal jugular vein in spontaneously breathing critically ill patients. The measurement process is simple to perform at the bedside and moderately reliable when performed by different observers; also, the results appear to be equivalent for both sides and for males or females. Images can be stored electronically for serial comparisons and for viewing by other caregivers. Because the aspect ratio is essentially constant over the length of the internal jugular vein, unlike diameter, measurements can be performed anywhere along the vein. Also, ultrasound imaging allows visualization of the internal jugular vein despite anatomic variation.9

Previous attempts at noninvasive hemodynamic monitoring using plethysmography, thoracic electrical bioimpedance, and external Doppler probes have shown these methods to be cumbersome or inaccurate.1013 Other investigators have used echocardiography14, 15 and handheld ultrasound16 to image the diameter of the inferior vena cava in order to assess intravascular volume status, but these techniques require expertise in sonographic imaging. An alternative technique is to measure peripheral venous pressure, which correlates with CVP.17 This method, however, requires technical expertise to zero the monitor and is not yet widely used for critically ill patients. A literature search found 1 letter to the editor suggesting that real‐time ultrasound imaging of the internal jugular vein could be used to qualitatively determine jugular venous pressure18 and 3 studies using ultrasound in conjunction with a pressure transducer or manometer to determine the pressure needed to collapse the vein (either the internal jugular or a peripheral vein), with subsequent correlation to CVP.1921 These latter techniques appear to be cumbersome and require custom equipment that is not readily available in most hospitals.

Any measurement of CVP, including our technique, assumes correlation with volume responsiveness as a surrogate for intravascular volume. However, CVP is governed by multiple physiologic and pathologic factors, including intravascular volume, vascular and ventricular compliance, ventricular function, tricuspid stenosis and regurgitation, cardiac tamponade, and atrioventricular dissociation.22, 23 Therefore, CVP alone may not be an accurate measure of volume responsiveness (intravascular volume). CVP may also have spontaneous variation similar to pulmonary capillary wedge pressure, which can be as high as 7 mm Hg in any given patient.24 Furthermore, invasive CVP monitors also have limitations, and the overall accuracy of the Philips system used at Saint Marys Hospital is 4% of the reading or 4 mm Hg, whichever is greater.25 Nonetheless, the EGDT algorithm that incorporates CVP measurement with a target of 8 to 12 mm Hg in spontaneously breathing patients and 12 mm Hg in mechanically ventilated patients has resulted in decreased mortality among patients with severe sepsis and is recommended by the Surviving Sepsis Campaign guidelines26 and the Institute for Healthcare Improvement.27

These study results are important because nonintensivists such as hospitalists and emergency department physicians can use this technique to provide rapid fluid resuscitation early in the course of severe sepsis and septic shock, when aggressive fluid resuscitation is most effective. Ultrasound imaging of the internal jugular vein is easy to perform without formal training, and the equipment is readily available in most hospitals. Future studies will evaluate outcomes in spontaneously breathing and ventilated patients to determine the accuracy of this measurement technique in volume‐depleted and volume‐overloaded states. If validated in different patient populations, ultrasound measurement of the internal jugular vein could substitute for the EGDT CVP target in critically ill patients and allow early aggressive fluid resuscitation before a central venous catheter is placed.

Limitations

This exploratory study enrolled a small convenience sample of primarily white patients. The convenience sample is potentially prone to selection bias since a majority of patients who may have been eligible were never asked to participate. Also, not all patients had sepsis syndrome; our intention was to measure CVP and aspect ratio for available critically ill patients. Accordingly, results may be different depending on severity of illness. In addition, some of the patients were transferred from outside medical centers or from emergency departments and therefore may have already been partly resuscitated. Another limitation is that the intraobserver and interobserver variability for the healthy volunteers showed only moderate agreement, possibly indicating limited repeatability, although these results could be due to the small sample size. Also, we did not determine intraobserver and interobserver variability for the critically ill patients; results may be different from those of the healthy volunteers. Furthermore, underlying conditions such as tricuspid stenosis or regurgitation and cardiac tamponade may affect measurement results, but we included all patients without formal assessment, since treatment was performed on an urgent/emergent basis as would happen in real clinical settings.

Acknowledgements

The authors dedicate this work to their patients with severe sepsis. They thank Lisa Kirkland, MD, and Murat Yilmaz, MD, for their assistance with this study. They also thank the Mayo Clinic Divisions of General Internal Medicine and Pulmonary and Critical Care Medicine for funding.

Severe sepsis and septic shock account for more than 750,000 hospital admissions and 215,000 deaths per year.1 Early fluid resuscitation is the cornerstone of treatment, and early goal‐directed therapy (EGDT), which includes a target central venous pressure (CVP) of 8 to 12 mm Hg, has been shown to improve outcomes, including mortality and length of stay.2 This goal allows appropriate initial resuscitation and may decrease the risk of excess fluid administration, which is related to adverse outcomes in critically ill patients.3 However, nonintensivists may not start early aggressive fluid resuscitation because of inability to accurately assess intravascular volume, concerns for inadvertent volume overload, or the difficulty of recognizing insidious illness. Assessment of volume status, primarily from inspection of the internal jugular vein to estimate CVP, is difficult to perform by clinical examination alone, especially if CVP is very low.4 Inspection of the external jugular vein is perhaps easier than inspecting the internal jugular vein and appears to accurately estimate CVP,5 but it does not allow the degree of precision necessary for EGDT. Echocardiography can estimate CVP based on respirophasic variation or collapsibility index, but this technique requires expensive equipment and sonographic expertise. The current gold standard technique for measuring CVP requires an invasive central venous catheter, which can delay timely resuscitation and is associated with complications.6

An alternative technique to guide resuscitation efforts should be accurate, safe, rapid, and easy to perform at the bedside, while providing real‐time measurement results. We hypothesized that CVP can be accurately assessed using noninvasive ultrasound imaging of the internal jugular vein, since jugular venous pressure is essentially equal to CVP.7 Specifically, our study estimated the diagnostic accuracy of ultrasound measurement of the aspect ratio (height/width) of the internal jugular vein compared with the invasively measured CVP target for EGDT. We expected that a lower aspect ratio would correlate with a lower CVP and a higher aspect ratio would correlate with a higher CVP.

Methods

Volunteers were enrolled at Saint Mary's Hospital (Mayo Clinic) in Rochester, MN, from January to March 2006, and patients were enrolled at Saint Mary's Hospital and at Abbott Northwestern Hospital (Allina Hospitals and Clinics) in Minneapolis, MN, from May 2006 to October 2007. The study was approved by the Institutional Review Boards of Mayo Clinic and Allina and had 2 phases. The first phase comprised ultrasound measurements of internal jugular vein aspect ratio and determination of intraobserver and interobserver agreement in healthy volunteers. The second phase involved measurement of internal jugular vein aspect ratio and invasive CVP in a convenience sample of 44 spontaneously breathing patients admitted to medical intensive care units: 9 patients at Saint Marys Hospital and 35 patients at Abbott Northwestern Hospital. Patients were enrolled only when study members were on duty in the intensive care unit and able to perform study measurements. As a result, a high proportion of patients who may have been eligible were not asked to participate.

Each volunteer was deemed euvolemic on the basis of normal orthostatic measurements and normal oral intake with no vomiting or diarrhea in the previous 5 days. Measurements of 19 volunteers were made by 1 author (A.S.K.), with subsequent measurements of 15 of the volunteers made by another author (O.G.) to determine interobserver variability; 4 participants did not undergo a second measurement because of scheduling conflicts.

Inclusion and exclusion criteria for the critically ill patients are provided in Table 1. Recruitment was based on presenting symptoms and test results that led the intensive care unit physicians to decide to place a CVP monitor. All the enrolled patients had invasive CVP measurement performed approximately 30 to 40 minutes after ultrasound measurement of the internal jugular vein; this delay was the time required to place the central line and obtain the measurement. All patients who were invited to participate in the study were included. No patients were excluded on the basis of the exclusion criteria or because of inability to place a central line. No complications related to central line placement occurred.

Study Inclusion and Exclusion Criteria for Critically Ill Patients
Inclusion criteria
1. Aged 18 years or older
2. Admission to the intensive care unit
3. Spontaneously breathing (not intubated/ventilated)
4. Planned insertion of a central venous pressure monitor for therapy
Exclusion criteria
1. Known cervical spine injuries or fusion
2. Nonremovable cervical collars
3. Surgical dressings that would prevent visualization of the internal jugular vein
4. Inability of the patient to be properly positioned
5. A code situation

We followed a prescribed measurement technique (Table 2) to determine the internal jugular vein aspect ratio in all volunteers and patients. Measurements of the volunteers were made with a Site‐Rite 3 Ultrasound System (Bard Access Systems, Inc., Salt Lake City, UT) using a 9.0‐MHz transducer. Measurements of the critically ill patients were made with a SonoSite MicroMaxx ultrasound system (SonoSite, Inc., Bothell, WA) using a 10.5‐MHz transducer. Study team physicians initially were blinded to actual measured CVP. Internal jugular vein aspect ratio and CVP were measured at tidal volume end‐expiration for all patients. One measurement was obtained for each patient, with measurements being made by 1 of 4 physicians (2 intensivists, 1 critical care fellow, and 1 chief medicine resident). With no specific ultrasound training and with only minimal practice, the physicians could obtain the optimal aspect ratio within a few seconds (Figure 1).

Figure 1
Measurement of aspect ratio. Cross‐sectional transverse‐plane ultrasound image shows the right internal jugular vein and the common carotid artery. The internal jugular vein aspect ratio (height/width) in this example is 0.77.
Internal Jugular Vein Measurement Process
1. Position the patient supine (0) with head and legs flat, ensuring overall comfort. A small pillow can be used to help keep head, neck, and trunk aligned
2. Have the patient rotate his or her head slightly to the side (<30) to expose the internal jugular vein
3. Place the transducer transversely on the patient's neck over the expected location of the internal jugular vein. The transducer should be perpendicular to the patient's neck
4. Apply slight pressure to the transducer to locate the internal jugular vein on the view screen. Use the minimum pressure necessary to obtain a good quality ultrasound image
5. Once the internal jugular vein is found, adjust the position of the transducer over the vein to obtain the most circular cross‐sectional image
6. Have the patient breathe normally, then ask him or her to briefly stop breathing at normal (tidal volume) end‐expiration
7. Store the best end‐expiration image (in which the internal jugular vein appears most circular) and have the patient resume normal breathing
8. Measure the height and width of the internal jugular vein using the built‐in cursor function or a ruler

This was an exploratory prospective study, and all methods of data collection were designed before patient enrollment. However, the ultrasound‐derived aspect ratio of 0.83 (which defined a CVP of 8 mm Hg) was determined post hoc to maximize sensitivity and specificity and was based on the aspect ratio of the euvolemic volunteers and the inflection point of the CVP vs aspect ratio curve for the critically ill patients.

Statistical Analysis

Groups were compared using the 2 test for differences in proportions and the Wilcoxon rank sum test for continuous data. P < 0.05 was considered statistically significant. Bland‐Altman plots were used to describe the bias and variability of the aspect ratio within and between observers.8 This technique compares 2 methods of measurement to determine agreement and repeatability by plotting the mean of the differences (which should be zero) and the upper and lower limits of agreement (1.96 standard deviations [SDs] of those differences above and below the mean). Results were calculated using the available data; there was no adjustment for missing data. Analyses were performed using SPLUS and SAS/STAT software (SAS Institute, Inc., Cary, NC).

Results

We first evaluated 19 white volunteers: 12 women and 7 men. Mean (SD) age was 42 (11) years and mean body mass index was 26.6 (4.5) kg/m2. Mean arterial pressure was 89 (13) mm Hg and mean heart rate was 71 (15) beats/minute. Mean aspect ratio of the right and left internal jugular vein for all volunteers was 0.82 (0.07). There was no difference in aspect ratio between the right (0.83 [0.10]) and left (0.81 [0.13]) vein (P > 0.10). Also, no difference in the aspect ratio was seen between men (0.81 [0.08]) and women (0.83 [0.07]) (P = 0.77). Bland‐Altman analysis indicated moderate intraobserver and interobserver agreement for the aspect ratio measurements (Figure 2).

Figure 2
Bland‐Altman analysis. (A,B) Intraobserver reliability for ultrasound measurements of the aspect ratio for the (A) right and (B) left internal jugular vein made by 1 observer (A.S.K.) in 19 volunteers. (C,D) Interobserver reliability for measurements of the (C) right and (D) left internal jugular vein by 2 observers (A.S.K. and O.G.) in 15 of the volunteers. Solid line represents the mean of the difference in aspect ratio; dotted lines represent the variability of the difference. Vertical line on each graph indicates an aspect ratio of 0.83.

We then compared the aspect ratio measured using ultrasound and CVP measured with an invasive monitor for 44 spontaneously breathing critically ill patients (22 women and 22 men; 38 were white). Mean (SD) age was 66 (14) years and mean body mass index was 28.8 (9.1) kg/m2. Mean arterial pressure (n = 36) was 67 (12) mm Hg and mean heart rate (n = 34) was 92 (22) beats/minute. Systemic inflammatory response syndrome (SIRS) criteria were present in 23 of 40 patients; complete data were unavailable for the other 4 patients. Of these 40 patients, 20 had sepsis, 15 had severe sepsis, and 5 had septic shock. The most common diagnoses were gastrointestinal tract bleeding in 6 patients and congestive heart failure in 4 patients. Acute Physiology and Chronic Health Evaluation (APACHE III) score, available for 8 of the 9 patients at Saint Marys Hospital, was 63 (10).

Figure 3 shows measured aspect ratios vs. invasively measured CVP for the critically ill patients. The curvilinear result is consistent with venous and right ventricular compliance ( volume/ pressure) characteristics. Note that the inflection point (beginning of the increased slope) of the curve corresponds to a CVP of about 8 mm Hg. Furthermore, the aspect ratio (0.8) at this point is the same as that seen in the euvolemic volunteers. These findings suggest that, in spontaneously breathing patients, a CVP of about 8 mm Hg and an aspect ratio of about 0.8 each defines the beginning of the plateau on the cardiac Frank‐Starling curve.

Figure 3
Measurements in spontaneously breathing critically ill patients. Plot of the ultrasound‐measured aspect ratio of the internal jugular vein (x‐axis) vs. the invasively‐measured end‐expiration central venous pressure (CVP) (y‐axis) for each patient (n = 44). The horizontal line indicates a CVP of 8 mm Hg, and the vertical line indicates an internal jugular vein aspect ratio of 0.83. Solid line represents a loess fit to the data.

Ultrasound imaging of the internal jugular vein aspect ratio accurately estimated the CVP target of 8 mm Hg based on the area under the receiver operating characteristics curve of 0.84 (95% confidence interval [CI], 0.72‐0.96) (Figure 4). For an invasively measured CVP of less than 8 mm Hg, the likelihood ratio for a positive ultrasound test result (aspect ratio < 0.83) was 3.5 (95% CI, 1.4‐8.4) and for a negative test result (aspect ratio 0.83) was 0.30 (95% CI, 0.14‐0.62). Clinically, this means that patients with a measured aspect ratio of less than 0.83 require further fluid resuscitation, whereas patients with a measured aspect ratio of 0.83 or greater are less likely to benefit from fluid resuscitation.

Figure 4
Receiver operating characteristics curve. Sensitivity (y‐axis) is plotted vs. 1 − specificity (x‐axis) for the 42 unique internal jugular vein aspect ratios among 44 patients. Area under the curve is 0.84 (95% CI, 0.72‐0.96). The “shoulder” indicates the point of maximum sensitivity (0.78) and specificity (0.77) that corresponds to the aspect ratio of 0.83 (*).

Discussion

This study demonstrated that the EGDT CVP target of 8 to 12 mm Hg can be accurately estimated (referenced to invasive CVP monitoring) using noninvasive ultrasound measurement of the internal jugular vein in spontaneously breathing critically ill patients. The measurement process is simple to perform at the bedside and moderately reliable when performed by different observers; also, the results appear to be equivalent for both sides and for males or females. Images can be stored electronically for serial comparisons and for viewing by other caregivers. Because the aspect ratio is essentially constant over the length of the internal jugular vein, unlike diameter, measurements can be performed anywhere along the vein. Also, ultrasound imaging allows visualization of the internal jugular vein despite anatomic variation.9

Previous attempts at noninvasive hemodynamic monitoring using plethysmography, thoracic electrical bioimpedance, and external Doppler probes have shown these methods to be cumbersome or inaccurate.1013 Other investigators have used echocardiography14, 15 and handheld ultrasound16 to image the diameter of the inferior vena cava in order to assess intravascular volume status, but these techniques require expertise in sonographic imaging. An alternative technique is to measure peripheral venous pressure, which correlates with CVP.17 This method, however, requires technical expertise to zero the monitor and is not yet widely used for critically ill patients. A literature search found 1 letter to the editor suggesting that real‐time ultrasound imaging of the internal jugular vein could be used to qualitatively determine jugular venous pressure18 and 3 studies using ultrasound in conjunction with a pressure transducer or manometer to determine the pressure needed to collapse the vein (either the internal jugular or a peripheral vein), with subsequent correlation to CVP.1921 These latter techniques appear to be cumbersome and require custom equipment that is not readily available in most hospitals.

Any measurement of CVP, including our technique, assumes correlation with volume responsiveness as a surrogate for intravascular volume. However, CVP is governed by multiple physiologic and pathologic factors, including intravascular volume, vascular and ventricular compliance, ventricular function, tricuspid stenosis and regurgitation, cardiac tamponade, and atrioventricular dissociation.22, 23 Therefore, CVP alone may not be an accurate measure of volume responsiveness (intravascular volume). CVP may also have spontaneous variation similar to pulmonary capillary wedge pressure, which can be as high as 7 mm Hg in any given patient.24 Furthermore, invasive CVP monitors also have limitations, and the overall accuracy of the Philips system used at Saint Marys Hospital is 4% of the reading or 4 mm Hg, whichever is greater.25 Nonetheless, the EGDT algorithm that incorporates CVP measurement with a target of 8 to 12 mm Hg in spontaneously breathing patients and 12 mm Hg in mechanically ventilated patients has resulted in decreased mortality among patients with severe sepsis and is recommended by the Surviving Sepsis Campaign guidelines26 and the Institute for Healthcare Improvement.27

These study results are important because nonintensivists such as hospitalists and emergency department physicians can use this technique to provide rapid fluid resuscitation early in the course of severe sepsis and septic shock, when aggressive fluid resuscitation is most effective. Ultrasound imaging of the internal jugular vein is easy to perform without formal training, and the equipment is readily available in most hospitals. Future studies will evaluate outcomes in spontaneously breathing and ventilated patients to determine the accuracy of this measurement technique in volume‐depleted and volume‐overloaded states. If validated in different patient populations, ultrasound measurement of the internal jugular vein could substitute for the EGDT CVP target in critically ill patients and allow early aggressive fluid resuscitation before a central venous catheter is placed.

Limitations

This exploratory study enrolled a small convenience sample of primarily white patients. The convenience sample is potentially prone to selection bias since a majority of patients who may have been eligible were never asked to participate. Also, not all patients had sepsis syndrome; our intention was to measure CVP and aspect ratio for available critically ill patients. Accordingly, results may be different depending on severity of illness. In addition, some of the patients were transferred from outside medical centers or from emergency departments and therefore may have already been partly resuscitated. Another limitation is that the intraobserver and interobserver variability for the healthy volunteers showed only moderate agreement, possibly indicating limited repeatability, although these results could be due to the small sample size. Also, we did not determine intraobserver and interobserver variability for the critically ill patients; results may be different from those of the healthy volunteers. Furthermore, underlying conditions such as tricuspid stenosis or regurgitation and cardiac tamponade may affect measurement results, but we included all patients without formal assessment, since treatment was performed on an urgent/emergent basis as would happen in real clinical settings.

Acknowledgements

The authors dedicate this work to their patients with severe sepsis. They thank Lisa Kirkland, MD, and Murat Yilmaz, MD, for their assistance with this study. They also thank the Mayo Clinic Divisions of General Internal Medicine and Pulmonary and Critical Care Medicine for funding.

References
  1. Angus DC,Linde‐Zwirble WT,Lidicker J,Clermont G,Carcillo J,Pinsky MR.Epidemiology of severe sepsis in the United States: analysis of incidence, outcome, and associated costs of care.Crit Care Med.2001;29(7):13031310.
  2. Rivers E,Nguyen B,Havstad S, et al.Early Goal‐Directed Therapy Collaborative Group. Early goal‐directed therapy in the treatment of severe sepsis and septic shock.N Engl J Med.2001;345(19):13681377.
  3. Durairaj L,Schmidt GA.Fluid therapy in resuscitated sepsis: less is more.Chest.2008;133(1):252263.
  4. Cook DJ,Simel DL.The rational clinical examination: does this patient have abnormal central venous pressure?JAMA.1996;275(8):630634.
  5. Vinayak AG,Levitt J,Gehlbach B,Pohlman AS,Hall JB,Kress JP.Usefulness of the external jugular vein examination in detecting abnormal central venous pressure in critically ill patients.Arch Intern Med.2006;166(19):21322137.
  6. Taylor RW,Palagiri AV.Central venous catheterization.Crit Care Med.2007;35(5):13901396.
  7. Magder S.How to use central venous pressure measurements.Curr Opin Crit Care.2005;11(3):264270.
  8. Bland JM,Altman DG.Statistical methods for assessing agreement between two methods of clinical measurement.Lancet.1986;1(8476):307310.
  9. Denys BG,Uretsky BF.Anatomical variations of internal jugular vein location: impact on central venous access.Crit Care Med.1991;19(12):15161519.
  10. Bloch KE,Krieger BP,Sackner MA.Noninvasive measurement of central venous pressure by neck inductive plethysmography.Chest.1991;100(2):371375.
  11. Ward KR,Tiba MH,Barbee RW, et al.A new noninvasive method to determine central venous pressure.Resuscitation.2006;70(2):238246.
  12. Barie PS.Advances in critical care monitoring.Arch Surg.1997;132(7):734739.
  13. Chandraratna PA,Brar R,Vijayasekaran S, et al.Continuous recording of pulmonary artery diastolic pressure and cardiac output using a novel ultrasound transducer.J Am Soc Echocardiogr.2002;15(11):13811386.
  14. Duvekot JJ,Cheriex EC,Tan WD,Heidendal GA,Peeters LL.Measurement of anterior‐posterior diameter of inferior vena cava by ultrasonography: a new non‐invasive method to assess acute changes in vascular filling state.Cardiovasc Res.1994;28(8):12691272.
  15. Yanagiba S,Ando Y,Kusano E,Asano Y.Utility of the inferior vena cava diameter as a marker of dry weight in nonoliguric hemodialyzed patients.ASAIO J.2001;47(5):528532.
  16. Brennan JM,Ronan A,Goonewardena S, et al.Handcarried ultrasound measurement of the inferior vena cava for assessment of intravascular volume status in the outpatient hemodialysis clinic.Clin J Am Soc Nephrol.2006;1(4):749753.
  17. Charalambous C,Barker TA,Zipitis CS, et al.Comparison of peripheral and central venous pressures in critically ill patients.Anaesth Intensive Care.2003;31(1):3439.
  18. Lipton BM.Determination of elevated jugular venous pressure by real‐time ultrasound.Ann Emerg Med.1999;34(1):115.
  19. Aggarwal V,Chatterjee A,Cho Y,Cheung D.Ultrasound‐guided noninvasive measurement of a patient's central venous pressure.Conf Proc IEEE Eng Med Biol Soc.2006;1:38433849.
  20. Thalhammer C,Aschwanden M,Odermatt A, et al.Noninvasive central venous pressure measurement by controlled compression sonography at the forearm.J Am Coll Cardiol.2007;50(16):15841589.
  21. Baumann UA,Marquis C,Stoupis C,Willenberg TA,Takala J,Jakob SM.Estimation of central venous pressure by ultrasound.Resuscitation.2005;64(2):193199.
  22. Stephan F,Novara A,Tournier B, et al.Determination of total effective vascular compliance in patients with sepsis syndrome.Am J Respir Crit Care Med.1998;157(1):5056.
  23. Smith T,Grounds RM,Rhodes A.Central venous pressure: uses and limitations. In: Pinsky MR, Payen D, eds.Functional Hemodynamic Monitoring.Berlin, Germany:Springer‐Verlag Berlin Heidelberg;2006:101.
  24. Nemens EJ,Woods SL.Normal fluctuations in pulmonary artery and pulmonary capillary wedge pressures in acutely ill patients.Heart Lung.1982;11(5):393398.
  25. Philips M3012A Data Sheet.Hemodynamic extension to the multi‐measurement server.Amsterdam:Koninklijke Philips Electronics N.V.;2003.
  26. Dellinger RP,Carlet JM,Masur H, et al.Surviving Sepsis Campaign Management Guidelines Committee. Surviving Sepsis Campaign guidelines for management of severe sepsis and septic shock.Crit Care Med.2004;32(3):858873. [Erratua: Crit Care Med. 2004;32(6):1448. Correction of dosage error in text. Crit Care Med. 2004;32(10):2169–2170.]
  27. Institute for Healthcare Improvement.Sepsis.Cambridge, MA:Institute for Healthcare Improvement. Available at:http://www.ihi.org/IHI/Topics/CriticalCare/Sepsis. Accessed March 2009.
References
  1. Angus DC,Linde‐Zwirble WT,Lidicker J,Clermont G,Carcillo J,Pinsky MR.Epidemiology of severe sepsis in the United States: analysis of incidence, outcome, and associated costs of care.Crit Care Med.2001;29(7):13031310.
  2. Rivers E,Nguyen B,Havstad S, et al.Early Goal‐Directed Therapy Collaborative Group. Early goal‐directed therapy in the treatment of severe sepsis and septic shock.N Engl J Med.2001;345(19):13681377.
  3. Durairaj L,Schmidt GA.Fluid therapy in resuscitated sepsis: less is more.Chest.2008;133(1):252263.
  4. Cook DJ,Simel DL.The rational clinical examination: does this patient have abnormal central venous pressure?JAMA.1996;275(8):630634.
  5. Vinayak AG,Levitt J,Gehlbach B,Pohlman AS,Hall JB,Kress JP.Usefulness of the external jugular vein examination in detecting abnormal central venous pressure in critically ill patients.Arch Intern Med.2006;166(19):21322137.
  6. Taylor RW,Palagiri AV.Central venous catheterization.Crit Care Med.2007;35(5):13901396.
  7. Magder S.How to use central venous pressure measurements.Curr Opin Crit Care.2005;11(3):264270.
  8. Bland JM,Altman DG.Statistical methods for assessing agreement between two methods of clinical measurement.Lancet.1986;1(8476):307310.
  9. Denys BG,Uretsky BF.Anatomical variations of internal jugular vein location: impact on central venous access.Crit Care Med.1991;19(12):15161519.
  10. Bloch KE,Krieger BP,Sackner MA.Noninvasive measurement of central venous pressure by neck inductive plethysmography.Chest.1991;100(2):371375.
  11. Ward KR,Tiba MH,Barbee RW, et al.A new noninvasive method to determine central venous pressure.Resuscitation.2006;70(2):238246.
  12. Barie PS.Advances in critical care monitoring.Arch Surg.1997;132(7):734739.
  13. Chandraratna PA,Brar R,Vijayasekaran S, et al.Continuous recording of pulmonary artery diastolic pressure and cardiac output using a novel ultrasound transducer.J Am Soc Echocardiogr.2002;15(11):13811386.
  14. Duvekot JJ,Cheriex EC,Tan WD,Heidendal GA,Peeters LL.Measurement of anterior‐posterior diameter of inferior vena cava by ultrasonography: a new non‐invasive method to assess acute changes in vascular filling state.Cardiovasc Res.1994;28(8):12691272.
  15. Yanagiba S,Ando Y,Kusano E,Asano Y.Utility of the inferior vena cava diameter as a marker of dry weight in nonoliguric hemodialyzed patients.ASAIO J.2001;47(5):528532.
  16. Brennan JM,Ronan A,Goonewardena S, et al.Handcarried ultrasound measurement of the inferior vena cava for assessment of intravascular volume status in the outpatient hemodialysis clinic.Clin J Am Soc Nephrol.2006;1(4):749753.
  17. Charalambous C,Barker TA,Zipitis CS, et al.Comparison of peripheral and central venous pressures in critically ill patients.Anaesth Intensive Care.2003;31(1):3439.
  18. Lipton BM.Determination of elevated jugular venous pressure by real‐time ultrasound.Ann Emerg Med.1999;34(1):115.
  19. Aggarwal V,Chatterjee A,Cho Y,Cheung D.Ultrasound‐guided noninvasive measurement of a patient's central venous pressure.Conf Proc IEEE Eng Med Biol Soc.2006;1:38433849.
  20. Thalhammer C,Aschwanden M,Odermatt A, et al.Noninvasive central venous pressure measurement by controlled compression sonography at the forearm.J Am Coll Cardiol.2007;50(16):15841589.
  21. Baumann UA,Marquis C,Stoupis C,Willenberg TA,Takala J,Jakob SM.Estimation of central venous pressure by ultrasound.Resuscitation.2005;64(2):193199.
  22. Stephan F,Novara A,Tournier B, et al.Determination of total effective vascular compliance in patients with sepsis syndrome.Am J Respir Crit Care Med.1998;157(1):5056.
  23. Smith T,Grounds RM,Rhodes A.Central venous pressure: uses and limitations. In: Pinsky MR, Payen D, eds.Functional Hemodynamic Monitoring.Berlin, Germany:Springer‐Verlag Berlin Heidelberg;2006:101.
  24. Nemens EJ,Woods SL.Normal fluctuations in pulmonary artery and pulmonary capillary wedge pressures in acutely ill patients.Heart Lung.1982;11(5):393398.
  25. Philips M3012A Data Sheet.Hemodynamic extension to the multi‐measurement server.Amsterdam:Koninklijke Philips Electronics N.V.;2003.
  26. Dellinger RP,Carlet JM,Masur H, et al.Surviving Sepsis Campaign Management Guidelines Committee. Surviving Sepsis Campaign guidelines for management of severe sepsis and septic shock.Crit Care Med.2004;32(3):858873. [Erratua: Crit Care Med. 2004;32(6):1448. Correction of dosage error in text. Crit Care Med. 2004;32(10):2169–2170.]
  27. Institute for Healthcare Improvement.Sepsis.Cambridge, MA:Institute for Healthcare Improvement. Available at:http://www.ihi.org/IHI/Topics/CriticalCare/Sepsis. Accessed March 2009.
Issue
Journal of Hospital Medicine - 4(6)
Issue
Journal of Hospital Medicine - 4(6)
Page Number
350-355
Page Number
350-355
Publications
Publications
Article Type
Display Headline
Diagnostic accuracy of a simple ultrasound measurement to estimate central venous pressure in spontaneously breathing, critically ill patients
Display Headline
Diagnostic accuracy of a simple ultrasound measurement to estimate central venous pressure in spontaneously breathing, critically ill patients
Legacy Keywords
central venous pressure, early goal‐directed therapy, internal jugular vein, sensitivity, septic shock, severe sepsis, specificity, ultrasound imaging
Legacy Keywords
central venous pressure, early goal‐directed therapy, internal jugular vein, sensitivity, septic shock, severe sepsis, specificity, ultrasound imaging
Sections
Article Source

Copyright © 2009 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Division of Hospital Internal Medicine, Mayo Clinic, 200 First Street SW, Rochester, MN 55905
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Article PDF Media