Skip to main content

Development of a multisystem point of care ultrasound skills assessment checklist

Abstract

Background

Many institutions are training clinicians in point-of-care ultrasound (POCUS), but few POCUS skills checklists have been developed and validated. We developed a consensus-based multispecialty POCUS skills checklist with anchoring references for basic cardiac, lung, abdominal, and vascular ultrasound, and peripheral intravenous line (PIV) insertion.

Methods

A POCUS expert panel of 14 physicians specializing in emergency, critical care, and internal/hospital medicine participated in a modified-Delphi approach to develop a basic POCUS skills checklist by group consensus. Three rounds of voting were conducted, and consensus was defined by ≥ 80% agreement. Items achieving < 80% consensus were discussed and considered for up to two additional rounds of voting.

Results

Thirteen POCUS experts (93%) completed all three rounds of voting. Cardiac, lung, abdominal, and vascular ultrasound checklists included probe location and control, basic machine setup, image quality and optimization, and identification of anatomical structures. PIV insertion included additional items for needle tip tracking. During the first round of voting, 136 (82%) items achieved consensus, and after revision and revoting, an additional 21 items achieved consensus. A total of 153 (92%) items were included in the final checklist.

Conclusions

We have developed a consensus-based, multispecialty POCUS checklist to evaluate skills in image acquisition and anatomy identification for basic cardiac, lung, abdominal, and vascular ultrasound, and PIV insertion.

Background

Point-of-care ultrasound (POCUS) training is required for a growing list of specialties, including emergency medicine, critical care, and anesthesiology [1]. Physicians in-practice have been obtaining training through local and national continuing medical education courses that provide hands-on training and have been shown to be effective [2, 3].

Despite an increase in POCUS training, a critical gap remains in the ability to determine a physician’s competency in POCUS use due to variability in training standards and definitions of competency [4]. Several checklists and global rating scales have been published to evaluate POCUS skills [5,6,7,8,9,10,11,12,13,14]. Most published checklists are limited to a single organ system or specialty, and no multispecialty, multisystem checklists for evaluation of common POCUS applications of the lungs, heart, abdomen, and lower extremity veins have been published. Hospitals and healthcare systems are seeking validated multisystem POCUS checklists that can be applied across specialties to certify physician skills and maintain standards for POCUS use.

We describe the development of a multispecialty, multisystem POCUS skills checklist based on group consensus of national POCUS faculty from distinct institutions as the initial step toward creating a validated checklist.

Methods

We conducted a prospective observational study using consensus-based methods in two phases. The University of Pittsburgh Institutional Review Board approved this project (IRB # PRO18050302).

The initial POCUS skills checklist was developed by group consensus of POCUS experts from emergency medicine, critical care medicine, and hospital medicine during an in-person 3-day meeting dedicated toward developing a national POCUS training course for physicians practicing in the Department of Veterans Affairs (VA). Three diagnostic (heart, lungs, and abdomen) and one procedural application [peripheral intravenous line (PIV) insertion] were included in the checklist based on current evidence and applicability to multiple specialties. We piloted the initial checklist to evaluate novice learners from 2017 to 2019. Based on faculty feedback, the initial checklist was revised to include one additional diagnostic application (lower extremity deep venous thrombosis).

To gather formal consensus, an expert panel of 14 national POCUS faculty from emergency, critical care, and hospital medicine was convened which included the experts that developed the initial POCUS skills checklist. Experts were defined as individuals who regularly used POCUS in clinical practice; taught POCUS courses locally or nationally; and either had completed a dedicated POCUS fellowship, had a national professional society leadership role in POCUS, or had previously published on POCUS topics. All experts were required to disclose any conflicts of interest.

The checklist was divided into five sections (cardiac, lung, abdomen, lower extremity DVT, and PIV) and entered in an internet-based electronic data collection instrument (Research Electronic Data Capture [REDCap™]) hosted on the server of the University of Texas Health Science Center in San Antonio, Texas. Expert panel members rated each item as a requirement for basic competency, and panel members were encouraged to provide feedback in free text boxes for each item.

A modified-Delphi approach was used to assess the level of agreement among experts. Three rounds of electronic voting followed by group discussion by videoconferencing were conducted between May 2020 and December 2020. Consensus was defined by ≥ 80% of experts agreeing to include an item. Items achieving < 80% consensus for inclusion were discussed, revised, and considered for an additional two rounds of voting.

To finalize the checklist, we pilot tested it on pre-recorded skill examinations of 18 learners who were categorized as novice, intermediate, or experienced POCUS users based on learners’ prior training and current use. Each POCUS expert reviewed a minimum of 40 videos that were randomized by learner and expert reviewer, and each video was rated by at least five different experts. Feedback from raters was incorporated into the checklist to add anchors and clarify wording. Formal validation of the checklist is planned for the future when the COVID-19 pandemic subsides and live in-person POCUS training events are permitted.

Results

Fourteen POCUS experts participated and 13 (93%) completed all three rounds of voting. Characteristics of the POCUS expert panel are displayed in Additional file 1: Table S1.

The original skills checklist included a total of 166 items for five different POCUS applications. The cardiac, lung, abdominal, and lower extremity DVT ultrasound checklists included sections for probe type, location, and control; basic machine setup; image quality and optimization; and identification of anatomical structures. The checklist for PIV insertion included additional items for needle-tip tracking.

After the first round of voting, 136 (82%) checklist items achieved consensus based on ≥ 80% agreement for inclusion (Additional file 2: Table S2). Thirty items did not achieve consensus from the cardiac (17), lung (6), PIV insertion (3), abdomen (2), and lower extremity DVT (2) checklists. A checklist item for speed and efficiency consistently did not achieve consensus for all applications and was removed after the follow-up panel discussion.

The second round of voting included 19 checklist items. Prior to voting, checklist items for optimization of image depth and gain were revised as, “Image depth (or gain) optimized appropriately.” Differences in convention of the screen marker and image orientation for the subcostal 4-chamber and inferior vena cava views were discussed and clarified on the checklist during the second round of voting. We chose to allow some flexibility and stated, “exam preset and orientation can vary based on specialty or local convention.” An additional 15 items reached consensus after the second round of voting.

The third round of voting focused on cardiac subcostal views. In emergency medicine, the subcostal views are often obtained as part of a focused assessment with sonography in trauma (FAST) exam using a curvilinear probe and an abdominal exam preset, whereas in internal medicine and critical care, these views are most often obtained as part of a cardiac evaluation using a phased-array probe and cardiac exam preset. The group felt strongly to not remove these items and both probe types and exam settings were included in the revised checklist. All three items in the third round of voting achieved consensus. A total of 153 items were included in the final checklist (Tables 1, 2, 3 and 4).

Table 1 Cardiac POCUS checklist for basic competency in image acquisition and anatomy identification
Table 2 Lung POCUS checklist for basic competency in image acquisition and anatomy identification
Table 3 Abdominal and pelvic POCUS checklists for basic competency in image acquisition and anatomy identification
Table 4 Vascular POCUS checklist for basic competency in image acquisition and anatomy identification

The checklist was pilot tested using pre-recorded videos to identify unclear or ambiguous checklist items that could have varying interpretations. Anchors and explanatory statements were added to clarify certain checklist items based on group discussion (Additional file 3: Table S3).

Discussion

We have developed a consensus-based multisystem POCUS skills checklist to assess basic competency in image acquisition and anatomy identification. The checklist includes 153 items to evaluate skills to perform basic cardiac, lung, abdominal, and vascular ultrasound applications, including PIV insertion, that are commonly used in emergency medicine, critical care, and hospital medicine.

Our POCUS skills assessment checklist has noteworthy differences from other checklists. Most published POCUS skills checklists focus on assessing image acquisition skills of a single organ system, such as cardiac [5,6,7,8], thoracic [12,13,14], FAST exam [9,10,11], vascular [5], neuromuscular [15], musculoskeletal [16], or procedures [17]; or assessing skills of clinicians from a single specialty, such as emergency medicine [10], surgery [9], or critical care [5]. In contrast, our checklist was based on consensus from 14 POCUS experts from emergency (5), critical care (5), and hospital medicine (4), who practice at different medical centers across the United States. The value of our consolidated checklist is it establishes a common standard for assessing skills in image acquisition and anatomy identification for basic, common POCUS applications across specialties. Institutions seeking tools to assess POCUS skills prior to granting privileges to use POCUS for clinical decision-making can use our checklist to efficiently evaluate POCUS skills of physicians from different specialties.

Our multisystem POCUS skills checklist combines the use of both checklist items and global rating scales. Checklists use task-specific items that can provide both evaluative scoring with cutoff levels for “passing” as well as formative feedback. Checklists are perceived as being easier to use, especially for non-expert assessors, and having better interrater reliability [17]. However, checklists may focus more on thoroughness rather than overall competency and may not capture a summative assessment of one’s performance [18, 19]. One approach to overcome this limitation is increasing the point-value of critical checklist items, or identifying checklist items that result in immediate disqualification from competency if performed incorrectly [18, 20]. By comparison, global rating scales provide an overall assessment of a learner’s skills and can differentiate learner levels with high reliability and sensitivity, particularly when performed by content experts [21,22,23]. For these reasons, a final global rating question was included to determine whether the learner has demonstrated minimum skills to be considered competent in image acquisition and anatomy identification to perform the specified POCUS exam on patients.

A rigorous multi-step process was conducted to develop our checklist from 2017 to 2021. Initially, speed and efficiency of image acquisition were included in the checklist. However, after pilot testing the initial version of our checklist with novices, we noted substantial variability in interpretation and application of these checklist items among expert faculty and removed them, because consensus could not be achieved on the specific wording, anchoring, and scoring of these items. In the final phase of checklist development, a standardized set of recorded skills exams of novice, experienced, and expert learners were reviewed and scored by the expert panel members independently which led to insertion of additional anchors to clarify some checklist items.

Our consensus-based multisystem checklist has limitations. First, POCUS competency requires mastery of image acquisition and interpretation, and integration of findings into clinical decision-making, which include the cognitive, psychomotor, and affective domains of learning [24, 25]. Our POCUS checklist assesses image acquisition skills and identification of normal structures, while additional assessment is needed for the cognitive domain. Second, we were unable to assess interrater reliability of our checklist due to the cancellation of live in-person courses during the COVID-19 pandemic. We plan to validate our checklist with learners after resumption of live in-person POCUS courses in the future. Third, we had to balance completeness versus efficiency when selecting views to include in a multisystem POCUS skills checklist, and although important, certain views, such as the left upper quadrant, were not included based on group consensus. Finally, we have postponed weighting of critical checklist items until validation of our checklist prospectively. We anticipate greater weighting of the final global rating question on competency for granting privileges.

Conclusions

We have developed a consensus-based multispecialty, multisystem POCUS checklist to assess basic competency in image acquisition and anatomy identification of cardiac, lung, abdominal, and vascular ultrasound, and PIV insertion. This checklist was designed to assess the skills of novice POCUS users from a wide range of specialties. Future steps include validating our checklist with learners during live in-person POCUS courses and determining its interrater reliability.

Availability of data and materials

Data is available upon request.

Abbreviations

ACGME:

Accreditation Council for Graduate Medical Education

DVT:

Deep vein thrombosis

FAST:

Focused assessment with sonography in trauma

PIV:

Peripheral intravenous line

POCUS:

Point-of-care ultrasound

References

  1. ACGME. http://www.acgme.org. Accessed 11 Apr 2021.

  2. Greenstein YY, Littauer R, Narasimhan M, Mayo PH, Koenig SJ (2016) Effectiveness of a critical care ultrasonography course. Chest 148:459A

    Article  Google Scholar 

  3. Schott CK, LoPresti M, Boyd JS et al (2021) Retention of point-of-care ultrasound skills among practicing physicians: findings of the VA national point-of-care ultrasound training program. Am J Med 134:391–399

    Article  Google Scholar 

  4. Wong A, Galarza L, Duska F (2019) Critical care ultrasound: a systematic review of international training competencies and program. Crit Care Med 47(3):e256–e262

    Article  Google Scholar 

  5. Patrawalla P, Eisen LA, Shiloh A et al (2015) Development and validation of an assessment tool for competency in critical care ultrasound. J Grad Med Educ 7(4):567–573

    Article  Google Scholar 

  6. Gaudet J, Waechter J, McLaughlin K et al (2016) Focused critical care echocardiography: development and evaluation of an image acquisition assessment tool. Crit Care Med 44(6):e329-335

    Article  Google Scholar 

  7. Millington SJ, Arntfield RT, Hewak M et al (2016) The rapid assessment of competency in echocardiography scale: validation of a tool for point-of-care ultrasound. J Ultrasound Med 35(7):1457–1463

    Article  Google Scholar 

  8. Adamson R, Morris AE, Sun Woan J, Ma IWY, Schnobrich D, Soni NJ (2020) Development of a focused cardiac ultrasound image acquisition assessment tool. ATS Sch 1(3):260–277

    Article  Google Scholar 

  9. Ziesmann MT, Park J, Unger BJ et al (2015) Validation of the quality of ultrasound imaging and competence (QUICk) score as an objective assessment tool for the FAST examination. J Trauma Acute Care Surg 78(5):1008–1013

    Article  Google Scholar 

  10. Bell CR, McKaigney CJ, Holden M, Fichtinger G, Rang L (2017) Sonographic accuracy as a novel tool for point-of-care ultrasound competency assessment. AEM Educ Train 1(4):316–324

    Article  Google Scholar 

  11. Russell L, Østergaard ML, Nielsen MB, Konge L, Nielsen KR (2018) Standardised assessment of competence in focused assessment with sonography for trauma. Acta Anaesthesiol Scand 62:1154–1160

    Article  CAS  Google Scholar 

  12. Skaarup SH, Laursen CB, Bjerrum AS, Hilberg O (2017) Objective and structured assessment of lung ultrasound competence. A multispecialty delphi consensus and construct validity study. Ann Am Thorac Soc 14(4):555–560

    Article  Google Scholar 

  13. Millington SJ, Arntfield RT, Guo RJ et al (2017) The Assessment of Competency in Thoracic Sonography (ACTS) scale: validation of a tool for point-of-care ultrasound. Crit Ultrasound J 9(1):25

    Article  Google Scholar 

  14. Di Pietro S, Mascolo M, Falaschi F et al (2021) Lung-ultrasound objective structured assessment of technical skills (LUS-OSAUS): utility in the assessment of lung-ultrasound trained medical undergraduates. J Ultrasound 24(1):57–65

    Article  Google Scholar 

  15. Tawfik EA, Cartwright MS, Grimm A et al (2021) Neuromuscular ultrasound competency assessment: consensus-based survey. Muscle Nerve 63(5):651–656

    Article  Google Scholar 

  16. Kissin EY, Niu J, Balint P et al (2013) Musculoskeletal ultrasound training and competency assessment program for rheumatology fellows. J Ultrasound Med 32(10):1735–1743

    Article  Google Scholar 

  17. Kahr Rasmussen N, Nayahangan LJ, Carlsen J et al (2021) Evaluation of competence in ultrasound-guided procedures-a generic assessment tool developed through the Delphi method. Eur Radiol 31(6):4203–4211

    Article  Google Scholar 

  18. Desy J, Noble VE, Woo MY, Walsh M, Kirkpatrick AW, Ma IWY (2021) Use of critical items in determining point-of-care ultrasound competence. Eval Health Prof 44(3):220–225

    Article  Google Scholar 

  19. Walzak A, Bacchus M, Schaefer JP et al (2015) Diagnosing technical competence in six bedside procedures: comparing checklists and a global rating scale in the assessment of resident performance. Acad Med 90(8):1100–1108

    Article  Google Scholar 

  20. Payne NJ, Bradley EB, Heald EB et al (2008) Sharpening the eye of the OSCE with critical action analysis. Acad Med 83(10):900–905

    Article  Google Scholar 

  21. Hodges B, McIlroy JH (2003) Analytic global OSCE ratings are sensitive to level of training. Med Educ 37(11):1012–1016

    Article  Google Scholar 

  22. Hodges B, Regehr G, McNaughton N, Tiberius R, Hanson M (1999) OSCE checklists do not capture increasing levels of expertise. Acad Med 74(10):1129–1134

    Article  CAS  Google Scholar 

  23. Regehr G, MacRae H, Reznick RK, Szalay D (1998) Comparing the psychometric properties of checklists and global rating scales for assessing performance on an OSCE-format examination. Acad Med 73(9):993–997

    Article  CAS  Google Scholar 

  24. Soni NJ, Tierney DM, Jensen TP, Lucas BP (2017) Certification of point-of-care ultrasound competency. J Hosp Med 12(9):775–776

    Article  Google Scholar 

  25. Menix KD (1996) Domains of learning: interdependent components of achievable learning outcomes. J Contin Educ Nurs 27(5):200–208

    Article  CAS  Google Scholar 

Download references

Acknowledgements

The authors would like to thank the faculty and staff of The Winter Institute for Simulation, Education, and Research (WISER) center (Pittsburgh, PA) for their time, space, and resources to conduct this study.

Funding

Drs. Schott and LoPresti received funding for this work from an Innovation Grant by the Alliance for Academic Medicine. Dr. Soni receives funding from the Department of Veterans Affairs Quality Enhancement Research Initiative (QUERI) Partnered Evaluation Initiative (I50 HX002263-01A1) and National Center for Patient Safety. None of the funding agencies supporting this work were involved with the study design; collection, analysis, and interpretation of data; writing of the report; or the decision to submit the article for publication. The contents of this publication do not represent the views of the U.S. Department of Veterans Affairs or the United States Government.

Author information

Authors and Affiliations

Authors

Contributions

NJS, CKS, MM, MA, JSB, JP, EPF, and CLM conceived and designed the study including the initial skills checklist. NJS, RN, MA, RK, KV, KK, JSB, CLM, DR, ZB, JW, BB, HS, EW and CKS piloted the checklist, served on the expert voting panel, collected data, and finalized the checklist. MM, EKH, NS, JP, and EPF performed data cleaning and analysis followed by preparation of tables/figures for the manuscript. NJS, RN, MA, RK, KV, KK, JSB, CLM, DR, ZB, JW, BB, HS, EW, and CKS contributed to drafting the manuscript, and all authors contributed substantially to revisions and finalization. NJS and CKS take primary responsibility for the data presented in this manuscript. NJS takes responsibility as the corresponding author and manuscript as a whole. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Nilam J. Soni.

Ethics declarations

Ethics approval and consent to participate

The University of Pittsburgh Institutional Review Board approved this project (IRB # PRO18050302).

Consent for publication

All authors agree to publication of the manuscript in The Ultrasound Journal.

Competing interests

All authors report no competing or conflicting interests related to this manuscript.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1: Table S1.

Characteristics of the Point-of-care Ultrasound Expert Panel.

Additional file 2: Table S2.

Voting Results of Point-of-care Ultrasound Expert Panel.

Additional file 3: Table S3.

Multisystem Point-of-care Ultrasound Checklist for Basic Competency in Image Acquisition and Anatomy Identification. A total of 153 checklist items reached consensus and include four diagnostic applications (cardiac, lung, abdominal, and vascular ultrasound) and one procedural application (peripheral intravenous line insertion).

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Soni, N.J., Nathanson, R., Andreae, M. et al. Development of a multisystem point of care ultrasound skills assessment checklist. Ultrasound J 14, 17 (2022). https://doi.org/10.1186/s13089-022-00268-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13089-022-00268-4

Keywords