Skip to main content
  • Short communication
  • Open access
  • Published:

The association of attentional foci and image interpretation accuracy in novices interpreting lung ultrasound images: an eye-tracking study


It is unclear, where learners focus their attention when interpreting point-of-care ultrasound (POCUS) images. This study seeks to determine the relationship between attentional foci metrics with lung ultrasound (LUS) interpretation accuracy in novice medical learners. A convenience sample of 14 medical residents with minimal LUS training viewed 8 LUS cineloops, with their eye-tracking patterns recorded. Areas of interest (AOI) for each cineloop were mapped independently by two experts, and externally validated by a third expert. Primary outcome of interest was image interpretation accuracy, presented as a percentage. Eye tracking captured 10 of 14 participants (71%) who completed the study. Participants spent a mean total of 8 min 44 s ± standard deviation (SD) 3 min 8 s on the cineloops, with 1 min 14 s ± SD 34 s spent fixated in the AOI. Mean accuracy score was 54.0% ± SD 16.8%. In regression analyses, fixation duration within AOI was positively associated with accuracy [beta-coefficients 28.9 standardized error (SE) 6.42, P = 0.002). Total time spent viewing the videos was also significantly associated with accuracy (beta-coefficient 5.08, SE 0.59, P < 0.0001). For each additional minute spent fixating within the AOI, accuracy scores increased by 28.9%. For each additional minute spent viewing the video, accuracy scores increased only by 5.1%. Interpretation accuracy is strongly associated with time spent fixating within the AOI. Image interpretation training should consider targeting AOIs.


Point-of-care ultrasound (POCUS) can be used at the bedside when assessing patients with heart failure/acute dyspnea to increase diagnostic accuracy [1, 2] and provide important prognostic information [3,4,5,6]. The need to incorporate POCUS into the practice of internal medicine is increasingly recognized internationally [7, 8]. However, POCUS skills are complex to teach, involving image acquisition, interpretation, and clinical integration [7,8,9]. Despite image interpretation being a fundamental skill, few studies exist to guide educators on how to teach it [10]. In diagnostic imaging studies, eye-tracking technology has provided educators with a better understanding of what the image interpretation task involves and its associated errors [11,12,13]. The majority of these studies were on radiologists interpreting radiographs and computed tomography. Few were on ultrasound. Where eye-tracking studies were conducted on POCUS [14,15,16,17], differences in eye movement between experts and novices were noted. However, the relationship between interpretation accuracy and eye movement remains undefined. We hypothesize that POCUS interpretation accuracy is related to the learners’ attentional foci on the areas of interest (AOI) relevant to the diagnosis. If this relationship proves true, educators could consider targeting training to AOIs in their educational interventions for those learning image interpretation.


Between January 2020 and January 2021, we invited a convenience sample of 14 internal medicine residents with any prior lung ultrasound (LUS) training to participate in this cross-sectional study. We excluded those with no prior LUS training as tracking uninformed eye movements during image interpretation may not yield helpful information.

After performing eye-tracking calibration in a seated position, consenting participants viewed and interpreted 8 LUS videos on a standardized laptop (Asus ROG Strix, GL503V) with an eye-tracking system (Tobii Tech, Danderyd, Sweden) mounted on the laptop. Eight videos were created from 6 s cineloops from our program’s anonymized teaching bank. These cineloops were played in a continuous loop for 30 s, portraying the following common LUS findings: normal lung (× 2), absent lung sliding, pleural effusion, mirror image artifact with a negative spine sign, pleural irregularity with B-lines, M-mode demonstrating absent lung sliding, and presence of B-lines and A-lines. Each video is accompanied by 1–3 questions regarding the findings and diagnosis (See Additional file 1). Participants were instructed to read the paper-based questions for each video prior to viewing the video, so that they are aware of what findings to anticipate. Participants had the option to exit the video early or view the video one additional time, within 1 min (max allotted duration 2 min).

Defining areas of interest (AOI)

AOIs for each ultrasound video were defined as areas on the ultrasound image that required evaluation to rule in or rule out a specific finding. For example, evaluation of the spine in the far field of a coronal image of the lung base is important to rule in or rule out a pleural effusion and examination of the pleural line is important to determine if pleural sliding is present [18]. AOIs for each video were mapped independently in March 2021 by two experts (IM, JD), both certified by the American Registry for Diagnostic Medical Sonography. Three discrepancies in AOI mapping were resolved by discussion. One discrepancy involved evaluation of the lung zone labelling for a normal lung cineloop and a second discrepancy involved evaluating the depth scale in a cineloop for B lines. Post discussion, experts agreed that neither were definitively critical to the cineloops’ diagnosis. The third discrepancy involved evaluation of the far field findings deep to a non-sliding pleura, which was agreed upon to be an important area to evaluate. Both experts were blinded to the participant data, which was collected by the resident investigator (ML). The AOI were then externally validated using eye movement data of a third expert (ACT) external to our institution, whose eye movements were captured in September 2019 during a site visit.

Outcome variables

Total fixation duration was defined as the duration of all fixations within the AOI, (I-VT filter, default settings, minimum fixation duration of 60 ms, User’s manual Tobii Studio, version 3.4.8, 2017, pp. 54–57). Total time spent viewing the videos was time spent both within and outside of AOI. Gaze plots were created using Tobii Studio software and examined qualitatively (Fig. 1).

Fig. 1
figure 1

Gaze plots of expert (green, top left) and novices (purple) in identifying the presence of a positive spine sign and the pleural effusion. Top right: gaze plot of a novice who scored 0% on the video with minimal gaze on the spine. Bottom left: gaze plot of a different novice who scored 50% on the video. Bottom right: gaze plot of a third novice who correctly identified both findings and scored 100%

Accuracy score was calculated as the number of correct responses on image interpretation (out of 15), presented as a percentage. Validity evidence for the questionnaire was evaluated in two ways. First, in September 2019, for content validity, the questionnaire was reviewed and completed independently by two education experts (JD, ACT) not involved in test construction; both scored 100%. Second, we evaluated the internal reliability of the questionnaire using Cronbach’s alpha (alpha = 0.68).

Statistical analyses

Standard descriptive statistics are reported. The independent association between eye-tracking variables and accuracy score was explored using univariate linear regression analyses. A two-sided p value < 0.05 was considered to indicate statistical significance. All analyses were performed using SAS version 9.4 (SAS Institute Inc., Cary, NC) and STATA 17.0 (StataCorp, College Station, TX).


All invited internal medicine residents completed the study (n = 14). However, eye tracking for four participants (29%) was not captured by the system and the data for these were excluded. Of the remaining 10 participants, five (50%) were female; all ten were first year residents. Nine (90%) reported using POCUS for under 1 year, while one (10%) reported 1–2 years of POCUS use.

Participants spent an average of 8 min 44 s [standard deviation (SD) 3 min 8 s] viewing the videos, of which, the average total fixation duration in AOI was 1 min 14 s ± SD 30 s. Mean accuracy score was 54.0% ± SD 16.8% (range 33.3–80.0%).

Total fixation duration was significantly associated with accuracy score [Beta-coefficients (β) 28.9 standardized error (SE) 6.42, P = 0.002 for fixation duration). Total time spent viewing the videos was also associated with accuracy (β = 5.08, SE 0.59, P < 0.0001), but less so than total fixation duration. Figure 2 illustrates representative gaze plots, demonstrating qualitative differences between gaze plots of an expert and participants.

Fig. 2
figure 2

Scatter plot of image interpretation accuracy score, presented as a precent, vs. total fixation duration within areas of interest (seconds), with fitted line shown in black, and confidence interval for the mean shown in gray


In this eye-tracking study on medical learners, total time spent fixating in the AOI as well as time spent viewing the videos were associated with interpretation accuracy. For every additional minute spent fixating in the AOI, accuracy score increased by 28.9%, while for every additional minute spent viewing the videos in general resulted in a score increase of only 5.1%. Our results support the hypothesis that accuracy is associated with attentional foci within the AOI.

Our results extend prior studies on eye tracking in POCUS, where most studies explored expert-novice differences. In studies on ultrasound-guided regional anesthesia, one found that novices spent more gaze time outside the AOI than experts [16]. In another study, fixation patterns differed qualitatively between one expert and one novice [17]. Two studies evaluated the interpretation of abdominal free fluid and both identified significant differences between experts and novices in their fixations in AOI [14, 15]. While helpful, identifying expert-novice differences may not be sufficient validity evidence [19]. Our study adds to this body of literature by demonstrating an additional measure of validity evidence: relations to other variables [20, 21], namely, interpretation accuracy.

How can eye-tracking data assist an educator? From an assessment perspective [20, 21], eye-tracking data can provide evidence for response process of the trainees. From a training perspective, eye-tracking data may provide feedback to learners [22], by demonstrating where errors in attention may lie. Prior studies on non-ultrasound imaging suggest that training using eye movement feedback data may increase interpretation accuracy [23, 24] and improve decision time [23, 25]. One randomized study training learners where to look in ultrasound videos using eye movement technology found higher interpretation accuracy [26]. For programs without eye-tracking technology, potentially learners may still benefit from being taught key AOIs for learners to pay attention to during image interpretation.

Our study has some limitations. This is a single institution study, which limits the generalizability of our conclusions. Second, despite finding a significant association, our study has a small sample size, including the loss of four participants’ data as their videos and eye movement data failed to capture despite completing the study. Third, our questionnaire’s internal reliability was 0.68, lower than the frequently cited threshold of 0.7 [27]. It is possible that interpretation competence is multidimensional [28] which would account for the low internal reliability. Alternatively, a longer questionnaire may be needed to demonstrate a higher internal reliability. Fourth, one of the participants had more POCUS experience than the remaining cohort. Reassuringly, however, by removing this participant’s data, our study conclusions did not materially change. Total fixation duration remained significantly associated with accuracy score (β = 25.5, SE 6.79, P = 0.007). Total time spent viewing the videos also remained significantly associated with accuracy (β = 4.68, SE 0.60, P = 0.0001).


For novices interpreting LUS videos, total time spent fixating in the AOI was strongly and positively associated with interpretation accuracy. Novices may benefit from explicit instructions on key areas to look during image interpretation.

Availability of data and materials

The data sets used and/or analyzed during the current study are available from the corresponding author on reasonable requests.



Area of interest


Lung ultrasound


Point-of-care ultrasound


Standard deviation


Standardized error


  1. Maw AM, Hassanin A, Ho PM et al (2019) Diagnostic accuracy of point-of-care lung ultrasonography and chest radiography in adults with symptoms suggestive of acute decompensated heart failure: a systematic review and meta-analysis. JAMA Netw Open 2(3):e190703

    Article  PubMed  PubMed Central  Google Scholar 

  2. Qaseem A, Etxeandia-Ikobaltzeta I, Mustafa RA, Kansagara D, Fitterman N, Wilt TJ (2021) Appropriate use of point-of-care ultrasonography in patients with acute dyspnea in emergency department or inpatient settings: a clinical guideline from the American college of physicians. Ann Intern Med 174(7):985–993

    Article  PubMed  Google Scholar 

  3. Araiza-Garaygordobil D, Gopar-Nieto R, Martínez-Amezcua P et al (2021) Point-of-care lung ultrasound predicts in-hospital mortality in acute heart failure. QJM 114(2):111–116

    Article  CAS  PubMed  Google Scholar 

  4. Platz E, Lewis EF, Uno H et al (2016) Detection and prognostic value of pulmonary congestion by lung ultrasound in ambulatory heart failure patients. Eur Heart J 37(15):1244–1251

    Article  PubMed  PubMed Central  Google Scholar 

  5. Gargani L, Pugliese NR, Frassi F et al (2021) Prognostic value of lung ultrasound in patients hospitalized for heart disease irrespective of symptoms and ejection fraction. ESC Heart Fail 8(4):2660–2669

    Article  PubMed  PubMed Central  Google Scholar 

  6. Gargani L, Pang PS, Frassi F et al (2015) Persistent pulmonary congestion before discharge predicts rehospitalization in heart failure: a lung ultrasound study. Cardiovas Ultrasound 13:40

    Article  Google Scholar 

  7. Soni NJ, Schnobrich D, Matthews BK et al (2019) Point-of-care ultrasound for hospitalists: a position statement of the society of hospital medicine. J Hosp Med 14:E1–E6

    PubMed  PubMed Central  Google Scholar 

  8. Torres-Macho J, Aro T, Bruckner I et al (2020) Point-of-care ultrasound in internal medicine: a position paper by the ultrasound working group of the European federation of internal medicine. Eur J Intern Med 73:67–71

    Article  CAS  PubMed  Google Scholar 

  9. LoPresti CM, Jensen TP, Dversdal RK, Astiz DJ (2019) Point of Care ultrasound for internal medicine residency training: a position statement from the alliance of academic internal medicine. Am J Med 132(11):1356–1360

    Article  PubMed  Google Scholar 

  10. Moses A, Weng W, Orchanian-Cheff A, Cavalcanti RB (2020) Teaching point-of-care ultrasound in medicine. Can J Gen Intern Med 15:13–29

    Article  Google Scholar 

  11. Wu C-C, Wolfe JM (2019) Eye movements in medical image perception: a selective review of past, present and future. Vision 3(2):32

    Article  PubMed  PubMed Central  Google Scholar 

  12. Brunyé TT, Drew T, Weaver DL, Elmore JG (2019) A review of eye tracking for understanding and improving diagnostic interpretation. Cog Res: Princ Implic 4(1):7

    Article  Google Scholar 

  13. Van der Gijp A, Ravesloot C, Jarodzka H et al (2017) How visual search relates to visual diagnostic performance: a narrative systematic review of eye-tracking research in radiology. Adv Health Sci Educ Theory Pract 22(3):765–787

    Article  PubMed  Google Scholar 

  14. Lee WF, Chenkin J (2021) Exploring eye-tracking technology as an assessment tool for point-of-care ultrasound training. AEM Educ Train 5(2):e10508

    Article  PubMed  Google Scholar 

  15. Bell CR, Szulewski A, Walker M et al (2021) Differences in gaze fixation location and duration between resident and fellowship sonographers interpreting a focused assessment with sonography in Trauma. AEM Educ Train 5(1):28–36

    Article  PubMed  Google Scholar 

  16. Borg LK, Harrison TK, Kou A et al (2018) Preliminary experience using eye-tracking technology to differentiate novice and expert image interpretation for ultrasound-guided regional anesthesia. J Ultrasound Med 37(2):329–336

    Article  PubMed  Google Scholar 

  17. Harrison TK, Kim TE, Kou A et al (2016) Feasibility of eye-tracking technology to quantify expertise in ultrasound-guided regional anesthesia. J Anesth 30(3):530–533

    Article  PubMed  Google Scholar 

  18. Desy J, Noble VE, Liteplo AS et al (2021) Minimal criteria for lung ultrasonography in internal medicine. Can J Gen Intern Med 16(2):6–13

    Article  Google Scholar 

  19. Cook DA (2015) Much ado about differences: why expert-novice comparisons add little to the validity argument. Adv Health Sci Educ Theory Pract 20(3):829–834

    Article  PubMed  Google Scholar 

  20. Yudkowsky R, Park Y, Downing S (2020) Assessment in health professions education, 2nd edn. Routledge, New York

    Google Scholar 

  21. American Educational Research Association, American Psychological Association, & national council on measurement in education standards for educational and psychological testing. Washington, DC: American Educational Research Association. 2014.

  22. Ashraf H, Sodergren MH, Merali N, Mylonas G, Singh H, Darzi A (2018) Eye-tracking technology in medical education: a systematic review. Med Teach 40(1):62–69

    Article  PubMed  Google Scholar 

  23. Litchfield D, Ball L, Donovan T, Manning D, Crawford T. (2008) Learning from others effects of viewing another person’s eye movements while searching for chest nodules. In Medical Imaging 2008: Image Perception, Observer Performance and Technology Assessment.

  24. Gegenfurtner A, Lehtinen E, Jarodzka H, Säljö R (2017) Effects of eye movement modeling examples on adaptive expertise in medical image diagnosis. Comput Educ 113:212–225

    Article  Google Scholar 

  25. Quen MTZ, Mountstephens J, Teh YG, Teo J (2021) Medical image interpretation training with a low-cost eye tracking and feedback system: a preliminary study. Healthc Technol Lett 1:1–7

    Google Scholar 

  26. Darici D, Masthoff M, Rischen R, Schmitz M, Ohlenburg H, Missler M (2023) Medical imaging training with eye movement modeling examples a randomized controlled study. Med Teach.

    Article  PubMed  Google Scholar 

  27. Taber KS (2018) The use of Cronbach’s Alpha when developing and reporting research instruments in science education. Res Sci Educ 48:1273–1296

    Article  Google Scholar 

  28. Cortina JM (1993) What is coefficient alpha? An examination of theory and applications. J Appl Psychol 78:98–104

    Article  Google Scholar 

Download references


The authors wish to thank all participants in this study and the assistance of Sarah Simmons, Julia Kupis and Michelle Yee-Yan Cheng from the W21C for their assistance with this study. Preliminary results were presented at the Canadian Society of Internal Medicine Virtual Annual Meeting, Research Poster Presentations, November 17, 2021.


This study was funded by The John A. Buchanan Chair in General Internal Medicine at the University of Calgary. The funder had no role in the design and conduct of the study, nor the decision to prepare and submit the manuscript for publication.

Author information

Authors and Affiliations



ML and JD contributed to the conception and design of the work; acquisition and interpretation of data; and drafted and substantively revised the work. MHW contributed to the conception and design of the work; and drafted and substantively revised the work. ACT contributed to the analysis and interpretation of data; drafted and substantively revised the work. IWYM contributed to the conception and design of the work; acquisition, analysis, and interpretation of data; and drafted and substantively revised the work. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Irene W. Y. Ma.

Ethics declarations

Ethics approval and consent to participate

The University of Calgary Conjoint Health Research Ethics Board approved this study (REB-17-0428). All participants provided informed written consent prior to enrollment.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1:

Supplement 1, Eye Tracking Questions.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lee, M., Desy, J., Tonelli, A.C. et al. The association of attentional foci and image interpretation accuracy in novices interpreting lung ultrasound images: an eye-tracking study. Ultrasound J 15, 36 (2023).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: