Paper 13412-2
Echocardiographic view classification using ECG-assisted multi-phase image sampling
18 February 2025 • 10:50 AM - 11:10 AM PST | Palm 2
Abstract
Transthoracic echocardiography (TTE) is the most common non-invasive imaging technique used to diagnose cardiac disease, capturing video assessments from up to 27 unique views. Accurate view classification is crucial for downstream tasks using TTE data. Traditional methods often rely on random or fixed video frame sampling, leading to misaligned training data. This study introduces a novel deep learning approach that uses pixel-embedded ECG signals to enhance TTE view classification. We developed an ensemble model using three cardiac phase-specific images, sampled from: R-wave(early-systole), late-systole, and diastole, each trained as CNN models based on the EfficientNet-B2. A majority vote was used to ensemble the results of each individual model prediction and achieved high performance, with accuracy of 92.4% and AUC of 99%, across 11 TTE views, offering a robust and efficient solution for classification. Further validation is needed to confirm generalizability across larger cohorts and variable data inputs.
Presenter
Zahra Ghods
Univ. of Calgary (Canada)
I am currently pursuing a Master’s degree in Computer Science at the University of Calgary, where I have focused my research on applying artificial intelligence in the medical field. My work primarily involves developing and optimizing deep learning models for cardiac imaging, including echocardiograms, and MRI scans, with the goal of enhancing diagnostic accuracy and predictive capabilities.
Throughout my academic career, I have contributed to various projects, such as segmenting the Left Atrium of the heart, analyzing ECG signals from echocardiograms, and classifying TTE views, the latter of which form the basis of the work presented at this conference. My dedication to advancing AI in biomedicine drives my efforts to create innovative solutions that could have a meaningful impact on global healthcare.
I am honored to present my research at this esteemed conference and to contribute to the ongoing advancements in this field.