Program now available
Registration open
>
16 - 20 February 2025
San Diego, California, US
Conference 13407 > Paper 13407-113
Paper 13407-113

A comparative study of deep learning architectures for classification of human burn wounds using visual light and multispectral SWIR imaging

19 February 2025 • 5:30 PM - 7:00 PM PST | Golden State Ballroom

Abstract

Accurate determination of burn wound depth is crucial in the resection of injured tissue to avoid infection, complications in healing, and the unnecessary removal of healthy tissue. This study aims to lay the groundwork for improving burn depth classification accuracy through multispectral short-wave infrared (SWIR) imaging and deep learning. 267 regions of interest (ROIs) were imaged at wavelength bands ranging from 1200 to 2250 nm. Burn categories were blindly classified by surgeons using visual light images of the ROIs. When trained on the SWIR images alone, a MobileNet CNN model showed average accuracies ranging from 0.18 to 0.94 in predicting the surgeons’ classification of operable burns, superficial thickness burns, and normal skin. Classification accuracy for operable burns is significantly higher (up to 94%) compared to other classes. Ongoing work to augment accuracy includes incorporating spectral and texture-based features, histological classification, and segmentation of burns.

Presenter

Mignon Frances Dumanjog
The Univ. of Texas at San Antonio (United States), MATRIX, the UTSA AI Consortium for Human Well-Being, The Univ. of Texas at San Antonio (United States)
Mignon Dumanjog is a graduating master's student in the biomedical engineering program at the University of Texas at San Antonio. She graduated with a bachelor's degree in physics at the University of the Philippines - Diliman. She works as a graduate research assistant at Qutub Lab, specializing in computational biology and computer vision applied to various biomedical data such as cell microscopy images. She became part of the TRC4 Initiative Project for Novel Short Wave Assessment Tool in Texas (SWATT) to Enhance Burn Tissue Viability Assessment, where she employed computer vision and deep learning techniques to classify burn wound images by depth and operability.
Application tracks: AI/ML
Presenter/Author
Mignon Frances Dumanjog
The Univ. of Texas at San Antonio (United States), MATRIX, the UTSA AI Consortium for Human Well-Being, The Univ. of Texas at San Antonio (United States)
Author
Ctr. for Organogenesis, Regeneration and Trauma, The Univ. of Texas Southwestern Medical Ctr. at Dallas (United States)
Author
Ctr. for Organogenesis, Regeneration and Trauma, The Univ. of Texas Southwestern Medical Ctr. at Dallas (United States)
Author
Univ. of Michigan (United States)
Author
Ctr. for Organogenesis, Regeneration and Trauma, The Univ. of Texas Southwestern Medical Ctr. at Dallas (United States)
Author
Omer Berenfeld
Univ. of Michigan (United States)
Author
Benjamin Levi
Ctr. for Organogenesis, Regeneration and Trauma, The Univ. of Texas Southwestern Medical Ctr. at Dallas (United States)
Author
The Univ. of Texas at San Antonio (United States), MATRIX, the UTSA AI Consortium for Human Well-Being, The Univ. of Texas at San Antonio (United States)