Language
English
Publication Date
1-1-2024
Journal
MELBA Journal: Machine Learning for Biomedical Imaging
DOI
10.59275/j.melba.2024-g93a
PMID
40453064
PMCID
PMC12123533
PubMedCentral® Posted Date
10-23-2025
PubMedCentral® Full Text Version
Author MSS
Abstract
Clinically deployed deep learning-based segmentation models are known to fail on data outside of their training distributions. While clinicians review the segmentations, these models tend to perform well in most instances, which could exacerbate automation bias. Therefore, detecting out-of-distribution images at inference is critical to warn the clinicians that the model likely failed. This work applied the Mahalanobis distance (MD) post hoc to the bottleneck features of four Swin UNETR and nnU-net models that segmented the liver on T1-weighted magnetic resonance imaging and computed tomography. By reducing the dimensions of the bottleneck features with either principal component analysis or uniform manifold approximation and projection, images the models failed on were detected with high performance and minimal computational load. In addition, this work explored a non-parametric alternative to the MD, a k-th nearest neighbors distance (KNN). KNN drastically improved scalability and performance over MD when both were applied to raw and average-pooled bottleneck features. Our code is available at https://github.com/mckellwoodland/dimen_reduce_mahal.
Keywords
Out-of-distribution detection, Mahalanobis distance, Nearest Neighbors, Principal component analysis, Uniform manifold approximation and projection
Published Open-Access
yes
Recommended Citation
Woodland, McKell; Patel, Nihil; Castelo, Austin; et al., "Dimensionality Reduction and Nearest Neighbors for Improving Out-of-Distribution Detection in Medical Image Segmentation" (2024). Faculty and Staff Publications. 5319.
https://digitalcommons.library.tmc.edu/baylor_docs/5319