Language

English

Publication Date

1-1-2024

Journal

MELBA Journal: Machine Learning for Biomedical Imaging

DOI

10.59275/j.melba.2024-g93a

PMID

40453064

PMCID

PMC12123533

PubMedCentral® Posted Date

10-23-2025

PubMedCentral® Full Text Version

Author MSS

Abstract

Clinically deployed deep learning-based segmentation models are known to fail on data outside of their training distributions. While clinicians review the segmentations, these models tend to perform well in most instances, which could exacerbate automation bias. Therefore, detecting out-of-distribution images at inference is critical to warn the clinicians that the model likely failed. This work applied the Mahalanobis distance (MD) post hoc to the bottleneck features of four Swin UNETR and nnU-net models that segmented the liver on T1-weighted magnetic resonance imaging and computed tomography. By reducing the dimensions of the bottleneck features with either principal component analysis or uniform manifold approximation and projection, images the models failed on were detected with high performance and minimal computational load. In addition, this work explored a non-parametric alternative to the MD, a k-th nearest neighbors distance (KNN). KNN drastically improved scalability and performance over MD when both were applied to raw and average-pooled bottleneck features. Our code is available at https://github.com/mckellwoodland/dimen_reduce_mahal.

Keywords

Out-of-distribution detection, Mahalanobis distance, Nearest Neighbors, Principal component analysis, Uniform manifold approximation and projection

Published Open-Access

yes

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.