CheXmask-U: Quantifying uncertainty in landmark-based anatomical segmentation for X-ray images
Abstract
Uncertainty estimation in landmark-based segmentation of chest X-rays using hybrid neural network architectures improves reliability and robustness in clinical deployment.
Uncertainty estimation is essential for the safe clinical deployment of medical image segmentation systems, enabling the identification of unreliable predictions and supporting human oversight. While prior work has largely focused on pixel-level uncertainty, landmark-based segmentation offers inherent topological guarantees yet remains underexplored from an uncertainty perspective. In this work, we study uncertainty estimation for anatomical landmark-based segmentation on chest X-rays. Inspired by hybrid neural network architectures that combine standard image convolutional encoders with graph-based generative decoders, and leveraging their variational latent space, we derive two complementary measures: (i) latent uncertainty, captured directly from the learned distribution parameters, and (ii) predictive uncertainty, obtained by generating multiple stochastic output predictions from latent samples. Through controlled corruption experiments we show that both uncertainty measures increase with perturbation severity, reflecting both global and local degradation. We demonstrate that these uncertainty signals can identify unreliable predictions by comparing with manual ground-truth, and support out-of-distribution detection on the CheXmask dataset. More importantly, we release CheXmask-U (huggingface.co/datasets/mcosarinsky/CheXmask-U), a large scale dataset of 657,566 chest X-ray landmark segmentations with per-node uncertainty estimates, enabling researchers to account for spatial variations in segmentation quality when using these anatomical masks. Our findings establish uncertainty estimation as a promising direction to enhance robustness and safe deployment of landmark-based anatomical segmentation methods in chest X-ray. A fully working interactive demo of the method is available at huggingface.co/spaces/matiasky/CheXmask-U and the source code at github.com/mcosarinsky/CheXmask-U.
Community

We present CheXmask-U, a framework for quantifying uncertainty in landmark-based anatomical segmentation models on chest X-rays and release the CheXmask-U dataset providing per-node uncertainty estimates to support research in robust and safe medical imaging.
This is an automated message from the Librarian Bot. I found the following papers similar to this paper.
The following papers were recommended by the Semantic Scholar API
- LatentFM: A Latent Flow Matching Approach for Generative Medical Image Segmentation (2025)
- Diffusion Model in Latent Space for Medical Image Segmentation Task (2025)
- Automated C-Arm Positioning via Conformal Landmark Localization (2025)
- Uncertainty-Aware Extreme Point Tracing for Weakly Supervised Ultrasound Image Segmentation (2025)
- Bridging spatial awareness and global context in medical image segmentation (2025)
- An Empirical Study on MC Dropout-Based Uncertainty-Error Correlation in 2D Brain Tumor Segmentation (2025)
Please give a thumbs up to this comment if you found it helpful!
If you want recommendations for any Paper on Hugging Face checkout this Space
You can directly ask Librarian Bot for paper recommendations by tagging it in a comment:
@librarian-bot
recommend
Models citing this paper 0
No model linking this paper