S069 - Learning Patient Rotation Using Synthetic X-ray Images from 3D CT Volumes
Wai Yan Ryana Fok, Andreas Fieselmann, Magdalena Herbst, Dominik Eckert, Marcel Beister, Steffen Kappler, Sylvia Saalfeld
Show abstract - PDF - Reviews
Curation of large-scale annotated clinical data for training could be challenging due to scarcity or ethical issues. As an alternative, synthetically generated data can be used to train network for recognising basic features. In this work, we propose the novel training scheme using synthetic chest X-rays generated from 3D photon-counting CT volumes for quantifying the internal patient rotation $\alpha$. This can automatically inform the technician if and how re-exposure is needed without the need of extensive image analysis. X-ray images were forward projected with a step size of 2$\degree$ rotation along patient axis. 1167 images and labels were trained on a modified DenseNet-121 to detect $\alpha$. Results on 252 test images showed good correlation between true and predicted $\alpha$, with $R^2$= 0.992, with 95% confidence level of $\approx \pm$2$\degree $.
Hide abstract
Short paper
Schedule: Wednesday, July 12: Virtual poster session - 8:00–9:00
Poster location: Virtual only