Full Resolution Three-Dimensional Reconstruction of Non-Serial Prostate Whole-Mounts: Pilot Validation and Initial Results

D. Schouten, N. Khalili, J. van der Laak and G. Litjens

European Congress on Digital Pathology 2024.

Introduction

Three-dimensional reconstruction in histopathology can be achieved by the co-registration of consecutive digitized whole-slide images to reconstruct an anatomical volume of interest. These 3D reconstructions greatly enhance the pathologist's ability to inspect and relate tumour growth patterns to pre-operative 3D imaging and pave the way for automated specimen analysis. Although several algorithms have been proposed for this task, these either require additional input (i.e. ex vivo MRI), are limited to low-resolution versions of the slide or expect serial slicing rather than the more common sparse sampling of a specimen, which severely limits clinical applicability. In this work, we present a deep-learning based algorithm which addresses all of the aforementioned limitations and demonstrate its effectiveness on 20 prostatectomy samples with 4-9 slides each.

Materials and Methods

We tune and validate our algorithm on a cohort of respectively 30 and 20 prostatectomy specimens with routine sparse sampling with a four mm distance between slices, prepared as whole-mount sections and stained with H&E. The algorithm is powered by a transformer (LightGlue) for feature extraction and matching and was evaluated using median target registration error (TRE) between automatically detected landmarks and a normalized Dice coefficient to assess shape congruency.

Results

The proposed algorithm achieved a correct reconstruction in 16/20 (80%) cases and obtained a mean TRE of 1.53 +- 0.71 vs 7.75 +- 1.83 mm and normalized Dice of 0.95 +- 0.03 vs. 0.84 +- 0.04 compared to the baseline method of aligning slides by centerpoint.

Conclusion

We present a novel algorithm for automated full-resolution 3D reconstruction from sparsely sampled prostatectomy specimens.