DescriptionProstate cancer is the most common non-skin related cancer affecting 1 in 7 men in the United States. Treatment of patients with prostate cancer remains a difficult decision-making process that requires physicians to balance clinical benefits, life expectancy, morbidities, and potential side effects. Gleason scores have been shown to serve as the best predictors of prostate cancer outcomes. In spite of progress made in trying to standardize the grading process, there still remains approximately a 30% grading discrepancy between the score rendered by general pathologists and those provided by experts while reviewing needle biopsies for Gleason pattern 3 and 4, which accounts for more than 70% of daily prostate tissue slides at most institutions. Therefore, we present computational imaging methods for prostate gland analysis which we will utilize to develop an automated reliable computer-aided Gleason grading system. The inspiration for the project starts from the fact that prostate adenocarcinoma is diagnosed by recognizing certain histology fields clinically. Recently, the Gleason grading criteria used to perform Gleason grading was updated to allow more accurate stratification and higher prognostic discrimination as compared to the traditional grading system.
In this thesis work, we have gone beyond Gleason score analysis by introducing survival model assessment to predict patient outcomes. Using whole-slide images (WSIs) generated from biopsy tissues from radical prostatectomy surgical specimens, we utilize deep learning approaches to discover the most promising computational image biomarkers. The proposed method differs from existing survival analysis studies that use individual patches or manually designed protocols to select a set of patches. In contrast to those approaches, we develop an end-to-end methodology to learn from patches that are analyzed sequentially while preserving their inter-spatial relationships within the WSIs. We build the automatically cropped patches from a WSI as a sequence and use the recurrent neural network to generate a salient representative computational biomarker for the WSI.
Automatic and accurate Gleason grading of histopathology tissue slides is crucial for reliable prostate cancer diagnosis, treatment, and prognosis. Usually, histopathology tissue slides from different institutions show heterogeneous appearances because of variation in tissue preparation and staining procedures, thus the predictable model learned from one domain may not be applicable to a new domain, directly. Here we propose to adopt unsupervised domain adaptation to transfer the discriminative knowledge obtained from the source domain to the target domain without requiring labeling of images at the target domain. The adaptation is achieved through adversarial training to find an invariant feature space along with the proposed Siamese architecture on the target domain to add the regularization that is appropriate for the whole-slide images. We validate the method on two prostate cancer datasets and obtain significant classification improvement of Gleason score as compared with the baseline models.
Finally, we explore the possibility of utilizing cluster computing infrastructure to speed up the analysis. The nuclei detection algorithm that was previously reported extremely reliable in terms of accuracy, but suffered from the fact that performance took an inordinate amount of time to run on a single machine. We have addressed this challenge and present here a parallel nuclei detection algorithm that has been implemented on CometCloud.