Format

Send to

Choose Destination
IEEE Trans Pattern Anal Mach Intell. 2013 Aug;35(8):1930-43. doi: 10.1109/TPAMI.2012.277.

Stacked autoencoders for unsupervised feature learning and multiple organ detection in a pilot study using 4D patient data.

Author information

1
Institute of Cancer Rearch Royal Marsden NHS Foundation Trust, Sutton, United Kingdom. hoo.shin@icr.ac.uk

Abstract

Medical image analysis remains a challenging application area for artificial intelligence. When applying machine learning, obtaining ground-truth labels for supervised learning is more difficult than in many more common applications of machine learning. This is especially so for datasets with abnormalities, as tissue types and the shapes of the organs in these datasets differ widely. However, organ detection in such an abnormal dataset may have many promising potential real-world applications, such as automatic diagnosis, automated radiotherapy planning, and medical image retrieval, where new multimodal medical images provide more information about the imaged tissues for diagnosis. Here, we test the application of deep learning methods to organ identification in magnetic resonance medical images, with visual and temporal hierarchical features learned to categorize object classes from an unlabeled multimodal DCE-MRI dataset so that only a weakly supervised training is required for a classifier. A probabilistic patch-based method was employed for multiple organ detection, with the features learned from the deep learning model. This shows the potential of the deep learning model for application to medical images, despite the difficulty of obtaining libraries of correctly labeled training datasets and despite the intrinsic abnormalities present in patient datasets.

PMID:
23787345
DOI:
10.1109/TPAMI.2012.277
[Indexed for MEDLINE]

Supplemental Content

Full text links

Icon for IEEE Engineering in Medicine and Biology Society
Loading ...
Support Center