Format

Send to

Choose Destination
Comput Methods Programs Biomed. 2020 Jan 25;190:105360. doi: 10.1016/j.cmpb.2020.105360. [Epub ahead of print]

Computer-aided tumor detection in automated breast ultrasound using a 3-D convolutional neural network.

Author information

1
Department of Radiology, Seoul National University Hospital and Seoul National University College of Medicine, South Korea.
2
Department of Computer Science and Information Engineering, National Taiwan University, Taipei, Taiwan.
3
Department of Surgery, National Taiwan University Hospital, Taipei, Taiwan.
4
Department of Computer Science and Information Engineering, National Taiwan University, Taipei, Taiwan; Graduate Institute of Network and Multimedia, National Taiwan University, Taipei, Taiwan; Graduate Institute of Biomedical Electronics and Bioinformatics, National Taiwan University, Taipei, Taiwan; MOST Joint Research Center for AI Technology and All Vista Healthcare, Taipei, Taiwan. Electronic address: rfchang@csie.ntu.edu.tw.

Abstract

BACKGROUND AND OBJECTIVES:

Automated breast ultrasound (ABUS) is a widely used screening modality for breast cancer detection and diagnosis. In this study, an effective and fast computer-aided detection (CADe) system based on a 3-D convolutional neural network (CNN) is proposed as the second reader for the physician in order to decrease the reviewing time and misdetection rate.

METHODS:

Our CADe system uses the sliding window method, a CNN-based determining model, and a candidate aggregation algorithm. First, the sliding window method is performed to split the ABUS volume into volumes of interest (VOIs). Afterward, VOIs are selected as tumor candidates by our determining model. To achieve higher performance, focal loss and ensemble learning are used to solve data imbalance and reduce false positive (FP) and false negative (FN) rates. Because several selected candidates may be part of the same tumor and they may overlap each other, a candidate aggregation method is applied to merge the overlapping candidates into the final detection result.

RESULTS:

In the experiments, 165 and 81 cases are utilized for training the system and evaluating system performance, respectively. On evaluation with the 81 cases, our system achieves sensitivities of 100% (81/81), 95.3% (77/81), and 90.9% (74/81) with FPs per pass (per case) of 21.6 (126.2), 6.0 (34.8), and 4.6 (27.1) respectively. According to the results, the number of FPs per pass (per case) can be diminished by 56.8% (57.1%) at a sensitivity of 95.3% based on our tumor detection model.

CONCLUSIONS:

In conclusion, our CADe system using 3-D CNN with the focal loss and ensemble learning may have the capability of being a tumor detection system in ABUS image.

KEYWORDS:

3-D convolutional neural network; Automated breast ultrasound; Breast cancer; Computer-aided detection; Ensemble learning; Focal loss

PMID:
32007838
DOI:
10.1016/j.cmpb.2020.105360

Conflict of interest statement

Declaration of Competing Interest The authors declare that they have no financial and personal relationships with other people or organizations that could inappropriately influence their work.

Supplemental Content

Full text links

Icon for Elsevier Science
Loading ...
Support Center