4
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Reducing the number of unnecessary biopsies for mammographic BI-RADS 4 lesions through a deep transfer learning method

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          In clinical practice, reducing unnecessary biopsies for mammographic BI-RADS 4 lesions is crucial. The objective of this study was to explore the potential value of deep transfer learning (DTL) based on the different fine-tuning strategies for Inception V3 to reduce the number of unnecessary biopsies that residents need to perform for mammographic BI-RADS 4 lesions.

          Methods

          A total of 1980 patients with breast lesions were included, including 1473 benign lesions (185 women with bilateral breast lesions), and 692 malignant lesions collected and confirmed by clinical pathology or biopsy. The breast mammography images were randomly divided into three subsets, a training set, testing set, and validation set 1, at a ratio of 8:1:1. We constructed a DTL model for the classification of breast lesions based on Inception V3 and attempted to improve its performance with 11 fine-tuning strategies. The mammography images from 362 patients with pathologically confirmed BI-RADS 4 breast lesions were employed as validation set 2. Two images from each lesion were tested, and trials were categorized as correct if the judgement (≥ 1 image) was correct. We used precision (Pr), recall rate (Rc), F1 score (F1), and the area under the receiver operating characteristic curve (AUROC) as the performance metrics of the DTL model with validation set 2.

          Results

          The S5 model achieved the best fit for the data. The Pr, Rc, F1 and AUROC of S5 were 0.90, 0.90, 0.90, and 0.86, respectively, for Category 4. The proportions of lesions downgraded by S5 were 90.73%, 84.76%, and 80.19% for categories 4 A, 4B, and 4 C, respectively. The overall proportion of BI-RADS 4 lesions downgraded by S5 was 85.91%. There was no significant difference between the classification results of the S5 model and pathological diagnosis ( P = 0.110).

          Conclusion

          The S5 model we proposed here can be used as an effective approach for reducing the number of unnecessary biopsies that residents need to conduct for mammographic BI-RADS 4 lesions and may have other important clinical uses.

          Supplementary Information

          The online version contains supplementary material available at 10.1186/s12880-023-01023-4.

          Related collections

          Most cited references27

          • Record: found
          • Abstract: found
          • Article: not found

          Global cancer statistics 2020: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries

          This article provides an update on the global cancer burden using the GLOBOCAN 2020 estimates of cancer incidence and mortality produced by the International Agency for Research on Cancer. Worldwide, an estimated 19.3 million new cancer cases (18.1 million excluding nonmelanoma skin cancer) and almost 10.0 million cancer deaths (9.9 million excluding nonmelanoma skin cancer) occurred in 2020. Female breast cancer has surpassed lung cancer as the most commonly diagnosed cancer, with an estimated 2.3 million new cases (11.7%), followed by lung (11.4%), colorectal (10.0 %), prostate (7.3%), and stomach (5.6%) cancers. Lung cancer remained the leading cause of cancer death, with an estimated 1.8 million deaths (18%), followed by colorectal (9.4%), liver (8.3%), stomach (7.7%), and female breast (6.9%) cancers. Overall incidence was from 2-fold to 3-fold higher in transitioned versus transitioning countries for both sexes, whereas mortality varied <2-fold for men and little for women. Death rates for female breast and cervical cancers, however, were considerably higher in transitioning versus transitioned countries (15.0 vs 12.8 per 100,000 and 12.4 vs 5.2 per 100,000, respectively). The global cancer burden is expected to be 28.4 million cases in 2040, a 47% rise from 2020, with a larger increase in transitioning (64% to 95%) versus transitioned (32% to 56%) countries due to demographic changes, although this may be further exacerbated by increasing risk factors associated with globalization and a growing economy. Efforts to build a sustainable infrastructure for the dissemination of cancer prevention measures and provision of cancer care in transitioning countries is critical for global cancer control.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Lymph Node Metastasis Prediction from Primary Breast Cancer US Images Using Deep Learning.

            Background Deep learning (DL) algorithms are gaining extensive attention for their excellent performance in image recognition tasks. DL models can automatically make a quantitative assessment of complex medical image characteristics and achieve increased accuracy in diagnosis with higher efficiency. Purpose To determine the feasibility of using a DL approach to predict clinically negative axillary lymph node metastasis from US images in patients with primary breast cancer. Materials and Methods A data set of US images in patients with primary breast cancer with clinically negative axillary lymph nodes from Tongji Hospital (974 imaging studies from 2016 to 2018, 756 patients) and an independent test set from Hubei Cancer Hospital (81 imaging studies from 2018 to 2019, 78 patients) were collected. Axillary lymph node status was confirmed with pathologic examination. Three different convolutional neural networks (CNNs) of Inception V3, Inception-ResNet V2, and ResNet-101 architectures were trained on 90% of the Tongji Hospital data set and tested on the remaining 10%, as well as on the independent test set. The performance of the models was compared with that of five radiologists. The models' performance was analyzed in terms of accuracy, sensitivity, specificity, receiver operating characteristic curves, areas under the receiver operating characteristic curve (AUCs), and heat maps. Results The best-performing CNN model, Inception V3, achieved an AUC of 0.89 (95% confidence interval [CI]: 0.83, 0.95) in the prediction of the final clinical diagnosis of axillary lymph node metastasis in the independent test set. The model achieved 85% sensitivity (35 of 41 images; 95% CI: 70%, 94%) and 73% specificity (29 of 40 images; 95% CI: 56%, 85%), and the radiologists achieved 73% sensitivity (30 of 41 images; 95% CI: 57%, 85%; P = .17) and 63% specificity (25 of 40 images; 95% CI: 46%, 77%; P = .34). Conclusion Using US images from patients with primary breast cancer, deep learning models can effectively predict clinically negative axillary lymph node metastasis. Artificial intelligence may provide an early diagnostic strategy for lymph node metastasis in patients with breast cancer with clinically negative lymph nodes. Published under a CC BY 4.0 license. Online supplemental material is available for this article. See also the editorial by Bae in this issue.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Breast Cancer Classification from Histopathological Images with Inception Recurrent Residual Convolutional Neural Network

              The Deep Convolutional Neural Network (DCNN) is one of the most powerful and successful deep learning approaches. DCNNs have already provided superior performance in different modalities of medical imaging including breast cancer classification, segmentation, and detection. Breast cancer is one of the most common and dangerous cancers impacting women worldwide. In this paper, we have proposed a method for breast cancer classification with the Inception Recurrent Residual Convolutional Neural Network (IRRCNN) model. The IRRCNN is a powerful DCNN model that combines the strength of the Inception Network (Inception-v4), the Residual Network (ResNet), and the Recurrent Convolutional Neural Network (RCNN). The IRRCNN shows superior performance against equivalent Inception Networks, Residual Networks, and RCNNs for object recognition tasks. In this paper, the IRRCNN approach is applied for breast cancer classification on two publicly available datasets including BreakHis and Breast Cancer (BC) classification challenge 2015. The experimental results are compared against the existing machine learning and deep learning–based approaches with respect to image-based, patch-based, image-level, and patient-level classification. The IRRCNN model provides superior classification performance in terms of sensitivity, area under the curve (AUC), the ROC curve, and global accuracy compared to existing approaches for both datasets.
                Bookmark

                Author and article information

                Contributors
                wlong_612@126.com
                shendong19741109@163.com
                Journal
                BMC Med Imaging
                BMC Med Imaging
                BMC Medical Imaging
                BioMed Central (London )
                1471-2342
                13 June 2023
                13 June 2023
                2023
                : 23
                : 82
                Affiliations
                [1 ]GRID grid.89957.3a, ISNI 0000 0000 9255 8984, Department of Radiology, , The Affiliated Changzhou No 2 People’s Hospital of Nanjing Medical University, ; Changzhou, 213164 Jiangsu Province P. R. China
                [2 ]GRID grid.452666.5, ISNI 0000 0004 1762 8363, Department of Radiology, , The Second Affiliated Hospital of Soochow University, ; Suzhou, 215004 Jiangsu Province P.R. China
                Article
                1023
                10.1186/s12880-023-01023-4
                10265786
                37312026
                95b1fc47-e4b8-4db6-97d4-b38ff94126f0
                © The Author(s) 2023

                Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

                History
                : 7 July 2022
                : 23 May 2023
                Funding
                Funded by: This study was supported by the Program of Bureau of Science and Technology Foundation of Changzhou.
                Award ID: No. CJ20220260
                Award ID: No. CJ20220260
                Award ID: No. CJ20220260
                Award ID: No. CJ20220260
                Award ID: No. CJ20220260
                Funded by: Suzhou Youth Science and Technology Project.
                Award ID: KJXW2020021
                Categories
                Research
                Custom metadata
                © BioMed Central Ltd., part of Springer Nature 2023

                Radiology & Imaging
                residents,deep transfer learning,fine-tuning,mammography,breast lesions
                Radiology & Imaging
                residents, deep transfer learning, fine-tuning, mammography, breast lesions

                Comments

                Comment on this article

                scite_
                0
                0
                0
                0
                Smart Citations
                0
                0
                0
                0
                Citing PublicationsSupportingMentioningContrasting
                View Citations

                See how this article has been cited at scite.ai

                scite shows how a scientific paper has been cited by providing the context of the citation, a classification describing whether it supports, mentions, or contrasts the cited claim, and a label indicating in which section the citation was made.

                Similar content499

                Cited by3

                Most referenced authors409