Inviting an author to review:
Find an author and click ‘Invite to review selected article’ near their name.
Search for authorsSearch for similar articles
0
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: not found
      • Article: not found

      Automated brain tumour Segmentation using multimodal brain scans, a survey based on models submitted to the BraTS 2012-18 challenges

      Read this article at

      ScienceOpenPublisherPubMed
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Reliable brain tumor segmentation is essential for accurate diagnosis and treatment planning. Since manual segmentation of brain tumors is a highly time-consuming, expensive and subjective task, practical automated methods for this purpose are greatly appreciated. But since brain tumors are highly heterogeneous in terms of location, shape, and size, developing automatic segmentation methods has remained a challenging task over decades. This paper aims to review the evolution of automated models for brain tumor segmentation using multimodal MR images. In order to be able to make a just comparison between different methods, the proposed models are studied for the most famous benchmark for brain tumor segmentation, namely the BraTS challenge [1]. The BraTS 2012-2018 challenges and the state-of-the-art automated models employed each year are analysed. The changing trend of these automated methods since 2012 are studied and the main parameters that affect the performance of different models are analysed.

          Related collections

          Author and article information

          Journal
          IEEE Reviews in Biomedical Engineering
          IEEE Rev. Biomed. Eng.
          Institute of Electrical and Electronics Engineers (IEEE)
          1937-3333
          1941-1189
          2019
          : 1
          Article
          10.1109/RBME.2019.2946868
          31613783
          e91561a6-74c5-4221-b36e-6d857a50cd80
          © 2019
          History

          Comments

          Comment on this article