Please use this identifier to cite or link to this item:
Title: Exploring the U-Net++ model for automatic brain tumor segmentation
Authors: Micallef, Neil
Seychell, Dylan
Bajada, Claude J.
Keywords: Neurology
Brain -- Tumors
Image segmentation
Optical data processing
Artificial intelligence
Issue Date: 2021
Publisher: IEEE
Citation: Micallef, N., Seychell, D., & Bajada, C. J. (2021). Exploring the U-Net++ model for automatic brain tumor segmentation. IEEE Access. DOI:10.1109/ACCESS.2021.3111131
Abstract: The accessibility and potential of deep learning techniques have increased considerably over the past years. Image segmentation is one of the many fields which have seen novel implementations being developed to solve problems in the domain. U-Net is an example of a popular deep learning model designed specifically for biomedical image segmentation, initially proposed for cell segmentation. We propose a variation of the U-Net++ model, which is itself an adaptation of U-Net, and evaluate its brain tumor segmentation capabilities. The proposed approach obtained Dice Coefficient scores of 0.7192, 0.8712, and 0.7817 for the Enhancing Tumor, Whole Tumor and Tumor Core classes of the BraTS 2019 challenge Validation Dataset. The proposed approach differs from the standard U-Net++ model in a number of ways, including the loss function, number of convolutional blocks, and method of employing deep supervision. Data augmentation and post-processing techniques were also implemented and observed to substantially improve the model predictions. Thus, this article presents a novel adaptation of the U-Net++ architecture, which is both lightweight, and performs comparably with peer-reviewed work evaluated on the same data.
Appears in Collections:Scholarly Works - FacICTAI
Scholarly Works - FacM&SPB

Files in This Item:
File Description SizeFormat 
Exploring_the_U-Net_model_for_Automatic_Brain_Tumor_Segmentation.pdf6.81 MBAdobe PDFView/Open

Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.