Please use this identifier to cite or link to this item: https://www.um.edu.mt/library/oar/handle/123456789/29162
Title: A deep learning approach for detecting and correcting highlights in endoscopic images
Authors: Rodriguez-Sanchez, Antonio
Chea, Daly
Azzopardi, George
Stabinger, Sebastian
Keywords: Image processing
Imaging systems -- Algorithms
Imaging systems in medicine
Issue Date: 2017
Publisher: Institute of Electrical and Electronics Engineers
Citation: Rodriguez-Sanchez, A., Chea, D., Azzopardi, G., & Stabinger, S. (2017). A deep learning approach for detecting and correcting highlights in endoscopic images.Seventh International Conference on Image Processing Theory, Tools and Applications (IPTA, 2017), IEEE, Montreal, 1-6.
Abstract: The image of an object changes dramatically depending on the lightning conditions surrounding that object. Shadows, reflections and highlights can make the object very difficult to be recognized for an automatic system. Additionally, images used in medical applications, such as endoscopic images and videos contain a large amount of such reflective components. This can pose an extra difficulty for experts to analyze such type of videos and images. It can then be useful to detect - and possibly correct - the locations where those highlights happen. In this work we designed a Convolutional Neural Network for that task. We trained such a network using a dataset that contains groundtruth highlights showing that those reflective elements can be learnt and thus located and extracted. We then used that trained network to localize and correct the highlights in endoscopic images from the El Salvador Atlas Gastrointestinal videos obtaining promising results.
URI: https://www.um.edu.mt/library/oar//handle/123456789/29162
Appears in Collections:Scholarly Works - FacICTAI

Files in This Item:
File Description SizeFormat 
A_deep_learning_approach_for_detecting_and_correcting_highlights_in_endoscopic_ images_2017.pdf
  Restricted Access
972.35 kBAdobe PDFView/Open Request a copy


Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.