Please use this identifier to cite or link to this item: https://www.um.edu.mt/library/oar/handle/123456789/25487
Title: Automatic segmentation of indoor and outdoor scenes from visual lifelogging
Authors: Buhagiar, Juan
Keywords: Data logging
Wearable cameras
Neural networks (Computer science)
Computer vision
Issue Date: 2017
Abstract: Visual lifelogging and indoor & outdoor classification have been researched on several occasions. The focus of this research is to identify and implement the most effective methods to automatically classify images, captured from a wearable camera, into indoor and outdoor scenes. We incorporated these statistics in a smart phone application where the user can monitor his/her activities. To automatically classify images into indoor & outdoor scenes, we have used pre-trained convolutional neural networks on Imagenet and Places2 and applied transfer learning to adapt to the problem at hand. This method proved to be useful instantly, obtaining accuracies above 90%. Therefore, we investigated ways to improve the classification further by joining different features, using intermediate layer information from the CNNs and improving the classifiers used. When tested on the UBRug dataset, our best method, joining the FC7 intermediate layer's output from both CNNs with a Random Forest classifier, obtained around 98% accuracy. We compared our results with previous tests on the SUN397 dataset, our methods outperformed these tests. The best performing method from these tests obtained an accuracy rate of 94.2% while our best performing method obtained an accuracy of 97.06%. Moreover, we tested re-training a pre-trained CNN on the UBRug Dataset, introducing, the first Ego-centric indoor/outdoor CNN.
Description: B.SC.(HONS)ICT
URI: https://www.um.edu.mt/library/oar//handle/123456789/25487
Appears in Collections:Dissertations - FacICT - 2017
Dissertations - FacICTAI - 2017

Files in This Item:
File Description SizeFormat 
17BITAI007.pdf
  Restricted Access
15.63 MBAdobe PDFView/Open Request a copy


Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.