Please use this identifier to cite or link to this item:
https://www.um.edu.mt/library/oar/handle/123456789/137832| Title: | INSIGHT : intelligent navigation system for integrated guidance haptic technologies |
| Authors: | Gaerty, Nick (2025) |
| Keywords: | Haptic devices -- Malta Vision disorders -- Malta Computer vision -- Malta Navigation -- Malta |
| Issue Date: | 2025 |
| Citation: | Gaerty, N. (2025). INSIGHT: intelligent navigation system for integrated guidance haptic technologies (Bachelor's dissertation). |
| Abstract: | This thesis presents the Intelligent Navigation System for Integrated Guidance Haptic Technologies (INSIGHT) project, developed to assist individuals with visual impairments in navigating indoor environments. The system enables object recognition through an embedded computer vision model and delivers navigational feedback using audio and vibration cues. These allow users to form a mental representation of their surroundings without relying on visual input. The application was developed natively for Android using Kotlin and integrates a TensorFlow Lite implementation of the You Only Look Once, Version 8 (YOLOv8) object detection model. A gesture‐based interaction system was employed, using tap and hold gestures to switch between two operational modes: ’General Mode’, which provides auditory announcements of nearby objects, and ’Specific Mode’, which assists in locating a selected object using adaptive vibration intensity. Evaluation was carried out across multiple environments, lighting conditions, and object types. The model achieved up to 93% detection accuracy for commonly encountered items such as chairs, beds, and tables, while maintaining an average feedback latency of under one second. Medium‐accuracy classes like cupboards and bottles yielded detection rates between 40–78%, whereas low‐accuracy classes such as doors and windows were rarely detected. Informal usability testing with blindfolded participants confirmed that users could operate the application solely through gesture input and multimodal feedback, without relying on visual interfaces The results indicate that the INSIGHT system offers a viable, screen‐free navigation solution for visually impaired users in everyday settings. While limitations remain, particularly in detecting low‐contrast or occluded objects, the system demonstrates clear value in promoting independent movement. Future enhancements will focus on improving detection in low‐light and cluttered environments, as well as expanding interaction options through voice control. |
| Description: | B.Sc. (Hons) ICT(Melit.) |
| URI: | https://www.um.edu.mt/library/oar/handle/123456789/137832 |
| Appears in Collections: | Dissertations - FacICT - 2025 Dissertations - FacICTAI - 2025 |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| 2508ICTICT390900016335_1.PDF Restricted Access | 3.75 MB | Adobe PDF | View/Open Request a copy |
Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.
