<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:dc="http://purl.org/dc/elements/1.1/" version="2.0">
  <channel>
    <title>OAR@UM Collection:</title>
    <link>https://www.um.edu.mt/library/oar/handle/123456789/66054</link>
    <description />
    <pubDate>Sun, 26 Apr 2026 01:25:23 GMT</pubDate>
    <dc:date>2026-04-26T01:25:23Z</dc:date>
    <item>
      <title>Development of image processing algorithms for taxiway line detection and tracking</title>
      <link>https://www.um.edu.mt/library/oar/handle/123456789/66253</link>
      <description>Title: Development of image processing algorithms for taxiway line detection and tracking
Abstract: In the taxiing phase of flight, the pilots are usually busy executing pre-flight checklists &#xD;
and communicating with Air Traffic Control (ATC). This may lead to a loss of &#xD;
situational awareness and can result in a collision with another aircraft, vehicle or other &#xD;
objects present in the vicinity on the aerodrome. Furthermore, pilots might miss an exit &#xD;
or turn if they are not familiar with the airport or if the visibility is not good. This thesis &#xD;
describes a novel computer vision-based method to detect taxiway lines on an &#xD;
aerodrome and to estimate the deviation of a large passenger aircraft from the taxiway &#xD;
centerline. This method processes video footage captured from an onboard camera &#xD;
located on the vertical stabilizer of an A380 aircraft.  &#xD;
 &#xD;
The proposed method can be applied as part of a larger system to increase the situation &#xD;
awareness of pilots during taxiing. It can also alert them if the aircraft deviates from the &#xD;
centerline. The method makes use of colour and edge information present in the camera &#xD;
images. The thesis proposes a Sliding Window (SW) method and clustering techniques &#xD;
to detect and process the taxiways markings.  &#xD;
 &#xD;
In the first step, the input image is transformed to a top-down view by applying the &#xD;
Homographic Transform. Then, colour and edge detection techniques are applied to the &#xD;
top-down view and a binary image of pixels belonging to taxiway lines is generated. In &#xD;
the next step, the region of the image directly in front of the aircraft is processed to &#xD;
determine whether the aircraft is turning or moving in a straight line. If it is determined &#xD;
that the aircraft is in a turn manoeuver, the Density-Based Spatial Clustering of &#xD;
Applications with Noise (DBSCAN) clustering technique is applied to the binary image &#xD;
obtained in the previous step. Otherwise, a SW method is used to detect the taxiway centerline ahead of the aircraft and any taxiway line intersections present in the image. &#xD;
In the final step, the aircraft’s lateral deviation from the taxiway centerline (the cross&#xD;
track error) is estimated using a template matching approach.  &#xD;
 &#xD;
The entire algorithm was tested on simulated video sequences of taxi scenarios in &#xD;
different illumination and visibility conditions. The test results obtained during daytime &#xD;
(without fog) showed that the SW method achieved a detection rate of 80% and a false &#xD;
positive rate of 3%. On the other hand, the clustering technique achieved a detection &#xD;
rate of 76% and a false positive rate of 4%. The aircraft’s deviation  from the taxiway &#xD;
centerline was estimated with a mean error of 0.8 pixels and a worst case error of ±3 &#xD;
pixels.
Description: M.SC.AEROSPACE</description>
      <pubDate>Wed, 01 Jan 2020 00:00:00 GMT</pubDate>
      <guid isPermaLink="false">https://www.um.edu.mt/library/oar/handle/123456789/66253</guid>
      <dc:date>2020-01-01T00:00:00Z</dc:date>
    </item>
  </channel>
</rss>

