As human beings, we interact with each other continuously by switching between various modes of verbal and non-verbal communication, such as voice, touch, and body language. Similarly, we interact with our electronic devices – smart phones, tablets, cameras, car infotainment systems, and so on – using various methods. For example, we can zoom in on a map on a tablet device by pinching the screen; or look at photos on our phone by swiping a finger on the screen; or look up something on the Internet by means of a voice command.
While the use of multimodal interaction – particularly via voice commands (also known as Direct Voice Input) and touch gestures – has become established in the consumer market (and increasingly in sectors such as retail, banking, and healthcare), the integration of these technologies in the flight deck of modern, large, commercial aircraft is still limited. In fact, the pilots of such aircraft still interact with aircraft avionics primarily through the use of physical knobs, buttons, switches and other controls. The delay in the introduction of new interaction modes such as voice and touch is mainly due to the safety implications of the use of such technologies in the flight deck. If a smart phone responds incorrectly to a touch gesture or is slow to respond to a voice command, the user may become frustrated, but it is highly unlikely that there will be any serious consequences. On the other hand, the situation is very different in the flight deck environment, where malfunctioning equipment can potentially have an impact on the safety of the aircraft and its passengers. Furthermore, such an environment presents several challenges due to operational factors such as noise, turbulence, and varying lighting conditions. Therefore, before new interaction technology can be introduced to the flight deck, it has to be ensured that it can operate accurately, reliably, securely and safely under all flight conditions.
A lot of research and development has been carried out in the last few decades by aircraft and avionics manufacturers (such as Rockwell Collins, Honeywell, and Thales Avionics) and research institutions worldwide, to come up with concepts for touch screen interaction and voice recognition in the flight deck environment. In fact, commercial touch screen solutions are already available for the business jet market and are expected to become operational in a few years. These solutions are typically based on the use of large multi-function touch screen displays which replace the more traditional cockpit screens. Moreover, Direct Voice Input and touch screen technologies have been in use on military fighter jets – such as the Eurofighter Typhoon and the F-35 Lighting II – for several years. Voice control allows the pilots to issue commands to the aircraft without having to leave the controls or to look down at a particular screen, thus enabling the pilots to focus on other tasks.
Over the last few years, the University of Malta – through the Institute of Aerospace Technologies – has participated in a number of research projects focussing on the use of touch screen technology in the flight deck of large commercial aircraft. During Touch Flight – a project funded through the MCST FUSION R&I programme – the Institute of Aerospace Technologies developed a novel concept for the interaction between pilots and avionics systems. This concept is based on the use of a single touch screen interface which can be placed directly in front of the pilots and which allows them to monitor and control various avionics systems by means of touch screen gestures instead of using conventional knobs, switches and other control and display devices. The Institute is now developing the concept further in a new research project called Touch Flight 2. In addition to touch screen gestures, this project is also investigating the application of voice control as an alternative interaction mode between pilots and cockpit avionics. Furthermore, another objective of the project is to investigate the possibility of automating certain tasks that are normally conducted by the crew, in order to reduce workload and improve flight safety.
Touch Flight 2 is a € 200,000 project which is being funded through the MCST FUSION R&I programme and is a collaboration between the University of Malta and QuAero Ltd, a Maltese aviation consultancy company. The project started in February of this year and will last for 28 months.