Please use this identifier to cite or link to this item: https://www.um.edu.mt/library/oar/handle/123456789/93436
Title: Assessing task difficulty in software testing using biometric measures
Authors: Camilleri, Daryl
Micallef, Mark
Porter, Chris
Keywords: Computer software -- Verification
Computer software -- Testing
Software engineering -- Case studies
Human-computer interaction
Machine learning
Issue Date: 2022
Publisher: BCS Learning and Development Ltd
Citation: Camilleri, D., Micallef, M., & Porter, C. (2022). Assessing task difficulty in software testing using biometric measures [in press]. 35th International BCS Human-Computer Interaction Conference (HCI), Keele.
Abstract: In this paper, we set out to investigate the extent to which we can classify task difficulty in the software testing domain using psycho-physiological sensors. After reviewing the literature, we adapt the work of Fritz et al. (2014), and transpose it to the testing domain. We present the results of a study conducted with 15 professional software testers carrying out predefined tasks in a lab setting while we collected eye-tracking, brain (EEG) and skin (EDA) data. On average, each participant took part in a two-hour long data collection session. Throughout our study, we captured close to fourteen gigabytes worth of biometric data, consisting of more than a hundred and twenty million data points.
Using this data, we trained 21 Na ̈ıve Bayes classifiers to predict task difficulty from three perspectives (“by participant”, “by task” and “by participant-task”) and using the seven possible combinations of sensors. Our results confirm that we can predict task difficulty for a new tester with a precision of 74.4% and a recall of 72.5% using just an eye-tracker, and for a new task with a precision of 72.2% and a recall of 70.0% using eye-tracking and electrodermal activity. The results achieved are largely consistent with the work of Fritz et al. (2014) among software developers. We go on to provide insights as to what combinations of sensors provide the best results and how this work could be used to develop well-being and workflow support tools in an industry setting.
URI: https://www.um.edu.mt/library/oar/handle/123456789/93436
Appears in Collections:Scholarly Works - FacICTCS

Files in This Item:
File Description SizeFormat 
2022_Assessing Task Difficulty in Software Testing.pdf
  Restricted Access
5.45 MBAdobe PDFView/Open Request a copy


Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.