Please use this identifier to cite or link to this item:
|Title:||Don't classify ratings of affect ; rank them!|
|Authors:||Martinez, Hector P.|
Yannakakis, Georgios N.
|Keywords:||Computer games -- Design|
|Publisher:||Institute of Electrical and Electronics Engineers Inc.|
|Citation:||Martinez, H. P., Yannakakis, G. N., & Hallam, J. (2014). Don’t classify ratings of affect; rank them!. IEEE Transactions on Affective Computing, 5(3), 314-326.|
|Abstract:||How should affect be appropriately annotated and how should machine learning best be employed to map manifestations of affect to affect annotations? What is the use of ratings of affect for the study of affective computing and how should we treat them? These are the key questions this paper attempts to address by investigating the impact of dissimilar representations of annotated affect on the efficacy of affect modelling. In particular, we compare several different binary-class and pairwise preference representations for automatically learning from ratings of affect. The representations are compared and tested on three datasets: one synthetic dataset (testing “in vitro”) and two affective datasets (testing “in vivo”). The synthetic dataset couples a number of attributes with generated rating values. The two affective datasets contain physiological and contextual user attributes, and speech attributes, respectively; these attributes are coupled with ratings of various affective and cognitive states. The main results of the paper suggest that ratings (when used) should be naturally transformed to ordinal (ranked) representations for obtaining more reliable and generalisable models of affect. The findings of this paper have a direct impact on affect annotation and modelling research but, most importantly, challenge the traditional state-of-practice in affective computing and psychometrics at large.|
|Appears in Collections:||Scholarly Works - InsDG|
Files in This Item:
|Dont_classify_ratings_of_affect_rank_them.pdf||1.42 MB||Adobe PDF||View/Open|
Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.