Transactions on Affective Computing

IEEE Transactions on Affective Computing (TAC) is intended to be a cross disciplinary and international archive journal aimed at disseminating results of research on the design of systems that can recognize, interpret, and simulate human emotions and related affective phenomena. Read the full scope of TAC


Expand your horizons with Colloquium, a monthly survey of abstracts from all CS transactions!


From the October-December 2018 issue

Multimodal Depression Detection: Fusion Analysis of Paralinguistic, Head Pose and Eye Gaze Behaviors

By Sharifa Alghowinem, Roland Goecke, Michael Wagner, Julien Epps, Matthew Hyett, Gordon Parker, and Michael Breakspear

Featured article thumbnail image An estimated 350 million people worldwide are affected by depression. Using affective sensing technology, our long-term goal is to develop an objective multimodal system that augments clinical opinion during the diagnosis and monitoring of clinical depression. This paper steps towards developing a classification system-oriented approach, where feature selection, classification and fusion-based experiments are conducted to infer which types of behaviour (verbal and nonverbal) and behaviour combinations can best discriminate between depression and non-depression. Using statistical features extracted from speaking behaviour, eye activity, and head pose, we characterise the behaviour associated with major depression and examine the performance of the classification of individual modalities and when fused. Using a real-world, clinically validated dataset of 30 severely depressed patients and 30 healthy control subjects, a Support Vector Machine is used for classification with several feature selection techniques. Given the statistical nature of the extracted features, feature selection based on T-tests performed better than other methods. Individual modality classification results were considerably higher than chance level (83 percent for speech, 73 percent for eye, and 63 percent for head). Fusing all modalities shows a remarkable improvement compared to unimodal systems, which demonstrates the complementary nature of the modalities. Among the different fusion approaches used here, feature fusion performed best with up to 88 percent average accuracy. We believe that is due to the compatible nature of the extracted statistical features.

download PDF View the PDF of this article      csdl View this issue in the digital library


Editorials and Announcements

Announcements

  • According to Clarivate Analytics' 2016 Journal Citation Report, TAC has an impact factor of 3.149.
  • Heartfelt congratulations are offered to Georgios N. Yannakakis and Julian Togelius, authors of "Experience-Driven Procedural Content Generation," who were presented with TAC's Most Influential Paper Award by Editor-in-Chief Björn W. Schuller at the 2015 6th AAAC Affective Computing and Intelligent Interaction Conference in Xi'An, P.R. China on 22 September 2015.

Call-for-Papers

Editorials


Guest Editorials


Reviewers List


Author Index


Access recently published TAC articles

RSS Subscribe to the RSS feed of recently published TAC content

mail icon Sign up for e-mail notifications through IEEE Xplore Content Alerts

preprints icon View TAC preprints in the Computer Society Digital Library


TAC -- Call for Papers cover

View the PDF of TAC's ongoing call-for-papers.

TAC is indexed in ISI