Opportunistic and Context-Aware Affect Sensing on Smartphones

Rajib Rana, Margee Hume, John Reilly, Raja Jurdak, Jeffrey Soar

Research output: Contribution to journalArticle

10 Citations (Scopus)

Abstract

Opportunistic affect sensing offers unprecedented potential for capturing spontaneous affect, eliminating biases inherent in the controlled setting. Facial expression and voice are two major affective displays, but most affect sensing systems on smartphones avoid them due to extensive power requirements. Encouragingly, due to the recent advent of low-power DSP coprocessor and GPU technology, audio and video sensing are becoming more feasible on smartphones. To utilize opportunistically captured facial expressions and voice, gathering contextual information about the dynamic audiovisual stimuli is also important. This article discusses recent advances in affect sensing on smartphones and identifies the key barriers and potential solutions for implementing opportunistic and context-aware affect sensing on smartphone platforms. In addition to exploring the technical challenges (privacy, battery life, and robust algorithms), the authors also consider the challenges of recruiting and retaining mental health patients. Experimentation with mental health patients is difficult but crucial to showcase the importance and effectiveness of smartphone-centered affect sensing technology.

Original languageEnglish
Article number7445775
Pages (from-to)60-69
Number of pages10
JournalIEEE Pervasive Computing
Volume15
Issue number2
DOIs
Publication statusPublished - 1 Apr 2016
Externally publishedYes

    Fingerprint

Keywords

  • DSP coprocessor
  • GPU
  • healthcare
  • mental health
  • mobile
  • opportunistic affect sensing
  • pervasive computing
  • smartphone

Cite this