Multimodal sensing, recognizing and browsing group social dynamics

Zhiwen Yu, Zhiyong Yu, Xingshe Zhou, Yuichi Nakamura

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

Group social dynamics is crucial for determining whether a meeting was well organized and the conclusion well reasoned. In this paper, we propose multimodal approaches for sensing, recognition and browsing of social dynamics, specifically human semantic interactions and group interests in small group meetings. Unlike physical interactions (e.g., turn-taking and addressing), the human interactions considered here are incorporated with semantics, i.e., user intention or attitude toward a topic. Group interests are defined as episodes in which participants engaged in an emphatic and heated discussion. We adopt multiple sensors, such as video cameras, microphones and motion sensors for meeting capture. Multimodal methods are proposed for human interaction recognition and group interest recognition based on a variety of features. A graphical user interface, the MMBrowser, is presented for browsing group social dynamics. Experimental results have demonstrated the feasibility of the proposed approaches.

Original languageEnglish
Pages (from-to)695-702
Number of pages8
JournalPersonal and Ubiquitous Computing
Volume14
Issue number8
DOIs
StatePublished - Dec 2010

Keywords

  • Group interest
  • Group social dynamics
  • Human interaction
  • Multimodal
  • Smart meeting

Fingerprint

Dive into the research topics of 'Multimodal sensing, recognizing and browsing group social dynamics'. Together they form a unique fingerprint.

Cite this