CSEar: Metalearning for Head Gesture Recognition Using Earphones in Internet of Healthcare Things

Research output: Contribution to journalArticlepeer-review

7 Scopus citations

Abstract

With the popularity of personal computing devices, people often keep long-term head immobility in front of screens, resulting in the emergence of 'phubbers' and 'office workers.' The early warning solutions in the Internet of Healthcare Things (IoHT) have brought hope to protect users' health and safety. However, most existing works cannot recognize the different head gestures during walking, which is also a common cause of text neck and traffic accidents. In addition, they also need a large amount of data to update the model to adapt to the new environment, which reduces the practicality of the model. To solve these problems, we propose a system, CSEar, based on built-in accelerometers of off-the-shelf wireless earphones, which can recognize 12 kinds of head gestures both in resting and walking states. First, an innovative algorithm is designed to detect head gesture signals, especially for the signals mixed with gait. Then, we propose the MetaSensing, a head gesture recognition model that can improve the recognition ability with few samples compared with the existing metalearning algorithms. Finally, the experimental results prove the effectiveness and robustness of the CSEar.

Original languageEnglish
Pages (from-to)23176-23187
Number of pages12
JournalIEEE Internet of Things Journal
Volume9
Issue number22
DOIs
StatePublished - 15 Nov 2022

Keywords

  • Earphones
  • head gesture
  • Internet of Healthcare Things (IoHT)
  • metalearning

Fingerprint

Dive into the research topics of 'CSEar: Metalearning for Head Gesture Recognition Using Earphones in Internet of Healthcare Things'. Together they form a unique fingerprint.

Cite this