Automatic Storytelling from Wearable Sensor Data for Health and Wellness Applications

Ky Trung Nguyen, Jani Mäntyjärvi, Tran Thi Ngoc Nguyen

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

Abstract

Storytelling describes our daily living activities in many ways. It assists us to understand what we have done to advance our health and wellbeing. In this paper, we present our novel approach to generate scripts from events, which are detected from wearable sensor data. First, we use Deep Neural Network (DNN) to recognize semantic concepts such as gesture, activity, and location for generating a chronological sequence of events. Second, we apply a sequence to sequence (SEQ2SEQ) model consisting of two recurrent neural networks (RNNs) to generate human-understandable stories. The results show that our method can improve the performance of script generation (SG) by using SEQ2SEQ with 0.972 BLEU-1 score.
Original languageEnglish
Title of host publication2021 IEEE EMBS International Conference on Biomedical and Health Informatics (BHI)
PublisherIEEE Institute of Electrical and Electronic Engineers
Pages1-4
Number of pages4
ISBN (Electronic)978-1-6654-0358-0
ISBN (Print)978-1-6654-4770-6
DOIs
Publication statusPublished - 30 Jul 2021
MoE publication typeA4 Article in a conference publication
EventIEEE EMBS International Conference on Biomedical and Health Informatics, BHI: Online - Virtual, Athens, Greece
Duration: 27 Jul 202130 Jul 2021

Conference

ConferenceIEEE EMBS International Conference on Biomedical and Health Informatics, BHI
Abbreviated titleIEEE BHI-BSN 2021
Country/TerritoryGreece
CityAthens
Period27/07/2130/07/21

Keywords

  • Storytelling
  • Human Activity Recognition
  • Machine Learning
  • Script Generation

Fingerprint

Dive into the research topics of 'Automatic Storytelling from Wearable Sensor Data for Health and Wellness Applications'. Together they form a unique fingerprint.

Cite this