Domain: Research

PhD Research: Technology Supported Capturing and Sharing of Multifaceted Running Experience

This photo shows the vision of this research project. On the left, a runner wears different types of sensors to capture data including self report, head motion signals, breathing and voice pattern, arm movement signals, physiological signals such as heart rate and skin conductivity, electromyography, leg movement signals, and feet pressure signals. One the right, it shows a runners' subjective and affective experience including their perceived exertion, happiness, motivation, fatigue, discomfort, pain, attentional focus, and cognitive capability. This photo aims to illustrate how technology could support the capturing and sharing of these multifaceted running experience.
Technology Supported Capturing and Sharing of Multifaceted Running Experience

Project Timeline: 2017 - 2021

Project Overview

Distance running is a dynamic experiential journey in which emotions and feelings play a vital role in constructing the running experience. The current market of running technology has been dominated by the focus on tracking performance-based running metrics. There are new research interests in using machine learning methods to create automatic recognition systems for capturing and sharing affective and experiential aspects of running experience, e.g., using machine learning smart wearables for detecting runners’ perceived exertion or tiredness during running. Such work show potential not only for improving running performance and reducing injury risk, but also enabling runners to share their experience in real-time with their friends or coaches to receive real-time feedback or to engage the audience in sports events.

In this project, we first take a qualitative approach to understand what, when and how people would like to capture and share their running experience. We then apply an explorative design approach to investigate different wearable sensors and means of collecting ground truth for building the automatic recognition system for the multifaceted running experience.

We have published our first CHI paper on understanding the shared experience of runners and spectators during long-distance running events. We have preliminarily assessed the possibility of building wearable systems to recognise runners’ perceived exertion and pain and other affective experience. We are also exploring how to design a chatbot to support runners’ self-reporting for better-quality ground-truth for building automatic recognition systems. Such findings would shed light on further exploration of how to design for runners with disabilities and people in other types of sports.

Outcomes

  • Bi, T., Bianchi-Berthouze, N., Singh, A., & Costanza, E. (2019, May). Understanding the shared experience of runners and spectators in long-distance running events. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (pp. 1-13).
  • Bi, T. (2019, September). Wearable Sensing Technology for Capturing and Sharing Emotional Experience of Running. In 2019 8th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW) (pp. 70-74). IEEE.

Project Team

Prof. Nadia Berthouse

Professor & Deputy Director of UCLIC

Prof. Catherine Holloway

Person's image is not available

Dr Enrico Costanza

Associate Professor & Deputy Director of UCLIC
A South Asian woman with black hair wearing a green chequered top

Dr Aneesha Singh

Lecturer in Human-Computer Interaction