ICT Project

MAGIC SHOES

Credits: -

Can body-tracking, sound-based wearables alter negative self-perception and reduce physical inactivity?

http://www.researchgate.net/project/MAGIC-SHOES-Changing-sedentary-lifestyles-by-altering-mental-body-representation-using-sensory-feedback

From Dec. 30, 2016 to Dec. 29, 2018

Keywords: wearable sensor health

Description of the challenges faced by the ICT-Project

The overall aim of MAGICSHOES project is to test the feasibility and potential value of using wearable technology integrating sensory-feedback and body-tracking for improving body-representation, motor behaviour and emotion, and ultimately exercise adherence, in those physically inactive or with sedentary lifestyles. At the intersection between neuroscience research on mental body-representation (MBR), human-computer interaction (HCI) and real-life applications, MAGICSHOES will (1) develop a new wearable device that alters people's perception of their body size and capabilities as they walk or do other physical activitiy, resulting in more active motor patterns and positive emotional states. Further, this technology would allow (2) exploring its potential benefits for people that are physically inactive, by addressing emotional and psychological factors related to MBR. MAGICSHOES could be used for self-management and therapy for this population.

Brief description of technology

The artist will have access to all the documentation explaining the project, and to the related bibliography, and will be in close contact with the project team that will provide support and guidance when the artist is elaborating his/her technical approach. The artist will have access to software and hardware for recording, processing and storing audio and other digital data, as well as for capturing motion and emotion relatedsignals from users and delivering audio. these include microphones, amplifiers, equalizers, a bracelet to track emotion related physiological signals from users and movement sensors that connect to computer or smartphones. The artist will be able to use a dedicated computer to process sound in real-time, which uses the software MAX/MSP, as well as a dedicated smartphone. Access to lab consumables for designing the motion/emotion capture system will be also ensured.

What the project is looking to gain from the collaboration and what kind of artist would be suitable

in MAGICSHOES we are exploring how sound can alter the experience of one's own body. This experience is tightly linked to self-identity, self-esteem, self-efficacy, emoacoustics, and to the way of interacting with the environment. We are rethinking how the experience of one's body could be redesigned through the use of sensing technology and sound feedback in order to augment human interaction with the environment and to enhance well-being. Artistic considerations may be deceisive in shedding light on new concepts and methods for inducing sensory and emotional experiences that can increase body awareness and/or alter the perception of one's body. We are also interested in innovative wearable solutions for tracking body signals (movment, physiology) or delivering sound while people are on the move. Through the artist's exploratory work we may gain an understanding on the social acceptability of the new concepts presented and on how technology can address real issues.

Resources available to the artist

The artist will be based at one out of 4 locations which are part of the project (Spain, Estonia, UK, France, see project link). Each partner lab offers an unparalleled environment in terms of knowledge and infrastructure for carrying out this multidisciplinary project. The following resources will be made available to the artist: office facilities for human behavioural testing, as well as research-dedicated equipment (sensory stimulation equipment, a large catalogue of audio-equipment for recording and delivering sound, motion-capture, electrophysiological acquisition, lab consumables, etc) library access, close contact with researchers working on the project.

  • Figure1  -
    Figure1 -
  • Figure2  -
    Figure2 -