SAM-Guide

Spatial Awareness for Multimodal Guidance

Designing an efficient multi-modal interface to help VIP during spatial interactions and sports.
Research
Software Engineering
C#
Unity
Human-Computer Interaction
Augmented Reality
Consortium
Began Around

May 1, 2021

Abstract

SAM-Guide’s high level objective is to efficiently assist Visually Impaired People (VIP) in tasks that require interactions with space. It aims to develop a multimodal interface to assist VIP during different types of spatial interactions, from object reaching, large-scale navigation (indoor and outdoor) to outdoor sports activities (e.g. laser-run). It also aims to study and model how to optimally supplement vision with both auditory and tactile feedback, reframing spatial interactions as target-reaching affordances, and symbolizing spatial properties by 3D ego-centered beacons. Candidate encoding schemes will be evaluated through Augmented Reality (AR) serious games relying on motion capture platforms and indoor localisation solutions to track the user’s movements.

SAM-Guide is a inter-disciplinary collaboration project (ANR 2021 PRC) between three sites: (1) the LPNC and GIPSA laboratories from the Grenoble-Alpes University, (2) the CMAP from Ecole Polytechnique in Paris-Saclay, and the LITIS and CERREV from Normandy University.


Banner illustrating the project

The SAM-Guide project aims to help visually impaired people interact with the world around them by creating smart devices that convert visual information into sounds and vibrations they can feel and hear. Instead of trying to replace vision entirely, the project focuses on giving people the specific spatial information they need for everyday tasks like finding objects, navigating spaces, and even participating in sports activities. The team of researchers from multiple French universities is developing wearable devices like vibrating belts and audio systems that can guide users toward targets or help them understand their surroundings. They’re testing these technologies through virtual reality games and real-world activities, including a new sport called laser-run designed for people with visual impairments. The ultimate goal is to create a common “language” of sounds and vibrations that can help visually impaired people gain more independence in various activities, from simple daily tasks to recreational sports.

NoteMy role in this project

1) I was a major actor behind the birth of this project, by connecting the consortium members together and writing most of the grant proposal (ANR AAPG 2021, funding of 609k€).

2) I designed and participated in the development of the second prototype of our vibro-tactile belt, which features wireless communication (thanks to an ESP32 module) and amovible vibrators:

Photography of the second iteration of the TactiBelt
Figure 1: Second prototype of the TactiBelt

3) I lead the design and development of the project’s experimental platform. The platform uses Unity, connects to various motion tracking devices used by the consortium (Polhemus, VICON, pozyx), uses PureData for sound-wave generation and Steam Audio for 3D audio modeling, and communicates with the consortium’s non-visual interfaces wirelessly.

Screenshot of the testing environment of the experimental platform of SAM-Guide
(a) Testing environment with a PureData audio beacon

 

Screenshot of the maze generator of the experimental platform of SAM-Guide
(b) Auto-generated maze with 3D audio beacons on waypoints
Figure 2: Screenshots from SAM-Guide’s experimental platform (in development)

This platform allows one to easily spin up experimental trials by specifying the desired characteristics in a JSON file (based on the OpenMaze project). Unity will automatically generate the trial’s environment according to those specifications and populate it with the relevant items (e.g. a tactile-signal emitting beacon signalling a target to reach in a maze), handle the transition between successive trials and blocks of trials, and log all the relevant user metrics into a data file.

Screenshot of the experimental protocol file specifying the avatar and the experimental blocks' characteristics
(a) Specifying the avatar and the experimental blocks’ characteristics

 

Screenshot of the experimental protocol file specifying experimental trials, which can be repeated and randomized within blocks
(b) Specifying experimental trials, which can be repeated and randomized within blocks
Figure 3: Examples of settings used to generate experimental trials on the fly.

4) Handled the experimental design of the first wave of experiments using the TactiBelt for “blind” navigation.

5) Built the first version of the SAM-Guide’s website, using Quarto and hosted on GitHub Pages (code).

Back to top