NAV-VIR

Virtual Map exploration for Visually Impaired People

Developing a multi-modal interface for Visually Impaired People to virtually explore a map in order to prepare for a journey.
Research
Software Engineering
Human-Computer Interaction
Virtual Reality
Haptic Interface
Auditory Interface
Sensory Substitution
Consortium
Began Around

September 1, 2018

Abstract

The goal of the NAV-VIR project was to develop a multimodal interface to allow Visually Impaired People in virtually discovering and exploring unknown areas from the safety of their home. It relies on an interactive Force-Feedback Tablet, the F2T, and an immersive HRTF-based 3D audio simulation relying on binaural recordings of real environments.


Image illustrating the project

1 Introduction


To Be Filled

1.1 Our interface: F2T (v2)

During this project, we improved upon the first iteration of the Force Feedback Tablet (F2T) from the TETMOST project to design the finalized prototype of this interface:

Photo of the F2T interface

Schema explaining the F2T interface

2 Outcomes


1) We developed a Java application to create or convert images into simplified tactile representations, which can then be explored using the F2T:

Screenshot of the F2T control interface

2) We investigated and developed tools to automatically generate a navigation graph from a floor plan, which can then be converted into a tactile image and explored with the F2T:

Illustrations of the navigation graph generation

3 My role in this project


1) Participated in the development of a Java app to control the F2T and display tactile “images”.

2) Helped design the first round of experimental evaluations, where participants where tasked with recognizing and re-drawing simple geometrical shapes, as well as the layout of a simple mock apartment.

A drawing illustrating the experimental setup of NAV-VIR

A drawing illustrating the experimental setup of NAV-VIR

3) Wrote a first-author conference article and a poster.

Back to top