NAV-VIR: an audio-tactile virtual environment to assist visually impaired people

International IEEE/EMBS Conference on Neural Engineering

We introduce NAV-VIR, a multimodal interface for the interactive exploration of maps by Visually Impaired People
Cognitive Psychology
Sensory Substitution
Accessibility
Engineering
Auditory Interface
Haptic Interface
Augmented Reality
Authors
Affiliations

Marc-Aurèle Rivière

Simon Gay

Katerine Romeo

Edwige Pissaloux

Michal Bujacz

Piotr Skulimowski

Pawel Strumillo

Published

May 20, 2019

Doi
Abstract

This paper introduces the NAV-VIR system, a multimodal virtual environment to assist visually impaired people in virtually discovering and exploring unknown areas from the safety of their home. The originality of NAV-VIR resides in (1) an optimized representation of the surrounding topography, the spatial gist, based on human spatial cognition models and the sensorimotor supplementation framework, and (2) a multimodal orientation-aware immersive virtual environment relying on two synergetic interfaces: an interactive force feedback tablet, the F2T, and an immersive HRTF-based 3D audio simulation relying on binaural recordings of real environments. This paper presents NAV-VIR functionalities and its preliminary evaluation through a simple shape and movement perception task.


Back to top

Citation

BibTeX citation:
@inproceedings{rivière2019,
  author = {Rivière, Marc-Aurèle and Gay, Simon and Romeo, Katerine and
    Pissaloux, Edwige and Bujacz, Michal and Skulimowski, Piotr and
    Strumillo, Pawel},
  publisher = {IEEE},
  title = {NAV-VIR: An Audio-Tactile Virtual Environment to Assist
    Visually Impaired People},
  booktitle = {Proceedings of the International IEEE/EMBS Conference on
    Neural Engineering},
  pages = {1038-1041},
  date = {2019-05-20},
  url = {https://ieeexplore.ieee.org/document/8717086},
  doi = {10.1109/NER.2019.8717086},
  isbn = {978-1-5386-7921-0},
  langid = {en},
  abstract = {This paper introduces the
    {[}NAV-VIR{]}(/content/projects/NAV-VIR) system, a multimodal
    virtual environment to assist visually impaired people in virtually
    discovering and exploring unknown areas from the safety of their
    home. The originality of NAV-VIR resides in (1) an optimized
    representation of the surrounding topography, the spatial gist,
    based on human spatial cognition models and the sensorimotor
    supplementation framework, and (2) a multimodal orientation-aware
    immersive virtual environment relying on two synergetic interfaces:
    an interactive force feedback tablet, the F2T, and an immersive
    HRTF-based 3D audio simulation relying on binaural recordings of
    real environments. This paper presents NAV-VIR functionalities and
    its preliminary evaluation through a simple shape and movement
    perception task.}
}
For attribution, please cite this work as:
Rivière, M.-A., Gay, S., Romeo, K., Pissaloux, E., Bujacz, M., Skulimowski, P., & Strumillo, P. (2019). NAV-VIR: an audio-tactile virtual environment to assist visually impaired people. Proceedings of the International IEEE/EMBS Conference on Neural Engineering, 1038–1041. https://doi.org/10.1109/NER.2019.8717086