TETMOST

Making Art more accessible to Visually Impaired People

Developping a haptic interface and studying ways to intuitively represent images and Art pieces haptically for Visually Impaired People.
Research
Software Engineering
Human-Computer Interaction
Virtual Reality
Sensory Substitution
Haptic Interface
Consortium
Began Around

September 1, 2017

Abstract

The goal of the TETMOST project was to make Art more accessible to Visually Impaired People. The project had two main axes: (1) researching and experimenting to find the most intuitive way to transcode an image into haptic sensations whilst retaining its original meaning and peculiarities, and (2) developping a novel haptic interface allowing VIP to explore images with their fingers.


Image illustrating the project

1 Introduction


To Be Filled

Schema illustrating the objectives of the TETMOST project

1.1 Exploring existing haptic interfaces

We researched and tried different categories of haptic interfaces in order to asses their strengths and weaknesses for our purposes:

Photo of the StimTACT interface
(a) Taxel mechanical interfaces
Photo of the Hap2U interface
(b) Electro-friction interfaces
Photo of the vibrotactile gloves
(c) Vibrational interfaces
Figure 1: The three main categories of haptic interfaces

Our experience with the existing categories of haptic interfaces allowed us to designed one best adapted to our need: the Force-Feedback Tablet (F2T)

2 Outcomes


2.1 Our interface: F2T (v1)

The first prototype of the F2T was assembled with legos and a camera to better asses the position of the joystick within the frame of the device.

Photo of the first prototype of the F2T device

Schema explaining the F2T interface

2.2 Software tools

1) We developed a Java application to create or convert images into simplified tactile representations, which can then be explored using the F2T:

Screenshot of the F2T control interface

2) In order to display an image haptically, we first needed a way to simplify the image’s content without losing its meaning. To do so, we explored various Computer Vision techniques such as image segmentation and edge detection:

Antique drawing of a horse passed through an edge-detection algorithm to remove unneeded details

Antique painting of a woman carrying a milk jug, passed through another type of edge-detection algorithm to remove unneeded details

3 My role in this project


1) Participated in the development of a Java app to control the F2T and display tactile “images”.

2) Investigated existing haptic interfaces, classifying them based on our needs, and buying/lending a prototype from each category.

3) Organized experimental evaluations with VIP to assess the strengths and weaknesses of each category of interfaces.

4) Participated in writing a poster and two conference articles: one on the F2T, and one on the image segmentation.

Back to top