Back to the previous page

Digital haptics on a tablet improve performance on a central visual task, paving the way for safer drivers.

Every year there are over a million road traffic deaths. Road traffic injuries are the leading killer of people aged 5-29 years. Driving is visually-demanding, and any additional in-vehicle tasks risk detracting attention to the road. Time spent looking inside the vehicle is not spent looking at the road for potential hazards.

That dashboards in modern automobiles now include a touch screen tablet may exacerbate this situation. Drivers using a tablet in a car likely shift their visual attention away from the road towards the screen. Drivers glance away to control the temperature in the car, adjust a navigator, or select music. Operating devices that require glances away from the road can have negative effects on driving performance. Glances of 2 seconds lead to 3.6 times more lane departures than do glances of 1 second. Thus, as drivers increase their use of such in-vehicle devices there is an associated increase in related crashes. Visual attention during driving is an important predictor of accidents.

In a study published today in Scientific Reports, Professor Micah Murray, Scientific and Academic Director of The Sense Innovation and Research Center in Lausanne and Sion, Switzerland, and his colleagues show that when a tablet provides haptic information – i.e. the impression of texture – performance improves on a central attention-demanding visual task similar to driving. This is the first quantitative support for introducing digital haptics into vehicle and similar interfaces.

To show this improvement, Dr. Ruxandra Tivadar, the lead author on the work that was part of her Ph.D. thesis at The University of Lausanne, had 25 adults perform 2 tasks simultaneously. The main task required people to indicate when a target letter of a predefined color (e.g. a dark blue “T”) appeared on a central display. This task was made challenging by embedding this target within a set of 42 distracters (e.g. light blue “T”s as well as dark and light blue “L”s). Simultaneously, the participants in the study also had to control slider “buttons” on a tablet. These sliders could be just visual, as in currently standard tablets. Alternatively, the sliders could be presented only haptic with no visual information, thanks to the introduction of innovative digital haptics technology. Lastly, the sliders could have both visual and haptic information.

What the team found is that people’s ability to detect target letters was fastest when the tablet task was completed using just haptic information alone. By contrast, when the tablet was functioning with only visual or both visual and haptic information, performance in detecting target letters was slowed. Dr. Tivadar highlights these applications of digital haptic technology more generally. “My collective thesis work shows how digital haptics can serve both the general public, for example by changing in-vehicle technologies, as well as the visual impaired by providing a tool by which text or pictures can immediately and dynamically be rendered as a haptic display.”

This research is exemplary of the activity of The Sense Innovation and Research Center, which is a joint venture of the Lausanne University Hospital (CHUV), the University of Lausanne (UNIL) and the University of Applied Sciences of Western Switzerland – Valais (HES-SO Valais-Wallis), The study results from the collaboration with the Fondation Asile des aveugles, the CIBM Center for Biomedical Imaging, The Institute of Computer Science at the University of Bern, The University of Geneva, as well as Hap2u. Financial support was provided by the Swiss National Science Foundation and a grantor advised by Carigest SA.

Prof. Micah Murray, Associate Professor at the Radiology Department of the Lausanne University Hospital and University of Lausanne, describes the impact of this work, “Digital haptics technologies exist today. This work shows why such technologies are critically important to make drivers safer by reacting faster. Touch screens are universally present, but their full capacity for showcasing how we can best use our senses to interact with our world is only in its infancy.”

When asked what’s next, Prof. Murray and his colleagues point to changing online shopping to be able to “feel” fabrics, to revolutionizing museum visits to enable “touching” of artworks for visually-impaired, to providing indoor/outdoor navigation tools for the visually-impaired, and to novel educational devices for children with learning differences and disabilities.