Perception-based haptic rendering
Matthias Harders, Computer Vision Laboratory, ETH Zurich.
Anatole Lécuyer, IRISA, Rennes.
Marc Ernst, Max Planck Institute, Tübingen.
Dr. Matthias Harders
Computer Vision Laboratory
ETH Zurich, Sternwartstrasse 7
CH-8092 Zurich, Switzerland
Tel.: + 41 1 632 52 79
Fax: + 41 1 632 11 99
Marc Ernst, Max Planck Institute
Matthias Harders, ETH Zurich
Vincent Hayward, McGill University
Anatole Lecuyer, INRIA
Gunter Niemeyer, Stanford University
Alan Wing, University of Birmingham
Hong Z.Tan, Purdue University
Although haptic interaction with objects in our environment plays
a fundamental role for human perception, the sense of touch has been
far less intensively investigated than other senses in the context
of virtual enviroments. For instance, the design of haptic interfaces
is more often driven by the availability of technology than by the
necessity to solve real users' perceptual issues. There is a need
today for a clear change of perspective, and it is time to search
how to design haptic enviroments that match properly the human haptic
perception. Haptic hardware could be restricted to stimulate the part
of the haptic channel which provides the best contribution to the
final percept. Furthermore, we could take advantage of interesting
properties of human perception such as haptic illusions, cross-modal
transfer, synesthesia, etc. In addition, having a deeper understanding
of the characteristics of the human haptic system, as well as of the
human perceptual processes would help us to develop more effective
guidelines for developing and evaluating haptic devices and applications.
The workshop aims at providing answers to the following
- How to design haptic interfaces that are in line with the characteristics
of the human perception?
- How to use properties of the human perception to simplify the
different components of haptic displays?
- How to take advantage of perceptual phenomena such as haptic illusions
and cross-modal influences?
- How to design haptic rendering frameworks, which match human perceptional
This workshop will provide the audience with recent
physiological and psychological findings in the field of haptic and
multimodal perception and rendering. It will give methodological guidelines
for the design of haptic solutions that match the characteristics
of the human haptic sense. We will illustrate our approach with successful
applications and system, that benefited from information stemming
from human perception.
Title: Neurophysiology and neuropsychology of active touch.
Active touch is based on a combination of cues from peripheral
receptors including skin and muscle. Primary and secondary
areas in sensory cortex integrate these. I will review the
neurophysiological bases of touch including imaging studies
and describe neuropsychological studies of the effects of
brain damage on perception.
Marc O. Ernst
Title: Interaction Within Touch: Combining Tactile And
When actively exploring an object with the hand the resulting
percept is based on tactile as well as kinesthetic information.
The purpose of a series of psychophysical studies that we recently
conducted was to investigate the interaction between kinesthetic
and tactile information. In a first experiment we investigated
the interaction between different cues to haptic shape. We found
that position and force signals are both used by the haptic system
to determine the final shape percept. Furthermore, we found that
the individual reliability of the two signals determines the
relative contribution to the shape percept, indicating an optimal
integration strategy. In a second experiment we investigated the
tactile sensitivity for orientation discrimination during active
and passive exploration and found that discrimination thresholds
significantly increased during active touch. This indicates some
sort of tactile suppression during active touch. In a third
experiment we investigated the contribution of different tactile
sensations (produced by shear force, slip force, and normal force)
to the feeling of "natural" haptic shape. Within a certain range
that we could determine quantitatively using psychophysical
methods all three cues provide useful information to haptic shape
and contribute to a more natural haptic feeling. In conclusion
these three experiments demonstrate the tight coupling between
kinesthetic and tactile perception which is dependent on whether
objects are explored actively or passively.
Hong Z. Tan
Title: Assessing Perceived Quality of Virtual Haptic Surfaces
Two studies will be presented that illustrate a human-based
approach to studying haptic rendering of virtual surfaces. We
propose a new concept called perceived instability that captures
non-realistic sensations associated with virtual textured surfaces.
We then formulate a force-constancy hypothesis that predicts the
illusion of surface-height reversal when virtual surfaces vary
in both topography and stiffness.
Title: Pseudo-Haptic Feedback
This presentation will review the main results obtained in
the field of "pseudo-haptic feedback". Pseudo-haptic feedback
is a technique meant to simulate haptic sensations in virtual
environments without using haptic devices, but using properties
of the human visuo-haptic perception. Pseudo-haptic feedback
uses visual feedback to distort the haptic perception and
generate "haptic illusions". Up to now, pseudo-haptic feedback
has been used to simulate several haptic properties such as:
friction, stiffness or textures. It has also been implemented
in several VR applications such as for vocational training,
or medical simulation.
Title: How to use properties of the human perception to simplify
the different components of haptic displays?
In the past two years, the McGill University Haptics Lab has
been working on several new devices: a handheld vibrotactile
device, a contact location display, a planar direct-drive
force-feedback device, and a high-density distributed tactile
display; named "MicroTactus", "Morpheotron", "Pantograph", and
"StreSS", respectively, each targeted at specific approaches to
mediate haptic interaction. In this presentation, I will discuss
recent enhancements for each of these devices, what drives
performance and what provides effectiveness in each case.
Title: Event-Based Haptics
Haptic sytems aim to allow users to touch virtual objects. A large
part of our normal sense of touch are vibrations and high-frequency
transitients we detect at our finger tips. These inform us about
material properties, surface features, and/or relative motion,
stick/slip, and more. But today's sytems do not render this
information, instead only show low
frequency and quasi-static forces. They make the virtual world feel
soft and spongy, with all surfaces appearing identical. This
limitation can be traced to the simple rendering algorithms that are
inherited from decades of robotics. Understanding the human's
perception requirements, we have developed the concept of Event-Based
Haptics. It takes full advantage of typical DC
motors to dislpay high frequency transient forces when an event occurs,
such as impact or motion across small surface features. These
transients are pre-computed or pre-recorded from real events, adjusted
as necessary, and presented without feedback from the user. Their
short duration up to 50 or 100ms ensures all information is transfered
before the user can react, avoiding the need for feedback. Results
have shown that users of Event-Based Haptics were unable to distinguish
a virtual from an equivalent real surface.
Title: Assessing the Fidelity of Haptically Rendered
A still open question in the area of surgical simulation is
the necessary degree of realism. It has to be noted, that the
final goal is not to achieve the highest realism possible, but
to provide sufficient realism to enable efficient training of
a specific surgical skill. An important element in this respect
is the haptics module of a simulator. Several components are
necessary to generate haptic feedback, including tissue
deformation models, tissue material parameters, collision
detection algorithms, coupling to the haptic display, and
the haptic device. As a first approach to assessing the
influence of these different elements on the fidelity of
haptic rendering, we carried out a simplified comparison
study, which included real linear elastic silicone samples
as well as virtually generated counterparts. In this
presentation an overview of this work will be provided.
This workshop is open to a heterogeneous audience:
- Interested researchers in the field of applied perception
- Designers of haptic devices and haptic software solutions
- Builders of haptic rendering environments
- End-users of haptic devices
Level of expertise
No specific pre-knowledge or high level of expertise is required.