Chapter 24. Human and Machine Haptics
Human and Machine Haptics
Academic and Research Staff
Dr. Mandayam A. Srinivasan, Dr. Orly Lahav, Dr. David W. Schloerb
Visiting Scientists and Research Affiliates
Dr. Jianjuen Hu
Graduate and Undergraduate Students
Siddarth Kumar, Manohar Srikanth
Sponsors
National Institutes of Health (NINDS) – Grant RO1-NS33778
National Institutes of health (NEI) – Grant 5R21EY16601-2
Abstract
The work in the Touch Lab (formal name: Laboratory for Human and Machine Haptics) is guided by a
broad vision of haptics which includes all aspects of information acquisition and object manipulation
through touch by humans, machines, or a combination of the two; and the environments can be real
or virtual. We conduct research in multiple disciplines such as skin biomechanics, tactile
neuroscience, human haptic perception, robot design and control, mathematical modeling and
simulation, and software engineering for real-time human-computer interactions. These scientific and
technological research areas converge in the context of specific application areas such as the
development of virtual reality based simulators for training surgeons, haptic aids for people who are
blind, real-time haptic interactions between people across the Internet, and direct control of machines
from neural signals in the brain.
Key Words
Haptics, touch, skin biomechanics, tactile neuroscience, haptic psychophysics, human-computer
interactions, virtual reality, medical training, brain-machine interfaces
Introduction
Haptics refers to sensing and manipulation through touch. Although the term was initially used by
psychologists for studies on active touch by humans, we have broadened its meaning to include
humans and/or Machines in real, virtual or teleoperated environments. The goals of research
conducted in the Touch Lab are to understand human haptics, develop machine haptics, and enhance
human-machine interactions in virtual environments and teleoperation. Human Haptics is the study of
how people sense and manipulate the world through touch. Machine Haptics is the complimentary
study of machines, including the development of technology to mediate haptic communication
between humans and computers as illustrated in the following figure.
24-1
Chapter 24. Human and Machine Haptics
In the figure, a human (left) senses and controls the position of the hand, while a robot (right) exerts
forces on the hand to simulate contact with a virtual object. Both systems have sensors (nerve
receptors, encoders), processors (brain, computer), and actuators (muscles, motors). Applications of
this science and technology span a wide variety of human activities such as education, training, art,
commerce, and communication.
Our research into human haptics has involved work on biomechanics of skin, tactile neuroscience,
haptic and multimodal psychophysics, and computational theory of haptics. Our research into
machine haptics includes work on computer haptics -- which, like computer graphics, involves the
development of the algorithms and software needed to implement haptic virtual environments -- as
well as the development of haptic devices. Applications of haptics that we have investigated include
methods for improving human-computer interaction as well as novel tools for medical diagnosis and
virtual reality based medical training. An exciting new area of research we have initiated is the
development of direct brain-machine interfaces, using which we succeeded in controlling a robot in
our lab using brain neural signals transmitted over the internet in real-time from a monkey at Duke.
Another of our research results that made world news headlines was the first demonstration of
transatlantic touch where a user in our lab and a user in London collaboratively manipulated a virtual
cube while feeling each other’s forces on the cube. The following sections present summaries of our
work in the various research areas including descriptions of progress over the past year in our current
projects.
1. Biomechanics of Touch
Mechanics of the skin and subcutaneous tissues is as central to the sense of touch as optics of the
eye is to vision and acoustics of the ear is to hearing. When we touch an object, the source of all
tactile information is the spatio-temporal distribution of mechanical loads on the skin at the contact
interface. The relationship between these loads and the resulting stresses and strains at the
mechanoreceptive nerve terminals within the skin, plays a fundamental role in the neural coding of
tactile information. Unfortunately, very little is known about these mechanisms.
In the Touch Lab, we develop apparatus and perform experiments to measure the mechanical
properties of the skin and subcutaneous tissues. In addition, we develop sophisticated mechanistic
models of the skin to gain a deeper understanding of the role of its biomechanics in tactile neural
response. A variety of techniques have been used in our experiments, including videomicroscopy,
Optical Coherence Tomography (OCT), Magnetic Resonance Imaging (MRI), high frequency
Ultrasound Backscatter Microscope (UBM) imaging, and computer-controlled mechanical stimulators.
We use the empirical data to develop finite element models that take into account inhomogeneity in
24-2 RLE Progress Report 151
Chapter 24. Human and Machine Haptics
the skin structure and nonlinearities in its mechanical behavior. Analysis of these models in contact
with a variety of objects generates testable hypotheses about deformations of skin and subcutaneous
tissues, and about the associated peripheral neural responses. Verifications of the hypotheses are
then accomplished by comparing the calculated results from the models with biomechanical data on
the deformation of skin and subcutaneous tissues, and with neurophysiological data from recordings
of the responses of single neural fibers. We are currently engaged in a several projects in this area.
1.1 Characterization of material properties of primates and C.elegans
The biomechanics of tissues plays a crucial role in tactile sensation. Since any load on the surface is
transmitted to mechanoreceptors through the tissues, studying their response is critical in
understanding general principles of mechanosensation. In continuation of our work last year in which
we characterized the viscoelastic properties of the primate finger by single point indentation, we
rebuilt our 3D FEM model in ADINA and are in the process of simulating our indentation experiments
to estimate mechanistic viscoelastic parameters for the underlying tissue (Figure 3-1).
Figure 3-1: (a) Cross section of the multilayer viscoelastic model built in ADINA (b) shows the completed
3D model which will be used for simulations
In addition to improving the computational model of the primate finger by incorporating the
viscoelasticity of the primate fingertip, we have also initiated the study of touch in a new model
organism, the nematode C. elegans. The use of this nematode presents many advantages. Apart
from our goal of finding mechanical triggers that cause mechanoreceptors to respond (similar to what
we have been studying in humans and primates), the nematode gives us an opportunity to study the
role of genetics in touch sensation as well. We are also focusing on understanding the protein
machinery behind tactile sensation by employing techniques and tools similar to what we have
developed for our primate experiments. We investigated the biomechanics of C.elegans tissue by
utilizing the force spectroscopy capabilities of the Atomic Force Microscope. A modified cantilever tip,
with a glass bead of size 10μm attached to its end, was used to indent nematode and obtain force or
stiffness curves. The response was found to be linear with a stiffness of 9.4 N/m within our
indentation range. This is in agreement with literature (Park et at, 2007 1 ). In the past we have
developed mechanistic 2D homogeneous finite element models for the primate fingertip (Dandekar
and Srinivasan, 19962) which later evolved into 3D layered finite element models (Dandekar et al,
20033) with realistic geometry. We are following a similar approach with the goal of developing an
1
Park, S-J, Goodman, M. B. and Pruitt, B. L., (2007). “Analysis of nematode mechanics by Piezoresistive
displacement clamp”, PNAS, Vol. 104, no. 44, 17376–17381
2
Srinivasan, M. A. and Dandekar K. (1996). "An investigation of the mechanics of tactile sense using two
dimensional models of the primate fingertip." Journal of Biomechanical Engineering 118: 48-55
3
Dandekar, K., B.I. Raju and M.A. Srinivasan (2003) "3-D Finite-Element Models of Human and Monkey
Fingertips to Investigate the Mechanics of Tactile Sense." Journal of Biomechanical Engineering, Vol. 125, pp.
682-691, ASME Press.
24-3
Chapter 24. Human and Machine Haptics
accurate model to explain the biomechanics of the nematode. A simple 2D numerical model was
developed to perform finite element simulations of the indentation experiment to study the effect of
material and geometric parameters on the resulting force curve. The elastic properties of the
nematode (Young’s modulus) were estimated by matching the force curves obtained by numerical
simulations with those obtained experimentally and was found to be 35.67 MPa assuming
incompressibility (Poissons ratio = 0.49).
Figure 3-2: (a) Deflection curve of the nematode. The y axis shows the deflection of the cantilever (d) while
the x axis shows the motion of the piezoelectric actuator (z). (b) the indentation depth of the nematode when
compared with a reference force curve (indentation on glass).
Figure 3-3: (a) Schematic of the finite element simulation of the indentation experiment. (b) the response for the
2D model to indentation depths upto 3 μm and (c) the response for the 2D model for small strains where a
linear assumption holds.
24-4 RLE Progress Report 151
Chapter 24. Human and Machine Haptics
2. Tactile Neuroscience
Tactile neuroscience is concerned with understanding the neural processes that underlie the sense of
touch originating from contact between the skin and an object. Traditional studies have focused on
characterizing the response of mechanoreceptors in the skin to various stimuli such as vibrating
probes or indenting sharp edges. In contrast, we have tried to determine how object properties such
as shape, microtexture, and softness, and contact conditions such as slip, are represented in the
peripheral neural response.
Most of our work in this area has been done in collaboration with Dr. Robert H. LaMotte of the Yale
University School of Medicine. In the experiments, microelectrodes monitor the discharge rate of
tactile receptors in the skin of anesthetized monkeys while the surface of the skin is mechanically
stimulated. Computer-controlled stimulators press and stroke carefully designed objects on the
fingerpads. Frequently in conjunction with these neurophysiological measurements, we have also
performed psychophysical experiments with human subjects using the same apparatus.
3. Sensorimotor Psychophysics
Psychophysics is the quantitative study of the relationship between physical stimuli and perception. It
is an essential part of the field of haptics, from the basic science of understanding human haptics to
setting the specifications for the performance of haptic machines. It is also quite natural to extend
psychophysical methods to the study of motor control in this case, investigating the relationship
between intention and physical effect, because the haptic channel is inherently bi-directional.
We have conducted pioneering psychophysical studies on compliance identification and
discrimination of real and virtual objects, and determined the human resolution (i.e., Just Noticeable
Difference, JND) in discriminating thickness, torque, stiffness, viscosity, and mass under a variety of
conditions. Furthermore, using the virtual environment systems that we have developed, we have
conducted psychophysical experiments under multimodal conditions, such as the effect of visual or
auditory stimuli on haptic perception of compliance. We have also conducted a number of studies on
the human ability to apply controlled forces on active and passive objects. Psychophysical
experiments related to the detection of extremely fine--75-nanometer high--textures and the detection
of slip have also been performed in conjunction with neurophysiological measurements. Currently we
are engaged in the various tactile threshold measurements.
4. Haptic Device Development
Haptic devices are used to investigate, augment, or replace human haptic interactions with the world.
For example, haptic devices like the Instrumented Screw Driver (see photo) have been developed
and used in the Touch Lab to investigate human performance. The Instrumented Screw Driver was
used in an experiment to study a person's ability to sense and control torque.4 In the experiment,
subjects held the handle of the computer-controlled device in a pinch grasp and overcame a
preprogrammed resistive torque to rotate the handle. Other devices, like the Epidural Injection
Simulator (see photo), have been developed in the lab to augment medical training.5 Using this
device, the trainee manipulates a syringe and feels realistic forces as he or she attempts to position
the needle and inject a fluid. Another example of augmenting performance is on the development of
machines that can be directly controlled by neural signals in the brain.6 7.
4
Jandura L and Srinivasan MA, Experiments on human performance in torque discrimination and control, in
Dynamic Systems and Control, Vol. 1, Ed: C. J. Radcliffe, DSC-Vol.55-1, pp. 369-375, ASME, 1994.
5
Dang T, Annaswamy TM and Srinivasan MA, Development and Evaluation of an Epidural Injection Simulator
with Force Feedback for Medical Training, Medicine Meets Virtual Reality Conference 9, Newport Beach, CA,
January, 2001.
6
Wessberg J, Stambaugh CR, Kralik JD, Beck P, Laubach M, Chapin JK, Kim J, Biggs SJ, Srinivasan MA and
Nicolelis MAL, Adaptive, real-time control of robot arm movements by simultaneously recorded populations of
premotor, motor and parietal cortical neurons in behaving primates, Nature, Vol. 408, No. 6810, pp. 361-365,
2000.
24-5
Chapter 24. Human and Machine Haptics
Figure 4 -1 Instrumented Screw Driver
Figure 4 -2 Epidural Injection Simulator
Primarily, the development of haptic devices in the Touch Lab is driven by our need for new types of
experimental apparatus to study haptics and its applications. Our work in this area includes the
design and construction of new devices as well as the modification/enhancement of existing
apparatus to meet specific needs. Our current work on haptic devices focuses on the development of
tactile sensors, displays, and stimulators in connection with our projects related to Biomechanics of
Touch, Sensorimotor Psychophysics, and Brain Machine Interfaces.
5. Human Computer Interactions
An important general application of our research is the use of haptics to improve communication with,
or mediated by, computers. Just as the graphical user interface (GUI) revolutionized human computer
interactions (HCI) compared to earlier text-based interfaces in the early 1980's, adding haptics has
the potential of significantly expanding the communications channel between humans and computers
in a natural and intuitive way. Specific goals range from the development of a standard haptic user
interface (HUI) for a single user to improved virtual environment and teleoperation systems with users
who collaborate over large distances.
5.1 BlindAid: A Virtual Reality System that Supports Acquisition of Orientation and Mobility
Skills by People who are Blind
Over the past few years, the MIT Touch Lab has developed and tested the BlindAid system that
combines 3D audio and a haptic feedback. The users interact with virtual environments (VEs) through
a haptic interface that enables users to touch and feel a VE through a hand-held stylus. The goal of
the project is to develop a user friendly system which allows people who are blind to explore and build
cognitive maps of unknown virtual spaces through haptic exploration supported with audio feedback.
A paper describing the technical development has been accepted for presentation and inclusion in the
proceedings of the IEEE Haptics Symposium, March 25-26, 2010, Waltham, MA (Schloerb et al.
2010).
The system consists of a software package that we developed to provide a VE for people who are
blind and a hardware station, which consists of a haptic interface and audio feedback devices, in
addition to a visual display for the experimenter. The haptic interface allows the user to interact with
the VE and provides two functions: it moves the avatar through the VE and it provides force feedback
to the user that gives cues about the space, similar to those generated by a white cane (Figure 5.1-1).
In the study we used the Phantom (SensAble Technologies) as the haptic interface that presents high
fidelity in force generation and position tracking for the user.
7
Nicolelis MAL and Chapin JK, Controlling Robots with the Mind, Scientific American, 287 (4), pp 46-53, 2002.
24-6 RLE Progress Report 151
Chapter 24. Human and Machine Haptics
Figure 5 -1 (Left) Photograph of the BlindAid system in use. The user hears spatialized sounds, as if physically
standing in the simulated space such as a room in a building, while touching a small scale virtual model of the
space (~10 cm across) using the Phantom. The user may input commands via help keys on the computer
keypad. A video display allows sighted O&M instructors and researchers to monitor the user’s progress. (Right)
An image of the BlindAid system’s visual display showing a replay of a subject’s movements in one of the virtual
environments (VEs) in the usability tests. The VE shown simulates a real space at MIT where the participants’
navigation abilities were later tested. The visual display, which presents an orthographic view of the VE from
above, helps the experimenter conduct the experiments.
Over the last year, we continued analysis of our Stage 1 experiments to determine the robustness of
the developed system in order to gauge its usability and effectiveness in helping blind users build
cognitive maps and traverse unknown real spaces after training on the system. These usability
experiments were focused on the effectiveness of haptic and the auditory feedback as well as user
navigation tools. In addition to this, the user’s ability to explore the VE, to construct a cognitive map,
and to apply this new spatial knowledge in his orientation tasks in the real space were also tested.
Four adult human subjects who were totally blind volunteered to take part in the tests in accordance
with an approved human subject protocol. The subjects learned how to operate the system as part of
two studies, exploring 17 different VEs of increasing complexity. The subjects were tested after each
exploration in terms of how well they had been able to learn the unknown space. Specifically, in Study
1--after some initial tests that focused on training (VEs #1 and #2) and user preference with regards
to haptic feedback (VEs #3 to #8)--the subjects were asked to describe VEs #9 to #13 verbally and to
build physical models of them using a modeling kit designed for the tests. In Study 2, in which the
VEs (#14 to #17) represented real physical spaces, the subjects were tested in their ability to navigate
in the real spaces after training only in the VEs. The experimental protocol in both studies also
involved a standard set of verbal questions to elicit statements from the subjects about their
preferences. In addition, the experimenter observed which features of the system were used and how
they were used. The sessions were video-recorded and the subjects’ explorations were also recorded
by the BlindAid system to aid the experimenter in this task.
Two journal papers are currently in preparation, presenting the results of the Stage 1 tests. In general,
all of the subjects were able to learn how to operate the system and use it to learn about unknown
spaces. They also demonstrated the ability to transfer and apply spatial knowledge gained in VEs to
navigation in real spaces.
24-7
Chapter 24. Human and Machine Haptics
The second stage of the project, which was done in collaboration with the Carroll Center for the Blind,
a private, non-profit rehabilitation center based in Newton MA, was focused on integrating the
BlindAid system with the traditional Orientation and Mobility (O&M) rehabilitation program taught at
the Center. During this phase, we designed an experiment protocol, built ten VEs which represented
three main buildings at the Carroll center campus along with the outdoor and surrounding area of the
center. The ten virtual environments were built based on the blue prints of the buildings and the
surrounding area along with available online maps (maps.live.com) (figure 5-2). Seventeen
participants took part on this study.
Figure 5-2 (a) Blueprint of the Carroll Center for the Blind campus (b) map of the campus from maps.live.com (c)
a preliminary VE of the campus constructed from the above two maps
During the final study, the participants at the Carroll Center explored the VEs using various tools
provided by the BlindAid system which included haptic and 3D audio aids; multi-scale environments
(that enables haptic zooming to help the user feel the size of the objects at different scales); and by
using different exploration techniques. After each exploration of a virtual environment, the subjects
were asked to describe the environment and to perform five orientation tasks in the real space. The
results of these experiments are being analyzed. Additional research and development efforts will
enable transformation this promising technology into a useful diagnostic tool that will allow a
researcher or an O&M teacher to be able to track and observe participants during their exploration
while gaining insight into improving training procedures. In addition to this, the BlindAid system can
be used to train O&M teachers.
6. Medical Applications
Touch Lab research has a wide range of medical applications. On a fundamental level, our
investigations of human haptics offer insights into the functioning of the human body that should
ultimately lead to improved medical care. Many of the experimental techniques and apparatus
developed in these studies also have specific clinical uses that are explored in collaboration with
various medical researchers. The lab's primary medical focus, however, has been to develop machine
haptics and other virtual environment technologies for specific medical needs. The major thrust to
24-8 RLE Progress Report 151
Chapter 24. Human and Machine Haptics
date has been the development of virtual reality based medical simulators to train medical personnel,
similar to the use of flight simulators to train pilots.
We have developed an epidural injection simulator and a laparoscopic surgical simulator with novel
real-time techniques for graphical and haptic rendering. The epidural injection simulator, developed in
collaboration with Dr. Thiru Annaswamy of UT Southwestern Medical Center, Dallas, TX, has been
tested by residents and experts at two hospitals. It has been exhibited at the Boston Museum of
Science where the general public were able to experience the feel of performing a needle procedure
without any risk to a patient. Another project we have pursued has been on developing haptic and
graphical rendering techniques in the context of laparoscopic esophageal myotomy (Heller myotomy).
Publications
Book Chapters, Published
Zimmer R, Jefferies J, and Srinivasan M.A., “Touch technologies and museum access,” In Touch in
Museums, ed: Helen J. Chatterjee, Berg Publishers, 2008.
Srinivasan M.A. and Zimmer R, “Machine Haptics,” in New Encyclopedia of Neuroscience, Ed: Larry
R. Squire, Vol. 5, pp. 589-595, Oxford: Academic Press, 2009.
Meeting Papers, Published
Lahav O., Schloerb D., Kumar S. & Srinivasan M. A. (2008), "BlindAid: A virtual exploration tool for
people who are blind", 13th Annual CyberTherapy Conference June 23rd-25th 2008 in San Diego, CA
Lahav O., Schloerb D., Kumar S. & Srinivasan M. A. (2008). BlindAid: A Learning Environment for
Enabling People who are Blind to Explore and Navigate through Unknown Real Spaces. Presented at
Virtual Rehabilitation 2008, Vancouver, Canada.
Kyung K-U, Lee J-Y, and Srinivasan MA, Precise Manipulation of GUI on a Touch Screen with Haptic
Cues, Proceedings of the Third World Haptics Conference, Salt Lake City, Utah 2009.
Lahav O., Schloerb D., Srinivasan M. A. (2009). Integrating the BlindAid System in a Traditional
Orientation and Mobility Rehabilitation Program. Presented at Virtual Rehabilitation 2009, Haifa,
Israel.
Meeting Papers, In Press
Schloerb, D.W., Lahav, O., Desloge, J.G., and Srinivasan, M.A. (2010). “BlindAid: Virtual Environment
System for Self-Reliant Trip Planning and Orientation and Mobility Training.” Accepted for inclusion in
the proceedings of the 2010 Haptics Symposium, March 25-26, Waltham, MA, IEEE.
24-9