A Haptic Interaction Method for Volume Visualization
Ricardo S. Avila and Lisa M. Sobierajski
GE Corporate Research & Development
Schenectady, New York 12345
Abstract
Haptic interaction has been successfully applied to
simulate specific tasks in several application areas. In
molecular docking studies, a robotic arm was used to supply molecular interaction forces [3]. In another application, a haptic device was used as a nanomanipulator for a
scanning tunneling microscope [I 61, enabling scientists to
manipulate individual atoms on a surface. A medical planning and training system [4] has also been developed
which simulates knee palpation through the use of visual
and haptic feedback.
One goal of the work presented in this paper is to
develop a haptic interaction method for use in a volume
visualization system. There are several reasons to pursue
the addition of haptic cues to volume visualization. The
use of a force feedback device during visualization is a
natural output method for interactively conveying complex
information to the user. This is particularly useful when
the user attempts to precisely locate a feature within a volume, or to understand the spatial arrangement of complex
three-dimensional structures.
A second goal of this haptic interaction method is to
allow the haptic device to be used for input as well as output. The position and orientation of the haptic device
could be used to simulate a virtual tool that could modify
local properties of the volume dataset. For example, this
modification ability can be employed during data exploration to alter visibility of one structure to allow visual
access to another. Data modification can also be used for
three-dimensional annotation of scientific data. In addition, the field of volume graphics [9] can benefit from a
haptic data modification method that allows for interactive, virtual volume modeling [ 171.
This paper is organized into eight sections. An overview of the haptic interaction method is given in Section 2.
The volume data representation used for visual and haptic
interaction is covered in Section 3. We discuss our haptic
rendering method in Section 4, while the corresponding
volume rendering method is described in Section 5. Techniques and tools for data modification are covered in Section 6. Our implementation, and some results of this
method are given in Section 7, while Section 8 concludes
the paper with a discussion of future work.
Volume visualization techniques typically provide
support for visual exploration of data, however additional
information can be conveyed by allowing a user to see as
well as feel virtual objects. In this paper we present a haptic interaction method that is suitable for both volume
visualization and modeling applications. Point contact
forces are computed directly from the volume data and are
consistent with the isosu$ace and volume rendering methods, providing a strong correspondence behveen visual
and haptic feedback. Virtual tools are simulated by applying three-dimensional jilters to some properties of the data
within the extent of the tool, and interactive visual feedback rates are obtained by using an accelerated ray casting method. This haptic interaction method was
implemented using a PHANTOM haptic inteeace.
1. Introduction
Traditional methods for visualizing volumetric data
rely almost entirely on our powerful sense of vision to
convey information. While this has proven quite effective
for most visualization tasks, it remains worthwhile to
investigate the benefit of augmenting these visualization
methods with information obtained through other sensory
channels. In particular, our sense of touch, in combination
with our kinesthetic sense, is capable of supplying a large
amount of information about the structure, location, and
material properties of objects [6]. The study of the many
issues related to interaction with an environment through
the sense of touch is known as haptics.
Haptic rendering is the name given to the process of
feeling virtual objects [13]. This involves tactile feedback
for sensing properties such as surface texture, and kinesthetic feedback for sensing the shape and size of objects.
Traditional methods for producing convincing haptic renderings have mainly utilized scenes comprised of geometric primitives such as polygons, spheres, and surface
patches. These investigations have generally focused on
simulating realistic interactions with static and dynamic
collections of geometric objects given the capabilities and
limitations of haptic devices. Although the benefits of haptic rendering volume data have been recognized [LX],this
area of research has not yet been fully explored.
8) 1996 IEEE
O-7803-3707-7196
..$4.m
197
2. System Overview
We identified several major requirements that must
be met by a haptic visualization method in order to provide
meaningful force feedback and data modification capabilities.
l
l
l
l
l
Constant haptic refresh rate: Large variations in the
rate at which forces are updated can produce distracting tactile artifacts.
Fast force calculations: Complex force computations
would reduce the haptic refresh rate and would therefore decrease the amount of processing time available
for rendering and data modification.
Fast, incremental rendering: Interactive render rates
are necessary for visual feedback of the haptic pointer
location and data modification, and the time cost of
rendering must be amortized over a number of force
feedback iterations to maintain a consistent haptic
refresh rate.
Fast data modification: Interactive data modification
rates are required for both visual and force feedback:
Consistent haptic and volume rendering: Volume
rendering and haptic rendering should be consistent.
A structure which appears amorphous should also feel
amorphous.
To satisfy these five major requirements, we made
some assumptions about the force feedback, rendering,
and data modification computations that could occur during haptic interaction. First, the force feedback and data
modification calculations are restricted to using only a
local area of data values. In addition, viewing, lighting,
and global material properties are fixed during haptic
interaction, and a local data modification operation must
effect only a small region of the displayed image.
Based on the major requirements of the system, and
the assumptions that were made, we developed the haptic
visualization method illustrated in Figure 1. The haptic
interaction loop begins after an initial image of the scene
has been computed. The first step in the interaction loop is
to obtain the current position and orientation of the haptic
pointer from the physical device. We will refer to the virtual counterpart of this physical pointer device as the
“tool”, since it will often be used, for example, as a virtual
scalpel, chisel, or paintbrush. If we determine that a data
modification operation is necessary at this time, the modification computation is performed, and the volume database is updated. In addition, the pixel status buffer is
updated to indicate which pixels of the image have been
affected by this modification operation. Once this optional
modification step is complete, the current force is computed and supplied to the haptic device. During the rendering phase, some small number of the pixels that require
updating are rendered using a ray casting method. Finally,
if it is time to refresh the physical display device, the current image is copied to the screen and the graphics hard-
198
Figure 1: An overview of the haptic visualization
method. Solid lines indicate control flow while
dashed lines show data flow.
ware is used to render a geometric object that indicates the
size, position, and orientation of the current tool.
The data modification operation does not occur during every iteration of the haptic interaction loop. Instead, a
timer indicating elapsed time since the previous modification is consulted during each iteration. If this elapsed time
exceeds some threshold, the modification operation is performed. The main reason for limiting the rate at which
data modification occurs is that we are maintaining a haptic refresh rate of 1 to 5 KHz. Therefore, there is only a
small amount of computational time left over after the
force calculation in each iteration. Increasing the rate of
data modification would decrease the amount of time
available to update the pixels of the image affected by the
modification.
The rate at which the physical display device is
refreshed is also limited by an elapsed time threshold. In
this case, a 30Hz limit is imposed since a refresh rate
much greater than this is unnecessary.
Table 1: Voxel Representation
3. Data Representation
A volume is represented as a 3D rectilinear array of
volume elements, or voxels. each specifying a set of scalar
properties at a discrete grid location. An interpolation
function is used to produce a continuous scalar field for
each property. This is critical for producing smooth volume and haptic rendering.
In order to meet the requirements of the system, the
contents of each voxel must contain a large number of
physical properties. This includes a scalar value for density, values for material classification and shading properties, as well as values for mechanical properties such as
stiffness, and viscosity. In addition, it is often desirable to
precompute and store values that do not often change such
as density gradients and partial shading results for each
voxel. Given unlimited memory resources, each voxel
would contain high precision storage for each of these
parameters.
One possibility that we considered was to store the
volume in a space-efficient, hierarchical data structure
such as an octree, thereby reducing storage requirements
in empty areas of the volume. There are two problems
with this approach. First, maintaining the requirement of a
consistent haptic refresh rate is far easier when the rime
required to read and modify voxel values is constant. Second, in many cases data modification operations will lead
to datasets that can no longer be stored efficiently in hierarchical data structures.
When defining the values stored in each voxel in the
rectilinear grid, we considered interactive rendering rates
to have highest priority, since that is a direct requirement
of the system. The ability to render large datasets was
given the next highest priority, and property modification
flexibility was considered last. From these priorities, we
obtained the definition of a voxel requiring 8 bytes as
shown in Table I. Three bytes are allocated for gradient
magnitude and direction in order to save time during volume rendering. A 24 bit color, rather than a color LUT, is
stored within each voxel to ensure cornpositing and painting operations on volumes include tine details. A collection of material properties is indexed through a look up
table. An entry in the table specifies additional characteristics such as material classification and shading parameters.
Haptic properties such as stiffness and viscosity may also
be assigned to an index.
Material opacity is generally stored in a table
indexed by the LUT index, the density value, or both the
density and gradient magnitude values. Material classification, shading properties, and haptic properties are typically
obtained through a segmentation process applied to the
density values.
Property
Size (bytes)
Type
Density
I
SCdX
1 Gradient Direction [ EncodedUnit Vector 1
2
I
I
’
I
I
3
I
I
’
I
1 Gradient Magnitude I
1 Material Properties 1
Scalar
LUT Index
4. Haptic Rendering
The system allows for the exploration and modification of both isosurface and translucent volumes. The
forces generated can either be constructed to approximate
a realistic feel of a virtual object or to convey meaningful
structural information for data exploration purposes. In the
case where the user wishes to explore internal structures of
a rigid body, such as bone, it is desirable to produce contact forces and visual characteristics which are inconsistent with bone, but that allow the user to penetrate the bone
to feel and see internal structure.
The force equations are based on two principal
requirements. First, the interaction forces must be calculated fast enough to be used within an interactive system.
Typically, force update rates of l-5 KHz are generated for
this system. Second, the forces imparted to the user should
be consistent with the rendering of the volumetric object.
In order to meet the speed requirement and since the
haptic device we used can only handle translation forces,
the force calculation is simplified to a point contact. This
has been shown to be a reasonable simplification for many
tasks [ 121. The general equation we used for feeling an
object using a point contact model is:
F = A’+R(I))+S(j;)
and is illustrated in Figure 2. The force 3 supplied to the
user located at position P and moving in direction ‘! is
equal to the vector sum of an ambient force A’, a motion
retarding force R(3) , and a stiffness force normal to the
object S(z) .
The ambient force is the sum of all forces acting on
the tool that are independent of the volumetric data itself.
Some forces such as gravitational or buoyant forces are
independent of the tool position while other forces such as
synthetic guide forces, which aid the user during interactive volume modification, are dependent on position. For
example, a virtual plane perpendicular to a surface can be
used as a guide when attempting to cut a straight line. The
ambient force would be used to guide the tool back to the
plane.
199
The motion retarding force is proportional to velocity and can be used to represent a viscous force. The last
term captures the stiffness of the object and is always in
the direction of local gradient. When simulating interaction on rigid surfaces, which are generally governed by
Hooke’s law, this term can then be set to a linear force
function in the direction of the surface normal and proportional to the penetration distance of point P.
This general equation for force feedback is the basis
for calculating forces which are consistent with different
types of rendering methods. The forces generated are not
intended to be a realistic simulation of interacting with
materials. Rather, the intent is to convey additional information to the user about the data being explored.
The display of volume data requires a segmentation
step in order to determine the visual appearance of the projected volume. In a similar manner, we introduce a segmentation step which produces tactile properties to the
volume. In order to ensure consistency between visual and
haptic rendering, the transfer functions used for the assignment of visual and tactile properties are similar.
and stiffness force functions for haptic volume rendering
become:
R(3) = -L(d,
IVffl)
The normal vector 2 is computed using central differences. We found that a linear correspondence between the
visual transfer function and the haptic transfer functions
produced an intuitive force response, as in:
t,(n, IVdl) = c, r,(d, lVdl) + c,
f,Jd IV4 1 = q&t
IVdl,
Essentially, the more opaque a material, the greater
its stiffness and motion retarding properties. The stiffness
function has an implied zero additive constant to ensure
that the initial contact with an object starts from a zero
force.
Other mappings of the opacity transfer function may
be suitable depending on the type of forces required. For
instance, an exponentially increasing opacity transfer
function may be translated into a linear force response
through the use of a logarithmic function.
If our intent was to simulate a realistic haptic and
visual rendering of a volume, then the segmentation of the
volume into all relevant material characteristics would be
necessary. The properties in the new representation would
replace the approximations found in the visual and haptic
transfer functions.
4.1 Haptic Volume Rendering
When rendering a translucent volume we employ a
gradient magnitude segmentation method [I 11 in order to
assign opacities. The segmentation method specifies an
opacity transfer function a = r,(d, ]Vd]) , where the opacity value a at a sample location is defined by both the
material density and the magnitude of the density gradient
at that location. The function ra can be specified in a num-
4.2 Haptic Isosurface Rendering
ber of different ways. As was done in [lo], we compute a
values by multiplying a density transfer function by a gradient magnitude transfer function. In order to keep volume
and haptic rendering consistent, force transfer functions
t, and I,~ are constructed which are similar to fa, but pro-
The fast and robust calculation of stiffness and
motion retarding forces is essential when interacting with
volumetric isosurfaces. Unfortunately, the stiffness computation requires that the penetration distance of the tool
below the isosurface is available at every location in the
volume. While it is possible to precompute the distance to
an isosurface for every voxel in the volume, we decided to
investigate techniques for approximating stiffness and
retarding forces based only on the density field. There are
two reasons for this. First, the system allows for the interactive modification of the volume. Creating a new distance
map for the volume would be prohibitive. Second, for
small penetration distances, the density field itself can give
a reasonable approximation of the distance to an isosurface.
Similar to volume rendering, the retarding and stiffness force functions used to feel a surface are dependent
on transfer functions:
duce force magnitudes rather than opacities. The retarding
Figure 2: Forces acting on a haptic sensing point P
which is moving at a velocity “v.
200
the Polygon Assisted Ray Casting (PARC) method [14] is
employed to avoid casting rays through empty regions of
the volume. This acceleration method relies on a projection of a geometric approximation of the volume to avoid
segments of rays that pass through empty regions of the
volume. We use the standard hardware projection method
to render the full geometric approximation when the initial
image is generated. The geometric approximation is
updated whenever the scalar field of a volume is altered by
a data modification operation. A software projection
method is then used to update only the affected pixels with
the new geometric approximation.
When data modification occurs, flags are set to indicate which pixels must be recomputed. During each iteration of the haptic interaction loop, some small number of
pixels are updated (typically less than IO), and the flags for
these pixels are cleared. When the display device is
refreshed, the current image may contain some pixels
which have not yet been updated, producing results similar
to those obtained by frameless rendering techniques [2].
Typically these effects are not noticed since the tool
obscures the region of the volume being modified for a
few frames. By the time this region becomes visible in the
image, these pixels have been updated.
The current image is stored in the image buffer, with
a color, opacity, and depth value stored for each pixel location. The depth value indicates the distance from the
image plane at which the ray cast through that pixel accumulated an opacity value greater that V,. When the dis-
Here the density d is used as an indicator of penetration
distance in the thin shell between the isosurface density
values di and dj, where di < dj. The function f,(d) maps
density values into retarding force magnitudes while
f,y(d) maps density value into stiffness force magnitudes.
We set these functions to:
f,(d) =
(di < d I dj) C,(d - di) + C,
i
otherwise
0
(d-d,)
f,@) =
tdi
< d s dj)
‘6(dj
otherwise
_ di)
0
The retarding force is set to a linear function proportional
to the difference in density above di . Similar to haptic volume rendering, the coefficients C, , C, , and C, specify a
linear mapping from density values to force magnitudes.
The stiffness force varies from zero to C, depending linearly on where the value d lies between di and dj. This
can be viewed as a penetrable shell model with viscous
internal properties. A nice property of this model is that it
allows the user to feel subsurface structure when the density and normal vector change below the surface.
5. Rendering
In order to provide fast rendering of isosurfaces and
translucent volumes, we use a volumetric ray tracing
method [15] to generate images of the volumetric data.
This method is flexible since it allows for the rendering of
multiple independent, possibly overlapping volumetric
objects. If the viewing position is fixed and global effects
are ignored during the haptic interaction loop, then data
modification operations correspond to local image updates
within the image-space extent of the modified region of
the volume.
When modeling a solid object, an isosurface representation is a natural choice for both force feedback and
rendering. To produce high quality images, the ray tracer
computes the analytical intersection of the ray with the
isosurface as defined by the interpolation function, and a
central differences technique is employed to estimate surface normals. When modeling amorphous objects such as
smoke and clouds, a volumetric feedback equation and
rendering method are required. Images are generated by
sampling material properties along a ray, and cornpositing
them to produce a final pixel intensity value.
Using a standard ray tracing method, the computation required to update even a small region of the image
(typically 30* to 50* pixels for a 512* image) may be too
slow for interactive object modification. Therefore an
acceleration method must be employed to achieve interactive frame update rates. In this haptic interaction method,
play device requires refreshing during the haptic
interaction loop, this image is copied into the color and
depth components of the framebuffer. Fast projection of
the tool is then obtained through the use of graphics hardware. For opaque surfaces, opacity values stored in the
is equivalent to
pixels are binary, therefore V,#O
V,
= 1 , and this method correctly combines the ray
traced image of the volume with the projection of the tool.
When the tool is within a translucent volume, the combination provides only an approximate image since the opacity value stored in a pixel represents the final opacity of all
accumulated samples, and the distance value represents
the location along the ray at which the accumulated opacity reached V,. In our experience, adequate images are
generally produced with
V,
= 0.25.
If more accurate
combined images are required during the modification of
internal features in a translucent volume, they can be generated at the cost of memory or computation time.
6. Data Modification
The data modification component of our haptic visualization method is an optional component that can be
used to modify any local property of the data stored in the
volume database. A local property is one that affects only
201
a small region of the volume, and therefore will cause only
a small region of the image to require updating. As
described in Section 3, three modifiable values are stored
for each data sample: material density, color, and an index
value. Density and color are independent data properties.
Stiffness, material classification, and material shading
parameters such as opacity and ambient, diffuse, and specular reflectivity are dependent properties stored in look-up
tables that are indexed by either the index value, or the
density value. The values in the look-up tables cannot be
modified since this would result in global changes in the
appearance of the volume. Instead, the index value or density value used to index the look-up table is modified.
Independent data properties can be modified by setting them to a constant value, or by applying a filter to the
current values. The index value defining dependent data
properties is generally only modified using the constant
value method unless there is a linear relationship between
this index value and all the properties represented by this
value. For the independent color values, we could use the
constant value method to “paint” an object red by setting
the color property at the tool location, or some small
region centered at the tool location, to red. In addition, we
could define the material properties of the paint by also
setting the index value, which would change all properties
dependent on the index value simultaneously.
With the filter method, we could “melt” an object by
updating density values in a small region around the tool
location according to di = (1 - a)d,- t , where di is the
new density value, dim , is the current density value, and
Figure 3: An example of various tools applied to a volumetric wall.
column indicates a tool name, the second column lists the
modification method used for each operation, the third column defines the properties that are modified for each operation, and the last column describes the modification
process. Specific constants for the instance of the tools
shown in Figure 3 are given in parenthesis in the last column.
a is obtained by sampling a spherical filter with a Gaussian distribution centered at the tool location. In contrast to
an object using
melting,
we can “construct”
di= aD+(l-a)d,-,
, where D is the density of the
material that we are adding to the volume. Note that melting is just a special case of constructing with D = 0. In
fact, constructing will appear like melting whenever the
opacity of di is less than the opacity of the density dimI
Table 2: Common Tools
Description
I
I
Melt
Filter
1Construct 1Filter
that it is replacing.
The two modification methods described above can
lead to a wide variety of tools by varying the constant
value or filter type, and the data properties that are modified. The most difficult part of defining a new tool is
selecting a tool name, since there are many possible virtual
tools that do not have physical counterparts. In our system,
a modification operation is defined by providing the modification method, the extent of modification, the properties
affected, and any necessary filter and constant values. A
tool is composed of one or more modification operations,
with the condition that no property is affected by more
than one modification operation in a tool. Figure 3 illustrates the use of some common tools on a volumetric wall.
These tools are described briefly in Table 2, where the first
Stamp
~1
202
I
Density
I
remove density
1Density 1add density (63% dense)
I
7. Implementation
and Results
The methods discussed in this paper were implemented on a Silicon Graphics Indigo2 Extreme workstation with a 200 MHz R4400 processor and 96 MB of
RAM. A 1.5x PHANTOM haptic interface [ 121 was used
to provide force-feedback. The PHANTOM connected
directly to the workstation through an EISA bus.
The haptic interaction loop described in Section 2
was implemented as a single process, and was responsible
for computing forces, modifying the volume, and rendering. We typically visualized volumetric objects using a
512’ image, with data modification rates of lOHz, image
update rates of 20Hz, and force update rates of 5KHz.
Faster data modification rates are possible, but are not
practical due to the precision limitations imposed by representing density and the components of color as 8 bit values. The force update rate is higher than the minimum
required rate of lKHz, and could potentially be reduced to
3KHz to provide more processing power for rendering in a
larger image.
Special care was taken to ensure that the system
would not produce unsafe forces. At the start of a haptic
session, the system computes forces but does not apply
them until a force within an acceptable range is calculated.
This prevents a user from starting haptic feedback in an
area of high force magnitude. In addition, all forces are
checked against a maximum force threshold. If this value
is exceeded, the system shuts down haptic interaction.
Another concern that we faced was the ergonomics
of the system. During haptic interaction, the tight link
between visual and tactile feedback can often trick a user
into believing that she is holding a real tool, and interacting with a physical object. This illusion is shattered as
soon as the user attempts to rest her hand against the
object for greater control of small movements. We have
found that providing places for the user to rest her elbow
and wrist can help to maintain this illusion of physical
reality. These resting places are also necessary to help
reduce fatigue and muscle strain that could be caused by
prolonged use of a haptic device.
Figure 4 illustrates the use of the system in understanding a complex set of dendrites emanating from a lateral geniculate nucleus (LGN) cell. This 256 x 256 x 195
voxel LGN cell was scanned with a confocal microscope,
and volume and haptic rendered as discussed in Section
4.1. The ability to feel as well as see the dendrites provides
a large amount of additional information when visualizing
this data. It proved useful to determine the path of intertwined dendrites using haptic feedback. However, a problem was encountered when feeling the path of dendrites
since it was easy to slip off the dendrite and have to feel
your way back to resume exploration. One solution to this
problem was to invert the mapping of opacities to force
magnitudes such that the high opacities attracted the tool.
203
This made following winding dendrites a far easier task.
Another example of where haptic interaction can be
used to explore scientific data is shown in Figure 5. A
152x 261 x 220 CT data set of the Visible Human’s foot
was segmented and visualized as a skin surface and a bone
surface. Initially, the bone surface is completely contained
within the skin, and is therefore not visible in the image.
An “invisibility” tool is used to set the LUT indices on a
portion of the skin surface to a value that represents an
invisible material, revealing the inner bone structure for
visual inspection. Since the density values have not been
modified, force feedback is still provided for the transparent skin regions. With a strong input force, the user can
“poke” through the skin surface to explore internal features.
The ability to feel and modify a volume rendered
object was used to annotate and cut a 256 x 256 x 225
voxel CT head shown in Figures 6 and 7. Figure 6 shows
the tool cutting into the surface of the skull, revealing interior regions of bone. Prior to cutting, a black circle was
traced on the surface as a cutting guide. The forces
assigned to the skull were selected to be rigid, providing a
sensation similar to bone. Figure 7 shows the skull after
the removal of the cut out section. Additional punctures
and annotations were also placed on the surface.
Figure 8 shows a volumetric scene created from an
empty volume with a resolution of 6g3 voxels. Several
construction and painting tools were used, and the image
was generated using a volume rendering technique. This
haptic interaction method can form the basis of a volume
modeling [ 17,5] or painting [ 1,7] system that would allow
for the creation, modification, and rendering of both solid
and amorphous objects.
8. Future Work
We have found that the integration of haptic interaction into a scientific visualization process can lead to a better understanding of complex, three-dimensional data, and
can result in more natural, intuitive methods for modifying
this data. One limitation that we encountered during our
work was the lack of rotational forces, since the PHANTOM device provides only three degrees of translational
force feedback. An area of future research that we would
like to explore is the extension of our force feedback equations and data modification operations for haptic devices
that provide six degrees of freedom in both input and output.
We found it difficult to work with high frequency
data using the force equations presented in this paper.
When the density field changes from empty space to dense
object within one or two voxels, it is difficult to provide
force feedback without producing unwanted vibrations.
We would like to investigate methods for reducing these
vibrations on high frequency data. This may include limit-
ing the speed at which the user can move through these
regions, providing spherical rather than point contact, or
maintaining auxiliary buffers that indicate the distance to a
surface.
Physically realistic haptic interaction is another area
that requires further investigation. This includes the ability
to obtain, store, and modify material properties that define
how an object reacts to an applied force. We would also
like to investigate methods for quickly computing forces
for more realistic tools. For example, an artist could feel
the bristles of the brush interact with the object as he
paints.
Graphics,” IEEE Computer
26(7),
pp. 51-64 (July
1993).
10. P. Lacroute and M. Levoy, “Fast Volume Rendering
Using a Shear-Warp Factorization of the Viewing
Transformation”, Computer Graphics (Proc. SIGGRAPH), pp. 45 I-457 (July 1994).
11. M. Levoy, “Display of Surfaces from Volume Data”,
IEEE Computer Graphics & Applications, S(3) pp.
29-37 (May 1988).
12. T.H. Massie and J.K. Salisbury, “The PHANTOM
Haptic Interface: A Device for Probing Virtual
Objects,” Proceedings of the ASME Winter Annual
Meeting, Symposium on Haptic Interfaces for Virtual
Environment and Teleoperator Systems, Chicago, pp.
295-302 (November 1994).
13. K. Salisbury, D. Brock, T. Massie, N. Swarup, and C.
Zilles, “Haptic Rendering: Programming Touch Interaction with Virtual Objects,” Proceedings of the 1995
Symposium on Interactive 30 Graphics, pp. 123-130
(April 1995).
14. L.M. Sobierajski and R.S. Avila, “A Hardware Acceleration Method for Volumetric Ray Tracing,” Visuafization ‘95 Proceedings, pp. 27-34 (October 1995).
15. L.M. Sobierajski and A.E. Kaufman, “Volumetric Ray
Tracing,” 1994 Symposium on Volume Visualization,
pp. 11- 18 (October 1994).
16. R.M. Taylor, W. Robinett, V.L. Chi, F.P. Brooks, W.V.
Wright, R.S. Williams, and E.J. Snyder, “The
Nanomanipulator: A Virtual-Reality Interface for a
Scanning Tunneling Microscope,” Proceedings of
SIGGRAPH ‘93, pp. 127-134 (August 1993).
17. S. Wang and A. Kaufman, “Volume Sculpting,” Proceedings of the 1995 Symposium on Interactive 30
Graphics, pp. 151-156 (April 1995).
Acknowledgements
The volumetric wall data set shown in Figure 3 is
courtesy of Sidney Wang. The LGN cell shown in Figure 4
is courtesy of Barry Burbach. The CT data for Figure 5 is
courtesy of the National Library of Medicine’s Visible
Human Project. The CT data shown in Figures 6 and 7 is
courtesy of North Carolina Memorial Hospital. We would
like to thank Thomas Massie for providing valuable feedback on the use of the PHANTOM.
References
1. M. Agrawala. A. Beers, and M. Levoy. “3D Painting
on Scanned Surfaces,” Proceedings of the 1995 Symposium on Interactive 30 Graphics, pp. 145-150
(April 1995).
2. G. Bishop, H. Fuchs, L. McMillan, and E.J. Scher
Zagier, “Frameless Rendering: Double Buffering
Considered Harmful,” Proceedings of SIGGRAPH
‘94, pp. 175- 176 (July 1994).
3. F.P. Brooks, P. M. Ouh-Young, J.J Batter, P.J. Kilpatrick, “Project GROPE:Haptic Displays for Scientific Visualization,” Proceedings of SIGGRAPH ‘90,
pp. 177-186 (August 1990).
4. G. Burdea, N. Langrana, K. Lange, D. Gomez, and S.
Deshpande, “Dynamic Force Feedback in a Virtual
Knee Palpation,” Journal Of Artijcial Intelligence in
Medicine, 6, pp. 321-333 (1994).
5. T. A. Galyean and J.F. Hughes, “Sculpting: An Interactive Volumetric Modeling Technique,” Computer
Gruphics, 25(4), pp. 267-274, (July 1991).
6. S.R. Geiger, I. Darian-Smith, J.M. Brookhart, and
V.B. Mountcastle, “The Nervous System,” Handbook
of Physiology, Volume III, Section 1, pp. 739-788,
(1984).
P. Hanrahan and P. Haeberli, “Direct WYSIWYG
Painting and Texturing on 3D Shapes,” Proceedings
of SIGGRAPH ‘90, pp. 2 15-223, (August 1990).
H. Iwata and H. Noma, “Volume Haptization,” IEEE
1993 Symposium on Research Frontiers in Virtual
Reality, pp. 16-23 (October 1993).
A. Kaufman, R. Yagel, and D. Cohen, “Volume
204