Computer-Generated Holograms of Three-Dimensional Real Objects
Computer-Generated Holograms of Three-Dimensional Real Objects
Computer-Generated Holograms of Three-Dimensional Real Objects
Abstract
We propose a method of synthesizing computer-generated holograms of real-life three-dimensional (3-D) objects. An
ordinary digital camera illuminated by incoherent white light records several projections of the 3-D object from different
points of view. The recorded data are numerically processed to yield a two-dimensional complex function, which is then
encoded as a computer-generated hologram. When this hologram is illuminated by a plane wave, a 3-D real image of the
object is reconstructed.
Keywords: 3-D image processing, 3-D holography, Computer-generated hologram
1. Introduction
Since the invention of the hologram more than 50 years ago,1 holographic recording of real objects has been performed
by wave interference. In general, interference between optical waves demands special stability of the optical system and
relatively intense light with a high degree of coherence between the involved beams. These requirements have prevented
hologram recorders from becoming as widely used for outdoor photography as conventional cameras. A partial solution
to these limitations is obtained by the techniques of holographic stereograms2,3 (also known as multiplex holograms4,5)
However, optical interference is also involved in recording of holographic stereograms, although it is off-line
interference. The meaning of "off-line" here is that a reference beam interferes with a beam diffracted from a motion
picture film. The motion picture film contains many viewpoints of the object, and the object is in-line recorded by a
motion picture camera. However, unlike ordinary holograms,1,6 holographic stereograms do not reconstruct the true wave
front that is diffracted from an object when this object is coherently illuminated. The reconstructed wave front from a
holographic stereogram is composed of a set of discrete patches; each patch contains a different perspective projection of
the object. Because of the discontinuity between those patches, the imitation of the observed reality cannot be complete.
In this study we propose a process of recording a computer-generated hologram (CGH) of a real-world three-dimensional
(3-D) object under conditions of incoherent white illumination. Yet the true wave front diffracted from the object, when
it is coherently illuminated, can be reconstructed from the proposed hologram. In other words, after a process of
recording the scene under incoherent illumination and digital computing, we get a two-dimensional (2-D) complex
function. This function is equal to the complex amplitude of coherent light diffracted from the same object and
propagates through a particular optical system described below. Thus apparently we succeed in recording the complex
amplitude of some wave front without beam interference. It should immediately be said that we do not propose here a
general method of recording complex amplitude without interference. Our system cannot sense any phase modulations
that happen between the object and the recording system. However, let us look at a 3-D object illuminated by a coherent
plane wave. If the reflected beam from the object propagates in free space and then through the particular optical system,
the result at the output plane is some complex amplitude. We claim that this complex amplitude can be restored under
incoherent conditions. Once this complex function is in computer memory, we can encode it to a CGH. When this CGH
is illuminated by a plane wave, which then propagates through the same optical system mentioned above, the image of
the 3-D object is reconstructed in space as a common holographic image. Similarly as in stereogram photography, we
record several digital pictures of the object from different points of view. The pictures are recorded into a digital
computer, which computes a CGH from the input data. Illuminating this hologram by a plane wave reconstructs the
original objects and creates the volume effect in the observer's eyes. The hologram that we would like to produce is of the
type of a Fourier hologram. This means that the image is reconstructed in the vicinity of the back focal plane of a
spherical lens when the hologram is displayed on the front focal plane. However, a complete 2-D Fourier hologram can
*
59
be recorded if the camera's points of view are on a 2-D transverse grid of points. Because it is technically impractical, or
at least quite difficult, to shift the camera out of the horizontal plane along a 2-D transverse grid of points, the hologram
that we produce here is only a one-dimensional (1-D) Fourier hologram along the horizontal axis and an image hologram
along the vertical axis. Consequently, the coherent system that we emulate and the reconstructing system are both
composed of a cylindrical Fourier lens in the horizontal axis and a second cylindrical imaging lens in the vertical axis. In
Section 2 we describe the recording process in detail.
y
RECORDED OBJECTS
o1 (x, y, z )
yi
8
L
%
xi
x
o2 (xi , yi , i )
COMPUTER
j 2 i
exp
[ ] dx
RECONSTRUCTED IMAGE
y0
HOLOGRAM
v
h( u,v )
Lu
8
%
Laser
beam
f/2
Lv
x0
u
f
60
z0
(1)
( x i , y i ) = ( x cos i + z sin i , y ).
For simplicity, we assume that the magnification factor of the imaging lens is 1. Also, because distance L is much greater
than the depth of the object, all the object points are equally imaged with the same magnification factor of 1.
The maximum range of the angle i is chosen to be small (no more than 16o on each side in the present example).
Therefore we are allowed to use the following small-angle approximations: cosi1 and sinii. Recalling our original
goal to get a 2-D hologram containing the information of the objects' volume, we next reduce the 3-D function o2(xi,yi,i)
to a 2-D function. We assume that the following digital process is a perfect imitation of an optical system, which will be
discussed below. Inside the computer, the hologram values are computed from the projected functions according to the
following equation,
h(u , v ) =
f )dx i dy i .
(2)
where is the wavelength of the plane wave illuminating the system and f is the focal length of a cylindrical lens. The
variable u is related to the angle i via the equation i=au, where a is some chosen parameter.
PLANE
WAVE
o1 (x, y, z )
y
f
Lx
8
%
g (u, v )
z
BEAM
SPLITTER
f
u
Ly
f
f
61
Let us consider now the relation between h(u,v) and the object o1(x,y,z). For a single infinitesimal element of size
(x,y,z), at point (x',y',z'), with the intensity o1(x',y',z') from the entire 3-D object function, the distribution on the (u,v)
plane for each i value is
h(u , v ) o1 (x' , y ' , z ') exp( i 2uxi f ) (v yi )xyz.
(3)
Relation (3) is obtained from Eq. (2) because, for each i value, a single point at the input scene is imaged to a point at
the (xi,yi) plane. The / function in relation (3) is a mathematical idealization of the fact that the point at yi is imaged to the
line v=yi on the (u,v) plane. Substituting Eq. (1) into relation (3) yields
h(u , v ) o1 (x' , y ' , z ')exp[ i 2 (ux ' cos i + uz ' sin i ) f ] (v y ')xyz.
(4)
Next we examine the influence of all points of the object o1(x,y,z) on the distribution of h(u,v). The object is 3-D, and
therefore the summation of the object's points is performed along the (x,z) axes, whereas along the vertical axis the
picture is perfectly imaged. Therefore the overall distribution of h(u,v) is obtained by a 3-D integral of the expression in
relation (4) as follows:
h(u , v )
(5)
2
o1 (x, y, z ) (v y )exp[ i 2 (ux + au z ) f ]dxdydz.
(6 )
Substituting the condition i=au and the small-angle approximations into relation (5), yields the following 2-D function:
h(u , v )
Next we show that, if a is chosen to be a=-1/(2f ), h(u,v) is equal to the complex amplitude on the output plane of the
equivalent coherent system, shown in Fig. 2. It should be emphasized that this coherent system is only the equivalent
optical system for the expression in relation (6), and we depict it in Fig. 2 only to clarify the equivalent model. The
complex amplitude is examined at the back focal plane of a convex cylindrical lens (horizontally focusing) when a plane
wave is reflected from the 3-D object o1(x,y,z) located at the back focal plane and is perfectly imaged along the vertical
axis. For a single infinitesimal element of the size (x,y,z) from the entire object with an amplitude of o1(x',y',z'), the
complex amplitude at the plane (u,v) is7
i 2 ux u 2 z
xyz
(7 )
g1 (u , v ) = Ao1 (x ' , y ' , z ') (v y ) exp
f 2 f 2
where A is a constant. Summation over the contributions from all the points of the 3-D object yields the following
complex amplitude:
i 2 ux u 2 z
dxdydz
(8)
g (u , v ) = A
o1 (x, y, z ) (v y )exp
f 2 f 2
Comparing Eqs. (8) and (6), we see indeed that substituting a=-1/(2f ) into Eq. (6) yields an expression similar to the one
given in Eq. (8). The only difference is that o1(x,y,z) in Eq. (8) represents a complex amplitude, whereas in Eq. (6) it
represents an intensity. As the intensity of the reconstructed object from h(u,v) is proportional to |o1(x,y,z)|2, its gray-tone
distribution is expected to be deformed compared with the gray-tone map of the original object. However, we can
compensate for this deformation by computing the square root of the grabbed pictures in the recording stage. In both
functions h(u,v) and g(u,v) the object's 3-D structure is preserved in a holographic manner. This means that the light
diffracted from the hologram is focused into various transverse planes along the propagation axis according to the
object's 3-D structure.
Parameter a can in fact take any arbitrary real value, not just the value -1/(2f). In that case, after integration variable z is
changed to z'=-2faz, Eq. (6) becomes
i 2 ux u 2 z '
z'
dxdydz'
(v y )exp
(9)
h(u , v )
o1 x, y,
2af
f 2 f 2
Relation (9) also has the form of Eq. (8) but with the change that the hologram obtained describes the same object on a
different scale along its longitudinal dimension z. We conclude that by our choice of parameter a we can control the
longitudinal magnification of the reconstructed image, as we show below. Equation (8) represents a complex wave front,
which usually should be interfered with a reference wave to be recorded. In the case of wave interference the intensity of
the resultant interference pattern keeps the original complex wave front in one of four separable terms.6 However, in our
case the complex wave-front distribution is recorded into computer memory in the form of Eq. (6) [or (9)] without any
62
interference experiment and actually without the need to illuminate the object with coherent laser light. Because the
expression in Eq. (6) describes the equivalent of a wave-front distribution, it contains 3-D holographic information on the
original objects, which can be retrieved as described in what follows.
As we mentioned above, the hologram values are stored in computer memory in the form of the complex function h(u,v).
To reconstruct the image from the hologram, the computer should modulate some transparency medium with the
hologram values. If the transparency cannot be modulated directly with complex values, one of many well-known coding
methods for CGHs8 might be used. The spatial light modulator (SLM) that we use in this study can modulate the intensity
of light with continuous gray tones. Therefore, complex function h(u,v) is coded into a positive real transparency as
follows,
i 2
(10)
d x u + d y v ,
hr (u , v ) = 0.51 + Re h(u , v ) exp
where (dx,dy) is the new origin point of the reconstruction space, and |h(u,v)| is normalized at 0-1.
The holographic reconstruction setup is shown in the lower part of Fig. 1. To get the output image with the same
orientation as the object, we display the 180o-rotated hologram h(-u,-v) on the SLM. Then the SLM is illuminated by a
plane wave, which propagates through the SLM and the two cylindrical lenses with two orthogonal axes. Through lens Lu
,a 1-D FT of h(-u,-v) along u is obtained at the back focal plane along xo. Lens Lv images the distribution along the v axis
on the yo axis. This optical setup is identical to the equivalent coherent system shown in Fig. 2, and therefore the real
image of the original 3-D object is reconstructed in the vicinity of the back focal plane of cylindrical lens Lu.
To calculate the magnification of the image along each axis we consider the equivalent optical process on object and on
image planes. Based on Eq. (6) together with the operation of lens Lu, the effective system from plane (x,y) to the output
plane, along the horizontal axis, is similar to a 4-f system.9 Therefore the overall horizontal magnification is identical to
1. In the vertical axis the object is imaged twice from plane (x,y) to output plane (xo,yo), and therefore the magnification is
also equal to 1. On the longitudinal axis the situation is a bit more complicated. Looking at Eq. (6) and the lens Lu, we see
a telescopic system but with two different lenses. From Eq. (6) the effective focal length of the first lens is
(a)
(b)
(c)
Fig. 4: (a) The magnitude (the maximum value is whitest) and (b)
the phase anglxe ( is white and is black) of the hologram
recorded and computed in the experiment. (c) The central part of
the CGH, computed by relation (10) from the complex function
shown in Figs 4a and 4b.
63
f1 =
f (2a ) . The focal length of re-constructing lens Lu is f2=f. Using the well-known result that the longitudinal
magnification of a telescopic system9 is (f2/f1)2, we find the longitudinal magnification in our case to be 2f|a|. Note that
with parameter a
magnification independently of the transverse magnification.
3. Experimental results
In our experiment the recording was carried out by the system shown in the upper part of Fig. 1 and the reconstruction
was demonstrated by an optical experiment shown in the lower part of Fig. 1. The scene observed contains three cubes of
size 5cm5cm5cm located at different distances from the camera. We show in Fig. 3 16 examples selected from 65
scene viewpoints taken by the camera. Each projection contains 256256 pixels. Figure 3 shows the scene observed by
the CCD from a distance of 77cm. The angular range is 16 o from the CCD axis to the z axis, and the angular increment
between every two successive projections is 0.5o.
The hologram was computed from the set of the 65 projections according to the procedure described above. The
magnitude and the phase angle of the computed 256256-pixel complex function h(-u,-v) are shown in Figs. 4(a) and
4(b), respectively. The central part of the CGH computed according to Eq. (10) is depicted in Fig. 4(c). The total size of
the CGH is 800 pixels on the horizontal axis and 256 pixels on the vertical axis.
In the optical experiment the CGH [part of which is shown in Fig. 4c] was displayed on a SLM (Central Research
Laboratories, Model XGA1). Parameter a in this example was chosen to be [sin(32o)/3.5]cm-1, where 32o is the angular
range of the capturing camera and 3.5cm is the width of the SLM located in the (u,v) plane. The reconstruction results in
the region of the left-hand diffraction order are shown in Fig. 5 at three transverse planes along the optical axis.
Evidently, the effect in which every letter is in focus on a different transverse plane appears in Fig. 5.
(a)
(b)
(c)
Fig. 5: The experimental results from the hologram shown in Fig. 4(c), at the vicinity of the back focal
point of Lu, on three different transverse planes at: (a) zo=-0.5cm, (b) zo= 2.5cm, (c) zo= 6cm.
4. Conclusions
In conclusion, we have proposed and demonstrated a process of recording holograms of real-life 3-D objects without
wave interference. There are two main differences between our method and previous techniques10,11 for recording CGH's
of 3-D objects. First, as we have shown, our hologram is a single hologram with properties similar to those of a hologram
recorded optically by the interference of laser beams. Our hologram is neither a composite hologram nor a holographic
stereogram as previously suggested.10,11 Second, we deal with a real-life 3-D object recorded into computer memory,
whereas others compute CGHs of artificial computer-generated objects. The last-named difference also distinguishes our
method from the method of 3-D CGH suggested in Ref. 12. This method is also different from the techniques for
recording holographic stereograms and multiplex holograms2 5 in two aspects. First, there is no need to interfere coherent
64
beams in any stage of our process. The final CGH is obtained from the set of the object's projections purely by numerical
computation. Second, our process is a true imitation of a particular holographic coherent system. Therefore the
reconstructed image has features similar to those of an image coming from a coherently recorded hologram. This method
might lead to development of a generally used holographic camera for outdoor photography.
Acknowledgement
This research was supported by the Israel Science Foundation.
References
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
65