Rigorous Image Formation From Airborne and Spaceborne Digital Array Scanners
Rigorous Image Formation From Airborne and Spaceborne Digital Array Scanners
Rigorous Image Formation From Airborne and Spaceborne Digital Array Scanners
H. J. Theiss
InnoVision Basic and Applied Research Office, Sensor Geopositioning Center, National Geospatial-Intelligence
Agency (contractor) PA Case 12-189, 7500 GEOINT Dr, Springfield, VA 22150, USA – [email protected]
Commission I, WG 3
KEY WORDS: geometric, modelling, optical, orientation, processing, rectification, restitution, scanner
ABSTRACT:
Sensor builders in the digital era have design limitations due to the constraint of maximum available digital array size. A
straightforward solution exists, for example, when four cameras that each simultaneously captures an image from essentially the
same perspective centre; they can be re-sampled to form a virtual large format image that can be exploited using a single (instead of
four separate) instantiation of a frame model. The purpose of this paper is to address the less trivial time-dependent cases where the
sensor scans the ground and the detector arrays obtain chips of imagery that need to be stitched together to form a single
conveniently exploitable image. Many operational techniques warp the imagery to form a mosaic, or ortho-rectify it using an
imperfect digital surface model (DSM), thus eliminating the possibility for accurate geolocation and uncertainty estimation. This
algorithm, however, forms a single virtual image with associated smooth metadata, which can be exploited using a simple physical
sensor model. The algorithm consists of four main steps: 1) automated tie point matching; 2) camera calibration (once per sensor);
3) block adjustment; and 4) pixel re-sampling based on an “idealized” virtual model. The same geometry model used to form the
image, or its true replacement, must be used to exploit it. This paper verifies the algorithm using real imagery acquired from the
Global Hawk (GH) UAV. Registration of the virtual image to a WorldView1 stereopair using four tie points yielded an RMS below
0.6 meters per horizontal axis.
159
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XXXIX-B1, 2012
XXII ISPRS Congress, 25 August – 01 September 2012, Melbourne, Australia
4.1 Approach
160
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XXXIX-B1, 2012
XXII ISPRS Congress, 25 August – 01 September 2012, Melbourne, Australia
161
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XXXIX-B1, 2012
XXII ISPRS Congress, 25 August – 01 September 2012, Melbourne, Australia
162
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XXXIX-B1, 2012
XXII ISPRS Congress, 25 August – 01 September 2012, Melbourne, Australia
Figure 4. An exaggerated view of footprints of three panoramic Figure 6. Smoothed attitude data
images (left side) and several columns of a whiskbroom image
(right side) It is important to note that smoothing the EO parameter data can
have significant advantages, as well as possible drawbacks,
when employed during the image formation process. The
following comments about smoothing apply equally to airborne
framing, pushbroom/whiskbroom, or spaceborne linear array
scanners. If the sensor had experienced uncommanded (but
recorded by the IMU, or recovered during bundle adjustment)
rolling during the image collection, e.g. due to air turbulence,
then straight lines would appear as wavy curves in an image
formed using this erratically varying attitude; however,
smoothing the metadata of the virtual (idealized) image would
have the effect of straightening the lines in the formed image.
Similarly, any imperfections in sensor IO, e.g. modelled lens
distortions or chip misalignments, need not be included in the
virtual (idealized) imaging geometry since it incorporates
Figure 5. Whiskbroom geometry applied to a GH UAV scene unnecessary complication into the virtual geometry model
which will ultimately need to be implemented by a downstream
exploitation tool. Even substantial smoothing of the attitude
data, or simplification of the IO parameters, will retain the
6.1.2.3 Frame Geometry: The frame geometry is a geometric integrity of the output image, albeit with changes to
simpler but less rigorous virtual model that can be applied to the how scale varies in the output image compared to the original
entire scene. It was deemed inappropriate for the GH UAV real image chips. (The reason why geometric integrity is retained,
data set due to the relatively long distance travelled between the as explained in Section 6.3, is that the re-sampling process
first and last images collected in a scene. The virtual frame requires the unsmoothed attitude and imperfect IO parameters
model may be appropriate from a platform at longer slant when performing the ground-to-image step.) Caution must be
ranges, for smaller scenes, or for faster collections. The virtual exercised, however, when smoothing perspective centre
frame model is appropriate for systems such as wide area locations since this simplification will introduce errors into
airborne surveillance (WAAS), whereby all frame images are subsequent geolocation processes as a function of the amount of
acquired at the same time from the same platform. error in the DSM (or range image) that was used in the re-
sampling process. Finally, note that while smoothing the
6.2 Generating the Idealized Metadata attitude data of the virtual image geometry will have the
desirable effect of making object straight lines appear as straight
Section 6.1 presented three different options for the virtual
lines in the image (even though they appear as wavy curves in
image geometry; and the whiskbroom geometry was chosen for
the original image), it will result in the artefact that the straight
the GH UAV data. All pixels in a given column are modelled
edges marking the extent of the original images will appear as
as though they were imaged at the same time instant; hence our
wavy edges in the output image product.
desire is to have a polynomial function that yields each EO
parameter as a function of column number. Figure 6 shows the
values of the attitude parameters recovered from the bundle 6.3 Re-sampling the Pixels
adjustment (aka triangulation), plotted as the blue curves. The Once the geometry of the virtual (idealized) image has been
red curve illustrates the result of fitting a second order defined and the associated IO and EO parameters have been
polynomial to the average value for the scan, i.e. the midpoint determined, the remaining step of re-sampling the pixels is
between the value at the fifth and sixth frames. The same relatively straightforward. First, it is recommended to add a
polynomial fitting strategy was applied to the perspective centre buffer of black pixels around all four edges of the output image
positions. Unlike the case for the attitude parameters, these product. (It is a simple procedure to crop the image if a
position parameters remained smooth throughout the duration of customer does not want to see the wavy or jagged edges in the
each scan. final output image.) For each pixel (row, column) in the output
image, the following steps are performed:
1. Image-to-ground in the virtual image. Calculate time
as a function of row and column, obtain the values of
all EO parameters as a function of time (from the
163
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XXXIX-B1, 2012
XXII ISPRS Congress, 25 August – 01 September 2012, Melbourne, Australia
7. RESULTS
164
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XXXIX-B1, 2012
XXII ISPRS Congress, 25 August – 01 September 2012, Melbourne, Australia
performing a photogrammetric resection using points extracted thus indicating that good internal relative geometry has been
from the stereopair as ground control points (GCPs). preserved during the image formation process. The a posteriori
reference variance values, provided in the last column of Table
The adjustable parameters consisted of one interior orientation 1, are close to unity therefore indicating that the one-sigma
parameter, i.e. focal length, and coefficients of a polynomial uncertainties of 1 pixel and 0.5 meters for image coordinate
function of time used to compute small corrections to the roll, measurement and GCP uncertainty, respectively, were
pitch, and yaw angles about the x, y, and z sensor coordinate appropriate. They also show that the adjustable parameter of
system axes, respectively, i.e. the green axes in Figure 5. The second power for pitch is significant, but not roll.
polynomial functions are:
a 0 a1t a 2 t 2 (2)
b0 b1t b2 t 2
c 0
in which:
, , are the roll, pitch, and yaw angles, respectively,
a 0 , b0 , c 0 are constant terms of the polynomials,
a1 , b1 are first order terms of the polynomials,
a 2 , b2 are second order terms of the polynomials, and
t is the normalized time associated with the column
number (sample) of the coordinate of interest in the formed
image; thus t ranges smoothly from -0.5 to +0.5 from the first to
the last column of the image. Table 1. GH UAV – resection results
165