PSC Photogrammetry

Download as pdf or txt
Download as pdf or txt
You are on page 1of 110

Photogrammetry

Presented by:
Er. Prashant Ghimire
Chief Survey Officer
Ministry of Land Management, Cooperatives & Poverty Alleviation
M. Sc. Photogrammetry and Geoinformatics

1
Photogrammetry and Remote
Sensing
• Photogrammetry: The art and science of obtaining
reliable measurement by means of images.

• Photogrammetry and Remote Sensing: The art,


science and technique to obtain reliable
geometric and thematic information on the earth
and its physical environment by acquisition,
measurement and interpretation of images, which
are obtained by remote sensing sensor systems.
• Photogrammetry – Aerial Remote Sensing
• Remote Sensing – Space Photogrammetry

2
Development of Photogrammetry

3
Photo measurements and
Interpretation
 Measurement devices
◦ With analog/analytical instruments on original
film (diapositive/negative) in stereo with optical
enlargement
◦ With digital photogrammetric workstations
(filmless, stereo)
 Image Interpretation
◦ Visual, without extensive technical aid
◦ RS: automated classification tools (image content
->
landuse/objects)

4
Photo measurements and
Interpretation
 Instruments
◦ Analogue (mechanical, optical)
◦ Analytical plotters (AP)
◦ Rectification devices
◦ Digital photogrammetric workstations
(DPWS) with automation of many processes

5
Digital Photogrammetry
 Digital Image: Picture Elements (Pixels)
◦ No film development, geometrical stability, usually
8Bit (256 grey levels) per color channel
 Offers possibility to automate the
photogrammetric process.
 Huge amounts of data
◦ 1 aerial image (23x23cm², color) digitalised with
14µm pixel size and 8Bit/channel about 800 MByte
◦ In practical applications 1000 aerial images and
more (>800 GByte)
 571000 Floppy disks
 1231 CDs
 178 DVDs

6
Photogrammetric Products

7
Photogrammetric Products
(Advanced)

8
When to use Photogrammetry?
 When the object is inaccessible, or
difficult to access
 When measurements ‘might’ be needed
 When large area are being mapped
 When continuous features are required –
such as contours, roads, rivers
 When the object is very small

9
Application
 Mapping
◦ Measurement and updating
◦ Topographic maps 1:250.000-1:25.000
◦ Large scale maps or plans up to 1:250
◦ Image maps (orthophoto map)
◦ Cadastre
◦ Digital Terrain Models
◦ Topographic, cartographic DBs
◦ 3D City models

10
Special Applications
 Archeology
 Cultural heritage
 Close-Range Photogrammetry –
Architecture, Industry
 Police (accidents, forensic medicine)
 Medicine
 Biometry

11
Types of Sensors
 Conventional: photography, metric
 Today: analogue and digital cameras,
additional sensors
 Spectral features:
– Photographic wavelength= 380nm - 780nm
• Panchromatic
• RGB
– Near infrared (NIR, b/w): wavelength = 0.7µm –
1.3µm
– Colour infrared (CIR): wavelength = 0.4µm –
1.3µm

12
Types of Sensors
 Optical
◦ Frame cameras (perspective geometry)
 Film cameras
 Digital cameras
 Terrestrial cameras
◦ Whisk broom
 Landsat
◦ Push broom
 SPOT, IRS, IKONOS
 Lidar
 Microwave
 Airborne and satellite Synthetic Aperture radar
systems

13
Geometry of a Frame Camera

14
Camera Opening angle, film format
and focal length

15
Orientation
 Orientation refers to the process of
reconstruction of geometry of images.
 Traditional Photogrammetry:
◦ Inner Orientation
◦ Relative Orientation
◦ Absolute Orientation
 Digital Photogrammetry:
◦ Interior Orientation
◦ Exterior Orientation
16
Interior Orientation

 Process of transforming photo


coordinates to image coordinate system.
 Photo coordinate system:
◦ Origin – lower left corner (similar to that of
raster images)
 Image Coordinate System
◦ Origin at the Principal Point

17
Interior Orientation

18
Interior Orientation
 3 parameters of interior orientation are:
◦ Principal point H’ (x0, y0)
◦ Principal distance c (c ~ focal length)
 Additionally, sometimes the parameters of
the lens distortions are also considered as
the parameters of the interior orientation.
 H’, c are calibrated in laboratory and are
assumed to be known for image
measurements.
 Alternatively in-situ calibration (requires
GCPs).

19
Practical snapshot

20
Exterior Orientation
 Relationship between image and object
space.
 Accomplished by determining the camera
position (exposure station) in the object
coordinate system.
 Camera Position determined by:
◦ Location of its perspective centre and altitude
(X0 ,Y0 , Z0)
◦ Three independent angles: kappa, phi and
omega

21
Exterior Orientation

22
Exterior Orientation

 P

• Parameters of Exterior Orientation: X0 , Y0 , Z0 , 𝜔 , 𝜑, 𝜅


23
Collinearity Equation / Model
 These six orientation parameters can be
conveniently solved by collinearity model.
 Collinearity model:
◦ The perspective centre ( S), the image point
(a), and the object point (A) must lie in a
straight line (as in previous figure).

24
Collinearity Equation / Model

25
Collinearity Equation / Model

26
Exterior Orientation
 From the given equation, it can be seen that
a high number of control points are needed
to perform exterior orientation of a block.
 Because, for a single model,
◦ the number of unknown parameters are 12 (6
parameters in each photograph)
◦ The number of observation equations for each
point is 4. (one point lies in 2 images and for each
image we measure 2 image coordinates)
◦ Hence at least 3 full control points are required
in order to orient this model.

27
Exterior Orientation
 Question
◦ Can we orient a model with 2 planimetric
GCPs and 3 Height GCPs?
◦ Use r= n-u, if 0 just can be solved, if positive
can be solved with better accuracy and if
negative can’t be solved.

28
Exterior Orientation
Which one is better?

29
Exterior Orientation
 Von-Gruber Positions
◦ For ideal distribution of GCPs.

30
Tie Points
 Points used to tie one image to the other
or one strip to the other

31
Aerial Triangulation
 Concept of models, strips, and blocks
 Methods
◦ Analog
◦ Analytical
– Polynomial strip and block adjustment
– Independent model method
– Bundle block adjustment
◦ Digital
– Automatic tie point measurements

32
Aerial Triangulation
 Process of contiguous densifying and extending
ground control through computational means.
 operation includes establishing ground control
points; performing interior orientation; measuring
and transferring all tie, check, and control points
appearing on all photographs manually; and
performing a least squares block adjustment.
 Results of Aerial Triangulation – parameters of the
Exterior Orientation and 3D coordinates of tie
points in Object Coordinate System.

33
Direct Orientation
 Modern airborne data acquisition systems
are equipped with precise navigation tools:
GPS and INS
 GPS gives coordinates of projection center
to an accuracy of ~ 10-20cm
 INS gives attitude (3 angles) to an accuracy
of ~ some mgon

Isn’t this good enough for orientation?


( Is AT still necessary?)

34
Is AT still necessary?
 Intricate relationship between interior
and
exterior orientation
◦ Errors of IO propagate to object space, e.g. error
in
focal length ∆c causes elevation error: ∆ h = s x
∆c
◦ Errors of IO are (almost) compensated by
indirect
orientation
 Calibration of GPS, IMU and imaging
sensor is not simple!

35
Typical Workflow of AT Project
 Planning
 Establish control points
 Preparation
 Point transfer
 Measurements
 Block adjustment
 Analysis

36
Planning
 Optimal design of flight lines to
◦ Assure overlap (60% forward, 20% strip
overlap
◦ Minimize flying time
◦ Consider across flight lines
◦ Take existing GCPs into account
 Flight time restrictions
 Weather

37
Planning

38
Ground Control Points (GCP)
 Select GCPs based on
◦ Accuracy considerations (e.g. along block
boundaries)
◦ Minimum number of points (economical
reasons)
◦ Measuring techniques
◦ Marking GCPs semi-permanently
 Timing
◦ Before or after data acquisition

39
Establishing Control Points
 Distribution and quality of control
points greatly affects accuracy
 Planimetric control points along
periphery
 Elevation control points regularly
distributed within block
 GPS on airplane reduces number of
ground control points

40
GCPs

41
Preparations
 Check data acquisition (quality,
overlap,
etc.)
 Prepare photo mosaic
 Identify GCPs
 Select suitable tie points
◦ Uniquely annotate points
 same point in object space appears on several
images - should have same ID
 Manual (Analog/Analytical)
 Automatic (Digital AT)

42
Point Transfer
 Selected tie points must be measured
on all images where tie point occurs
◦ Tie point may not be identifiable on all images
◦ On stereo measuring systems tie point must be
identified on one image only
 Analog/Analytical AT
◦ Most precise solution: identify natural points and
make sketches to aid identification
◦ Most economical solution: mark 3 points in the
center of image

43
Measurements
 Photo coordinates on
◦ comparators (mono or stereo)
◦ analytical plotters
 Model coordinates on
◦ (analog) stereo plotters
◦ analytical plotters
◦ computionally from photo coordinates by relative
orientation
 Automatic Tie point measurements
◦ Digital AT
◦ Image Matching

44
Automatic Tie Point Measurements

45
Adjustment
 Strip adjustment
◦ Strip coordinates measured on stereo plotter
(base in/base out capability) or on analytical
plotters
◦ Strip adjusted by polynomials
 Block adjustment
◦ Independent models
◦ Bundle block adjustment

46
Concept of Independent Models

47
Bundle Block Adjustment

48
Analysis
 Check block adjustment
◦ Blunders
◦ Connection of models or images
 Error analysis
◦ Variance component (σo)
 For a typical AT project, σo should be less than 1/3
pixel.
◦ Errors on control points
◦ Errors on check points

49
Graphical Analysis

50
Photogrammetric Acquisition
 Height accuracy
◦ Terrain point σz=0.1%0 h
◦ Limitations by gras, clods etc.
◦ Even better for signalized points or artificial
surfaces
◦ Not really dependent on terrain attitude

51
Automatic DTM generation

52
Automatic DTM generation

53
Automatic DTM generation

54
Mapping
 Content of map dependent on
application
 Photogrammetric mapping for
cadastre:
◦ problem with roof overhang
 Result of photogrammetric mapping
as input to cartographic process

55
Mapping
 Relation map-scale to image scale
 Two contradictory elements:
◦ Accuracy (0.1mm in map)
◦ Perceptibility of small elements
◦ In large image scales accuracy decisive
◦ In small image scales perceptibility decisive
◦ Empirical relation: von Gruber rule

56
Von Gruber Rule
 mi = c * sqrt(mm)
with
mi = image scale factor
mm = map scale factor
c = 200-300 (usually 250)
 Not strictly to be followed
 Rule of thumb!

57
Von Gruber Rule

58
DTM editing
 Through stereoscopic vision
◦ Examine the position of the DTM points.
◦ Click on the point and scroll the mouse to
“just touch” the ground point.

59
Image Matching

60
Image matching
 Image – to – image matching
 Matching images with object models
 Task of image matching (general):
◦ Determine the corresponding elements of
two data sets.
 The elements in the data sets are
description of the image and of the object
model

61
Matching correspondence

62
Image Matching scenarios and
approach

63
Classification of Image Matching
 Intensity based image matching
 Feature based image matching
 Relational image matching

64
Cross correlation

65
Correlation

66
Correlation Applications

67
Least squares matching (LSM)

68
Least squares matching

69
Least Squares Estimation

70
Feature based matching
 Features instead of image intensities
 Features: points, line, blobs, high curvature
lines
 Reasons for using features rather than
intensities:
◦ Features more robust – more insensitive to
model errors
◦ Features often sufficient to describe image
context
◦ Features computationally less demanding

71
Feature based matching
 Image content to be converted to
features by using different interest
operators

72
Feature based matching

73
Relational matching
 Match features and relations among them

74
Orthophotos

75
Orthophoto
 Basics
◦ Aerial photos – Perspective Projection
◦ Maps – Orthographic Projection
 When the perspective projection of aerial
photos are converted to orthographic
projection, the result is an orthophoto
 Process – Differential rectification
 Orthophotos have no tilt and relief
displacement.
76
Orthophoto
Inputs
 Aerial photo
 Interior and Exterior Orientation
Parameters of the image
 DTM (of the extent of the image)
Output
 Aerial photo with orthographic
projection (orthophoto)

77
Differential rectification
Basic Ideas
• Project image to surface
model
• Project textured surface
onto XY plane

Process
• For each point in the
orthophoto:
• Determine the surface
point.
• Map the surface point
to original camera
image
• Read out the intensity
value for the
orthophoto.
78
Differential rectification
 Define pixel positions in
orthophoto plane
 Interpolate surface point
at those points.
 Project 3D surface points
to image points
 Interpolate the intensity
value for the orthophoto
from the neighbourhood
in the original image
79
DTM Interpolation

80
Transformation from Ground to
Image

81
Resampling
 Resampling of gray values of the aerial
image into the orthoimage.
 Resampling can be done by – Nearest
Neighbor, Bilinear or Bicubic
 Usually a compromise between
computational cost and accuracy

82
Orthophotomap
Combined advantage of map and
orthophoto.
 Uniform scale
 Relation to object coordinate system
 High information density
 Marginal informations

83
Orthophotomap

84
Orthomosaics
 Combining orthophotos to a seamless
mosaic.
 Input:
◦ Geocoded Orthophotos
◦ Definition of mosaic areas to be generated
◦ Setup Parameters for radiometric adjusting
◦ Automatic Processing
 Output:
◦ Seamless, radiometrically adjusted
orthomosaic as a single file
85
Orthomosaics

86
Orthomosaics

87
Feature Extraction
Methods
 Manual - Stereo measurements in DPWS
 Semi automatic - inJECT
 Automatic - Building Generator

88
Digital Photogrammetric
Workstations (DPWs)
 Also called softcopy workstation
 Role: equivalent to that of analytical plotters in
analytical photogrammetry
 Characteristics:
◦ Able to store and process large amount of data
◦ Provisions for stereoscopic vision
◦ Simplified Graphic User Interface
 Examples:
◦ ERDAS Imagine Photogrammetric Suite
◦ LPS
◦ SOCER- SET
◦ Inpho

89
DPWs

90
Basic System Components of a
DPW

91
Stereoscopic viewing in a DPW
 The left and right image must be
separated.

92
Stereoscopic viewing in a DPW
 2 monitors + stereoscope

93
Stereoscopic viewing in a DPW
 1 monitor + stereoscope (split screen)

94
Stereoscoping viewing in a DPW
 Temporal shifting
◦ More popular and mostly used.
◦ A single monitor is used
◦ Benefits of exploiting full resolution of the
display

95
Temporal shifting

96
Advantages of a DPW
 Image processing capabilities
 Point Transfer devices and comparators are
no more required
 Absence of moving mechanical-optical part
makes DPW more reliable and accurate
 flexibility in viewing and measuring several
images simultaneously
 several persons can stereoscopically view a
model

97
Photogrammetric products
 Three main products from photo/images
◦ Terrain elevation data (DSM/DTM generation)
◦ Orthophoto/orthoimage (Image mapping)
◦ Feature data (Topographic mapping)
Feature extraction
 To obtain meaningful descriptions of
selected parts of the world
 Mainly topographic features
 Natural as well as artificial features
Photography to topography
 What are in a photograph
 What should be included in a topographic
map?
Topographic features
 Definite core of spatial reference data
forming a digital national framework for
other spatial data.[OS MasterMap]
 Nationally consistent basic spatial data that
locate and describe the Earth’s surface and
features.The National Map, a digital
database, will serve as a foundation for
integrating and sharing spatial data.[USGS
National Map]
Feature extraction
Steps/workflow
 Project definition
 Flight planning
 Ground control and signalization
 Aerial survey flight
 Film processing and scanning
 Orientation
 Aerial triangulation
 Feature extraction
◦ Image interpretation
◦ Data modelling
◦ specification
◦ Data extraction/digitizing
◦ Quality of extracted feature
 Check plots
 Field verification
 GIS processing
◦ Database
◦ Map
Image interpretation
 Spontaneous recognition
◦ Recognise the details without any difficulty
because they are familiar
 Logical inference
◦ Needs interpretation elements or cues and
logic for recognising the detail
Interpretation elements
 Shape
◦ Shape of the detail on the photo, e.g. Linear, rectangular
 Size
 Size of the object, e.g. Big , small
 Colour (hue, value)
◦ Colour, brightness, intensity values etc of the detail, e.g. Red, dark blue, light green
 Texture
◦ Surface roughness or smoothness of the detail
 Pattern
◦ Homogeneous structures or repeating patterns, e.g. Agricultural fields, regularly
spaced building structures
 Association
◦ Attachment or association of one detail with other, together help to identify the
detail, e.g. Big building and open ground with some recreation and sports structure
could be a school/college
 Shadow
◦ Height of a detail and shadow casted by it can be helpful to recognise a detail
Data model
 Basic data type
◦ Point
◦ Line
◦ Polygon
 Proper combination of the data type to
represent the real topographic surface of
earth and yield meaningful and logical
information
Data model
Data model
Data model
Object definition and Specification

You might also like