IP Segmentation L6 Compressed
IP Segmentation L6 Compressed
IP Segmentation L6 Compressed
Jayanta Mukhopadhyay
Dept. of CSE,
IIT Kharagpur
Segmentation
l Partitioning image pixels into meaningful non-
overlapping sets.
l Let R be the entire spatial region occupied by the
image.
l Segmentation is a process to partition R into n sub-
regions R1, R2, …, Rn so that
l Ri is a connected set, i=1,2,..,n A logical predicate over Ri
$
! 𝑅! = 𝑅 𝑅! ∩ 𝑅% = 𝜙, ∀𝑖, 𝑗, 𝑖 ≠ 𝑗 𝑄 𝑅! = 𝑇𝑅𝑈𝐸, ∀𝑖
!"#
3
Edge detection: Gradient
Operations
1 ( x, y + 1) g(x,y)=(1).f(x,y+1)+(-1).f(x,y)
-1
-1 1 -1 0 1 1 1 1
1
-1
-1 1 1 -1 0 1 0 0 0
-1 -1 1 -1 0 1 -1 -1 -1
1
Prewitt operator
(6 times of the gradient value in any direction)
Robust gradient computation
-1
1x
-1 1
1
-1 0 1 1 2 1
-1
2x -1 1 1 -2 0 2 0 0 0
-1
1x -1
1
1
-1 0 1 -1 -2 -1
Sobel operator
(8 times of the gradient value in any direction)
Results of gradient
operations
9
Courtesy: R.C. Gonzalez and R.E Woods © 1992-2008
Example (Gradient Image)
10
Courtesy: R.C. Gonzalez and R.E Woods © 1992-2008
Example (Post Smoothing)
11
Courtesy: R.C. Gonzalez and R.E Woods © 1992-2008
Example (Sobel: Comparison)
12
Courtesy: R.C. Gonzalez and R.E Woods © 1992-2008
Example (Sobel Diagonal)
13
Courtesy: R.C. Gonzalez and R.E Woods © 1992-2008
Example
0 1 0 1 0 1
𝜕 &𝑓(𝑥, 𝑦) 𝜕 &𝑓(𝑥, 𝑦) 0 -4 0
& + 1 -4 1 OR
𝜕𝑥 𝜕𝑦 &
Laplacian Operator 0 1 0 1 0 1
Weighted combination! 0 1 0
Laplacian Operator
1 -4 1
𝜕 &𝑓(𝑥, 𝑦) 𝜕 &𝑓(𝑥, 𝑦) 1x
+
𝜕𝑥 & 𝜕𝑦 & 0 1 0
+ 1 0 1
4 1 4
1 1 -20 1 4x 0 -4 0
x
5
4 1 4 1 0 1
Types of discontinuities in edges
Isolated
Point
Line
or
Roof
Flat Step
Ramp
Segment
Flat Step
Ramp
Segment
x
First 0 -a 0 b -b 0 c d-d -c 0 e 0
derivative
Magnitude high during transition
b -2b b c d-c d-cc e -e
Second 0 -a a
derivative
-2d
Change of sign at point of transition
Edge detection from first and
second order derivatives
l First order derivative
l Generally produce thicker edges
l Use magnitude of the derivative.
l Second order derivative
l Stronger response to fine details, such as, thin lines,
isolated points and noise.
l More sensitive to noise
l ‘double edge response’ at the point of discontinuity
l Produces two values for every edge.
l zero crossing, change of sign, better localization
l Sign changes could be used to qualify transition 19
0 0 -1 0 0
0 -1 -2 -1 0
-1 -2 16 -2 -1
0 -1 -2 -1 0
0 0 -1 0 0
Smoothing and differentiation
𝑔 𝑥, 𝑦 = 𝛻 ! 𝑓 𝑥, 𝑦 ∗ 𝐺(𝑥, 𝑦) = 𝑓 𝑥, 𝑦 ∗ 𝛻 ! 𝐺(𝑥, 𝑦)
OR
l Filter the image using nxn LoG filter.
l To retain the effect of zero-crossing, n> 2√2𝝈
l Detect zero-crossings 26
Approximation of LoG by DoG
A kind of BPF.
( ! )*! ( ! )*!
1 ' 1 '
l DoG: 𝐷𝑜𝐺 𝑥, 𝑦 = 𝑒 +"
!
− 𝑒 &+!
!
2𝜋𝜎#& 2𝜋𝜎&&
28
Courtesy: R.C. Gonzalez and R.E Woods © 1992-2008
Example
1. Original image
2. Sobel Gradient
3. LoG
4. Threshold LoG
5. Zero crossing
29
Courtesy: R.C. Gonzalez and R.E Woods © 1992-2008
LoG: Example
l SW: Original
l SE: LoG
l NW: Thresholded
l NE: Zero-Crossings
30
Courtesy: R.C. Gonzalez and R.E Woods © 1992-2008
LoG: Effect of Sigma
31
Courtesy: R.C. Gonzalez and R.E Woods © 1992-2008
The Canny edge detector (1986)
l Three basic objectives
l Low error rate
l All edge pixels to be detected
l No spurious edges
l Good localization
l To be as close as possible to true edge points
l Single edge point response
l Only one point for each edge point
l Optimization criteria set involving above
objectives.
l Difficult to get closed form optimal solution 32
Optimal detector for noisy step
edges
l Extension to 2D
l In 1-D
l Analysis required to be
l Derive responses for an repeated along a direction
ideal step edge in 1- D
l but not known a priori.
contaminated by white
Gaussian noise. l Compute gradient vector
l Smooth the image
l Optimal numerical solution
close to first derivative of a using Gaussian filter
Gaussian. l Apply Sobel operator
𝛻 Non Maximal
Suppression
35
Example: Canny
e)
White: Strong Edges
Grey: Weak Edges
Black: No Edges
f)
White: Valid Edges
Blue: Chosen Weak
Edges
Red: Rejected Weak
Edges
14-Sep-11 Image Segmentation 36
Courtesy: R.C. Gonzalez and R.E Woods © 1992-2008
Example: Canny
37
Courtesy: R.C. Gonzalez and R.E Woods © 1992-2008
Example: Canny
38
Courtesy: R.C. Gonzalez and R.E Woods © 1992-2008
Edge linking
l Local linking l Global linking
l Link an edge pixel with l Hough Transform
another pixel in its l Accumulate votes of
neighborhood having similar possible sets of
magnitude and direction. parameters for lines
l Computationally expensive or curves passing
l Another technique through the edge
l Set an edge pixel of specific pixel.
direction with a tolerance and
l Local peaks provide
having sufficient magnitude to
1, else to 0. the geometric lines
l Fill small gaps along that or curves linking the
direction and compute edge points. 39
connected components.
Hough transform
q Discretize parameter space into bins
q For each feature point in the image, put a vote in
every bin in the parameter space that could have
generated this point.
q Find bins that have the most votes.
y=mx+b
m
Image space b
Hough parameter space
P.V.C. Hough, Machine Analysis of Bubble Chamber Pictures, Proc. Int. Conf. High Energy Accelerators and Instrumentation, 1959
Parameter space representation
• A straight line in the image corresponds to a point in
Hough space.
Image space
Hough parameter space
y b
y=m1x+b1 (m1,b1)
x m
Adapted from slides by S. Seitz
Parameter space representation
• A point in the image corresponds to a line in the
Hough space.
b
y
b=-x1m+y1
(x1,y1)
x m
Adapted from slides by S. Seitz
Parameter space representation
• Two points in the image correspond to two lines in
the Hough space.
x m
Adapted from slides by S. Seitz
Parameter space
representation
• Problems with the (m,b) space:
l Unbounded parameter domain
x cos q + y sinq = r
• q varies from 0 to 180o
• r varies from 0 to the length
of diagonal of the image grid.
Each point will add a sinusoid in the (q,r) parameter space
Adapted from slides by S. Seitz
Algorithm
A: Accumulator array
• Initialize accumulator A to all zeros
• For each edge point (x,y) in the image
• {
For θ = 0 to 180
ρ
• {
ρ = x cos θ + y sin θ
A(θ, ρ) = A(θ, ρ) + 1 θ
}
}
• Find the value(s) of (θ, ρ) where A(θ, ρ) is a local
maximum
l The detected line in the image is given by
ρ = x cos θ + y sin θ
Adapted from slides by S. Seitz
Basic illustration
features votes
http://ostatic.com/files/images/ss_hough.jpg
A more complicated image
http://ostatic.com/files/images/ss_hough.jpg
Binarization (Thresholding)
l A simple algorithm
l Choose a threshold value T.
Frequency
Brightness (x)
Normalized Histogram à p(x)
Thresholding
Can you
automate
this
operation?
Th=156 Th=192
A simple iterative algorithm
l Select an initial estimate of global threshold T.
l Partition into two sets of background (x > T) and
foreground (x < T).
l Compute means of background and foreground
m1 and m2, respectively.
l Obtain new estimate of T, T=(m1+m2)/2.
l Iterate above steps till convergence
52
Bayesian Classification of
foreground and background pixels
Pixels belonging to two classes:
w1: Foreground
w2: Background
Compute p(w1|x) and p(w2|x).
p(x)
Bayes’ theorem:
p(w ) p( x | w )
p(x|w1)
x p(x|w2) p(w | x) =
p ( x)
Bayes’ classification rule:
Assign x to w1 if p(w1|x) > p(w2|x), else to w2.
To check whether p(w1)p(x|w1) > p(w2) p(x|w2)
Expectation-Maximization
Algorithm
1. Compute p(w1) and
p(w2) from proportional
areas of each region.
2. Compute parameters of
p(x|w1) by assuming it
Gaussian.
Th
3. Compute parameters of p(x|w2)
by assuming it Gaussian.
Expectation-Maximization
Algorithm Th
p ( w1) = å p ( x)
x =0
p ( w2) = 1 - p ( w1)
p(x)
Th
µ1 = å x. p ( x)
x =0
Th
Th x s 12 = å x 2 . p ( x) - µ 21
x =0
4. Compute new threshold 255
value so that for x<Th,
p(w1|x)>p(w2|x), and vice
µ2 = å x. p( x)
x =Th +1
versa. 255
5. Iterate till the value of Th s =
2
2 å x 2
. p ( x ) - µ 2
2
converges. x =Th +1
Otsu Thresholding
Th
s = p( w1) p( w2)(µ2 - µ1 )
2
B
2
10
4
10 4
16
8
14
7
12
6
10
5
8
4
An example
6
3
4
2
2 1
0 0
Bayesia
Otsu:157
n:157
Probabilistic modeling with
mixture of densities
Number of Component density
components
Component proportion
l Gi defines the ith segment or cluster.
l K is a hyper-parameter and should be known.
l For multivariate Gaussian distribution:
1
1 - ( x -μ ) S ( x -μ )
T -1
l P(x|Gi) ~ N(μi, Σi) P( x ) = e2
d 1
(2p )2 S 2
l To estimate μi, Σi, and P(Gi) for all i. from the set of iid.
input samples: X={xt}, t=1,2,…,N
Mixture of Gaussians:
Probabilistic inference
l Technique could be refined by computing
probabilities of belongingness to a segments.
Mixing coefficients
Parametric PDF:
Mean vector
Covariance matrix
Expectation
Maximization (EM) Zi = å p k N (xi | µk , S k )
Algorithm Normalizing
k
factor
probability
Optimization problem
where
The Lloyd algorithm (1957)
(Batch K-Means)
Compute partitions.
K-means: example (k=2)
Compute partitions.
Update centers.
K-means: example (k=2)
Update centers.
K-means: example (k=2)
K-means: example (k=2)
Update centers.
K-means: example (k=2)
Stop at no change
(or a very little
change in cluster
centers).
Histogram analysis using K-
Means clustering
l Compute the histogram h(x), x=0,1,2,..N
l Choose initial k brightness levels for the set of
means, say m1,m2,..,mk, such that
0<m1<m2<..<mk<N
l Update the ith means, for all i
(+! ,+!$# )/!
∑(+ ,+ )/! 𝑥ℎ(𝑥)
!"# !
𝑚) = (+! ,+!$# )/!
∑(+ ,+ )/! ℎ(𝑥)
!"# !
1 n
P ( x) = å K ( x - x i )
n i =1 K(x-x0)
l Kernel Properties:
l Normalized
ò K (x)dx = 1
l Symmetric
Rd
ò xK (x)dx = 0
Rd
𝑐 𝐱 ≤1
l Uniform 𝐾/ 𝐱 = 2
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
3
2 𝐱 %
l Gaussian 𝐾1 𝐱 = 𝑐𝑒 !
Mode selection
l Compute the gradient of distribution
1 n æ x - xi 2
ö
∇ P ( x) = å ∇ K ( x - x i ) K (x - xi ) = ck ç
ç h ÷÷
n i =1 è ø
é n ù
c n cé n ù êå i i x g ú Size of
Ñ P ( x ) = å Ñ ki = ê å g i ú ê n
i =1
- x ú window
n i =1 n ë i =1 û ê ú
å
êë i =1
gi
úû
g(x) = -k ¢(x)
é n æ x - xi 2 ö ù
ê å xi g ç ÷ ú
ê i =1 ç h ÷ ú
è ø - x =0 x(i+1) = x(i)+m(x(i))
∇P(x)=0 m (x) = ê
æ 2
ö
ú
ê n
x - xi ú
ê å g ç
ç h ÷
÷ ú
ë i =1
è ø û
Mean shift algorithm
• Searches a mode or local
maximum of density of a given
distribution from a point
l Choose a search window (width and location)
l Compute the mean of the data in the search window
l Center the search window at the new mean location
l Repeat until convergence
l Converged point is a mode.
l From every point do the same.
l Set of points arriving at the same mode forms a segment
Mean shift analysis of histogram
l Compute the histogram h(x), x=0,1,2,..N
l For each x determine the mode m(x) in the
histogram.
l Ensure monotonicity in m(x)=0,1,2,…,N
l m(x1) < m(x2), for x1 < x2
l Prune spurious modes
l sufficient support (brightness interval),
l sufficient gap between adjacent distinct pair, and
l sufficient strength (number of pixels).
l Get brightness interval for each mode
Mean shift analysis of
histogram and segmentation
modes: yellow:[1,51]
32,67,117,130 red:[52,82]
blue:[83,128]
white:[129,256]
Segmentation results from
various approaches
o o o o
o o o o o o
o o o o
4-neighbors 8-neighbors
o o o
o o o
o o o
Component Labeling
l Form a graph with edges between neighboring pixels
having same labels.
l Compute connected components.
l Graph traversal algorithms
20 20 50 20
20 20 50 100
50 50 50 100
100 100 20 20
o o o
o o o
o o o
Component Labeling
l Form graph with edges between neighboring pixels
having same labels.
l Compute connected components.
l Graph traversal algorithms
Do you require
an explicit
20 20 50 20
graph
20 20 50 100
representation?
Can you
50 50 50 100 compute using
only the image
100 100 20 20
array?
Examples of
components
?
89
Flooding process and dam building
Two connected One connected
components component
Build dam
l T(i-1): Number of l T(i): Number of connected
connected components: 2 component: 1
90
Decrease of number of connected component in
consecutive levels indicates the event of dam building.
Watershed segmentation
l C(i-1): Union of catchment basins at level i-1 having
l T(i): Set of pixels having having brightness value < I
l C(Xmin)=T(Xmin) and C(Xmax)=⋃ Cmax(Mi)
l Approach: Incrementally compute C(i) from C(i-1)
l Start from C(Xmin) and Stop at C(Xmax).
l Perform connected component analysis from T(i)
l Check each component q of T(i)
l Perform q ⋂ C(i-1)
Dam Construction
l M1, M2:
l Sets of coordinates of points
in the two regional minima
l Cn-1(M1), Cn-1(M2)
l Sets of coordinates of points
in the catchment basins
associated with M1 M2 at
stage n-1 of flooding
(catchment basins up to the
flooding level)
l C[n-1] = Cn-1(M1) È Cn-1(M2)
93
Courtesy: R.C. Gonzalez and R.E Woods © 1992-2008
Courtesy: Prof. P.P.Das, CSE, IITKGP
Dam Construction
l At flooding step n-1, there are
two connected components. At
flooding step n, there is only one
connected component
l The water between the two
catchment basins has
merged at flooding step n
l Use “q” to denote the single
connected component
l Steps
l Dilate Cn-1(M1), Cn-1(M2) by
the 3×3 SE, subject to:
l Constrain SE to q
l Dilation cannot merge the sets
94
Courtesy: R.C. Gonzalez and R.E Woods © 1992-2008
Courtesy: Prof. P.P.Das, CSE, IITKGP
Dam Construction
Watershed segmentation: A
few tips and issues
l Instead of working on an image itself, this
technique is often applied on its gradient image.
l Each object is distinguished from the background by its
up-lifted edges
0
100
80 100
60 80
40 60
40
20 20
0 0
96
Courtesy: R.C. Gonzalez and R.E Woods © 1992-2008
Watershed segmentation:
Typical results
97
l Use of marker
l A connected region defining the pit or the dam.
99
a) Original Image
b) Gradient Image:
3X3 Sobel Edge
Detector
c) Raw Watershed
d) Controlled
Watershed with
Region Marking
100
Courtesy: Prof. P.P. Das, CSE, IIT Kharagpur
Summary
l Meaningful partitioning of image pixels
l Edges, Regions
l Edge operators: First order and Second order derivatives
(Laplacian)
l Marr-Hildreth, Canny
l Analysis of histogram
l Binarization through classification
l Bayes’ classification rule applied
l Binarization through maximizing interclass variances.
l Otsu thresholding
l By finding peaks / valleys and declare intervals of
brightness value for a segment.
l Gaussian mixture model
l Mean shift algorithm
Summary
l K-means clustering in the feature space, if the cluster
number known.
l Component labeling for connected components.
l Watershed segmentation algorithm
Thank You
103