Efficient Belief propagation for Early Vision

  1. 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
  2. 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
  3. 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。

(1,1)
label
Can use two-pass algorithm to compute message
two-pass algorithm

initialize message m( fq ) with h( f p ) forward pass for f q from 1 to k 1
1.Computing Messages

To most low-level vision application V ( f p f q ) based on the difference between the labels fq and f p The form of equation (3) is commonly referred to as a min convolution
label
m (2,1, 2, 2)
1.Computing Messages
second. Use Potts model to computation pure linear model time complexity turns to O ( k )

Truncated quadratic Model 2 V ( f p f q ) min(c f p f q , d )
m( fq ) min(m( fq ), m( fq 1) s)

backward pass for f q from k 2 to 0
m( fq ) min(m( fq ), m( fq 1) s)
Compute messages with the linear cost
go
m( fq )
m fp 0 3
m fp 2 4
(2, 4)
(0,3)
m f p 1 1
Ex. S=1 cones root at ( f p , h( f p ))
m f p f q h( f p )
m fp 4 2
(3, 2)
lower envelope
Efficient Belief propagation for Early Vision
Pedro F. Felzenszwalb University of Chicago Daniel P. Huttenlocher Cornell University International Journal of Computer Vision 2006.10 讲解人:步文斌


MRF models

W ( f p , f q ) the cost of assigning labels f p and fq

to two neighboring pixels (discontinuity cost) , generally based on the difference between labels in low-level vision problem so we use Dp ( f p ) the cost of assigning label f p to pixel p (data cost )

less sensitive to numerical artifacts

max-product BP works by passing messages around the graph defined by the four-connected image grid.
6
Message passing
7 3 9 8 6 4 1 2 9 8 2 1 2 4 3 6 7
8
2
9
1 8 4 3
9
Message passing


be the message that node p sends to a neighboring node q at iteration t are initialized to 0 t mpq [0..n _ lables] a vector of n_labels dimension Step 1. Compute messages:
10
0 1 t 1 )\ q msp f p sN ( p n _ lables
0 0 1 1 V fq Dp fp fp n _ lables n _ lables
1.Computing Messages

Potts model
Compute
fristly
min f p h( f p )
time complexity turns to O ( k )
go
(stereo and truncated linear model image restoration)
go
Potts model
truncated linear model
Truncated quadratic Model
Robust!
2.BP on the Grid Graph



BP performed more efficiently for a bipartite graph grid graph to bipartite graph :every edge connects different nodes bipartite graph with nodes A B messages mAB sent from nodes in A depend on the messages mBA sent from nodes in B , vice versa
24
2.BP on the Grid Graph

Update
m
A
t AB
m
A
t 1 BA
m
B
t 2 AB

m
t 1 AB
B
Without compute

2.BP on the Grid Graph


So we alternate update the messages from A and from B t scheme: when t is odd , update mAB t when t is even , update mBA if t is odd(even) then
13

Step 3. minimizes bq ( fq ) individually at each node is selected
f q * min bq ( f q )

time complexity:

O(nk T )
2

Βιβλιοθήκη Baidu
n k T
number of pixels number of possible labels ( n_lables ) number of iterations
energy function:
Message passing



Finding a labeling with minimum energy corresponds to MAP estimation problem max-product used to approximate the MAP max-product to min-sum ( with log() )
12
p
q
Message passing
Step 2. Compute belief vector bq for each node after T iterations
0 0 0 1 1 1 T bq Dq m pq fq f q pN ( q ) fq n _ lables n _ lables n _ lables
2
outline



Markov random field (MRF) models Message passing 2 3 techniques to reduce the time O(nk T )


1.Computing Messages update in linear time 2.BP on the bipartite graph message passing schedule 3.Multi-Grid BP
MRF models

P be the set of pixels in an image L be a set of labels (ex. intensities) N the edges in the four-connected image
grid graph Labeling f assigns a label f p L to each pixel p P energy function:
1.Computing Messages
First. Consider the pure linear model
V ( f p fq ) s f p fq
message compute
The minimization can be seen as lower envelope in figure
Too long time
3 techniques



1. Computing a Message Update in Linear 2 Time ( O(k ) to O ( k ) ) 2. The bipartite graph message passing n schedule ( O ( n) to O ( ) ) 2 3. Multi-Grid BP

In general .

Ex 1.
1 2
Total: 3+2+1
3 4
4+1
5
5
4
3
2
1
3+1
Message passing
Rules:
A B C
A+B+C+1
Or:
A 1
A+1

Ex 2. (complex)
Message passing
2+4+1=7 2+3+1=6 3+4+1=8
1 2
perface



Early vision problems: problems based on pixels. Such as stereo, and image restoration Early vision problems can be formulated as a maximum a posteriori MRF (MAP-MRF) problem. methods to solve MAP-MRF: simulated annealing , which can often take an unacceptably long time although global optimization is achieved in theory
M fq
s N ( p) \ q
p
q
ex. fq fq _ 0
0 1 M fq _ 0 M fq _1 min(M fq _ 0 ) M fq _ 0 fp fp t m pq M f q n _ lables M fq _ n _ lables
two-pass algorithm
m( fq )
(1, 4)
(3, 4) (2, 4)
(0,3) (1,3) (2,3)
Ex. k 4 1. m (3,1, 4, 2) 2.forward pass for fq =1 to 3

(0, 2) (1,1)
(2, 2) (3, 2)
m (3,1, 2, 2) 3.backward pass for fq =2 to 0
相关文档
最新文档