Random Walk Definition - Boston University随机游走的定义-波士顿大学
合集下载
相关主题
- 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
- 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
- 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。
• A Markov Chain is a stochastic process defined on a set of states with matrix of transition probabilities.
• The process is discrete, namely it is only in one state at given time step (0, 1, 2, …)
Random Walks
Ben Hescott CS591a1
November 18, 2019
Random Walk Definition
• Given an undirected, connected graph G(V,E) with |V| = n, |E| = m a random “step” in G is a move from some node u to a randomly selected neighbor v. A random walk is a sequence of these random steps starting from some initial node.
is null-persistent
• If state i is persistent and hii is not infinite then i is non-null-persistent
• Turns out every Markov Chain is either transient or non-null-persistent
• The commute is bounded by 2m
• We can express hitting time in terms of
commute time as
hij1/2cij
ucujcui
u
Lollipop
• Hitting time from i to j not necessarily same a time from j to i. Consider the kite or lollipop graph.
– The chain is ergodic – There is a unique stationary distribution where for
1i n, i 0
– for 1i n, fii1anhd ii1/i
– Given N(i,t) number of time chain visits state i in t steps
cycle – GCD is then 1 so MG is aperiodic
Fundamental Theorem Holds
• So we have a stationary distribution, P =
• But Pvv P uvu
uv
• Good news, u d(u)2m
• Also get hii1u2md(u)
Clique n/2 nodes
n/2 node tail
• Here h u vO (n3)buh vt uO (n2)
Cover Time
move
Questions
• How many steps to get from u to v • How many steps to get back to initial node • How many steps to visit every node • Easy questions to answer if we consider a
HitBaidu Nhomakorabeaing Time
• Generally hij, the expected number of steps needed before reaching node j when starting
from node i is the hitting time.
• The commute time is the expected number of steps to reach node j when starting from i and then returning back to i. Cuvhuvhvu
• Next move does not depend on previous moves, formally
P X t 1 r j |X 0 [ i 0 , X 1 i 1 ,X . t . i ] . P X , t 1 r j |X t [ i ] P ij
Markov Chain Definitions
then
N(i,t) lim t t
i
Random Walk is a Markov Chain
• Consider G a connected, non-bipartite, undirected graph with |V| = n, |E| = m. There is a corresponding Markov Chain
r i ( t ) j P X tr j [ 1 s t ,X s j|X 0 i ]
• Consider probability visit state j at some
time t>0 when started at state i.
fij
r(t) ij
t0
• Consider how many steps to get from state i
• Consider states of chain to be set of vertices
• Define transitional matrix to be
1/d(u) (u,v)E
Puv
0
otherwise
Interesting facts
• MG is irreducible since G is connected and undirected
• Consider probabilities of being at a particular vertex at each step in walk.
• Each of these can be consider a vector,
V (1 ,o 0 ,0 )V , (1 1 /2 ,1 /4 ,1 /4 ),...
• The periodicity of state i is max int T for which there is a q0 and a>0 s.t. for all t, if qi(t) 0 then t is in arithmetic progression {aT|ii0} A state is periodic if T>1 and aperiodic otherwise
define matrix M = DA
0 1/ 2 1/ 2
• For triangle d(i) = 2 so M = 1/ 2
0
1
/
2
1 / 2 1 / 2 0
• Note for triangle Pr[a to b] = Pr[b to a]
Markov Chains - Generalized Random Walks
More Definitions
• Consider question where am I after t steps, define t-step probability P i(tj)PX rt[j|X oi]
• Question is this my first time at node j?
• An ergodic Markov Chain is one where all states are aperiodic and non-null persistent
Fundamental Theorem of Markov Chains
• Given any irreducible, finite, aperoidic Markov Chain then all of the following hold
• Notice the periodicity is the gcd of all cycles in G - closed walk
– Smallest walk is 2 go one step and come back – Since G is non-bipartite then there is odd length
Almost there
• A strong component of a directed graph G is a subgraph C of G where for each edge eij there is a directed path from i to j and from j to i.
• A Markov Chain is irreducible if underlying graph G consists of a single strong component.
• A stationary distribution for a Markov Chain with transition matrix P is distribution s.t. P =
Transition Matrix
• We can use a matrix to represent transition
probabilities, consider adjacency matrix A
and diagonal matrix, D, with entries 1/d(i)
where d(i) is degree of node i. Then we can
• Define vector qt (q1t,q2 t, ,qn t) • where the i-th entry is the probability that
the chain is in state t
• Note: qt1 qt P
• Notice that we can then calculate everything given q0 and P.
5
43
6
2
1 G
Points to note
• Processes is discrete • G is not necessarily planar • G is not necessarily fully connected • A walk can back in on itself • Can consider staying in same place as a
simple example
Regular Graphs
• The expected number of steps to get from vertex u to v in a regular graph is n-1,
• The expected number of steps to get back to starting point is n for a regular graph.
to j. hij tri(jt) given fij 1 hij ,otherwise t0
Even More Definitions
• Consider fii • State i is called transient fii < 1 • State i is called persistent if fii = 1 • If state i is persistent and hii is infinite then i
• Expected number of steps to visit every
node in a regular graph is
n1
(n 1)1/ i
i1
Triangle Example
Vo (1, 0, 1/2, 1/4, ... )
V1 (1/2, 0, 1/4, 3/8, ...)
V2 (1/2, 0, 1/4, 3/8, ...)
• The process is discrete, namely it is only in one state at given time step (0, 1, 2, …)
Random Walks
Ben Hescott CS591a1
November 18, 2019
Random Walk Definition
• Given an undirected, connected graph G(V,E) with |V| = n, |E| = m a random “step” in G is a move from some node u to a randomly selected neighbor v. A random walk is a sequence of these random steps starting from some initial node.
is null-persistent
• If state i is persistent and hii is not infinite then i is non-null-persistent
• Turns out every Markov Chain is either transient or non-null-persistent
• The commute is bounded by 2m
• We can express hitting time in terms of
commute time as
hij1/2cij
ucujcui
u
Lollipop
• Hitting time from i to j not necessarily same a time from j to i. Consider the kite or lollipop graph.
– The chain is ergodic – There is a unique stationary distribution where for
1i n, i 0
– for 1i n, fii1anhd ii1/i
– Given N(i,t) number of time chain visits state i in t steps
cycle – GCD is then 1 so MG is aperiodic
Fundamental Theorem Holds
• So we have a stationary distribution, P =
• But Pvv P uvu
uv
• Good news, u d(u)2m
• Also get hii1u2md(u)
Clique n/2 nodes
n/2 node tail
• Here h u vO (n3)buh vt uO (n2)
Cover Time
move
Questions
• How many steps to get from u to v • How many steps to get back to initial node • How many steps to visit every node • Easy questions to answer if we consider a
HitBaidu Nhomakorabeaing Time
• Generally hij, the expected number of steps needed before reaching node j when starting
from node i is the hitting time.
• The commute time is the expected number of steps to reach node j when starting from i and then returning back to i. Cuvhuvhvu
• Next move does not depend on previous moves, formally
P X t 1 r j |X 0 [ i 0 , X 1 i 1 ,X . t . i ] . P X , t 1 r j |X t [ i ] P ij
Markov Chain Definitions
then
N(i,t) lim t t
i
Random Walk is a Markov Chain
• Consider G a connected, non-bipartite, undirected graph with |V| = n, |E| = m. There is a corresponding Markov Chain
r i ( t ) j P X tr j [ 1 s t ,X s j|X 0 i ]
• Consider probability visit state j at some
time t>0 when started at state i.
fij
r(t) ij
t0
• Consider how many steps to get from state i
• Consider states of chain to be set of vertices
• Define transitional matrix to be
1/d(u) (u,v)E
Puv
0
otherwise
Interesting facts
• MG is irreducible since G is connected and undirected
• Consider probabilities of being at a particular vertex at each step in walk.
• Each of these can be consider a vector,
V (1 ,o 0 ,0 )V , (1 1 /2 ,1 /4 ,1 /4 ),...
• The periodicity of state i is max int T for which there is a q0 and a>0 s.t. for all t, if qi(t) 0 then t is in arithmetic progression {aT|ii0} A state is periodic if T>1 and aperiodic otherwise
define matrix M = DA
0 1/ 2 1/ 2
• For triangle d(i) = 2 so M = 1/ 2
0
1
/
2
1 / 2 1 / 2 0
• Note for triangle Pr[a to b] = Pr[b to a]
Markov Chains - Generalized Random Walks
More Definitions
• Consider question where am I after t steps, define t-step probability P i(tj)PX rt[j|X oi]
• Question is this my first time at node j?
• An ergodic Markov Chain is one where all states are aperiodic and non-null persistent
Fundamental Theorem of Markov Chains
• Given any irreducible, finite, aperoidic Markov Chain then all of the following hold
• Notice the periodicity is the gcd of all cycles in G - closed walk
– Smallest walk is 2 go one step and come back – Since G is non-bipartite then there is odd length
Almost there
• A strong component of a directed graph G is a subgraph C of G where for each edge eij there is a directed path from i to j and from j to i.
• A Markov Chain is irreducible if underlying graph G consists of a single strong component.
• A stationary distribution for a Markov Chain with transition matrix P is distribution s.t. P =
Transition Matrix
• We can use a matrix to represent transition
probabilities, consider adjacency matrix A
and diagonal matrix, D, with entries 1/d(i)
where d(i) is degree of node i. Then we can
• Define vector qt (q1t,q2 t, ,qn t) • where the i-th entry is the probability that
the chain is in state t
• Note: qt1 qt P
• Notice that we can then calculate everything given q0 and P.
5
43
6
2
1 G
Points to note
• Processes is discrete • G is not necessarily planar • G is not necessarily fully connected • A walk can back in on itself • Can consider staying in same place as a
simple example
Regular Graphs
• The expected number of steps to get from vertex u to v in a regular graph is n-1,
• The expected number of steps to get back to starting point is n for a regular graph.
to j. hij tri(jt) given fij 1 hij ,otherwise t0
Even More Definitions
• Consider fii • State i is called transient fii < 1 • State i is called persistent if fii = 1 • If state i is persistent and hii is infinite then i
• Expected number of steps to visit every
node in a regular graph is
n1
(n 1)1/ i
i1
Triangle Example
Vo (1, 0, 1/2, 1/4, ... )
V1 (1/2, 0, 1/4, 3/8, ...)
V2 (1/2, 0, 1/4, 3/8, ...)