Model Checking by Random Walk
合集下载
- 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
- 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
- 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。
state x. The mean
Ex(Tx). Let rx;y
is the probability
return time of x is de = P (Ty < 1 j X0 = of a chain starting in
xnst)ea,dtiea.esx.merxvxe=;yr
reaching state y. A set of states C X is closed i
Model Checking by Random Walk
P@trik Haslum
Department of Computer Science, Linkoping University
pahas@ida.liu.se
Abstract
While model checking algorithms are in theory e cient, they are in practice hampered by the explosive growth of system models. We show that for certain speci cations the model cheking problem reduces to a question of reachability in the system state-transition graph, and apply a simple, randomized algorithm to this problem.
A graph, G, consists of a ( nite) set of vertices, V , and a binary edge relation E over V . For a vertNddaelienlx(v(vnv)ve)wd==e2acjfsfavVvl0dl0,jojt(u(hvttveih(0;ve;vg)vr0))na=e2p2ihgEjhNEEbgoujg(.uvler)Warjhni.oahdonedTntahhdnoeeifdnoi(vwunvtdr)diies=tegegrrddeoeeoeenultnoyo(fefvdd)vv(vfao)iisrss.
(3)
That is, if the state of a chain at time n, Xn, is drawn
at random according to a stationary distribution , the
distribution of be shown that
iXf Xi
for is
rx;y = 0 for irreducible i
all x rx;y
A distribution
2 C and y 62 C. A > 0 for all x; y 2 C. over X is said to be
closed set stationary
C
i
is
(y) = x2X (x)P (x; y)
chain. A stationary Markov chain is characterized by
the transition probabilities and the initial distribution
0D(Xen0 o=tex)b,yxE2
X.
()
expectation
taken
under
all i nite,
n will also be . It closed and irreducible,
can the
chain has a unique stationary distribution and this is
given by
(x)
=
1 mx
(4)
Random Walks on Graphs
mAestpreiccieadlgcelarseslaotfioEnu, lie.eri.afnorgraalpl hv;sva0rief (gvr;avp0h)s2wEiththseyn-
also (v0; v) 2 E. Such graphs are called undirected. A
closed component of G is a graph consisting of a sub-
quence X0; X1 distribution of
;::: Xi,
of stochastic
for all i > 1,
variables over is such that it
X . The
satis es
the Markov property:
=P (PX(iX=i
xi =
xj Xi j0X=i?x10=; :
the
as-
sumption that the initial distribution is . We will
also write Ex( ) for the case when
(x0) =1 0源自when x0 = x elsewhere
i.e. the initial distribution is concentrated to a single
set of vertices C V and the edge relation restricted
to C such that there v 2 C to any vertex
ve0x62istCs
no path1 from . A component
any vertex is strongly
For certain classes of speci cation formulas, the model checking problem reduces to a problem of reachability in the state-transition graph. For certain classes of graphs, random walks can be used to create a polynomial algorithm for deciding reachability with siglesided, controllable probability of error. Allthough the time complexity of this algorithm is no better than that of other model checking algorithms (in fact, it is worse) two properties of the random walk algorithm make it interesting for use in veri cation: First, it uses extremely little space. Since veri cation is mostly an \o -line" activity, space, which is de nitely limited, can be considered a more critical resource than time. Second, it is highly parallelizable, the expected runtime decreasing linearly with the number of parallel processes.
state x. The hitting time of a state x 2 X is de ned
as Tx = minfn > 0 j Xn = xg, i.e. Tx is a stochastic
variable denoting the rst time that the chain enters
connected i there exists a path from v to v0 for every
v;v0 2 C.
A walk on G (of length n) is a sequence of ver-
tices 0; : : :
v; n0;?v11;.:
: : ; vn such The walk is
Introduction
The model checking approach to automated ver cation of nite state systems appears the most successfull to date. At the heart of the method are e cient algorithms for deciding if a system model, described as a transition system, satis es a speci cation, expressed in temporal logic (Clarke, Emerson, & Sistla 1986). While the theoretical time and space complexities of the algorithms are low order polynomial, they are in practice often time consuming, because the system model grows exponentially with the number of system components.
Markov Chains
Markov chains model discrete state, discrete time,
stochastic processes. A Markov chain consists of a
nite or countably in nite set of states, X and a se-
Basics
This section brie y introduces the model checking approach to veri cation, the theory of Markov chains and the random walk algorithm. For more thorough introductions, see for instance (Clarke, Grumberg, & Peled 1999), (Hoel, Port, & Stone 1972) and (Motwani & Raghavan 1995), respectively.
:x:i?; X1)i?1
=
xi?1)
(1)
i.e. the probability of a certain state x materializing at
time i depends only on the state of the process at time i ? 1, and not on its previous history. The probabilities P (Xi = xi j Xi?1 = xi?1) are called the transition probabilities. If
ary. We will in sequel consider only stationary chains,
and can
XPx;oy.f
A a
therefore abbreviate P (Xi = x j Xi?1 = y) as
probability distribution over the state space Markov chain is called a distribution of the
=P (PX(iX=j
xi =
jxXj ji?X1j?=1
x=i?x1j)?1
)
(2)
for all i; j > 1, i.e. the transition probabilities are
independent of time, the chain is said to be station-