随机过程课件4
合集下载
相关主题
- 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
- 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
- 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。
7
4.2 Birth and Death Processes • Example: Yule Process. Consider a Yule process starting with a single individual at time 0. What is the distribution of Xt for any given t? • ANS: – We want to calculate P {Xt = j | X0 = 1}. – We first consider τi, which is the amount of time that the process stays in state i before moving into i + 1. – When the process is at state i, it represents that there are i members in the population. Let Zk be the time needed for member k to give birth, then Z1, · · · , Zi are i.i.d following an exponential distribution with rate λ. We have that P {τi > t} = P {Z1 > t, · · · , Zi > t} = P {Z1 > t} · · · P {Zi > t} = e−iλt, so τi follows an exponential distribution with rate vi = iλ.
.
Chapter 4 Continuous-Time Markov Chains
1
4.1 Continuous-Time Markov Chains • In this chapter, we focus on continuous-time discrete-state homogenous Markov chains. • Definition: Continuous-Time Markov Chain. We say that a continuous-time stochastic process {Xt, t ≥ 0} is a continuous-time Markov chain if for all s, t ≥ 0, P {Xt+s = j | Xs = i, Xu = x(u), 0 ≤ u < s} = P {Xt+s = j | Xs = i}. • Remarks: – We focus on homogenous (or stationary) Markov chains, that is, P {Xt+s = j | Xs = i} = Pi,j (t).
t
P {τ1 + τ2 ≤ t} =
0 t
P {τ1 + τ2 ≤ t | τ1 = s}fτ1 (s)ds P {τ2 ≤ t − s | τ1 = s}λe−λsds 1 − e−2λ(t−s) λe−λsds
2
=
0 t
=
0
=t −λt −λ(t−s) s=t −λt = −e−λs |s − e e | = 1 − e s=0 s=0
8
4.2 Birth and Death Processes • – It is also easy to show that τ1, τ2, · · · are independent. – We consider the distribution of τ1 + · · · + τj . ∗ The cdf and the pdf of τ1 is P {τ1 ≤ t} = 1 − e−λt and fτ1 (t) = λe−λt, respectively. ∗ The cdf of τ1 + τ2 is
j =i Pi,j
= 1.
– A state for which vi = ∞ is called an instantaneous state since when entered it is instantaneously left. We will focus on the cases that vi < ∞ for all i. – A state for which vi = 0 is called an absorbing state since once entered it is never left.
= 1−e = 1−e = 1−e
−
0
e
−jλ(t−s) t 0
(j − 1)λe
−λs
1−e
dsLeabharlann Baidu
−e −e
−jλt
1 − e−λs
−λt j −2
j −2
de(j −1)λs
t
−λt
1−e
−
0
e
−(j −1)λ(t−s)
d 1−e
−λs j −2
= 1 − e−λt .
10
4.2 Birth and Death Processes • – We showed that P {τ1 + · · · + τj ≤ t} = 1 − e−λt – Then for j ≥ 1, P {Xt = j | X0 = 1} = P {Xt > j − 1 | X0 = 1} − P {Xt > j | X0 = 1} = P {τ1 + · · · + τj −1 ≤ t} − P {τ1 + · · · + τj ≤ t} = 1 − e−λt
,
and the pdf is fτ1+τ2 (t) = λe−λt 1 − e−λt .
9
4.2 Birth and Death Processes • – ∗ Assume that the cdf of τ1 + · · · + τj −1 is P {τ1 + · · · + τj −1 ≤ t} = 1 − e ∗ The cdf of τ1 + · · · + τj is P {τ1 + · · · + τj ≤ t}
j =i qi,j
= vi.
5
4.2 Birth and Death Processes • Definition: A continuous-time Markov chain with states {0, 1, 2, · · · } for which qi,j = 0 whenever |i − j | > 1 is called a birth and death process. • Remarks: – For a birth and death process, transitions from state i can only go to either state i − 1 or state i + 1, that is, Pi,i−1 + Pi,i+1 = 1. – Birth and death processes are often used to describe the size of some population. – qi,i+1 and qi,i−1 is called the birth rates and the death rates, respectively. – We have that vi = qi,i−1 + qi,i+1, Pi,i+1 = qi,i+1/(qi,i−1 + qi,i+1) = 1 − Pi,i−1.
t −λt j −1
.
=
0 t
P {τj ≤ t − s | τ1 + · · · + τj −1 = s}fτ1+···+τj −1 (s)ds 1 − e−jλ(t−s) (j − 1)λe−λs 1 − e−λs
−λt j −1 −λt j −1 −λt j −1 j t j −2
=
0
ds
−λs j −2
3
4.1 Continuous-Time Markov Chains • Remarks: – A Markov chain is a stochastic process having the properties that each time it enters a state i, (1) the amount of time it spends in that state before making a transition into a different state is exponential distributed with rate vi. (2) when the process leaves state i, it will next enter state j with probability Pi,j , where
2
4.1 Continuous-Time Markov Chains • Remarks: – Suppose that a continuous-time Markov chain enters state i at time 0, and let τi denote the amount of time that the process stays in state i before moving into a different state. Then P {τi > s + t | τi > s} = P {τi > s + t | Xs = i} = P {τi > t | X0 = i}, and we can show that τi follows an exponential distribution. – The rate of τi may depend on state i, denoted by vi. That is, τi follows an exponential distribution with pdf vie−vit 0 ≤ t < ∞, fτi (t) = 0 otherwise, and the mean of τi is E (τi) = 1/vi.
4
4.1 Continuous-Time Markov Chains • Remarks: – A Markov chain is said to be regular if, with probability 1, the number of transitions in any finite length of time is finite. We only consider regular Markov chains. – Let qi,j = viPi,j , for all i = j , where vi is the rate at which the process leaves state i, and Pi,j is the probability that it then goes to j . qi,j is called the transition rate from i to j . –
6
4.2 Birth and Death Processes • Definition: A birth and death process is said to be a pure birth process if qi,i−1 = 0 for all i. • Remarks: – For a pure birth process, qi,i+1 = viPi,i+1 = vi. – The simplest example of a pure birth process is the Poisson process, which has a constant rate qi,i+1 = λ. – Another example results in a population in which no one ever dies and each member independently gives birth at an exponential rate λ. Let Xt represents the population size at time t, such a pure birth process is called a Yule process.