lecture17

合集下载
  1. 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
  2. 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
  3. 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。

Lecture 17:Poisson Processes –Part I
STAT205Lecturer:Jim Pitman Scribe:Matias Damian Cattaneo <cattaneo@>17.1The Poisson Distribution
Define S n =X 1+X 2+...+X n .In a very simple setup,the X i are independent indicators with
P [X i =1]=p
P [X i =0]=1−p
We know that the distribution of S n is Binomial (n,p ).So we have P [S n =k ]= n k
p k (1−p )n −k ,E [S n ]=np,and
V ar [S n ]=np (1−p ).
For fixed p ,we have that
S n −np
np (1−p )
d −→N (0,1)as n →∞.
Now we let n →∞and choose p n small so that np n =λ.We see that P [S n =0]=(1−p )n = 1−
λn
n −1
−→λe −λ,and in general P [S n =k ]= n k
p k (1−p )n −k
≈n k
k !e −λ,
which is the mass function of a Poisson distribution.
17-1
Definition17.1The Poisson distribution with parameterλis given by mass func-tion
λk
Pλ(k)=
17.2Basics of Poisson Processes
We discuss the basics of Poisson processes here;for details,see[1].
Consider the positive numbers divided up into intervals of length10−6.We can think of this as time broken up into very short intervals.Consider a process(X1,X2,...) where X i is1if a certain thing happens in the i th such time interval and0otherwise. Suppose further that these X i s are independent and are1with probabilityλ·10−6. The waiting time until thefirst occurrence of this thing is defined as
T1={first n:X i=1}
here X i=1with probabilityλ
n−→e−λm
n
>t =P[T1>nt]−→e−λt
as m→∞.Notice that this is an exponential distribution.
When n=106→∞we obtain a point process(T i),an increasing sequence of random variables where
T1∼Exponential(λ)
T2−T1∼Exponential(λ),independent of T1
..
.
so we get T1,(T2−T1),(T3−T2)are iid Exponential(λ).
Alternatively,the counting process is defined by
N t=#{i:T i≤t}=
n
k=11{T k≤t}.
In general{N t}
t≥0is called a Poisson process with rateλon(0,∞).It is not difficult
to show that N t∼Poisson(λt).
It is important to note that almost by construction this process(N t,t≥0)has sta-tionary independent increments:that is,for0<t1<t2<...<t n,we have that N(t1),N(t2)−N(t1),...,N(t n)−N(t n−1)are independent Poisson variables with meansµt1,(t2−t1)µ,...,(t n−t n−1)µ.
17.3Details of Poisson Processes
In this section we present details on Poisson processes.
17.3.1The Poisson Process and its counting process
Definition17.2A real valued process N t,t≥0,is a Poisson Process with rateλ(or P P(λ))if N t has stationary independent increments,and for all s,t≥0,the increment X t+s−X t is a Poisson random variable with parameterλs. Interpretation:Jumps of N t are“arrivals”or“points”.
Let T k=inf{t:N t=k}.We say that0<T1<T2<···are these arrival times.
Note that N T
k =k and N
T−
k
=k−1.
As a convention,we will always work with the right continuous version of(N t,t≥0) which is increasing and hence has a left limit a.s.
Theorem17.3A counting process N t,t≥0(increasing,right continuous,left limits exists,jumps of1only)is a P P(λ)if and only if the distribution of its jumps(T1,T2,T3,...)satisfies that(T k−T k−1,k=1,2,...)is a sequence of i.i.d. Exponential(λ)random variables.(T0=0.)
Proof:For the“if”direction,see[1],Section2.6(c.Poisson process).The idea is given i.i.d.(W k)such that P(W k>t)=e−λt,t≥0,we construct(N t,t≥0)by the formula:
T k=W1+···+W k
N t=number of{k:T k≤t}
=

k=11{T k≤t}
Informally,the{T k}are the points of the P P(λ)denoted by N.1
The“only if”direction is easier.Suppose N t∼Poisson(λ).By the Strong Markov Property(see[1,section5.2]),we can deduce that T k−T k−1are independent.We also can deduce:P(T k>t)=P(N t<k)= k−1j=0e−λt(λt)j dt,we get P(T k∈dt)=e−λt(λt)k−1
1Friendly reference:”Probability”,J.Pitman,Springer1992.
This shows T k∼gamma(k,λ).SinceφT
k−T k−1=(φT
k
)1/n,and the characteristic
function uniquely determines the distribution,this determines the distributions of T k−T k−1.Since T k−T k−1∼exponential(λ)will ensure T k∼gamma(k,λ)and vice versa,T k−T k−1∼exponential(λ).
k!for all B∈
S;and
4.If B1,B2,...are disjoint sets then N(B1,·),N(B2,·),...are independent
random variables.
Example17.5Let S=R+,S=Borel(R+),µ=λ·Lebesgue.Let N(B,ω)be the measure of B,whose cumulative distribution function is defined as the counting process(N t,t≥0)for a P P(λ)as described before.That is,
N([0,t]):=N t=

k=11{T k≤t}
So,
N(B)=
∞ k=11(T k∈B)
=the number of T k which fall in B
Example17.6S=R,S=Borel(R),µ=λ·Lebesgue.Stick together independent P P(λ)on R+and R−.
Theorem17.7Such a P.P.P.exists for anyσ-finite measure space.
Proof:The only convincing argument is to give an explicit construction from se-quences of independent random variables.We begin by considering the caseµ(S)<∞.
1.Take X1,X2,...to be i.i.d.random variables with formµ(·|S)so that P(X i∈
B)=µ(B)
Exercise17.8Verify that this N(B,ω)is a P.P.P.with intensityµ.(Use thinning property of Poisson distributions.)
Example17.9(Poissonization of multinomial)If B1,B2are disjoint events and
B1∪B2=S
P(N(B1)=n1,N(B2)=n2)=P(N=n1+n2)P(N(B1)=n1,N(B2)=n2|N=n1+n2)
=e−λλn1+n2
n1
)p n1q n2,
where N=N(S),p=µ(B1)
µ(S)
.
17.4General Theory Behind Poisson Processes
It is important to note that we have presented two different examples of convolution semigroups(convolution1/2groups)of probability distributions on R.In general,we have a family of probability distributions(F t,t≥0)such that:
F s∗F t=F s+t
where∗is convolution.Examples are:
1.F t=N(0,σ2t)
2.F t=Poisson(λt)
3.F t=δct for some real c∈R
For any such family we can create a stochastic process(X t,t≥0)so that X has stationary independent increments,that is:
X t∼F t
X t−X s∼F t−s,for0<s<t
and so on.Notice that we did this by explicit construction for F t=Poisson(λt).For the case of F t=N(0,σ2t)we need to be clever,and in general we will need to appeal to Kolmogorov’s Extension Theorem(see[1]for details).In particular,
1.the Poisson case gives a Poisson Process;and
2.the Normal case gives a Brownian motion(also known as a Wiener Process). In particular,Brownian motion has the special feature that it is possible to define the process to have continuous paths;that is,for almost everyω,t→X t(ω)is continuous.
Definition17.10A probability distribution F on R is called infinitely divisible if for every n,there exist i.i.d.random variables X n,1,X n,2,...,X n,n such that
S n=X n,1+X n.2+...+X n,n∼F;
that is,there exists a distribution F1
n ∗F1
n
=F,
so F has a convolution”n th root”for every n.
Definition17.11Let(X t,t≥0)be a process with stationary independent incre-ments and distribution function F t at time t.F t is said to be weakly continuous in t as t↓0if
lim
t↓0
P[|X t|>ε]=0
for allε>0
Theorem17.12(L´e vy-Khinchine)There is a1-1correspondence between infinitely divisible distributions F and weakly continuous convolution semigroups(F t,t≥0) with F1=F.
Corollary17.13Every infinitely divisible law is associated with a process with sta-tionary independent increments which is continuous in P as t varies.
For a proof see[2]or[3]
Now we present another example.Consider accidents that occur at times of Poisson Process with rateλ.At the time of the k th accident let there be some variable X k like the damage costs covered by insurance companies.We have
Y t=
N t k=1X k
where Y t is the total cost/magnitude up to time t.It is not difficult to show that (Y t,t≥0)has stationary independent increments.
We look at the distribution of Y t.This is called a compound Poisson process.It is hard to describe explicitly,but we can compute its characteristic function. Exercise17.14Find a formula for the characteristic function of Y t in terms of the rateλof N t,and the distribution F of X t,that is,
F(B)=P[X k∈B].
Observe we have
E[exp{iθY t}]=
N t
n=1E[exp{iθY t}·1{N t=n}]
=
N t
n=1E[exp{iθ(X1+X2+...+X n)}·1{N t=n}]
=
N t
n=1E[exp{iθ(X1+X2+...+X n)}]·E[1{N t=n}]
=
N t
n=1(E[exp{iθX}])n·e−λt(λt)n
17.4.1Computation of an instance of LK Formula
Given a P.P.P.N ,say N (B )= i 1{X i ∈B }.For some sequence of random variables
X i .Consider positive and measurable f : fdN = i
f (X i )(17.1)
has clear intuitive meanings and many applications.
Example 17.15X i could be the arrival time,location,and magnitude of an earth-quake:
X i =(T i ,M i ,Y i ).
f (t,m,y )represents the cost to the insurance company incurred by an earthquake at time t with magnitude m in place y.
How do we describe its distribution?Consider the case where f is a simple function.Say f = m i =1x i 1{B i }where the B i ’s are disjoint events and cover the space.Then fdN = i
x i N (B i )(17.2)
where N (B i )are independent r.v.s with Poisson(µ(B i ))distribution.
This is some new infinitely divisible distribution.
Now we need a transformation.Because f ≥0,it is natural to look first at the Laplace transformation.Take θ>0:E e −θR fdN =E e −θP i x i N (B i ) (17.3)= i E e −θx i N (B i ) (17.4)
= i exp −µ(B i )(1−e −θx i ) (17.5)
=exp − i µ(B i )(1−e −θx i )
(17.6)
=exp − (1−e −θf (s ))µ(ds ) (17.7)17.4implies 17.5because N (B i )∼Poisson(µ(B i )),and if N ∼Poisson(λ),then
E (e −θN )=∞ n =0(e −θ)
n λn
Theorem17.16For every non-negative measurable function f,we have,writing e−∞:=0:
E e−θR fdN =exp (e−θf(s)−1)µ(ds) (17.8) This formula is an instance of the L´e vy-Khinchine equation.
Proof:We have shown that the result holds when f is a simple function.In general, there exist sequence of simple functions f n such that f n↑f.Then apply the Monotone Convergence Theorem and Dominated Convergence Theorem.。

相关文档
最新文档