随机过程英语讲义-3
合集下载
相关主题
- 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
- 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
- 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。
Total Probability Theorem:
为了帮助保护您的隐私,PowerPoint 禁止自动下载此外部图片。若要下载并显示此图片,请单击消息栏中的 “选项”,然后单击 “启用外部内容 ”。
Thomas Bayes
1 No earlier portrait or claimed portrait survives. Born Died Nationality
Conditioning, Marginals and Total Probability Combining conditioning, marginals, and total probability, we get a variety of useful identities:
Conditional probability density function
Solution:
P[chip defective] = P[def . | A]P[ A] + P[def . | B]P[ B] + P[def . | C ]P[C ] = (10−3 ) p A + 5(10−3 ) pB + 10(10−3 ) pC
Continue…
20
10 −3 p A P[ def . | A]P[ A] = P[ A | chip defective] = P[ def .] (10 −3 ) p A + 5(10 −3 ) pB + 10(10 −3 ) pC pA = p A + 5 pB + 10 pC
σ = Var ( X ) = E[( X − μ X ) ] = ∫ ( x − μ X ) 2 f X ( x)dx
2 x 2 −∞
∞
2
1.5 Moment Generating , Characteristic Function and Laplace Transform
吧The moment generating function of X is defined by
ψ (t )=E[etX ]
= ∫ etx dF ( x).
psi
3
When a moment generating function exists , it uniquely determines the distribution . This is quite important because it enables us to characterize the probability distribution of a random variable by its generating function . Example:: Normal Distribution Function with mean Probability Density Function and variance
P[ H ] = P[ H | coin 1]P[coin 1] + P[ H | coin 2]P[coin 2] 1 1 1 = p1 ⋅ + p2 ⋅ = ( p1 + p2 ) 2 2 2 Now using Bayes' Rule,
1 P[ H | coin 2]P[coin 2] p2 2 p2 P[coin 2 | H ] = = 1 = P[ H ] ( p1 + p2 ) 2 ( p1 + p 2 )
Signature
c. 1701 London, England 7 April 1761) (aged 59) Tunbridge Wells, Kent, England English
16
百度文库
Bayes’ Rule Suppose we are given Prior probabilities: Transition probabilities:
Moment Generating Function
4
Example:
5
吧 Characteristic functions
phi
吧
1
6
吧
7
Example:
8
9
10
Laplace Transform 吧
11
12
1.6 Conditional Expectation
Conditional Probability mass function
How can we compute
17
Problem 1: A nonsymmetrical binary communication channel is shown below. Assume inputs are equiprobable. Find the probability that the output is 0. Find the probability that input was 0 given that the output is 1. Find the probability that input is 1 given that the output is a 1.
input 0
1-ε1
0
output
ε1 ε2
1-ε2
1
1
Solution Continues…
18
Solution : Let X denote the input and Y the output. a) P[Y = 0] = P[Y = 0 | X = 0]P[ X = 0] + P[Y = 0 | X = 1]P[ X = 1]
similarly 10 pC P[C | chip defective] = p A + 5 pB + 10 pC
21
Problem 3: One of the two coins is selected at random and tossed. The first coin comes up heads with probability p1 and second coin with probability p2. What is the probability that coin 2 was used given that heads occurred? Solution :
19
Problem 2: A computer manufacturer uses chips from three sources. Chips from source A, B and C are defectives with the probability .001, .005, .01, respectively. If randomly selected chip found to be defective , find the probability that the manufacturer was A; that the manufacture was C.
1
1.4 Moments and Central Moments
Definition : The kth moment is The first moment = Average = Expectation:
μ X = E[ X ] = ∫ xf X ( x)dx
−∞
∞
Definition : The kth central moment is The second moment = variance:
Visualizing Conditioning View joint PMF along slice Y = y and renormalize.
l
•Probability of A given B occurred •B is now our new universe
Total Probability Formula Definition : A partition of such that is a collection of disjoint events
1 fX
( x) = ∫
∞
−∞
f X Y ( x y ) fY ( y )dy
∞ −∞
f X ,Y ( x, y ) = ∫
f X ,Y Z ( x, y z ) f Z ( z )dz
24
Prior distribution Transition PDFs
fY X ( y x )
Question: given an observation {Y = y} what information do we have about X?
1 1 = (1 − ε1 ) + ε 2 2 2
b)
1 P[Y = 1 | X = 0]P[ X = 0] 2 ε1 = 1 P[ X = 0 | Y = 1] = 1 − 2 (1 − ε1 ) − 1 ε 2 P[Y = 1] 2
ε1 = (1 − ε 2 ) + ε1
P[ X = 1 | Y = 1] = 1 − P[ X = 0 | Y = 1] = 1− ε 2 (1 − ε 2 ) + ε1