决策层特征融合decisionlevelidentityfusion

合集下载
  1. 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
  2. 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
  3. 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。
P(Oj / D1,..., Dn), j ? 1,2,..., M
? Input to Bayes formulation: P(Di|Oj). for each sensor and entity or hypothesis Hi; P(Oj) a priori probabilities
Bayesian inference
Identity Declaration
Identity Declaration
Association
Decision Level
Fusion – Identity Fusion
Introduction
? Decision-Level Fusion Techniques
Classical inference Bayesian inference Dempster- Shafer's method Generalized evidence processing theory Heuristic methods
? These declarations are then combined via a generalization of Bayesian formulation described before. This provides an updated, joint probability for each possible entity Oj.
Decision-Level Identity Fusion
Tan Xin
Lab 5, System Engineering Dept.
Contents
? 1. Introduction ? 2. Classical inference ? 3. Bayesian inference ? 4. Dempster- Shafer's method* ? 5. Generalized Evidence Processing (GEP) Theory ? 6. Heuristic methods for identity fusion ? 7. Implementation and trade-offs
Classical inference
? Main technique – hypothesis testing
? Two assumptions are required
? 1. an exhaustive and mutually exclusive set of hypothesis can be defined
Bayesian inference
? Bayesian formulation
P(Hi / E) ? P(E / Hi)P(Hi) ? P(E, Hi)
? P(E / Hi)P(Hi) P(E) i ? P(Hi) ? 1 i
Suppose H1,H2,…,Hi, represent mutually exclusive and exhaustive hypotheses
? 2.allow incorporation of a priori knowledge about the likelihood of a hypothesis being true at all.
? e subjective probabilities for a priori probabilities for hypothesis, and for the probability of evidence given a hypothesis.
Sensor #n ETC Dn
P(Dn|Oj)
P(Oj / D1 ? D2 ? ... ? Dn) j ? 1,2,..., M
Bayesian inference
? Disadvantages
? 1.Difficulty in defining priori functions: P(Oj) ? plexity when there are multiple potential hypothesis and
Bayesian inference
? Features
? 1.provide a determination of the probability of a hypothesis being true, given the evidence. Classical inference give us the probability that an observation could be ascribed to an object or event, given an assumed hypothesis.
? Requires a priori knowledge and computation of multidimensional probability density functions. (a serious disadvantage)
Classical inference
? Additional disadvantages
Bayesian inference
? Multisensor fusion
? For each sensor, a priori data provide an estimate of the probability that the sensor would declare the object to be type i given that the object to be of type j, noted as P(Di|Oj).
? 2.we can compute the probability of an observation, given an assumed hypothesis.
Classical inference
? Generalize to include multidimensional data from multiple sensors.
Feature-level fusion (Feature extraction, identity declaration)
Data-level fusion
(Data fused)
Introduction
Sensor
A
Sensor B
Extraction
Feature
Sensor
N
Identity Declaration
Classical inference
Theoretical base
Statistical inference techniques seek to draw conclusions about an underlying mechanism or distribution, based on an observed sample of data.
Introduction
? Decision-level fusion
Seeks to process identity declarations from multiple sensors to achieve a joint declaration of identity.
Decision-level fusion (joint identity declaration)
? The technique may be based on either classical probabilities, or subjective probabilities.
Subjective probabilities suffer a lack of mathematical rigor or physical interpretation. Nevertheless, if used with care, it can be useful in a data fusion inference processor.
lim P( A) ? fn( A) here
n??
n trials, occurrence of k times
Classical inference
? One disadvantage
Strictly speaking, empirical probabilities are only defined for repeatable events. Classical inference methods utilize empirical probability and hence are not strictly applicable to nonrepeatable events, unless some model can be developed to compute the requisite probabilities.
? 1. Only two hypotheses can be assessed at a time; ? 2. Complexities arise for multivariate data; ? 3. Do not take advantage of a priori likelihood assessment.
Classical inference
? Main technique – hypothesis testing
Define two hypothesis 1. A null hypothesis, H0 (原假设) 2. An alternative hypothesis,H1 (备择假设)
Test logic 1. Assume that the null hypothesis (H0) is true; 2. Examine the consequences of H0 being true in the sampling distribution for statistic; 3. Perform a hypothesis test, if the observation have a high probability of being observed if H0 is true, the declare the data do not contradict H0. 4. Otherwise, declare that the data tend to contradict H0.
multiple conditionally dependent events ? 3.Requirements that competing hypothesis be mutually
Sensor #1
Observables
Classifier
D1
Declaration
P(D1|Oj)
Sensor #2
ETC
D2
P(D2|Oj)
Bayesian Combination
Formula
Decision Logic: MAP
Threshold MAP etc
Fused Indentity Declaration
Usage: identification of defective parts in manufacturing and analysis of faults in system diagnosis and maintsian inference updates the likelihood of a hypothesis given a previous likelihood estimate and additional evidence (observations).
Classical inference typically assumes an empirical probability model. Empirical probability assumes that the observed frequency distribution will approximate the probability as the number of trials.
相关文档
最新文档