Mutual Information and Minimum Mean-square Error in Gaussian Channels

合集下载
  1. 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
  2. 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
  3. 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。

I. I NTRODUCTION This paper is centered around two basic quantities in information theory and estimation theory, namely, the mutual information between the input and the output of a channel, and the minimum mean-square error (MMSE) in estimating the input given the output. The key discovery is a relationship between the mutual information and MMSE that holds regardless of the input distribution, as long as the input-output pair are related through additive Gaussian noise. Take for example the simplest scalar real-valued Gaussian channel with an arbitrary and fixed input distribution. Let the signal-to-noise ratio (SNR) of the channel be denoted by snr. Both the input-output mutual information and the MMSE are monotone functions of the SNR, denoted by I (snr) and mmse(snr) respectively. This paper finds that the mutual information in nats and the MMSE satisfy the following relationship regardless of the input statistics: d 1 I (snr) = mmse(snr). (1) dsnr 2 Simple as it is, the identity (1) was unknown before this work. It is trivial that one can compute the value of one monotone function given the value of another (e.g., by simply
1
Mutual Information and Minimum Mean-square Error in Gaussian Channels
Dongning Guo, Shlomo Shamai (Shitz), and Sergio Verd´ u
arXiv:cs/0412108v1 [cs.IT] 23 Dec 2004
ห้องสมุดไป่ตู้
composing the inverse of the latter function with the former); what is quite surprising here is that the overall transformation (1) not only is strikingly simple but is also independent of the input distribution. In fact, this relationship and its variations hold under arbitrary input signaling and the broadest settings of Gaussian channels, including discrete-time and continuoustime channels, either in scalar or vector versions. In a wider context, the mutual information and mean-square error are at the core of information theory and estimation theory respectively. The input-output mutual information is an indicator of how much coded information can be pumped through a channel reliably given a certain input signaling, whereas the MMSE measures how accurately each individual input sample can be recovered using the channel output. Interestingly, (1) shows the strong relevance of mutual information to estimation and filtering and provides a non-coding operational characterization for mutual information. Thus not only is the significance of an identity like (1) self-evident, but the relationship is intriguing and deserves thorough exposition. At zero SNR, the right hand side of (1) is equal to one half of the input variance. In that special case the formula, and in particular, the fact that at low-SNR mutual information is insensitive to the input distribution has been remarked before [1], [2], [3]. Relationships between the local behavior of mutual information at vanishing SNR and the MMSE of the estimation of the output given the input are given in [4]. Formula (1) can be proved using the new “incremental channel” approach which gauges the decrease in mutual information due to an infinitesimally small additional Gaussian noise. The change in mutual information can be obtained as the input-output mutual information of a derived Gaussian channel whose SNR is infinitesimally small, a channel for which the mutual information is essentially linear in the estimation error, and hence relates the rate of mutual information increase to the MMSE. Another rationale for the relationship (1) traces to the geometry of Gaussian channels, or, more tangibly, the geometric properties of the likelihood ratio associated with signal detection in Gaussian noise. Basic information-theoretic notions are firmly associated with the likelihood ratio, and foremost the mutual information is expressed as the expectation of the log-likelihood ratio of conditional and unconditional measures. The likelihood ratio also plays a fundamental role in detection and estimation, e.g., in hypothesis testing it is compared to a threshold to decide which hypothesis to take. Moreover, the likelihood ratio is central in the connection of detection and estimation, in either continuous-time [5], [6], [7] or discrete-time setting [8]. In fact, Esposito [9] and Hatsell
Dongning Guo was with the Department of Electrical Engineering at Princeton University. He is now with the Department of Electrical and Computer Engineering at Northwestern University, Evanston, IL, 60208, USA. Email: dGuo@. Shlomo Shamai (Shitz) is with the Department of Electrical Engineering, Technion-Israel Institute of Technology, 32000 Haifa, Israel. Email: sshlomo@ee.technion.ac.il. Sergio Verd´ u is with the Department of Electrical Engineering, Princeton University, Princeton, NJ 08544, USA. Email: Verdu@.
Abstract— This paper deals with arbitrarily distributed finitepower input signals observed through an additive Gaussian noise channel. It shows a new formula that connects the inputoutput mutual information and the minimum mean-square error (MMSE) achievable by optimal estimation of the input given the output. That is, the derivative of the mutual information (nats) with respect to the signal-to-noise ratio (SNR) is equal to half the MMSE, regardless of the input statistics. This relationship holds for both scalar and vector signals, as well as for discrete-time and continuous-time noncausal MMSE estimation. This fundamental information-theoretic result has an unexpected consequence in continuous-time nonlinear estimation: For any input signal with finite power, the causal filtering MMSE achieved at SNR is equal to the average value of the noncausal smoothing MMSE achieved with a channel whose signal-to-noise ratio is chosen uniformly distributed between 0 and SNR. Index Terms— Gaussian channel, minimum mean-square error (MMSE), mutual information, nonlinear filtering, optimal estimation, smoothing, Wiener process.
相关文档
最新文档