矩阵理论在通信中应用
合集下载
相关主题
- 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
- 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
- 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。
Emmanuel Candès and Terence Tao, Near optimal signal recovery from random projections: Universal encoding strategies? (IEEE Trans. on Information Theory, 52(12), pp. 5406 - 5425, December 2006)
V Up left corner DCT zoomed in 2 layer discrete cosine transform
S Original SVD
with Haar wavelet basis
3.1 Sparsity: Applications and Development
Some times, on Weibo, interesting news originate from certain users and is forwarded many times by other users. We now know who forwards the messages and when the messages are forwarded. Now we want to construct a relationship (who friended whose Weibo) network from the above information. This can be abstracted as a topological gwk.baidu.comaph. Sparsity: each node is linked to a small number of neighbors.
3.1 Sparsity: Applications and Development
The Netflix prize:
About a million users and 25000 movies Known rankings are sparsely distributed Predict unknown ratings
������������������ ,
at most
������ 2
sparse
Specially, a Gaussian random matrix (each element is independently generated from normal distribution) satisfies with a overwhelming probability the requisition mentioned by the lemma. In many situations, there exists noise, so it is of more interest to solve ������ = ������������ + ������
3.2 Sparsity Rendering Algorithms
In the last chapter, we have introduced ������1 sparsity regularizer One way to get a sparse recovery: ������ = argmin||������ − ������������||2 2 + ������||������||1
3.1 Sparsity: Applications and Development
In 2006, monumental papers of compressive sensing were published:
Emmanuel Candès, Justin Romberg, and Terence Tao, Robust uncertainty principles: Exact signal reconstruction from highly incomplete frequency information. (IEEE Trans. on Information Theory, 52(2) pp. 489 - 509, February 2006) David Donoho, Compressed sensing. (IEEE Trans. on Information Theory, 52(4), pp. 1289 - 1306, April 2006)
3.2 Sparsity Rendering Algorithms
������
Not that this is a convex optimization problem and global optimal point is guaranteed. Bad thing is: we have to resort to trial and error to tune ������
3.2 Sparsity Rendering Algorithms
From the above answers.
In all 2������ ≤ ������������������������ ������ , so for ������, at most ������ ∈ vector can be guaranteed to be recovered.
3.2 Sparsity Rendering Algorithms
Basis Pursuit (BP) is such a ������1 regularizer optimization algorithm for the sparse signal recovery problem. It assumes that no noise is present, so the optimization function is: min������ ‖������‖1 subject to ������ = ������������. With some rearranging, this problem can be treated as a linear programming problem.
3.1 Sparsity: Applications and Development
As seen in the last chapter in linear regression, we are actually solving ������ this problem: ������ ������, ������ = ������������ ������ ������ + ������ Where ������ is noise. We have learned that, if ������ ≥ ������, there will be serious over-fitting. To suppress over-fitting, we can add a regularizer. We can add a sparse regularizer (LASSO) to render the target vector sparse, to select a small number of basis function.
Chapter 3
Sparse Signal Recovery
3.1 Sparsity: Applications and Development
What is sparse? 1. Many data mining tasks can be represented using a vector or a matrix. 2. Sparsity implies many zeros in a vector or a matrix.
Donoho was Emmanuel awarded Candès the Shao prize Terrace Tao
3.2 Sparsity Rendering Algorithms
The very important problem in compressive sensing is solving this problem: Given a sparse ������, and do this compression ������ = ������������, ������ is a underdetermined matrix. Now the target is from ������, to recover ������ Bad news is:������ is underdetermined and we know, normally, ������ = ������������ has infinite solutions. Good news is: we have a prior information:������ is sparse
2 Discrete Cosine Transform
3 Wavelet Transform… Note that the black pixels indicate U the matrix values are close to zero thus making the matrix easy to compress
3.1 Sparsity: Applications and Development
Collaborative filtering:
Customers are asked to rank items Not all customers ranked all items Predict the missing rankings
3.2 Sparsity Rendering Algorithms
Here are two concerns: 1: How sparse should ������ be so that it can be accurately recovered. 2: Is there any requisition for ������? For question 1, we know that ������ = ������������ has infinite solutions, thus, we have to attach some conditions to ������ to this solution unique. As ������ is sparse, we should make it the sparsest solution for ������ = ������������. For question 2, we have the following lemma. Suppose a ������ × ������ matrix ������ is such that every set of 2S columns are of ������ are linearly independent. Then an S-sparse (the vector ������ has S nonzero elements) vector ������ can be reconstructed uniquely from ������ = ������������.
������
3.1 Sparsity: Applications and Development
In image processing, to compress a image, we first do a transformation to the pixel matrix to render it sparse, such transformations are 1 Singular Value Decomposition