Hello, and welcome to Paper of the Day (Po’D): Precise Undersampling Theorems Edition. Today’s paper happens to have the same title! D. L. Donoho and J. Tanner, “Precise undersampling theorems,” Proc. IEEE, vol. 98, no. 6, pp. 913-924, June 2010.

This paper paints an extremely comprehensible picture of the current state of our understanding of when we can expect to perfectly recover a sparse signal from compressive measurements. It is also the paper which all people should read to understand the phase transition, and its significance. In short, a phase transition is a continuous boundary in a space of problems that shows where the majority of them are solvable.

In this paper, Donoho and Tanner focus on three particular problems to which convex optimization provides surprising solutions *most of the time* given the right sparsity \(k\):

- C: \(k\)-sparse \(N\)-dimensional signal distributed Gaussian, sensed by \(m\) sinusoids with randomly selected frequencies, recovery by $$\min ||\vx||_1 \; \textrm{subject to} \; \MA\vx = \vb.$$
- T: \(k\)-sparse \(N\)-dimensional non-negative signal, sensed by the first \(m\) frequencies of the DFT matrix, recovery by $$\min \mathbf{1}^T\vx \; \textrm{subject to} \; \MA\vx = \vb, \vx \succeq 0.$$
- I: \(N\)-dimensional signal distributed mostly in \(\{0,1\}\), but with \(k\) elements distributed in \((0,1)\) (I think), sensed by the first \(m\) frequencies of the DFT matrix, recovery by feasibility problem $$\vx \; \textrm{subject to} \; \MA\vx = \vb, 0 \succeq \vx \succeq 1.$$

(I don’t understand the objective of that last one. Maybe it is asking if there exists a an \(\vx\) subject to those constraints.)

The phase transition for these problems arises quite naturally from counting polytope faces in the original and projected spaces. The only difference is that the type of polytope changes for each of these problems.

Consider a \(m\times N\) random matrix \(\MA\) with elements \(\sim \mathcal{N}(0,1)\). Let \(m = \delta N\), where \(\delta \in [0,1]\) describes the undersampling (indeterminacy), and \(k = \rho m\), where \(\rho \in [0,1]\) describes the sparsity.

Every spot in the phase space \((\delta,\rho)\) describes the problem undersampling and sparsity.

We want to know where exact recovery is guaranteed most of the time in this space.

The work of Donoho and Tanner, has resulted in the following pinnacle theorem.

For all of the problems above, as \(N\to \infty\)

the probability of finding the solution is either 0 or 1:

$$

\lim_{N \to \infty} P\{\text{exact recovery}\} = \lim_{N \to \infty} \frac{f_k(\MA\MQ)}{f_k(\MQ)} = \begin{cases}

1,& k/m \rho(\delta, \MQ)

\end{cases}

$$

where \(f_k(\MQ)\) counts the \(k\)-dimensional faces of the \(N\)-dimensional polytope \(\MQ\). (The inequalities in this expression are switched in the paper.)

In other words, to find the probability of exact recovery to any of the three problems above, count the number of \(k\)-dimensional faces of the associated \(N\)-dimensional polytope,

project it into the \(m\)-dimensional space and count the number of \(k\)-dimensional faces, and divide the two.

For the three problems above, the polytopes are: T – simplex; I – hypercube, and C – cross-polytope. I think that finding is remarkable.

Donoho and Tanner also show that for finite \(N\), the phase transition is bounded from above by \(\rho(\delta, \MQ)\). (So the empirical results from my experiments should only improve in higher-dimensions.) They also show that

\(\rho_T(\delta, \MQ)\) for the non-negative signal is greater than or equal to the others by quite a bit over much undersampling;

and that phase transitions for different sensing matrices are nearly the same.

Bob,

The key insight for me was when Jared showed us back in 2008 that the sensing matrices designed by Indyk and Berinde also fitted this transition. At that point, one could not escape the fact that the transition was indeed universal, hence my naming it after their authors: The Donoho-Tanner phase transition ( https://sites.google.com/site/igorcarron2/donohotannerphasetransition ).

Igor.

LikeLike