Signal recovery with pure greed

I am now busy analyzing the results of our multiple simulations of cyclic greedy approaches to signal recovery from compressive measurements. Next month I will present a portion of this work at the 2011 Asilomar Conference on Signals, Systems and Computers. We are investigating a basic question: can we make the pure greedy algorithm (matching pursuit) competitive with more complex solvers of inverse problems and maintain its simplicity and low computational overhead? The answer is yes, but with a few additions.

  1. We must perform a debiasing step at the end of the pursuit to trim the support set, which has minimal complexity since it is only done once no matter the expected sparsity. This can be done by an orthogonal projection onto the selected features, and a hard thresholding of the weights. (Note that this step is also necessary with other more complex approaches, such as \(ell_1\) minimization (basis pursuit) and gradient projection for sparse representation.)
  2. We take a cyclic approach, wherein we employ the pure greedy algorithm again. This adds to the computational cost.


Below we see the empirical phase transitions for some different recovery algorithms for a problem suite of sparse signals distributed Normally, sensed by matrices sampled from the uniform spherical ensemble in a space with ambient dimension 400. The line marked DT is the Donoho-Tanner phase transition for the \(\ell_1\) minimization approach (BP) for the same type of sensing matrix. It marks the boundary between majority recovery as the ambient dimension goes to infinity, and so it is a “real” phase transition. All other lines I have empirically estimated; but as the ambient dimension grows, they should all go higher. My success criteria for exact recovery is exact support recovery. I run all pursuits a maximum number of iterations of twice the expected sparsity, or if the modeling error is less than \(10^{-10}\|\vu\|_2^2\).

normal_phasevsdistSupp.png
The line marked MP+ is the empirical phase transition of the pure greedy algorithm (MP) with debiasing at the end of the pursuit. It is nowhere competitive with the other algorithms, except for BP at low problem indeterminaces. The blue line shows the empirical phase transition of OMP, which performs an orthogonalization of the residual at the end of every model augmentation; and the magenta line shows the empirical phase transition of OLS, which selects each feature/atom based on minimizing the residual energy after orthogonalization. It is the most computationally heavy of greedy recovery algorithms because of the need to project the entire dictionary (less the atoms already selected) onto a subspace at each augmentation.
The red to black solid lines show the empirical phase transition of the cyclic MP (CMP) approach (with debiasing at the very end), performing 1, 2, 5, or 10 refinement cycles each model augmentation. As I mention here, CMP is nothing more than the Inception of sparse representation: it is a pure greedy algorithm within a pure greedy algorithm. We see that with only a single refinement cycle at each model augmentation, CMP is performing almost as well as OMP; and with 2 refinement cycles, it is performing as well as OMP. Increasing the refinement cycles increases its performance, but the returns certainly diminish.

Below are the results with the same algorithms for sparse signals distributed Bernoulli in \(\{-1,1\}\). Clearly \(\ell_1\) minimization wins this one, but yet again we see CMP take a large performance leap over MP+.

bernoulli_phasevsdistSupp.png
Overall, it is incredible to see such a huge leap in recovery performance with only a single refinement cycle!

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s