Hello, and welcome to Paper of the Day (Po’D): Cyclic Pure Greedy Algorithms Edition. We just finished and submitted today’s paper for the proceedings of the 2011 Asilomar Conference: B. L. Sturm, M. G. Christensen and R. Gribonval, “Cyclic pure greedy algorithms for recovering compressively sampled sparse signals,” Proc. IEEE Asilomar Conf. Signals, Systems and Comp., Pacific Grove, CA, Nov. 2011. I previously posted the slides here.
We explore the use of cyclic pure greedy algorithms matching pursuit (CMP) and complementary matching pursuit (CompMP) (both of which have been discussed on this blog many times before) for solving the inverse problems posed by compressed sensing (CS) within the non-noisy regime. Previously, we had only considered applying the cyclic minimization principle to MP, and not to CS. Motivated by the work in Dymarski et al., we realized that CompMP is also a pure greedy algorithm (previously there was only one) with just as small a computational complexity as MP (a little more overhead in the beginning), and which can be encased within a cyclic minimization framework. So how do these two algorithms perform in full support recovery?
The performance of cyclic CompMP (CompCMP) easily beats orthogonal least squares for sparse signals distributed Gaussian, and even for the least favorable Rademacher — as long as the refinement cycles are performed after each model augmentation. We also see that CompMP without any refinement performs as good as OLS for Rademacher signals. In comparison with state of the art methods, when CompCMP refines after every augmentation, it performs competitively in support recovery, while maintaining an extremely simple and low computational complexity. For Guassian distributed signals, CompCMP performs nearly as well as the best performing SL0 and PrOMP. Curiously, when it waits to refine the model, CompCMP takes a heavier hit to performance than CMP.