My simulations for normally distributed sparse signals have finally finished (after three weeks, jeesh!). This builds on the work here. The question is, what is the performance of cyclic OLS (COLS) in recovering compressively sensed sparse vectors?
Below we see the probability of exact recovery as a function of problem sparsity for cyclic MP (CMP, solid) and for COLS (dashed) at five problem indeterminacies \(m/N\), where the sensing matrix is \(m \times N\). Here \(N=400\). The \(s\)-sparse vector is distributed Normal. Clearly COLS is able to recover the original vectors better than CMP. (In my paper, we see CMP and OMP perform nearly the same.)
The figure below shows for these same sparse vectors, the phase transitions for six recovery methods, and the union of them all (Ensemble). Here we see COLS is on top for most of the indeterminacies, except for a brief tangle with PrOMP. Choosing the best from all six vectors recovered by these algorithms produces the phase transition on top.
So, COLS may do the best of them all for this type of sparse signal. (I should try SL0.) However, it brings with it such an enormous computational cost that I don’t think it is worth it. I make some mention of reducing its cost with the matrix inversion lemma, but I don’t think that will make any sort of dent. I will explore this in greater detail during my scientific mission in Rennes (details to come).