Today I reconciled my code with Blumensath’s, and am now confident I have an honest implementation of iterative hard thresholding (IHT). My initial surprise at how poorly IHT was performing for all but the embarrassingly sparsest signals appears to hold. For guidance, I have revisited the optimal tuning work by Maleki and Donoho. They rightly observe in this paper that we are awash in recovery algorithms that are not ready-to-run … well, maybe not *awash*, but I sure feel like it. And with IHT and its two parameters, and little guidance for how to set them, I was beginning to question its utility.

Maleki and Donoho recommend for IHT setting the step size parameter \(\kappa = 0.6\). Furthermore, they recommend an adaptable threshold based on the standard deviation \(\sigma\) of the entries of \(\mathbf{\Phi}^T\vr(n)\). In my experiments, I thus set the threshold \(t = 2\sigma\). (I now see that I should have looked at the standard deviation for only those elements of \(\mathbf{\Phi}^T\vr(n)\) that I assume are zero, based on the previous threshold.)

As in my experiments here, I look at the probability of perfect recovery (error < 1e-2) for several values of problem sparsity, and sparse vector distributions, for both IHT as presented in Blumensath, M. E. Davies, "Iterative Hard Thresholding for Compressed Sensing," Applied and Computational Harmonic Analysis, vol. 27, no. 3, pp. 265-274, 2009 (with \(\kappa = 1\) and \(\lambda = 0.4\), IHT), and the dynamic threshold (\(\kappa = 0.6, t = 2\sigma\), IHTDT).

Below we see the recovery results for sparse signals that are constant amplitude random sign (CARS).

And below we see the recovery results for sparse signals with normal distributions.

I am quite happy to see such a significant difference between the regular IHT and the tuned one — it says at least that I did something right with reading Maleki and Donoho. It is striking, however, that even the tuned IHT with dynamic threshold breaks off from perfect recovery for 2-sparse signals. (The problem size is the same as

here.)

### Like this:

Like Loading...

*Related*