Algorithm Power Hour: Subspace Pursuit (SP)

This algorithm power hour is devoted to Subspace Pursuit, presented in the paper: W. Dai and O. Milenkovic, “Subspace pursuit for compressive sensing signal reconstruction,” IEEE Trans. Inf. Theory, vol. 55, pp. 2230-2249, May 2009.


Given an \(s\)-sparse vector \(\vx \in \mathcal{R}^N\),
we sense it by a matrix \(\mathbf{\Phi} \in \mathcal{R}^{m\times N}\) with \(N > m\),
which produces the measurement vector \(\vu = \mathbf{\Phi}\vx\).
Define the index of the columns of \(\mathbf{\Phi}\) as \(\Omega = \{1, 2, \ldots, N\}\).
SP proceeds in the following way.
Given the index set \(\Omega_k \subset \Omega\) at iteration \(k\),
and defining \(\mathbf{\Phi}_{\Omega_k}\) as the columns from \(\mathbf{\Phi}\) indexed by \(\Omega_k\),
SP redefines this set \(\Omega_{k+1} = J_s^*\)
where \(J_s^* \subset \Omega\) is a set of at most \(s\) indices found in the following way.
First, SP finds
$$
J := \textrm{Supp}\{T_1 \left ( |\mathbf{\Phi}^*\vr_k| ; \tau_{k,1} \right ) \}
$$
where \(T_1(\vy)\) is a thresholding function applied to each element of \(\vy\)
according to the threshold \(\tau_{k,1} \ge 0\),
and \(\vr_k\) is the orthogonal projection of \(\vx\) onto the range of \(\mathbf{\Phi}_{\Omega_k}\).
(Wherever an element of \(\vy\) is below \(\tau_{k,1}\) it is set to zero.)
SP sets \(\tau_{k,1}\) such that \(|J| \le s\).
Then with \(\Omega_{k}\cup J\),
SP computes the weights such that \(||\vu – \mathbf{\Phi}\MI_{\Omega_{k}\cup J}\vb||_2\) is minimized,
where \(\MI_{\Omega_{k}\cup J}\) is a \(N\times N\) diagonal matrix
with ones at \((j,j)\), \(j \in \Omega_{k}\cup J\), and zeros elsewhere.
SP then thresholds \(\vb\) to find the new optimal support
$$
J_s^* = \textrm{Supp}\{T_2\left ( |\vb|; \tau_{k,2} \right )\}
$$
where \(T_2(\vy)\) is another thresholding function,
and \(\tau_{k,2} \ge 0\) is set such that a maximum of \(s\) elements of \(\vb\) are retained.
SP repeats the procedure above until some stopping condition,
for instance, \(||\vr_k|| >||\vr_{k-1}|| \), in which case \(\Omega_{k+1} \leftarrow \Omega_{k}\).

The similarity of SP to CoSaMP is unmistakable. In fact, it is so easy to take the correct code for CoSaMP, and make it SP: subspacepursuit.m. (Note, this code has been corrected here.)

Below we see a comparison of SP and CoSaMP.
The red lines are CoSaMP.
I am surprised that such a large difference between the two comes from selecting \(s\) elements instead of \(2s\) in the first threshold stage.
(By the way, that nasty hump in CoSaMP probably comes from the fact that I am averaging only 50 realizations. :)
Anyhow, these results agree with those of Maleki and Donoho
where they find SP outperforms CoSaMP.

CoSaMPSP.png

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s