SPARS 2011, day 2

Though the SPARS2011 twitter feed appears miserable, this workshop is jam packed by excellent presentations and discussions. I think too many people are having too much good discussion to have too much time to twitter.

Today at SPARS 2011: Heavy hitters Francis Bach and Rémi Gribonval delivered the two plenary talks. This morning Bach talked on a new subject for me: submodular functions.
In particular, he is exploring these ideas for creating sparsity-inducing norms.
A motivation for this work is that while the l1 norm promotes sparsity within groups, it does not promote sparsity among groups… or vice versa (it is new to me). But I liked how he described his formalization as “the norm-design business.”
Someone asked him a question about analyzing greedy methods vs. convex optimization.
Bach’s answer made me realize that we can more completely understand the behavior of convex optimization methods than greedy methods because convex methods are decoupled from the dictionary. For greedy methods, the dictionary is involved from the get go.

This afternoon, Gribonval talked on “cosparsity”, or when a signal is sparsely represented by the dual of a frame instead of the frame itself. His entire talk revolved around looking more closely at the assumption that atomic decomposition and a transform are somehow similar. Or that when we say a signal is sparse, we mean it is sparse in some dictionary; but we can also mean its projection on a frame is sparse. This is then “cosparsity”, which brings with it l1-analysis.
To be a little more formal, we can considering solving the “synthesis” problem
$$
\min_\vz || \vz ||_1 \; \textrm{subject to} \; \vy = \MA \MD \vz
$$
where we assume \(\vz\) is sparse; or the “analysis” problem
$$
\min_\vx || \MG \vx ||_1 \; \textrm{subject to} \; \vy = \MA \vx
$$
where we assume the analysis (or transformation) of \(\vx\) by \(\MG\), i.e., \(\MG\vx\), is sparse.
Gribonval et al. have done an excellent job interpreting what is really going on with l1-analysis. Instead of wanting to minimize the number of non-zeros in the signal domain, l1-analysis wants to maximize the number of zero in the transform domain.
Later on, his student Sangnam Nam presented extraordinary results of this work with their Greedy Analysis Pursuit, which attempts to null non-zeros in the solution.
This reminded me a bit about the complementary matching pursuit, but this is quantitatively different. Gribonval joked that “sparsity” may now be “over.” The new hot topic is “cosparsity.”

There were many other exciting talks too, showing extraordinary results;
but now I must go and work on some interesting ideas that may or may not require my computer to run through the night.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s