As I am leaving my office in Denmark for the holidays, I have planned a short reading list on the topics most near to me heart.
On addressing my frustration regarding explaining the strange dependence of recovery of sparse signals by OMP (and its ilk) on priors:
- A. Maleki and D. L. Donoho, “Optimally tuned iterative reconstruction algorithms for compressed sensing,” IEEE J. Selected Topics in Signal Process., vol. 4, pp. 330-341, Apr. 2010.
- M. A. Davenport and M. B. Wakin, “Analysis of orthogonal matching pursuit using the restricted isometry property,” IEEE Trans. Info. Theory, vol. 56, pp. 4395-4401, Sep. 2010.
- V. Cevher, “Learning with compressible priors,” in Proc. Neural Info. Process. Syst., (Vancouver, BC, Canada), Dec. 2009.
- Y. Jin and B. D. Rao, “Performance limits of matching pursuit algorithms,” in Proc. Int. Symp. Info. Theory, (Toronto, ON, Canada), pp. 2444-2448, July 2008.
- A. Barron, A. Cohen, W. Dahmen, and R. A. DeVore, “Approximation and learning by greedy algorithms,” Annals of Statistics, vol. 36, no. 1, pp. 64-94, 2008.
- J. D. Blanchard, C. Cartis, J. Tanner, and A. Thompson, “Phase transitions for greedy sparse approximation algorithms,” (submitted somewhere), Apr. 2010.
- E. J. Candès, Y. C. Eldar, and D. Needell, “Compressed sensing with coherent and redundant dictionaries,” arXiv:1005.2613v1, May 2010.
- E. J. Candès and T. Tao, “Near-optimal signal recovery from random projections: Universal encoding strategies?,” IEEE Trans. Info. Theory, vol. 52, pp. 5406-5425, Dec. 2006.
- J. Tropp, A. C. Gilbert, and M. J. Strauss, “Algorithms for simultaneous sparse approximation. part i: Greedy pursuit,” Signal Process., vol. 86, pp. 572-588, Mar. 2006.
- J. Tropp, “Greed is good: Algorithmic results for sparse approximation,” IEEE Trans. Info. Theory, vol. 50, pp. 2231-2242, Oct. 2004.
On other approaches for sparse signal recovery (for implementation and testing in MATLAB):
- J. A. Tropp and S. J. Wright, “Computational methods for sparse solution of linear inverse problems,” Proc. IEEE, vol. 98, pp. 948-958, June 2010.
- E.-T-Liu and V. Temlyakov, “Orthogonal super greedy algorithm and application in compressed sensing,” Tech. Rep. 01, University of South Carolina, South Carolina, USA, 2010.
- K. Labusch, E. Barth, and T. Martinetz, “Bag of pursuits and neural gas for improved sparse coding,” in Proc. Int. Conf. Computational Statistics, pp. 327-336, 2010.
- G. Peyré, “Best basis compressed sensing,” IEEE Trans. Signal Process., vol. 58, pp. 2613-2622, May 2010.
- C. Herzet and A. Drémeau, “Bayesian pursuit algorithms,” in Proc. European Signal Process. Conf., (Aalborg, Denmark), pp. 1474-1478, Aug. 2010.
Let’s not forget about audio!
- M. Christensen and A. Jakobsson, Multi-pitch estimation, Morgan & Claypool Publishers, 2009.
- R. Bardeli and F. Kurth, “Robust identification of time-scaled audio,” in Proc. AES Int. Conf., (London, U.K.), June 2004.
Fun bedtime reading and general relaxation:
- G. Isely, C. J. Hillar, and F. T. Sommer, “Deciphering subsampled data: adaptive compressive sampling as a principle of brain communication,” (submitted somewhere), Nov. 2010.
- T. Hromádka, M. R. DeWeese, and A. M. Zador, “Sparse representation of sounds in the unanesthetized auditory cortex,” PLOS Biology, vol. 6, pp. 0124-0137, Jan. 2008.
- S. Kay, “A new approach to Fourier synthesis with application to neural encoding and speech classification,” IEEE Signal Process. Lett., vol. 17, pp. 855-858, Oct. 2010.
- A. M. Bruckstein, D. L. Donoho, and M. Elad, “From sparse solutions of systems of equations to sparse modeling of signals and images,” SIAM Review, vol. 51, pp. 34-81, Feb. 2009.
Then I have a course to prepare! That fun reading includes
- I. Millington, Artificial Intelligence for Games, Morgan Kaufmann, 2006.
- S. Russell and P. Norvig, Artificial Intelligence: A Modern Approach, 3rd ed., Prentice Hall, 2009.
- C. Reas and B. Fry, Getting Started with Processing, O’Reilly Media, Inc., 2008.
Will a holiday so busy be a holiday at all?