Sparse representation and compressed sensing are giving folks in signal processing and other fields like machine learning a fresh perspective. For a really good hub for all the activity take a look at nuit blanche. It's this activity that initially attracted me to this topic.
To get a little history and context on the topic I read Dr. Iain Johnstone's paper on "Wavelets and Function Estimation" published in '99 and re-edited in 2002 in his book titled "Functional Estimation and Gaussian Sequence Models".
Dr. Johnstone does a great job of creating the motivation for the sparse representation for the goal of signal estimation. He first contrasts the the l_1 norm with the l_2 norm using both a sparse and dense signal representation; wavelets and fourier coefficients respectively. He shows, with a concrete example, that the l_2 norms are virtually equal and then that a sparse signal has a much smaller l_1 and l_0 norm. He then generalizes the concept with the l_p ball. The statistical game concept of minmax is introduced to show that the sparse representation allows for a threshold approach to estimation which is optimal.
To realize the optimal estimator you need to find a bases (A) that maps your signal (x) to a sparse representation (y); y = Ax. If you want to reconstruct your new sparse signal then you need to solve for the l_1 minimization problem min ||x||_1 with the constraint y = Ax. Nice.
These concepts and terms come up time and time again in the latest research papers. It's nice to see a historical survey of the topic that takes time to introduce the core concepts.

No comments:
Post a Comment