Optimal Vector Compressed Sensing
The modern trend in science and technology is to measure vector signals instead of scalar signals - ruthlessly, inexorably scaling up to higher and higher dimensional vectors. Measuring ever high dimensional vectors creates new opportunities, which are manifested mathematically. Even more surprisingly, squeezing the most benefit out of this trend in ever higher dimensional vectors is achieved using the most sophisticated tools from statistical decision theory.
In this talk, we describe a new Vector Compressed Sensing procedure based on adapting James Stein shrinkage into the Compressed Sensing world. We show that this is the optimal way to squeeze out the benefits of high dimensional measurement vectors, and we even show that these results extend to structured sensing matrices which are in demand in actual practice.
All of this is made possible by the so-called Approximate Message Passing (AMP) - type algorithms. We will clarify how to generalize and extend the family of AMP algorithms to this new structured setting. All this has been validated through massive computational experiments.
Time permitting, we will describe our recent advances in Binary Compressed Sensing and also work on the popular topic of Model Collapse in AI.
Apratim Dey is a Ph.D. student in the Department of Statistics at Stanford University, where he is fortunate to be advised by David Donoho. He obtained his undergraduate degree from the Indian Statistical Institute, Kolkata, India. He cares about computation-enabled scientific discovery and understanding of phenomena broadly happening today in empirical data science, and tries to figure out where statistical intuition and knowledge can create a difference.