Problem of Accelerating Data Deluge

Why should we even consider this problem? Below are some reasons.

From storage point of view, we are generating a huge amount of data. As a result, we are running out of storage. In 2007, we generated more data than the total storage available. So now, we have no other option but to simply throw away some data.

From sampling point of view, sampling rate calculated using Shannon-Nyquist theorem might be too high in many cases. It might be very expensive or simply impossible to design a sampling device for such rates.

From communication point of view, increase in data generation rate is much higher than increase in transmission rate.

Is there a solution?

Yes. Compressive Sensing ! The main idea is to acquire far fewer samples of a signal in a signal independent fashion and then reconstruct the signal from these incomplete measurements.

What is Compressive Sensing?

As stated in an article written by Emmanuel J. Candes and Michael B. Wakin, Compressive sensing (CS) is a sensing/sampling paradigm that goes against the common knowledge in data acquisition. Conventional approaches to sampling follow Shannon-Nyquist theorem. Using CS theory, certain signals can be recovered from far fewer samples than traditional methods use.