Information theoretic bounds for compressed sensing pdf

Information theoretic bounds to performance of compressed. Signal processing, compressed sensing, information theory and polar. Cs is considered as a new signal acquisition paradigm with which sample taking could be faster than. Since arguments for establishing information theoretic lower bounds are not algorithm speci. Similarly, in 10, the authors consider the matrix completion problem and again use information theoretic techniques to obtain bounds.

In this paper we derive information theoretic performance bounds to sensing and reconstruction of sparse phenomena from noisy projections. Information theoretic limits for linear prediction with graph. Saligrama, information theoretic bounds to sensing capacity of sensor networks under fixed snr, presented at the information theory workshop, sep. We emphasize that although the derivation assumes the measurement matrix to be gaussian, it can be extended to any subgaussian case, by paying a small con. Information theoretic results in compressed sensing. Written by leading experts in a clear, tutorial style, and using consistent notation and definitions throughout, it shows how information theoretic methods are being used in data acquisition, data. Pdf information theoretic bounds for compressed sensing. Introduction to compressed sensing with coding theoretic perspective this book is a course note developed for a graduate level course in spring 2011, at gist, korea.

Zhang jingxiong 1, yangke, guojianzhong2 1school of remote sensing and information engineering, wuhan university, wuhan, china. Spectral compressed sensing via structured matrix completion 1d line spectral estimation as a special case, and indicates how to address multidimensional models. Paper open access related content quantum tomography via. Compressed sensing cs is a new framework for integrated sensing and compression. We develop information theoretic performance bounds on target recognition based on statistical models for sensors and data, and examine conditions under which these bounds are tight.

Aug 17, 2000 detection and recognition problems are modeled as composite hypothesis testing problems involving nuisance parameters. Index terms an optimal scaling of the number of observations required forrelaxation, compressed sensing, fanos method, highdimensional statistical inference, information theoretic bounds, sparse approximation, sparse random matrices. Compressed sensing is an emerging field based on the revelation that a small group of linear projections of a sparse signal contains enough information for reconstruction. These techniques are based on one of the following categories. Computer science information theory, computer science machine learning, electrical engineering and systems science signal. Algorithms and bounds for sensing capacity and compressed sensing with applications to learning graphical models s aeron, m zhao, v saligrama 2008 information theory. Sparsity pattern recovery in compressed sensing by galen reeves a dissertation submitted in partial satisfaction of the requirements for the degree of doctor of philosophy in engineering electrical engineering and computer sciences in the graduate division of the university of california, berkeley committee in charge. On the other hand, an information theoretic analysis can reveal where there currently exists a gap between the performance of computationally tractable methods, and the fundamental limits. We also propose and prove several interesting statistical properties of the square root of jensenshannon divergence, a wellknown informationtheoretic metric, and exploit other known ones. Detection and information theoretic measures for quantifying the distinguishability between multimedia operator chains. Information theoretic bounds for compressed sensing core. On the other hand fundamental information theoretic bounds that are algorithm independent have been presented in 2 1. Information theoretic bounds for compressed sensing in sar imaging to cite this article.

Compressive sensing provides a new approach to data acquisition and storage. It has recently been shown that for compressive sensing, significantly fewer measurements may be required if the sparsity assumption is replaced by the assumption the unknown vector lies near the range of a suitablychosen generative model. Information theoretic bounds for compressed sensing. Index termscompressed sensing, relaxation, fanos method, highdimensional statistical inference, information theoretic bounds, lasso, model selection, signal denoising, sparsity pattern, sparsity recovery, subset selection, support recovery. In the cs literature, several information theoretic bounds on the scaling law of the required number of measurements for exact support recovery have been. Introduction sparse vectors are widely used tools in. Apr 22, 2008 in this paper we derive information theoretic performance bounds to sensing and reconstruction of sparse phenomena from noisy projections. Compressed regression neural information processing systems. Informationtheoretic lower bounds for compressive sensing with generative models the goal of standard compressive sensing is to estimate an unknown vecto. Request pdf information theoretic performance bounds for noisy compressive sensing compressive sensing provides a new approach to data acquisition and storage. The obtained bounds establish the relation between the complexity of the autoregressive process and the attainable estimation accuracy through the use of a novel measure of complexity.

Sep, 20 in the remaining part of this chapter we derive a few information theoretic bounds pertaining to the problem at hand. Here the authors propose a quantity, named sensing capacity, to incorporate the effects of distortion. Abstract compressed sensing cs deals with the reconstruction of sparse signals from a small number of linear measurements. Informationtheoretic bounds on target recognition performance. Nowadays, after only 6 years, an abundance of theoretical aspects of compressed sensing are explored in more than articles. Using an information theoretic metric for compressive recovery under poisson noise sukanya patila, karthik s. Information theoretic lower bounds for compressive sensing with generative models. Informationtheoretic limits on sparsity recovery in the. Index terms an optimal scaling of the number of observations required forrelaxation, compressed sensing, fanos method, highdimensional statistical inference, information theoretic bounds. Compressed sensing cs is a new framework for sampling and reconstructing.

I m, where i m is an identity matrix of size m, is assumed to be a. The fundamental revelation is that, if an n sample signal x is sparse and has a good k term approximation in some basis, then it can be reconstructed using m ok lognk n. Furthermore, we show an information theoretic lower bound for tomography of rankr states using adaptive sequences of singlecopy pauli measurements. Information theoretic lower bounds for compressive sensing with generative models abstract. On the one hand are rigorous bounds based on information theoretic arguments or the analysis of speci. Citeseerx document details isaac councill, lee giles, pradeep teregowda. Informationtheoretic limits on sparse signal recovery. Compressed mizationsensing bounds prior information weighted n. Spectral compressed sensing via structured matrix completion. Moreover, this methodology is to date extensively utilized by applied.

These problems concern continuous natural phenomena. The improved performance of these methods over their standard counterparts is demonstrated using simulations. Information theoretic bounds for compressed sensing article pdf available in ieee transactions on information theory 5610. One buzzword you can look up and read more about is the \singlepixel camera. A novel technique using polar codes signal processing and communications applications conference siu, 2010 ieee 18th,2010 compressed sensing coding and information theory polar codes signal processing. This is based on the principle that, through optimization, the sparsity of a signal can be exploited to recover it from far fewer samples than required by.

Tight measurement bounds for exact recovery of structured sparse signals. The theory of compressed sensing, where one is interested in recovering a highdimensional signal from a small number of measurements, has grown into a rich field of investigation and found many applications 24. The principle observation here is that most natural phenomena of interest is compressible, i. An interesting question which arises in this context is the e. Bounds for optimal compressed sensing matrices and. Informationtheoretic methods in data science edited by. Tight measurement bounds for exact recovery of structured. Index terms compressive sensing, linear prediction, classi. The fundamental revelation is that, if an nsample signal x is sparse and has a good kterm approximation in some basis, then it can be reconstructed using m ok lognk n linear projections of x onto another basis.

Compressed sensing also known as compressive sensing, compressive sampling, or sparse sampling is a signal processing technique for efficiently acquiring and reconstructing a signal, by finding solutions to underdetermined linear systems. Reference 5 investigated the contained information in noisy measurements by viewing the measurement system as an information theoretic channel. Informationtheoretic bounds of resampling forensics. Using an information theoretic metric for compressive. On the one hand are rigorous bounds based on informationtheoretic arguments or the analysis of speci. A strong converse bound for multiple hypothesis testing, with applications to highdimensional estimation. Bounds for optimal compressed sensing matrices and practical reconstruction schemes shriram sarvotham abstract compressed sensing cs is an emerging. Ieee transactions on information forensics and security 11, 4 2016, 774788. We propose a reconstruction algorithm with multiple side in. Gurumoorthyb, ajit rajwadec, adepartment of electrical engineering, iit bombay binternational center for theoretical sciences, tifr ictstifr, bangalore cdepartment of computer science and engineering, iit bombay abstract. In this section, we put our work in the context of existing work on poisson compressed sensing with theoretical performance bounds. Compressed sensing cs deals with the reconstruction of sparse signals from a small number of linear measurements. The problem has received significant interest in compressed sensing and sensor networkssnets literature.

In the cs literature, several information theoretic bounds on. The focus of our technique is on the replacement of the generalized kullbackleibler divergence, with an information theoretic metric namely the square root of the jensenshannon divergence, which is related to an approximate, symmetrized version of the poisson log likelihood function. The goal of compressed sensing is to learn a structured signal x from a limited number of noisy linear measurements y. Dror baron information theoretic results in compressed sensing compressed sensing. In this paper, we derive some information theory bounds on the performance of noisy compressive sensing to calculate. Towards an algorithmic theory of compressed sensing, rutgers univ. Index termsbasis pursuit, compressed sensing, compressive sampling, information theoretic bounds, lasso, orthogonal matching pursuit, prior information, sparsity pattern recovery. In this paper we introduce a new theory for distributed compressed sensing dcs that enables new distributed coding algorithms for multisignal. Compressed sensing cs is a new framework for sampling and. Index termsbasis pursuit, compressed sensing, compressive sampling, informationtheoretic bounds, lasso, orthogonal matching pursuit, prior information, sparsity pattern recovery. Information theoretic bounds for compressed sensing in sar. For comparison, we will use the results by hegde and others 2 in a linear regression setup.

Informationtheoretic lower bounds for compressive sensing. Lower bounds for compressed sensing with generative models. In this paper, we derive information theoretic performance bounds to sensing and reconstruction of sparse phenomena from noisy projections. For example, reference 4 studied the minimum number of noisy measurements required to recover a sparse signal by using shannon information theory bounds. Information theoretic limits for linear prediction with. The standard approach to taking pictures is to rst take a highresolution picture in the \standard basis e. Consider a population consisting of n individuals, each of whom has one of d types e. Another goal of this paper is to develop information theoretic bounds for the emerging. The problem of sparse estimation via linear measurements commonly referred to as compressive sensing is particularly wellunderstood, with theoretical developments including sharp performance bounds for both practical algorithms 4, 7, 8, 6 and potentially intractable information theoretically optimal algorithms 9, 10, 11, 12. The smaller matrix sa2rm is a compressedd version of the original data a2rn d we start with an overview of di erent constructions of sketching matrices in. In the cs literature, several information theoretic bounds on the scaling law of the required number of measurements for exact support recovery have been derived, where. We consider two types of distortion for reconstruction. Numerical experiments are performed showing the practical use of the technique in signal and image reconstruction from compressed measurements under. Information theoretic performance bounds for noisy.

Finally, we characterize the privacy properties of the compression procedure in informationtheoretic terms, establishing upper bounds on the rate of information communicated between the. Universal measurement bounds for structured sparse signal recovery we analyze the groupstructured sparse recovery problem using a random gaussian measurement model. In this paper we introduce a new theory for distributed compressed sensing dcs that enables new distributed. Detection and information theoretic measures for quantifying the. One of the main challenges in cs is to find the support of a sparse signal from a set of noisy observations. On the other hand are exact but heuristic predictions made using the replica method from statistical physics. Furthermore, x can be reconstructed using linear programming, which has. Universal measurement bounds for structured sparse signal. Information theoretic bounds for compressed sensing ieee. Learn about the stateoftheart at the interface between information theory and data science with this first unified treatment of the subject. An informationtheoretic approach to distributed compressed sensing. Algorithms and bounds for sensing capacity and compressed sensing with applications to learning graphical models s aeron, m zhao, v saligrama 2008 information theory and applications workshop, 303309, 2008. Indeed, the informationtheoreticconstrained quadratic programming.

671 1540 640 135 1545 1312 338 1125 479 1502 1359 809 1205 831 455 194 469 1021 613 1552 219 1092 336 465 459 1119 264 1425 770