Compressed sensing (CS) is a new strategy to sample complicated data such as audio signals or natural images. Instead of performing a pointwise evaluation using localized sensors, signals are projected on a small number of delocalized random vectors. This talk is intended to give an overview of this emerging technology. It will cover both theoritical guarantees and practical applications in image processing and numerical analysis. The initial theory of CS was jointly developed by Donoho [1] and Candès, Romberg and Tao [2]. It makes use of the sparsity of signals to minimize the number of random measurements. Natural images are for instance well approximated using a few number of wavelets, and this sparsity is at the heart of the non-linear reconstruction process. I will discuss the extend to which the current theory captures the practical success of CS. I will pay a particular attention to the worse case analysis of the recovery, and perform a non-asymptotic evaluation of the performances [3]. To obtain better recovery guarantees, I propose a probabilistic analysis of the recovery of the sparsity support of the signal, which leads to constants that are explicit and small [4]. CS ideas have the potential to revolutionize other fields beyond signal processing. In particular, the resolution of large scale problems in numerical analysis could beneficiate from random projections. This performs a dimensionality reduction while simplifying the structure of the problem if the projection is well designed. As a proof of concept, I will present a new compressive wave equation solver, that use projections on random Laplacian eigenvectors [5]. [1] D. Donoho, Compressed sensing, IEEE Trans. Info. Theory, vol. 52, no. 4, pp. 1289-1306, 2006. [2] E. Candès, J. Romberg, and T. Tao, Robust uncertainty principles: Exact signal reconstruction from highly incomplete frequency information, IEEE Trans. Info. Theory, vol. 52, no. 2, pp. 489-509, 2006. [3] C. Dossal, G. Peyré and J. Fadili, A Numerical Exploration of Compressed Sampling Recovery, Linear Algebra and its Applications, Vol. 432(7), p.1663-1679, 2010. [4] C. Dossal, M.L. Chabanol, G. Peyré and J. Fadili, Sparse Support Identi