Friday, October 09, 2015

Sketching for Simultaneously Sparse and Low-Rank Covariance Matrices

Laurent Jacques tweeted about these two sketching preprints recently. The second one is the new version of a preprint we mentioned earlier:




Sketching for Simultaneously Sparse and Low-Rank Covariance Matrices
Sohail Bahmani, Justin Romberg

We introduce a technique for estimating a structu red covariance matrix from observations of a random vector which have been sketched. Each observed random vector xt is reduced to a single number by taking its inner product against one of a number of pre-selected vector a. These observations are used to form estimates of linear observations of the covariance matrix varSigma, which is assumed to be simultaneously sparse and low-rank. We show that if the sketching vectors a have a special structure, then we can use straightforward two-stage algorithm that exploits this structure. We show that the estimate is accurate when the number of sketches is proportional to the maximum of the rank times the number of significant rows/columns of Σ. Moreover, our algorithm takes direct advantage of the low-rank structure of Σ by only manipulating matrices that are far smaller than the original covariance matrix.

In this paper we show that for the purposes of dimensionality reduction certain class of structured random matrices behave similarly to random Gaussian matrices. This class includes several matrices for which matrix-vector multiply can be computed in log-linear time, providing efficient dimensionality reduction of general sets. In particular, we show that using such matrices any set from high dimensions can be embedded into lower dimensions with near optimal distortion. We obtain our results by connecting dimensionality reduction of any set to dimensionality reduction of sparse vectors via a chaining argument.


 
 
 
Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly