Thursday, January 12, 2012

OSTP RFI Last day for comments, Around the blogs in 80 hours and Extension of SBL Algorithms for the Recovery of Block Sparse Signals with Intra-Block Correlation

Various items on the block today:

First Today is the last for getting your throughts to two Request For Information by the Office of Science and Technology Policy:


I expressed my view on the former in Toward Robust Science: Why Open Access of Government Funded Peer Review Work is Important but it could rightly apply to the latter as well. I listed the questions here

"...How To Submit a Response All comments must be submitted electronically to: publicaccess@ostp.gov. Responses to this RFI will be accepted through January 12, 2012. You will receive an electronic confirmation acknowledging receipt of your response,..."


Please note that you do not have to be a US person to submit a response. Also note that any information (that includes your identity) will be a matter of public record as normally expected in this type of generic inquiry. You can still make your submission anonymous if this is bothering you.

I will not be watching the presentations at MIA 2012 but I may or may not drop in in the cafeteria next door in between some presentations for the coffee breaks.. I removed myself from the list of participants early on so that younger participants could have a chance to attend and learn. Gabriel, one of the organizers, told me that more than 300 people applied but that the rooms at IHP could only safely host 200 or so folks. Congratulations to the organizers, it looks like it will be an impressive series of talks with a large audience.



Rich mentioned the upcoming Connexions conference, Bob talks about Strange behavior in sparse representation classification?, Zhilin provides some information on his new paper (see his email below for more information). Danny provides us with a Vowpal Wabbit Tutorial. Terry reviews Random matrices and specifically The Four Moment Theorem for Wigner ensembles. Here is a review of Persi Diaconis' latest book in the WSJ , I am going to get it on the Kindle app for the iPhone/iPad,


At UBC there are two courses related to compressed sensing: MATH 555 taught by Ozgur Yilmaz and 
EOSC 513 taught by Felix Hermann. Ar University of Michigan, Anna Gilbert has a blog where she writes down some of her lectures there. At Iowa State, Namrata Vaswani teaches EE 527: Detection and Estimation Theory with relevant parts related to compressive sensing.

Justin Romberg's lectures at ENS Lyon last week just showed up on the interwebs::

Finally, Zhilin Zhang sent me the following: 

".....Hi, Igor,

....We just submitted a paper on block sparse model, which considers to exploit intra-block correlation:, Zhilin Zhang, Bhaskar D. Rao , Extension of SBL Algorithms for the Recovery of Block Sparse Signals with Intra-Block Correlation, submitted to IEEE Transaction on Signal Processing, January 2012. The preprint can be downloaded here: http://arxiv.org/abs/1201.0862. Here is the abstract:
We examine the recovery of block sparse signals and extend the framework in two important directions; one by exploiting intra-block correlation and the other by generalizing the block structure. We propose two families of algorithms based on the framework of block sparse Bayesian learning (bSBL). One family, directly derived from the bSBL framework, requires knowledge of the block partition. Another family, derived from an expanded bSBL framework, is based on a weaker assumption about the a priori information of the block structure, and can be used in the cases when block partition, block size, block sparsity are all unknown. Using these algorithms we show that exploiting intra-block correlation is very helpful to improve recovery performance. These algorithms also shed light on how to modify existing algorithms or design new ones to exploit such correlation for improved performance.
Please note that:
  1. Our proposed algorithms have the best recovery performance among ALL the existing algorithms (I've sent more than one month to carry out experiments to compare algorithms, but didn't find any algorithms have the similar performance as ours). 
  2. These algorithms are the first algorithms that adaptively exploit intra-block correlation.
  3. We revealed that intra-block correlation, if exploited, can significantly improve recovery performance. I think you may not be surprised by this observation, since we obtained similar observation from our previous MMV work (i.e. temporal correlation, if exploited, can significantly improve recovery performance of MMV algorithms)
  4. But interestingly, we found that the intra-block correlation has little effects on the performance of existing algorithms. This observation is entirely different to our previous finding on the MMV model, where we found temporal correlation has obvious negative effects on the performance of existing algorithms. For example, group Lasso keeps almost the same recovery performance no matter what's the intra-block correlation value. I guess maybe this is the reason that why intra-block correlation has not drawn attention from the people working on the block sparse model. But as you can see from our paper, exploiting the intra-block correlation can be very helpful to improve recovery performance (or reduce the number of measurements with the same recovery performance).
The codes will be posted on the website: http://dsp.ucsd.edu/~zhilin/BSBL.html (probably at the end of this month). But any one, if interested in, can send email to me to get these codes...."


Thanks Zhilin for the heads-up.


Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly