Saturday, May 02, 2015

Saturday Morning Videos: IMA Workshop on Convexity and Optimization: Theory and Applications February 23-27, 2015

 IMA Workshop on Convexity and Optimization: Theory and Applications February 23-27, 2015
 Presentation of the workshop:
The workshop will consist of two parts. Day one will cover supply chain optimization with the objective of bringing together leading researchers/developers from industry with leading researchers from academia to discuss challenges, opportunities, and new trends in logistics, material handling, optimization, machine learning, and related algorithms.
The second part of the workshop (lasting four days) will focus on discrete and continuous optimization, with a foray into machine learning. Submodular functions are discrete analogs of convex functions (as well as concave functions in some contexts), arising in various fields of computer science and operations research. Since the seminal work of Jack Edmonds (1970), submodularity has long been recognized as a common structure of many efficiently solvable combinatorial optimization problems. Recent algorithmic developments in the past decade include a combinatorial strongly polynomial algorithm for minimization, constant factor approximation algorithms for maximization, and efficient methods for learning submodular functions. In addition, submodular functions find novel applications in combinatorial auctions, machine learning, and social networks. This workshop aims to provide a forum for researchers from a variety of backgrounds to exchange results, ideas, and problems on submodular optimization and its applications. Application domains have been numerous, ranging from (sensor placement for) water management to navigation of (mobile) robots, as evidenced by the works of Carlos Guestrin, Andreas Krause, and several others.
On the more general optimization front, the concept of robust optimization, as developed by Aharon Ben-Tal, Arkadi Nemirovsky, and collaborators, offers the promise of providing sets of near-optimal solutions (rather than a potentially unique optimal solution) to problems arising from families of input instances. This somewhat classic topic deserves more attention now due to (the obvious) appeal in the robustness to uncertainty, or noise in the input, typically arising from a large data set. The development of robust linear programs (LPs) and robust semidefinite programs (SDPs) is very much in its infancy from a theoretical computer science standpoint and we expect there to be a fruitful dialog between the various groups involved.
Optimization formulations and methods have been at the heart of many modern machine learning algorithms, which have been extensively used in many applications across science and engineering for automatically extracting essential knowledge from huge volumes of data. The increasing complexity, size, and variety of these applications has led to interesting interactions between optimization and machine learning. The workshop will highlight recent interesting connections between the two fields.

Abstracts and Talk Materials are here.

Here are the videos:

Continuous Time Integer Programming: Towards Large-Scale Optimal Scheduling in Logistics, 2015-02-23, Natashia Boland  (Georgia Institute of Technology)
Probabilistic Inference with Submodular Functions, 2015-02-23, Andreas Krause  (ETH)
Optimizing Decomposable Submodular Functions, 2015-02-24, Stefanie Jegelka  (Massachusetts Institute of Technology)
DC Programming in Discrete Convex Analysis, 2015-02-24, Kazuo Murota  (University of Tokyo)
Analytic Centers, Reciprocal Linear Spaces, and Planes that Intersect Them, 2015-02-24, Cynthia Vinzant  (North Carolina State University)
Fast Algorithms for Optimization of Submodular Functions, 2015-02-24, Jan Vondrak  (IBM Research Division)
Some Recent Developments in Large-Scale Convex Optimization, 2015-02-24, Niao He  (Georgia Institute of Technology)
The Entropic Barrier: A Simple and Optimal Universal Self-Concordant Barrier, 2015-02-25, Sébastien Bubeck  (Microsoft)
Two Distributed Optimization Algorithms for Machine Learning, 2015-02-25, Yingyu Liang  (Princeton University)
Inapproximability of Combinatorial Problems via Small LPs and SDPs, 2015-02-25, Sebastian Pokutta  (Georgia Institute of Technology)
Tightness of Convex Relaxations to Sparsity and Rank, 2015-02-25, Nati Srebro  (Technion-Israel Institute of Technology)
Efficient Algorithms for Structured Sparsity, and Applications, 2015-02-25, Eric Xing  (Carnegie-Mellon University)
Thrifty Approximations of Convex Bodies by Polytopes, 2015-02-26, Alexander Barvinok  (University of Michigan)
Relaxations: Deriving Algorithms for Learning and Optimization, 2015-02-26, Karthik Sridharan  (Cornell University)
Nonconvex Quadratic Optimization with One or Two Constraints, 2015-02-26, Akiko Takeda  (University of Tokyo)
On Connections Between Submodularity and Concavity, 2015-02-27, Jeff A Bilmes  (University of Washington)
The Power Localization for Efficiently Learning with Noise, 2015-02-27, Nina Balcan  (Carnegie-Mellon University)


the organizers were:

 
Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly