Learning Compact Models from High Dimensional Large Datasets

Monday, April 13, 2015 - 7:00am to 8:00am
United States

Dr. Yoram Singer reviews the design, analysis, and implementation of stochastic optimization techniques, online algorithms, and modeling approaches for learning in high dimensional spaces using large amounts of data. His focus is on algorithms and models that are efficient, accurate, and yield compact models. Concretely, his group describes the forward-backward shrinkage algorithm (Fobos), mirror descent for learning composite objectives (COMID), and the stonking adaptive gradient (AdaGrad) algorithm. Time permitting, he will also discuss simple yet effective modeling approaches based on locality for matrix approximation from high dimensional data.

 

Speaker(s): 

Yoram Singer

Google
Senior Research Scientist

Dr. Yoram Singer is a senior research scientist at Google. From 1999 through 2007, he was an Associate Professor at the Hebrew University of Jerusalem. From 1995 through 1999 he was a member of the technical staff at AT&T Research. He was also the co-chair of the conference on Computational Learning Theory (COLT) in 2004 and of Neural Information Processing Systems (NIPS) in 2007. He served as an editor of Machine Learning Journal (MLJ), Journal of Machine Learning (JMLR), IEEE Signal Processing Magazine (SPM), and IEEE Trans. on Pattern Analysis and Machine Intelligence (PAMI). His collaborative work with colleagues won several awards including the best 10 year retrospect machine learning paper and three best student paper awards at NIPS. He has been an AAAI fellow since 2011.

Back to Top