Learning from Big but Finite Data: Algorithms and Insights for Neural Networks
University of California at Riverside, Department of Electrical and Computer Engineering
Thursday, September 13, 2018|
4:00pm - 5:00pm
Add to Google Calendar
About the Event
Fueled by data, modern machine learning algorithms achieve state-of-the-art performance in computer vision and natural language processing tasks. These algorithms can efficiently utilize big data and boost the performance by using large models that can capture the latent structure in the data. In this talk, we discuss data-efficiency for training practical machine learning models such as convolutional networks and recurrent neural networks. We provide insights and principled algorithms for rigorously learning these models from near-optimal amount of data. We also highlight how theoretical insights can shed light upon common heuristics such as batch normalization.
Samet Oymak is an assistant professor in the Department of Electrical and Computer Engineering, at the University of California, Riverside. He was previously a fellow at the Simons Institute and a postdoctoral scholar at the AMPLab at UC Berkeley. He received his BS from Bilkent University in 2009 and his MS and PhD from Caltech in 2014, all in electrical engineering. At Caltech, he received the Wilts Prize for best thesis in Electrical Engineering. His research interests include mathematical optimization, distributed algorithms, and statistical machine learning.
Contact: Judi Jones
Faculty Sponsor: Vijay Subramanian
Open to: Public