About the Event
Critical to high-dimensional statistical estimation is to exploit the structure in the data distribution. Probabilistic graphical models provide an efficient framework for representing complex joint distributions of random variables through their conditional dependency graph, and can be adapted to many high-dimensional machine learning applications.
This dissertation develops the probabilistic graphical modeling technique for three statistical estimation problems arise in real-world applications: distributed and parallel learning in Markov networks, missing-values prediction in recommender systems using latent variable graphical models, and emerging topic detection in text corpora using topic models. The common theme behind all proposed methods is a combination of parsimonious representation of uncertainties in the data, optimization surrogate that leads to computationally efficient algorithms, and fundamental limits of estimation performance in high dimension.