Provable Learning from Data with Priors: from Low-rank to Diffusion Models
Speaker: Yuejie Chi – Pittsburgh, PA, United StatesTopic(s): Artificial Intelligence, Machine Learning, Computer Vision, Natural language processing , Computational Theory, Algorithms and Mathematics
Abstract
Generative priors are effective tools to combat the curse of dimensionality, and enable efficient learning that otherwise will be ill-posed, in data science. This talk starts with the classical low-rank prior, by discussing how the trick of preconditioning boosts the learning speed of gradient descent without compensating generalization in overparameterized low-rank models, unveiling the phenomenon of implicit regularization. The talk next discusses non-asymptotic theory towards understanding the data generation process of diffusion models in discrete time, assuming access to reasonable estimates of the score functions.About this Lecture
Number of Slides: 40Duration: 45 minutes
Languages Available: English
Last Updated:
Request this Lecture
To request this particular lecture, please complete this online form.
Request a Tour
To request a tour with this speaker, please complete this online form.
All requests will be sent to ACM headquarters for review.