Loading…
Loading grant details…
| Funder | National Science Foundation (US) |
|---|---|
| Recipient Organization | Yale University |
| Country | United States |
| Start Date | Jun 01, 2025 |
| End Date | May 31, 2030 |
| Duration | 1,825 days |
| Number of Grantees | 1 |
| Roles | Principal Investigator |
| Data Source | National Science Foundation (US) |
| Grant ID | 2443097 |
Algorithms are the building blocks of modern computation that underlies much of the technology on which our society depends. In particular, the successes of modern machine learning systems rely on our ability to solve large-scale computational tasks via efficient algorithms, such as for solving optimization problems for training neural networks, and drawing random samples from probability distributions.
Algorithms in practice necessarily operate via discrete iterations, by executing one operation after another. Many algorithms are inspired by the mathematics of dynamical systems, where time flows continuously, but all such derivations have been done manually, often with sophisticated and difficult specialized analysis. A general theory of how to derive algorithms from dynamics that can preserve the desired properties is missing.
This project aims to close this gap by developing a rigorous way to systematically translate continuous-time dynamics into discrete-time algorithms that can automatically preserve convergence guarantees, focusing on algorithms for fundamental problems in optimization and sampling motivated by machine learning applications. The outcome of this project will benefit many fields in which computational tasks such as optimization and sampling are used, by providing researchers and practitioners a higher-level language involving dynamical systems to design and reason about algorithms.
This project also provides training opportunities for students, outreach events to popularize the dynamical perspective of algorithm design, and a collaborative effort to compile a catalog of dynamics and algorithms.
This project will strengthen the matching parallel structures between dynamics and algorithms for optimization and sampling, by studying two complementary directions. In the first direction, this project studies sampling problems as optimization problems on the space of probability distributions. Many recent results in sampling have been derived from this perspective, in particular for the basic greedy methods analogous to gradient descent.
This project aims to complete the translation from optimization to sampling by deriving algorithmic results for various sampling problems as inspired by the optimization results, including for constrained sampling, accelerated sampling, and sampling with stronger mixing time guarantees. This project will combine ideas and tools from dynamical systems, geometry, probability, and information theory to extract concrete algorithmic principles to design new algorithms in practice.
In the second direction, this project aims to formalize the robust parallel structures between dynamics and algorithms with matching convergence guarantees. This project will study the mechanics of translating dynamics to algorithms, and investigate why certain ideas are successful in the translation, including the idea of speeding up time to achieve faster convergence rates, the optimality of accelerated methods for optimization and sampling, and the geometric structure of the space of algorithms and dynamics that allows such robust parallel structures to be exhibited.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
Yale University
Complete our application form to express your interest and we'll guide you through the process.
Apply for This Grant