Loading…

Loading grant details…

Active STANDARD GRANT National Science Foundation (US)

Collaborative Research: CIF: Small: Approximate Coded Computing - Fundamental Limits of Precision, Fault-Tolerance, and Privacy

$2.87M USD

Funder National Science Foundation (US)
Recipient Organization Georgia Tech Research Corporation
Country United States
Start Date Oct 01, 2024
End Date May 31, 2026
Duration 607 days
Number of Grantees 1
Roles Principal Investigator
Data Source National Science Foundation (US)
Grant ID 2506573
Grant Description

Distributed computing plays a central role in enabling modern machine learning and artificial intelligence applications. This project develops new theoretical frameworks, analyses, and techniques for distributed computing and machine learning aiming to (i) accelerate the computation time by overcoming system bottlenecks, (ii) ensure accurate computation in the presence of hardware errors and faults, and (iii) enable data-processing approaches that adhere to data-privacy constraints.

These attributes will be enabled by inducing a controlled amount of redundancy in computations based on coding theory - a field that has enabled modern data communication and storage technologies. To obtain a realistic understanding of what is possible in the long run, the developed techniques will be accompanied by fundamental bounds on the tradeoffs between computation accuracy, data privacy, error tolerance, and redundancy overheads.

The project will disseminate outcomes and enable awareness of developed research to the broader scientific community through publications, tutorials, and curricular integration.

Coded computing is a sub-area of information and coding theory that induces redundancy into distributed computing. Coded computing has emerged as a promising paradigm to relieve straggler, communication, and data-privacy bottlenecks in large-scale distributed machine learning. Yet, state-of-the-art coded-computing techniques, mostly devised to enable exact reconstruction of the computation output, have fundamental efficiency limitations, particularly for nonlinear computation tasks.

This project develops techniques for approximate coded computing, wherein the decoder aims to obtain the function output within a prescribed distortion limit, and data-privacy constraints are posed as limits on differential-privacy parameters. The research will be conducted in three closely connected thrusts: (i) coding schemes for fault-tolerant approximate matrix multiplication, (ii) coding schemes for fault-tolerant approximate nonlinear computations beyond matrix multiplications, and (iii) coding schemes for differentially private computations.

The techniques developed will combine ideas from information and coding theories, mathematical approximation theory, and differential privacy.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

All Grantees

Georgia Tech Research Corporation

Advertisement
Apply for grants with GrantFunds
Advertisement
Browse Grants on GrantFunds
Interested in applying for this grant?

Complete our application form to express your interest and we'll guide you through the process.

Apply for This Grant