Loading…

Loading grant details…

Active STANDARD GRANT National Science Foundation (US)

CSR: Small: Efficient DGNN Training on Large-Scale and Time-Varying Graphs

$6M USD

Funder National Science Foundation (US)
Recipient Organization University of Virginia Main Campus
Country United States
Start Date Oct 01, 2024
End Date Sep 30, 2027
Duration 1,094 days
Number of Grantees 1
Roles Principal Investigator
Data Source National Science Foundation (US)
Grant ID 2350425
Grant Description

Graphs play a pivotal role across diverse domains, including social networks, and natural language processing. While Graph Neural Network (GNN) training systems have facilitated training on large static graphs, real-world applications (e.g., social media) often involve large-scale, and time-varying dynamic graphs (LTGs) and Dynamic GNN (DGNN) models have emerged to tackle them.

Therefore, there is a need to reduce computational resources and training time of such models, especially considering that tackling DGNN training on LTGs is more formidable than GNNs due to the vast scale and distinctive time-varying graph characteristics. This project’s novelties are new effective training methods tailored for DGNNs on LTGs. The project has several areas of broader significance and importance.

The project generates critical insights into the challenges of achieving highly scalable and efficient DGNN training on LTGs for different applications, and advanced approaches for tackling this challenge. Moreover, the project provides thorough training of students, and collaborative research opportunities for participating graduate, undergraduate, and K-12 students and faculty, with research results disseminated and integrated into courses.

To achieve highly efficient DGNN training on LTGs, this project endeavors to create innovative approaches for graph partitioning, sampling, caching and training. The project comprises four distinct tasks: 1) a comprehensive analysis of the characteristics of DGNN training on various types of LTGs; 2) LTG partitioning and caching before training; 3) LTG sampling and caching during training; and 4) efficient DGNN training on LTGs.

This project impacts different applications and the science and engineering fields by improving the performance of many of their required tasks. It serves the system community as a vehicle to conduct further research and experiments, and advance the state-of-the-art. Finally, this project has potential to yield social-economic benefits to organizations requiring efficient DGNN execution on LTGs.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

All Grantees

University of Virginia Main Campus

Advertisement
Discover thousands of grant opportunities
Advertisement
Browse Grants on GrantFunds
Interested in applying for this grant?

Complete our application form to express your interest and we'll guide you through the process.

Apply for This Grant