Loading…
Loading grant details…
| Funder | National Science Foundation (US) |
|---|---|
| Recipient Organization | Stanford University |
| Country | United States |
| Start Date | Oct 01, 2021 |
| End Date | Sep 30, 2025 |
| Duration | 1,460 days |
| Number of Grantees | 6 |
| Roles | Former Principal Investigator; Principal Investigator; Co-Principal Investigator; Former Co-Principal Investigator |
| Data Source | National Science Foundation (US) |
| Grant ID | 2120095 |
Humans effortlessly engage in a diverse set of complex daily activities: cooking a meal, packing lunches, cleaning a bedroom. Designing artificial agents and robots that can perform such activities in dynamic environments would be immensely useful with many applications such as healthcare (e.g., assist the elderly). Still, it remains a long-standing, challenging problem.
While recent research in artificial intelligence (AI) and robotics has shown promising progress, it has focused on learning and executing simple tasks such as pushing an object in a table-top environment. This project will build infrastructure that enables the development of intelligent agents that perceive and interact with humans in real-world household environments.
In turn, this facilitates robotic systems that help humans accomplish household activities, with significant potential societal and economic impact. Researchers will also partner with educational and non-profit organizations to educate AI and robotics to underrepresented students and communicate the proposed infrastructure to families with diverse backgrounds, particularly from underserved communities.
In this project, investigators will develop a virtual environment for embodied agents to perform interactive activities, with a formal description of everyday activities inspired by cognitive science findings, facilitating the development of embodied AI in complex, ecological scenes. The developed environment will enable simulation of activities in large-scale, realistic, interactive, home-sized scenes, populated with rigid and articulated objects with state changes (e.g., temperature).
This project will build upon existing infrastructure from the team of researchers, notably the iGibson environment, but significantly extend it to include key features to facilitate embodied AI study. These features include (i) high-quality multisensory signals (visual, tactile, auditory), (ii) realistic and diverse geometry, material, and physics of objects, (iii) an intuitive human interface with a virtual reality setup, and (iv) a multi-user collaborative virtual reality system.
In addition, the team of researchers will develop a taxonomy of daily activities, including their pre-conditions, effects, and hierarchy, using formal description in robotics, accompanied by human demonstrations from virtual reality for evaluating the performance of autonomous agents on such activities.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
Stanford University
Complete our application form to express your interest and we'll guide you through the process.
Apply for This Grant