Loading…
Loading grant details…
| Funder | National Science Foundation (US) |
|---|---|
| Recipient Organization | Washington University |
| Country | United States |
| Start Date | May 15, 2024 |
| End Date | Apr 30, 2029 |
| Duration | 1,811 days |
| Number of Grantees | 1 |
| Roles | Principal Investigator |
| Data Source | National Science Foundation (US) |
| Grant ID | 2340417 |
Recent advances in artificial perception and decision-making theory have enabled the design of autonomous robots that can achieve diverse tasks in unknown environments such as delivery, surveillance, exploration, and mapping. In most existing autonomy architectures, the ambient environment remains beyond the direct control of the robots. Thus, if the structure of the environment prevents the robots from accomplishing their tasks, then existing autonomy algorithms will return a mission failure message.
This situation can naturally occur in mission scenarios in unstructured environments where pathways towards regions of interest may be blocked by unexpected objects or disconnected with unexpected physical gaps. The autonomy methods anticipated to be developed by research funded via this Faculty Early Career Development (CAREER) project will enable robots to reason about when and how to physically interact with the environment by re-configuring its structure to accomplish their tasks.
The developed algorithms are anticipated to enhance the efficiency, effectiveness, and adaptability of collaborative robots in diverse humanitarian and scientific applications in unstructured environments, such as search-and-rescue, disaster response, and surveillance. In addition, the research will inspire several educational activities that include developing a suite of open-source education materials (e.g., lectures, software, and robot demonstrations), curriculum development, as well as outreach and research activities for K-12, undergraduate, and graduate students.
The research agenda will be achieved by designing foundational interactive perception-based planning capabilities for robot teams operating in unknown and unstructured environments. These capabilities will empower robots to collaboratively engage with the perceived environment, while also accounting for imperfect perception, and respond to unanticipated events that could otherwise impede task completion.
In particular, this project will develop multi-robot planning algorithms that, given a high-level task, jointly design (i) cooperative manipulation plans orchestrating the interaction with the perceived environment to facilitate task completion, as well as (ii) motion plans to complete assigned tasks. To ensure safe robot-environment interaction, the developed planners will also be integrated with active perception techniques mitigating geometric and semantic environmental uncertainty arising due to imperfect perception.
These research results will be grounded through extensive evaluations on both physical robotic testbeds and photorealistic simulators with particular emphasis on autonomous delivery and search-and-rescue applications.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
Washington University
Complete our application form to express your interest and we'll guide you through the process.
Apply for This Grant