Loading…
Loading grant details…
| Funder | National Science Foundation (US) |
|---|---|
| Recipient Organization | Duke University |
| Country | United States |
| Start Date | Oct 01, 2021 |
| End Date | Sep 30, 2024 |
| Duration | 1,095 days |
| Number of Grantees | 1 |
| Roles | Principal Investigator |
| Data Source | National Science Foundation (US) |
| Grant ID | 2120333 |
This project will create a research infrastructure for computer vision and real-time control of autonomous mobile robots (both aerial and ground). The infrastructure includes four integrated components: (1) A Purdue laboratory decorated as miniature cities. (2) Simulators that reflect the physical laboratory. (3) Programmable aerial robots with the same interface as the simulators. (4) Sample solutions for research on artificial intelligence, computer vision, and robot control for evaluation and comparison.
This infrastructure will be available to the research community in multiple ways: (1) Users can evaluate their solutions with the simulators in a safe virtual environment. (2) Users can upload their control programs and this team will launch the robots inside Purdue's laboratory. Users can observe the robots remotely using the high-speed cameras already deployed in the laboratory. (3) Users can bring their own robots to the laboratory and conduct experiments. (4) This project will create competitions for researchers to demonstrate their solutions using autonomous mobile robots in simulated emergency and rescue scenarios.
The competitions will use miniature buildings and people for the robots to recognize and count objects (such as number of people, vehicles, and houses), assess situations (such as the number of collapsed bridges), while avoiding obstacles.
This infrastructure will be available for investigating a wide range of research topics, including (1) real-time computer vision and control. The decorated laboratory will allow researchers to evaluate their solutions for real-time vision and control methods using active computer vision, navigation, and semantic segmentation in a three-dimensional environment. (2) simulation of robot fleets.
Users can evaluate and improve their methods in a safe virtual environment before deployment. (3) This infrastructure will integrate virtual and physical environments so that solutions running in the simulators can be ported directly to the physical robots for experiments. (4) collision avoidance, multi-robot coordination, emergency response, computer security, and efficient machine learning on embedded systems. (5) agriculture, city planning, emergency response, and inspection of civil structures. This project will build STEM talents because autonomous robots and visual data are naturally appealing to the general public.
With the simulators, students at all levels can participate without the cost of purchasing physical robots. This research infrastructure will reduce the barriers to innovations. This infrastructure will also encourage innovations in machine learning that are efficient in energy and can be ported to resource constrained embedded systems such as aerial robots.
The project will engage a broader audience including K-12 students as well because of the many applications described above.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
Duke University
Complete our application form to express your interest and we'll guide you through the process.
Apply for This Grant