Loading…
Loading grant details…
| Funder | National Science Foundation (US) |
|---|---|
| Recipient Organization | Duke University |
| Country | United States |
| Start Date | Nov 01, 2023 |
| End Date | Oct 31, 2026 |
| Duration | 1,095 days |
| Number of Grantees | 1 |
| Roles | Principal Investigator |
| Data Source | National Science Foundation (US) |
| Grant ID | 2231975 |
Mobile augmented reality (AR), which integrates virtual objects with real environments, has shown outstanding potential in many areas including retail, education, and healthcare. The progress made in AR and machine learning over the last several years has given rise to opportunities to generate AR experiences that are well-matched with specific contexts, by leveraging the outputs of machine-learning-based object-detection algorithms that identify objects and their locations in the field of view of the AR device.
However, existing object-detection methods are brittle, often making mistakes due to variations in lighting, object positions, device capabilities, and users’ actions. This project's goal, then, is to develop more robust object-detection methods and AR techniques that use them, grounded in real-world use cases. The motivating scenario is settings where a facility administrator would like users to benefit from object-detection-integrated AR experiences over the course of months or years: for example, teachers using AR-enhanced learning in a classroom, museum curators using AR to enrich visitors’ experience, or managers of a construction site or a factory deploying AR-based safety guidance for workers.
The project team’s goal is to help administrators develop a variety of AR experiences with minimal workload, without placing restrictions on the state of the facility in terms of both its appearance and contents. This project will enable a wide range of context-aware AR applications, such as AR-based safety guidance, accessibility assistance, and support for health and well-being.
The research will engage multiple diverse cohorts of undergraduate and high school students, both throughout the school year and on intense integrated summer research project experiences. Interactive demonstrations developed as part of this research will be showcased at K-12-oriented events.
This project will enhance the reliability of AR object detectors on multiple dimensions, via the development of new AR-specific object-detection training approaches, performance monitoring techniques, input and output sanity-checking methods, and application interfaces. The work is divided into three thrusts. The first thrust will design and develop a robust AR object detection framework that will enhance the reliability of AR object detectors by adapting them to the conditions in a given location and by validating the correctness of AR object detectors’ inputs and outputs.
The second thrust will examine the performance of AR object-detection algorithms across large and diverse groups of users and across a set of diverse AR devices; the team will design and develop mechanisms for adapting object detectors to specific users and devices with limited labeled data. The third thrust will examine the performance of AR object detectors in naturally changing environments across long-term deployments.
This work will involve capturing a set of environments over a 12-month period, designing strategies for performance monitoring of AR object detectors, and developing approaches to maintain AR object detectors’ performance over time by future-proofing and retraining them.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
Duke University
Complete our application form to express your interest and we'll guide you through the process.
Apply for This Grant