Loading…

Loading grant details…

Active CONTINUING GRANT National Science Foundation (US)

CAREER: Understanding Human Movement and Haptic Interaction toward Cognitive Augmentation Aids

$3.26M USD

Funder National Science Foundation (US)
Recipient Organization Arizona State University
Country United States
Start Date Jul 01, 2022
End Date Jun 30, 2027
Duration 1,825 days
Number of Grantees 1
Roles Principal Investigator
Data Source National Science Foundation (US)
Grant ID 2142774
Grant Description

Humans interact with their immediate surroundings largely using their hands; from feeding oneself, to taking medication, to opening doors, humans manipulate many objects each day as part of the daily activities they wish to perform. Understanding what the hands are doing may provide insight into what activities are being performed and how they are being carried out.

The most direct, and potentially clearest, view of the hands may be obtained through cameras that provide a hand-centric view, that is, wrist-worn cameras pointed toward the hands and fingers. To-date, such camera placements have seen little exploration, yet such a view has been demonstrated to capture the details of haptic interactions. Combined with advancements in computing, particularly machine learning and computer vision, such technologies may have the potential to become cognitive augmentation aids: assistive technologies to support attention, memory, and decision-making across a wide range of applications including assistive aids for seniors, which is the focus of this award.

The outcomes of this project will impact principal disciplines of computer science and engineering as well as provide transformative innovations in haptics, machine learning, computer vision, wearable computing, and gerontechnology, particularly assistive technologies for seniors who desire to age in place safely and independently for longer. Investigations into the hardware, software, algorithms, and utility of intelligent, hand-centric wrist wearables will produce impactful results, not only for gerontechnology, but for a wide variety of fields, such as neurorehabilitation, virtual/mixed reality technologies, and smart industrial applications.

The education activities undertaken as part of this project will include the development of a textbook on haptics and a project-oriented haptics course development.

The goal of this project is to develop methods and technologies for understanding haptic interactions and human activities during senior activities of daily living (ADLs) using hand-centric wrist wearables, fused with sensors such as video cameras and inertial measurement units. The research hypothesis is that a wrist wearable using a hand-centric camera placement will simplify automatic analysis, recognition, recall, and prediction of hand-object interactions and human activities during senior ADLs.

The activities of this project include determining the hardware design trade-offs for hand-centric wrist wearables and developing an annotated multi-camera hand-centric video dataset; determining algorithms and software for understanding haptic interactions and senior ADLs; and determining hardware and software integration toward the deployment of a prototype to investigate the impact and utility of intelligent, hand-centric wrist wearables for cognitive augmentation during senior ADLs. Hand-centric views for vision-based applications have seen little exploration, creating gaps in the knowledge related to the hardware, software, algorithms, and utility of this viewpoint to support cognitive tasks such as attention, memory, and decision-making.

While this project focuses on gerontechnology for seniors, findings may propagate to many other fields including occupational and physical rehabilitation and compliance monitoring for smart industrial applications. Specific contributions include: (i) a better understanding of the hardware design tradeoffs that provide the most informative view of what the hands are doing; (ii) identification of the challenges of hand-centric views, and how to address them; (iii) development of a multi-view hand-centric video dataset; (iv) algorithms and software for robust and automated analysis, recognition, recall, and prediction of haptic interactions and senior ADLs. 3DCNNs (3-Dimensional Convoluted Neural Networks) will be developed to model and predict both hand-centric actions and objects being manipulated – i.e., the haptic interactions with hand; and (v) a better understanding of the impact and utility of intelligent, hand-centric wrist wearables on ADLs of seniors living independently, and more broadly, on their quality of life.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

All Grantees

Arizona State University

Advertisement
Discover thousands of grant opportunities
Advertisement
Browse Grants on GrantFunds
Interested in applying for this grant?

Complete our application form to express your interest and we'll guide you through the process.

Apply for This Grant