Loading…

Loading grant details…

Active NON-SBIR/STTR RPGS NIH (US)

EAT: A Reliable Eating Assessment Technology for Free-living Individuals.

$7.02M USD

Funder NATIONAL INSTITUTE OF DIABETES AND DIGESTIVE AND KIDNEY DISEASES
Recipient Organization Northwestern University At Chicago
Country United States
Start Date Aug 01, 2021
End Date Jul 31, 2026
Duration 1,825 days
Number of Grantees 1
Roles Principal Investigator
Data Source NIH (US)
Grant ID 10280789
Grant Description

Project Summary/Abstract Overeating and unhealthy eating are often associated with various health risk conditions such as obesity, high blood pressure, and some chronic diseases.

To get a better understanding of overeating and unhealthy eating, researchers often rely on self-reports provided by individuals. Suggestions regarding changing lifestyle is often provided based on observations from these self-reports. However, it is well known that self-reports can be erroneous and subject to reporting biases.

Thus, an objective way to measure the eating activity and validating self-reports is necessary.

Recently, there has been growing interest in moving beyond self-reports and monitoring the eating activity automatically.

To monitor automatically, and in real time, researchers have looked at using sensor data from wrist worn devices, neck-worn devices, or ear-worn devices to automatically detect eating. These devices often enable capturing the eating periods.

However, these devices seldom capture images, thus limiting the possibility of visually confirming the consumed food and their quantity.

With the increasing popularity of wearable cameras, it is gradually becoming possible to capture the eating activities and associated context automatically and without any user intervention. Advances in machine learning enables automatically extracting eating related information from these captured images.

However, wearable cameras often capture more information than necessary, like capturing bystanders. This unnecessary information capturing reduces participant's willingness to wearing the camera.

Currently, no camera exists that can capture the eating activity and at the same time limit capturing unnecessary information. Obfuscating the unnecessary information might increase participant's willingness to wear the camera.

However, it is unclear if and which obfuscation technique will increase participant's willingness to don the wearable camera and at the same time ensure automatic context determination.

In this project, we will determine the possibility of using machine learning to detect eating in videos and identify the obfuscation technique that can allow detecting the eating activity without collecting unnecessary information.

To this end, first we will develop an activity detection algorithm that will allow detecting the eating activity using data from an IR sensor array and RGB images.

Next, we will test various obfuscation methods in a cross-over trial and select the best obfuscation method based on the greatest participant acceptability.

We will then deploy the eating detection algorithm with the best obfuscation approach on a novel wearable camera that has an infrared sensor array. We will use this camera to test the possibility of detecting eating in a real-world setting. To validate our algorithm, we will ask people to confirm or refute predicted eating and non-eating moments.

We will compare the performance of this algorithm against both real-time user response and 24-hour dietary recall to objectively evaluate the algorithm's performance.

Our proposed system will improve current research practices of evaluating dietary intake and pave the way for personalized interventions for behavioral medicine.

All Grantees

Northwestern University At Chicago

Advertisement
Apply for grants with GrantFunds
Advertisement
Browse Grants on GrantFunds
Interested in applying for this grant?

Complete our application form to express your interest and we'll guide you through the process.

Apply for This Grant