Loading…
Loading grant details…
| Funder | National Science Foundation (US) |
|---|---|
| Recipient Organization | Georgia Tech Research Corporation |
| Country | United States |
| Start Date | Jun 15, 2022 |
| End Date | Apr 30, 2023 |
| Duration | 319 days |
| Number of Grantees | 1 |
| Roles | Principal Investigator |
| Data Source | National Science Foundation (US) |
| Grant ID | 2142959 |
This award is funded in part under the American Rescue Plan Act of 2021 (Public Law 117-2).
Entrainment is a process in which people’s natural brain and body rhythms synchronize, through stimuli such as music, which may create feelings of connection and well-being. This project addresses entrainment by building multimodal signal mapping interfaces that mediate interpersonal connections by deriving music from brain and body rhythms. The investigator will integrate sensor hardware and signal processing software to stream live brain and body data, perform calculations to extract signal characteristics, and use this to drive sound synthesis.
A series of music cognition and listening experiments study physiological, behavioral, and affective entrainment phenomena, which are expected to result, from a series of multimodal brain music interfaces. A use-case study, developed in consultation with doctors, connects mothers and infants, physically separated by distance, using the multimodal entrainment interface.
Mother and infant hear music derived from each other’s heartbeats and breathing. This study investigates the entrainment created in their body rhythms, and maps health and well-being effects of the virtual connection environment. For researchers, doctors, and caretakers, multimodal brain music interfaces have the potential to expand our scientific understanding of music’s beneficial effects on the brain and body, which may lead to new health and well-being interventions for adults, children, and infants.
This project will result in an open-source tool kit of accessible technologies and STEM learning modules to inspire educators and students to develop projects that further our understanding of brain and body signals. These learning modules will be integrated into a summer research experience--involving high school students and their teachers--in which authentic learning encourages students’ training in the scientific method through their natural interest in music.
This project develops and evaluates an interface with new multimodal signal mapping technologies that translate neurophysiological signals (e.g., EEG, ECG, EDA, respiration) into musical sound to promote biological, behavioral, and affective synchrony between individuals and computers by: (1) engineering sonification techniques that perform real-time signal processing and algorithmic music generation for transforming physiological signals into music; (2) investigating the neuropsychological mechanisms that govern auditory neurostimulation and physiological entrainment by designing new rhythmic auditory neurophysiological sonification stimuli and measuring how the human body responds; and (3) designing and evaluating a use case that involves co-generating music for infants and their mothers with each other’s physiological data. Quantitative data will address synchronies in physiology, protocol analysis of video will address behavioral synchronies, and qualitative data will address experiences.
These research activities will contribute to an overarching goal of discovering how using computing to pair music and physiology can function as a significant information channel in human-centered computing. One expected use of this channel is to promote human connection and well-being through entrainment.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
Georgia Tech Research Corporation
Complete our application form to express your interest and we'll guide you through the process.
Apply for This Grant