Loading…

Loading grant details…

Completed OTHER RESEARCH-RELATED NIH (US)

Neural and Computational Architecture for Complex Navigation and Subjective Self-Location

$1.27M USD

Funder NATIONAL INSTITUTE OF NEUROLOGICAL DISORDERS AND STROKE
Recipient Organization New York University
Country United States
Start Date May 01, 2023
End Date Apr 30, 2025
Duration 730 days
Number of Grantees 1
Roles Principal Investigator
Data Source NIH (US)
Grant ID 10664227
Grant Description

Project Summary Candidate and Career Goals: I intend to become an independent scientist at a research-first academic institution, bridging across levels of description (i.e., from computations to neurons) and furthering our understanding of how the brain infers the world around us, as well as ourselves within it. I trained as a cognitive scientist (winning

the Glushko Prize for best dissertation in Cognitive Science, worldwide), with a focus on understanding our sense of self-location; where am “I” located in space. I am now training in systems neuroscience, developing expertise in large-scale rodent neurophysiology and with a focus on the dynamic aspects of self-location; spatial

navigation. These experiences complement each other; from behavioral computations to single-units, and from static to dynamic self-location. Environment and Career Development Plan: I am mentored by Dr. Dora Angelaki (NYU, expertise in navigation) and co-mentored by Drs. David Schneider (NYU, rodent self-generated actions)

and Cristina Savin (NYU, data science). Further, I am a scientist member of the International Brain Lab, allowing me the opportunity to leverage world-class expertise (22 labs) in rodent neurophysiology. My training during the K99 will focus on model-based analyses of behavior and neurons during continuous and complex naturalistic

tasks, as well as lab management skills (i.e., personnel, grant-writing, communication). Research Plan: Spatial

navigation is central to adaptive behavior, underlying our ability to trade-off the exploitation of our current location with the exploration of novel ones. Beautiful work has detailed a number of spatial codes (e.g., place and grid cells) in the hippocampal formation, yet we (1) lack a normative framework accounting for the complexities of

natural navigation, (2) do not understand how spatial codes from the hippocampal formation interact with cortex, and (3) have focused on understanding how we build internal models of the world around us, while neglecting its starting point – ourselves. During the K99 phase of the award, I will develop a naturalistic navigation task in

virtual reality where rodents will be required to disambiguate complex signals. These animals will be trained to

integrate velocity signals derived from motion across their retina (i.e., optic flow) into a position estimate, in order to path integrate to the location of a latent target. Then, they will be tested in a novel situation, one where optic flow may be caused by self- and/or target-motion. I expect animals to behave in line with Bayesian Causal

Inference (BCI) – a canonical computation wherein estimation biases emerge during small, but not large, signal disparities (i.e., when observers operate under the incorrect internal model). Further, I will broadly map neural activity throughout the rodent’s brain during BCI by leveraging novel large-scale neurophysiology techniques.

During the R00 phase of the award, I will directly manipulate the subjective sense of self-location – the initial condition for navigation - and measure this phenomenology in rodents via the task developed during the K99. Beyond establishing BCI as a fundamental computation guiding naturalistic navigation, I expect the proposed

project to inform next-generation therapeutics for disorders of inference, such as Autism and Schizophrenia.

All Grantees

New York University

Advertisement
Apply for grants with GrantFunds
Advertisement
Browse Grants on GrantFunds
Interested in applying for this grant?

Complete our application form to express your interest and we'll guide you through the process.

Apply for This Grant