Loading…

Loading grant details…

Active STANDARD GRANT National Science Foundation (US)

CRCNS US-German Research Proposal: Inception loops for interpretable tuning in macaque area V4

$3.35M USD

Funder National Science Foundation (US)
Recipient Organization Stanford University
Country United States
Start Date Oct 01, 2024
End Date May 31, 2026
Duration 607 days
Number of Grantees 1
Roles Principal Investigator
Data Source National Science Foundation (US)
Grant ID 2510328
Grant Description

A long-standing hypothesis posits that along a hierarchy of visual areas of the brain, increasingly complex humanly interpretable features are represented. Moreover, vision in primates is an active process, where information about a scene is acquired through sequences of short fixations of the eyes. Despite decades of study, the characterization of response properties of neurons along the visual cortical hierarchy and the tuning dynamics associated with free viewing is still far from complete.

Two major reasons are the non-linear nature of information processing in the brain, and the high dimensionality of the visual input itself. To address these inherent challenges, the investigators will apply an innovative method called Inception Loops, that combines big data and artificial intelligence (AI) to study the dynamics of how information is represented along the cortex during visual perception.

This work will shed light on the mechanisms of one of the greatest mysteries of life -- the biological basis of perception and cognition. An algorithmic understanding of visual perception will have impact beyond neuroscience in developing smarter AI with more humanlike capabilities. The team is committed to broadening participation in science through educational outreach focused on neuroscience and AI.

Outreach activities include public events, apprenticeships, research internships, and courses. The team will also engage in public discussions about brain research, AI, society and ethics.

Inception Loops, an innovative method developed by the investigators, combine multi-neuronal recordings and deep learning (DL) predictive models. This enables a systematic in silico characterization of neural tuning which can be verified in vivo. In aim 1, the investigators will record neural responses from macaque V4 to rendered naturalistic stimuli during fixation.

The responses of neurons will be modeled with DL models, using the images and geometric features of the visual scene extracted from the rendering process. This will enable them to systematically characterize the nonlinear tuning functions of the neurons, in particular single cell invariances, in terms of interpretable geometric scene features such as slant, surface curvature and object identities.

In aim 2, the investigators will use DL models together with an experimental paradigm of interleaved fixating and free viewing trials to study how the spatial receptive fields and tuning functions of V4 neurons are modulated by saccades and salient features in natural scenes under natural viewing conditions. Saliency will be determined from eye movements in free viewing experiments on the same images and modeled using a state-of-the-art DL saliency model.

They hypothesize that receptive fields are attracted towards salient features in an image even when saccades are not executed. Together, these two aims will yield a more comprehensive and interpretable understanding of the representation of latent geometric scene features in V4 and how they are modulated by salient features and saccades during free viewing.

A companion project is being funded by the Federal Ministry of Education and Research, Germany (BMBF).

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

All Grantees

Stanford University

Advertisement
Apply for grants with GrantFunds
Advertisement
Browse Grants on GrantFunds
Interested in applying for this grant?

Complete our application form to express your interest and we'll guide you through the process.

Apply for This Grant