Skip to main content
Skip to main content

NACS - Yaoda Xu / Understanding visual representations in human occipito-temporal and posterior parietal cortices and convolutional neural networks

Close portrait of a woman, smiling.

NACS - Yaoda Xu / Understanding visual representations in human occipito-temporal and posterior parietal cortices and convolutional neural networks

Linguistics Friday, November 3, 2023 10:10 am - 11:15 am Bioscience Research Building, 1103

Friday November 3, Yale's Yaoda Xu is at the NACS colloquium, discussing her work on the neuroscience of visual representation.


Understanding visual representations in human occipito-temporal and posterior parietal cortices and convolutional neural networks

In recent studies we show that non-spatial visual information, including the content of visual working memory, may be directly represented in the human posterior parietal cortex (PPC). In this talk, I will compare visual representations in the human PPC with those in the human occipito-temporal cortex and describe how together they provide us with both a stable representation of our visual environment and allow us to interact with the external world flexibly and efficiently. Using methods developed in neuroscience, I will also examine the nature of visual representations in convolutional neural networks trained to perform object recognition.

Add to Calendar 11/03/23 10:10:00 11/03/23 11:15:00 America/New_York NACS - Yaoda Xu / Understanding visual representations in human occipito-temporal and posterior parietal cortices and convolutional neural networks

Friday November 3, Yale's Yaoda Xu is at the NACS colloquium, discussing her work on the neuroscience of visual representation.


Understanding visual representations in human occipito-temporal and posterior parietal cortices and convolutional neural networks

In recent studies we show that non-spatial visual information, including the content of visual working memory, may be directly represented in the human posterior parietal cortex (PPC). In this talk, I will compare visual representations in the human PPC with those in the human occipito-temporal cortex and describe how together they provide us with both a stable representation of our visual environment and allow us to interact with the external world flexibly and efficiently. Using methods developed in neuroscience, I will also examine the nature of visual representations in convolutional neural networks trained to perform object recognition.

Bioscience Research Building false