NACS - Yaoda Xu / Understanding visual representations in human occipito-temporal and posterior parietal cortices and convolutional neural networks
![Close portrait of a woman, smiling.](/sites/default/files/2023-10/yaodaxu.jpeg)
NACS - Yaoda Xu / Understanding visual representations in human occipito-temporal and posterior parietal cortices and convolutional neural networks
Friday November 3, Yale's Yaoda Xu is at the NACS colloquium, discussing her work on the neuroscience of visual representation.
Understanding visual representations in human occipito-temporal and posterior parietal cortices and convolutional neural networks
In recent studies we show that non-spatial visual information, including the content of visual working memory, may be directly represented in the human posterior parietal cortex (PPC). In this talk, I will compare visual representations in the human PPC with those in the human occipito-temporal cortex and describe how together they provide us with both a stable representation of our visual environment and allow us to interact with the external world flexibly and efficiently. Using methods developed in neuroscience, I will also examine the nature of visual representations in convolutional neural networks trained to perform object recognition.