When the ventral visual stream is not enough: A deep learning account of medial temporal lobe involvement in perception
Animals seamlessly integrate sensory activity with previously encountered, behaviorally relevant experience. Neuroanatomical structures within the medial temporal lobe, such as perirhinal cortex (PRC), are known to enable these memory-related behaviors. Yet there is an enduring debate over PRC involvement in perception, beset by decades of seemingly inconsistent experimental outcomes. Here we formalize two competing theories of PRC involvement in visual object perception, situating behavior in relation to the performance supported by the primate ventral visual stream (VVS). In lieu of direct neural recordings, we use a computational proxy for the visual system to estimate VVS-supported performance, directly from experimental stimuli. This enables us to situate decades of lesion, electrophysiological, and behavioral results within a shared computational framework. Perhaps surprisingly, this approach offers a coherent account of PRC involvement in concurrent visual discrimination tasks. In this talk, I’ll present this work not only as an opportunity to better understand PRC involvement in perception, but as a case study in how we evaluate competing theories of neural and cognitive function.