Traces and shading indicate mean ± s.e.m. c, Average population responses of L2/3 V1 neurons for cued (A aV a, blue) and un-cued (V a, gray) visual stimulus presentations on day 1 (top) and day 4 (bottom) of conditioning. On day 5, mice were additionally exposed to a previously unexperienced audio-visual stimulus pair (A bV a). Over the course of five conditioning days, mice were exposed to auditory-cued visual stimuli (A aV a and A bV b) that were reinforced, to the visual stimuli alone (V a and V b) with no reinforcement, and to a control visual stimulus (V c) that was never paired with an auditory stimulus or reinforced. All presentations were randomized with an inter-stimulus interval of between 4 s and 12 s ( Methods).Ī, Schematic representation of the VR setup. On day 5 of the conditioning paradigm, on a subset of trials, we additionally probed responses to an auditory cue and visual stimulus pairing that the mouse had previously not experienced (A bV a). To quantify the responses to the visual stimuli without a preceding auditory cue, we occasionally presented the normally cued visual stimuli alone (V a and V b) and also presented a control visual stimulus (V c) that was never paired with an auditory cue. The specific identities of stimuli used were counterbalanced across mice. For each mouse, two pairs of an auditory cue and a visual stimulus were presented throughout conditioning (A aV a and A bV b). Over the course of five conditioning sessions (approximately 45 min each on five consecutive days), mice were presented with pairings of a 1-s auditory cue (A) followed by a 1-s visual stimulus (V) (Fig. A virtual environment was used to enable simultaneous head-fixed optical physiology and experimental control of both visual and auditory input. Mice explored a virtual environment in which they were exposed to sequentially paired presentations of auditory and visual stimuli. ![]() To do this, we used an audio-visual associative conditioning paradigm and quantified how cross-modal interactions shape neural responses in V1 over the course of learning. Specifically, we investigated whether the utility of such cross-modal interactions could be to compute a comparison between expected, or predictable, and actual sensory experience. Although the computational role of these interactions remains unclear, we hypothesized that long-range cortical connections are shaped by experience and function to communicate memories of stimulus associations. 5, 6, 7, 8), and some of these cross-modal responses are thought to be driven by direct projections from AuC that target local inhibitory circuits in V1 (ref. ![]() Auditory input is known to influence neural activity in V1 (refs. During audio-visual associative learning, auditory cortex (AuC) is thought to underlie multi-modal plasticity in visual cortex 2, 3, 4. Here we probe the function of the direct interactions between auditory and visual cortices on processing of visual stimuli. Our results suggest that cross-modal associations can be communicated by long-range cortical connections and that, with learning, these cross-modal connections function to suppress responses to predictable input.Īlthough experience-dependent, cross-modal phenomena between auditory and visual perception, such as the McGurk effect 1, have long been recognized, the neural circuit mechanisms responsible for such interactions have remained elusive. Auditory cortex axons carry a mixture of auditory and retinotopically matched visual input to V1, and optogenetic stimulation of these axons selectively suppresses V1 neurons that are responsive to the associated visual stimulus after, but not before, learning. ![]() We show that the association of an auditory stimulus with a visual stimulus in a behaviorally relevant context leads to experience-dependent suppression of visual responses in primary visual cortex (V1). Here we describe a neural mechanism by which an auditory input can shape visual representations of behaviorally relevant stimuli through direct interactions between auditory and visual cortices in mice. However, it is not well understood how these interactions are mediated or at what level of the processing hierarchy they occur. Learned associations between stimuli in different sensory modalities can shape the way we perceive these stimuli.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |