The brain combines information from multiple sensory modalities to build a consistent representation of the world. The principles by which multimodal stimuli are integrated in cortical hierarchies are well studied, but it is less clear whether and how unimodal inputs shape the processing of signals carried by a different modality. In rodents, for instance, direct connections from primary auditory cortex reach visual cortex, but studies disagree on the impact of these projections on visual cortical processing. Both enhancement and suppression of visually evoked responses by auditory inputs have been reported, as well as sharpening of orientation tuning and improvement in the coding of visual information. Little is known, however, about the functional impact of auditory signals on rodent visual perception. Here we trained a group of rats in a visual temporal frequency (TF) classification task, where the visual stimuli to categorize were paired with simultaneous but taskirrelevant auditory stimuli, to prevent high-level multisensory integration and investigate instead the spontaneous, direct impact of auditory signals on the perception of visual stimuli. Rat classification of visual TF was strongly and systematically altered by the presence of sounds, in a way that was determined by sound intensity but not by its temporal modulation. To investigate the mechanisms underlying this phenomenon, we developed a Bayesian ideal observer model, combined with a neural coding scheme where neurons linearly encode visual TF but are inhibited by concomitant sounds by a measure that depends on their intensity. This model captured very precisely the full spectrum of rat perceptual choices we observed, supporting the hypothesis that auditory inputs induce an effective compression of the visual perceptual space. This suggests an important role for inhibition as the key mediator of auditory-visual interactions and provides clear, mechanistic hypotheses to be tested by future work on visual cortical codes.

Seeing what you hear: Compression of rat visual perceptual space by task-irrelevant sounds / Zanzi, Mattia; Rinaldi, Francesco G.; Fornasaro, Silene; Piasini, Eugenio; Zoccolan, Davide. - In: PLOS COMPUTATIONAL BIOLOGY. - ISSN 1553-7358. - 21:10(2025). [10.1371/journal.pcbi.1013608]

Seeing what you hear: Compression of rat visual perceptual space by task-irrelevant sounds

Zanzi, Mattia
Investigation
;
Rinaldi, Francesco G.
Formal Analysis
;
Fornasaro, Silene
Membro del Collaboration group
;
Piasini, Eugenio
Supervision
;
Zoccolan, Davide
Project Administration
2025-01-01

Abstract

The brain combines information from multiple sensory modalities to build a consistent representation of the world. The principles by which multimodal stimuli are integrated in cortical hierarchies are well studied, but it is less clear whether and how unimodal inputs shape the processing of signals carried by a different modality. In rodents, for instance, direct connections from primary auditory cortex reach visual cortex, but studies disagree on the impact of these projections on visual cortical processing. Both enhancement and suppression of visually evoked responses by auditory inputs have been reported, as well as sharpening of orientation tuning and improvement in the coding of visual information. Little is known, however, about the functional impact of auditory signals on rodent visual perception. Here we trained a group of rats in a visual temporal frequency (TF) classification task, where the visual stimuli to categorize were paired with simultaneous but taskirrelevant auditory stimuli, to prevent high-level multisensory integration and investigate instead the spontaneous, direct impact of auditory signals on the perception of visual stimuli. Rat classification of visual TF was strongly and systematically altered by the presence of sounds, in a way that was determined by sound intensity but not by its temporal modulation. To investigate the mechanisms underlying this phenomenon, we developed a Bayesian ideal observer model, combined with a neural coding scheme where neurons linearly encode visual TF but are inhibited by concomitant sounds by a measure that depends on their intensity. This model captured very precisely the full spectrum of rat perceptual choices we observed, supporting the hypothesis that auditory inputs induce an effective compression of the visual perceptual space. This suggests an important role for inhibition as the key mediator of auditory-visual interactions and provides clear, mechanistic hypotheses to be tested by future work on visual cortical codes.
2025
21
10
e1013608
10.1371/journal.pcbi.1013608
Zanzi, Mattia; Rinaldi, Francesco G.; Fornasaro, Silene; Piasini, Eugenio; Zoccolan, Davide
File in questo prodotto:
File Dimensione Formato  
journal.pcbi.1013608 (1).pdf

accesso aperto

Descrizione: pdf editoriale
Tipologia: Versione Editoriale (PDF)
Licenza: Creative commons
Dimensione 3.12 MB
Formato Adobe PDF
3.12 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.11767/149790
Citazioni
  • ???jsp.display-item.citation.pmc??? 1
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact