Characterizing and manipulating multisensory perceptual networks
Abstract: Sensory brain areas traditionally thought to be dedicated to a single modality can exhibit multimodal responses. These responses may be evidence for crossmodal recruitment (i.e., sensory processing for inputs in a non-primary modality); however, the direct contribution of this activity to perception is unclear. Moreover, our understanding of how sensory information routes through distributed multisensory perceptual networks is incomplete.
My talk will focus on temporal frequency processing and the functional relationship between audition and touch. I will summarize results from human psychophysical experiments that reveal highly specific perceptual interactions between touch and audition. Reciprocal audio-tactile interactions in the frequency domain suggest convergent and shared pitch representations. I will then present results from non-invasive brain stimulation experiments in humans that reveal clear crossmodal contributions of auditory cortex to tactile frequency perception.
Together, these results provide evidence for a supramodal brain organization scheme – Sensory areas process multiple modalities and collaborate in supramodal perceptual networks. I will finish my talk by describing novel methods for dissecting functional networks in humans using combined brain stimulation and neuroimaging.
About the Speaker
Jeff Yau received his B.S. in Psychology from the University of North Carolina @ Chapel Hill. He received his Ph.D. in Neuroscience from Johns Hopkins University and completed a postdoctoral fellowship in the Neurology Department of Johns Hopkins Medical Institutions. Jeff Yau has been an Assistant Professor in the Neuroscience Department at Baylor College of Medicine since March 2014.