RESEARCH.
|
Perceptual inference, Learning & Attention
Adaptive behaviour in a complex, dynamic and multisensory world poses some of the most fundamental computational challenges for the brain, notably inference, probabilistic/statistical computations, decision making, learning, binding and attention. To define the underlying computations and neural mechanisms in typical and atypical (e.g. neuropsychiatric disorders) populations my lab combines behavioural, computational modelling (Bayesian, neural network) and neuroimaging (fMRI, MEG, EEG, TMS).
Current research revolves around three research themes:
Making Sense of the Senses in a complex dynamic Multisensory World
Our research investigates how the brain processes information from multiple senses in complex, real-world environments. Consider the challenges of crossing a busy street: the roar of traffic, flashing traffic lights, the smell of fumes and the sight of your friend. When deciding whether to cross the street, your brain must seamlessly integrate information across the senses, like judging a vehicle's speed from its engine noise and visual motion, while avoiding illusory associations, such as perceiving the truck as talking and your friend as flashing and smelly.
Multisensory perception thus relies on solving the binding or causal inference problem - determining which signals arise from common causes and should be integrated and which should be processed independently. In the past we have unravelled how the brain solves this binding problem in simple lab scenes by accumulating evidence across the cortical hierarchy near-optimally (i.e. consistent with Bayesian principles). However, optimal perceptual inference is wildly computationally intractable for all but the simplest laboratory scenes.
Current research revolves around three research themes:
Making Sense of the Senses in a complex dynamic Multisensory World
Our research investigates how the brain processes information from multiple senses in complex, real-world environments. Consider the challenges of crossing a busy street: the roar of traffic, flashing traffic lights, the smell of fumes and the sight of your friend. When deciding whether to cross the street, your brain must seamlessly integrate information across the senses, like judging a vehicle's speed from its engine noise and visual motion, while avoiding illusory associations, such as perceiving the truck as talking and your friend as flashing and smelly.
Multisensory perception thus relies on solving the binding or causal inference problem - determining which signals arise from common causes and should be integrated and which should be processed independently. In the past we have unravelled how the brain solves this binding problem in simple lab scenes by accumulating evidence across the cortical hierarchy near-optimally (i.e. consistent with Bayesian principles). However, optimal perceptual inference is wildly computationally intractable for all but the simplest laboratory scenes.
|
Currently, we ask how the brain accomplishes this feat in complex naturalistic scenarios, for instance at the cocktail party with multiple speakers. In the face of the brain’s limited computational resources we hypothesize that attentional mechanisms are critical to approximate these optimal solutions for progressively more complex scenarios.
We combine statistical, computational (Bayesian, neural network), behavioural and neuroimaging (3 and 7T-fMRI, MEG/EEG, TMS) methods to determine how, and how well, the brain solves the causal inference problem in progressively richer multisensory environments. |
By bringing lab research closer to the real world we will shift from near-optimal passive perception in simple scenes to active information gathering in more realistic scenes. It has the potential to drive transformative insights into the perceptual difficulties older and clinical populations (e.g. autism-spectrum disorder, schizophrenia) face in the real-world.
Aller M, Noppeney U (2019) To integrate or not to integrate: Temporal dynamics of hierarchical Bayesian Causal Inference. Plos Biology. 17(4):e3000210.Click here to edit
Jones SA, Noppeney U (2024) Audiovisual integration is preserved in older adults across the cortical hierarchy. PLOS Biol. 22(2):e3002494. doi: 10.1371/journal.pbio.3002494. eCollection
Mihalik A, Noppeney U (2020) Causal inference in multisensory perception. J Neurosci. 40(34):6600-6612
Rohe T, Ehlis AC, Noppeney U (2019) The neural dynamics of hierarchical Bayesian causal inference in multisensory perception. Nat Commun. 10(1):1907.
Rohe T, Hesse K, Ehlis AC, Noppeney U (2024) Computations and neural dynamics of audiovisual causal and perceptual inference in schizophrenia. PLOS Biology https://doi.org/10.1101/2023.08.06.550662
Jones SA, Noppeney U (2024) Audiovisual integration is preserved in older adults across the cortical hierarchy. PLOS Biol. 22(2):e3002494. doi: 10.1371/journal.pbio.3002494. eCollection
Mihalik A, Noppeney U (2020) Causal inference in multisensory perception. J Neurosci. 40(34):6600-6612
Rohe T, Ehlis AC, Noppeney U (2019) The neural dynamics of hierarchical Bayesian causal inference in multisensory perception. Nat Commun. 10(1):1907.
Rohe T, Hesse K, Ehlis AC, Noppeney U (2024) Computations and neural dynamics of audiovisual causal and perceptual inference in schizophrenia. PLOS Biology https://doi.org/10.1101/2023.08.06.550662
Spatiotemporal model of Multisensory Integration in cortical and subcortical circuitries
Accumulating evidence has shown that multisensory influences are nearly ubiquitous occurring as early as primary areas in human neocortex as well as in subcortical circuits. This pervasiveness of multisensory interactions requires us to revise our current theories of sensory cortical organization. It raises important questions about the computational roles, functional contributions and behavioural relevance of multisensory interactions across the cortical hierarchy and in subcortical circuits.
We characterize the spatiotemporal evolution of multisensory interaction across the cortical hierarchy with MEG/EEG and fMRI. Resolving multisensory and attentional influences across cortical depth with submillimeter fMRI at 7T provide insights into the circuitries and input-output computations.
Gau R, Bazin PL, Trampel R, Turner R, Noppeney U (2020) Resolving multisensory and attentional influences across cortical depth in sensory cortices. eLife. e46856.
We characterize the spatiotemporal evolution of multisensory interaction across the cortical hierarchy with MEG/EEG and fMRI. Resolving multisensory and attentional influences across cortical depth with submillimeter fMRI at 7T provide insights into the circuitries and input-output computations.
Gau R, Bazin PL, Trampel R, Turner R, Noppeney U (2020) Resolving multisensory and attentional influences across cortical depth in sensory cortices. eLife. e46856.
Adaptive behaviour in a dynamic multisensory environment
Adapting dynamically to changes in the world around us and the sensorium of the body is a fundamental challenge facing the brain throughout lifespan. Critically, changes can evolve across multiple timescales. Adaptive behaviour to rapid contextual changes requires the brain to flexibly adjust its expectations as formalized by Bayesian priors. Other changes in our sensorium are more subtle and evolve slowly such as the inter-aural or inter-ocular separation as a result of physical growth during neurodevelopment. These changes continuously alter the sensory cues that guide the construction of neural representations, for instance of the space around us. Finally, the brain faces situations where it needs to learn a novel mapping between sensory cues such as in neural prosthesis, sensory substitution or man-machine systems.
We combine computational modelling and neuroimaging to characterize how the brain adapts dynamically to a changing multisensory world across multiple timescales.
We combine computational modelling and neuroimaging to characterize how the brain adapts dynamically to a changing multisensory world across multiple timescales.
Aller M, Mihalik A, Noppeney U (2022) Audiovisual adaptation is expressed in spatial and decisional codes. Nature Communications. 13(1):3924. doi: 10.1038/s41467-022-31549-0.
Beierholm U, Rohe T, Ferrari A, Stegle O, Noppeney U. (2020) Using the past to estimate sensory uncertainty. eLife. e54172. doi: 10.7554/eLife.54172.
Beierholm U, Rohe T, Ferrari A, Stegle O, Noppeney U. (2020) Using the past to estimate sensory uncertainty. eLife. e54172. doi: 10.7554/eLife.54172.