Research within the Predictive Brain Lab focuses on how our brain constructs perception and makes decisions. Perception is not a passive process of registering information (like a camera), but rather an active process of "hypothesis testing". Therefore, cognitive processes are constantly colouring the information provided by our senses. We are particularly interested in how perception is shaped by what we expect to see (prediction) and by what we deem relevant (attention). Below you can find some of the questions we are currently working on.

How do expectation and attention change perception?

Unexpected things (a sudden flash of light) capture our attention. Also, we often pay attention to locations where we expect something relevant to happen. In other words, attention and prediction are closely intertwined, and maybe even the same thing? Our working hypothesis is that although prediction and attention both influence sensory representations, they are conceptually and neurally distinct: Prediction increases the confidence in a particular perceptual hypothesis (or 'prior', e.g. "I will see a face"), and 'shuts up' competing hypotheses (e.g. "I will see a vase"). Attention, on the other hand, boosts sensory evidence (or 'likelihood'). As such, these factors can have opposite effect on neural activity.

Read more:
  • Kok P, Jehee JFM, de Lange FP (2012). Less is more: Expectation sharpens representations in the primary visual cortex. Neuron 75, 265-270. pdf
  • Kok P, Rahnev D, Jehee J, Lau HC, de Lange FP (2011). Attention reverses the effect of prediction in silencing sensory signals. Cerebral Cortex, in press. pdf
  • Todorovic A, van Ede F, Maris E, de Lange FP (2011). Prior expectation mediates neural adaptation to repeated sounds in the auditory cortex: an MEG study. Journal of Neuroscience 31, 9118-23. pdf

How does expectation change decision-making?

Perception can be cast as a Bayesian inference process, where sensory evidence (e.g. what our eyes "tell us") is combined with our prior beliefs (e.g. what we think we should be seeing). The outcome of this inference process is what constitutes the contents of our awareness. This scheme predicts that changing our predictions should change our awareness (an extreme example of this are hallucinations, where our perception seems purely based on top-down expectations).

Read more:
  • de Lange FP, Rahnev D, Donner TH, Lau HC (2013). Prestimulus oscillatory activity over motor cortex reflects perceptual expectations. Journal of Neuroscience, 33, 1400-10. pdf
  • de Lange FP, van Gaal S, Lamme VA, Dehaene S (2011). How awareness changes the relative weights of evidence during human decision making. PLoS Biology, 9(11): e1001203. pdf
  • Pajani A, Kok P, Kouider S, de Lange FP (2015). Spontaneous activity patterns in primary visual cortex predispose to visual hallucinations. Journal of Neuroscience, in press.
  • Rahnev D, Lau HC, de Lange FP. (2011). Prior expectation modulates the interaction between sensory and prefrontal regions in the human brain. Journal of Neuroscience 31, 10741-8. pdf

How are action and perception intertwined?

Every time we act, there is a corresponding change in sensory information. Therefore, we can almost perfectly predict the incoming somatosensory and visual information that is due to our own movements. Likewise, we can use our own motor programs to predict the kinematics of movements of other agents around us.

Read more:
  • Zimmermann M, Toni I, de Lange FP (2013). Body posture modulates action perception. Journal of Neuroscience 33, 5930-8. pdf
  • Zimmermann M, Meulenbroek R, de Lange FP (2011). Motor planning is facilitated by adopting an action’s goal posture: an fMRI study. Cerebral Cortex in press. pdf
  • Ondobaka S, de Lange FP, Newman-Norlund RD, Wiemers M, Bekkering H (2011). Interplay Between Action and Movement Intentions During Social Interaction. Psychological Science, in press. pdf

What is in the mind's eye?

We can effortlessly imagine seeing a horse, driving a car, or walking down the street. It is still debated whether visual and motor mental imagery relies on the same sensory and motor representations as perception and action. Using well-controlled experimental paradigms and neuroimaging techniques, we investigate the involvement and timing of sensory and motor activity in the brain during mental simulation of perception and action.

Read more:
  • Albers AM, Kok P, Toni I, Dijkerman HC, de Lange FP (2013). Shared representations for working memory and mental imagery in early visual cortex. Current Biology, 23, 1-5. pdf
  • Seurinck R, de Lange FP, Vingerhoets G, Achten E (2011). Mental rotation meets the motion after-effect: the role of hV5/MT+ in visual mental imagery. Journal of Cognitive Neuroscience 23, 1395-404. pdf
  • de Lange FP, Roelofs K, Toni I (2008). Motor imagery: a window into the mechanisms and alterations of the motor system. Cortex 44: 494-506. pdf

Research Tools

Population receptive field mapping. We use fMRI to delineate different early visual areas in the cortex (V1, V2, etc). Each of these visual areas contains a "map" of the visual world, in eye-centered (retinotopic) coordinates, which can be extracted using population receptive field (pRF) mapping. Each of these areas specialize for ever more complex visual properties. This tool is important to create a "roadmap" and to be able to navigate through the visual system of the human brain.

Multivariate Pattern Analysis (MVPA). Also referred to as "brain reading" or "neural decoding", MVPA is a powerful tool to investigate whether a brain region contains information about a particular stimulus. During convential fMRI analyses, we ask which regions are reliably activated during e.g. viewing of a particular stimulus. Using MVPA, we try to predict e.g. which stimulus the subject is looking at on the basis of the pattern of activity in a given area. We are interested how decoding accuracy is affected by top-down factors like expectation and attention.

Magneto-encephalography (MEG). While fMRI is necessary to answer questions about how expectation changes activity at a fine spatial scale (e.g. map interactions between V1 and V2), it lacks the temporal resolution to investigate the timing and neurophysiological mechanisms of expectation and attention on perceptual processing. Using MEG, we look at evoked and induced oscillatory activity during perceptual decision-making.

Eye movements. This great example of early research by Yarbus shows how our eye moves when looking at a face. It shows that perception is not passive scanning of the scene, but rather active "hypothesis testing": the eyes move back and forth between areas that are expected to contain relevant information.