In my neuroscience research, I try to understand how the nervous system processes somatosensory information and controls movement.
Think about how you use your hands to interact with things in your daily life—typing, playing with lego, picking your nose, or prepping food. They all require you to process incoming sensory information from your hand and to carefully control individual fingers. And yet, these movements often feel effortless. Why? How does the nervous system achieve such elegant control of the hands?
To study this, I use functional MRI to measure distributed patterns of brain activity while participants perform simple finger movements or experience stimulation delivered to the hand. I then interrogate these brain activity patterns using a Bayesian approach to explore:
Although much of my work involves neuroimaging data, the analyses I use can be used to make inferences with almost any high-dimensional dataset. For example, I've applied these analyses to neural spiking, muscle activity, and behavioural data.
Most recently, we examined how tactile sensory inputs from multiple fingers are integrated in the brain.
We used a custom-built device to deliver tactile stimulation to each fingertip independently. Using a family of computational models, we found that brain activity patterns in the primary somatosensory cortex became increasingly complex as more fingers were simultaneously stimulated, suggesting that sensory inputs from multiple fingers are integrated in unique ways depending on the pattern of stimulation.
We think that this complex integration serves as the foundation for dexterous object manipulation since it allows for a highly flexible mapping between somatosensory inputs and motor responses of the hand.