How the Brain Senses Body Position and Movement

How the Brain Senses Body Position and Movement

Proprioception is a term often used to describe how we unconsciously sense where all of our body parts are without having to look.

We know that it involves a complex network of sensors all other the body to gather information and send it to the brain to work. What we don’t know is how the brain takes all the information and interprets it into knowledge of whereabouts.

To understand this, a study led by Alexander Mathis at EPFL (published by Cell) explored this phenomenon.

PhD students Alessandro Marin Vargas, Axel Bisi, and Alberto Chiappa took part in this study and were part of the team who analysed experimental data provided by Chris Versteeg and Lee Miller at Northwestern University.

Mathis stated,”It is widely believed that sensory systems should exploit the statistics of the world and this theory could explain many properties of the visual and auditory system,” meaning that t0 understand how it works they need to look at statistics within the visual and auditory sensory systems.

When using this theory to understand proprioception, they “used musculoskeletal simulators to compute the statistics of the distributed sensors.”

This included using musculoskeletal modeling to generate muscle spindle signals in the upper limb to generate a collection of “large-scale, naturalistic movement repertoire.” Thousands of “task-driven” neural network models were trained on sixteen computational tasks based on the repertoire. Each reflecting the scientific hypothesis that the computations will be carried out by the proprioceptive pathway (including the brainstem and the somatosensory cortex).

They used this method because while others focus on predicting neural activity directly, task driven models show the underlying computational principles of sensory processes, giving us more accurate information.

The different neural network architectures and computational tasks gave a “brain-like” representation of how proprioceptive information was gathered and sent to the brain to be analysed.

As a result, they found that neural network models trained on tasks that predict limb position and velocity were the best when determining information on the whereabouts of each limb.

This suggests that integrating the distributed muscle spindle input is most prioritised when trying to understand body movement and position.

Now that we better understand proprioceptive processing, research in this field can focus on making significant advancements in neuroprosthetics, that can help create more natural and intuitive control of artificial limbs.

 

Grow your career with London School of Psychology and Counselling

At London School of Psychology and Counselling we offer over 70 online courses and qualifications in psychology and psychology-related disciplines.

Our courses are particularly suitable for psychology students and working practitioners who value their time and opt for self-study to prepare for exams or undertake continuing professional development without interruptions in their careers.

The courses start twice a year – in November and April. Register here.

View A to Z of LSPC courses

We use cookies to personalise and enhance your experience on our site. By clicking "Accept" and continuing to use the site, you agree to the use of cookies as set out in our Privacy Policy and Cookie Policy. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below, then you are consenting to this. Visit our Cookie Policy and Privacy Policy for more information on our data collection practices.

Close