Two research groups at EPFL (École polytechnique fédérale de Lausanne) supervised by Prof. Aude G. Billard, and Prof. José del R. Millán developed a machine learning (ML) algorithm that can learn from the patient’s thoughts and enables to control a robot’s movements based on electrical signals from the brain.
This research has potential applications to help tetraplegic patients who are unable to speak or perform any movement to independently perform activities of daily living (ADL).
- Patients move the robot with their thoughts, and no auditory or tactile feedback is needed.
- The electroencephalogram (EEG) scans of the patient’s brain are conducted with electrodes on a head cap.
- Inverse reinforcement learning is deployed to correct the errors.
Watch a short video of this below:
Thanks for reading this robotics news piece. You can access more categorized news on Robotics and Mechatronics at the link below:
If you enjoyed this post, please consider contributing to help us with our mission to make robotics and mechatronics available for everyone. We deeply thank you for your generous contribution!
Do not forget to contact us:
Be sure to let us know your thoughts and questions about this post, as well as the other posts on the website. You can either contact us through the “Contact” tab on the website or email us at support[at]mecharithm.com.
Send us your work/ research on Robotics and Mechatronics to have a chance to get featured in Mecharithm’s Robotics News/ Learning.
Follow Mecharithm in the following social media too: