An algorithm reveals what the brain is looking at

Is called Cebra the new algorithm which paves the way for futuristic brain-machine interfaces capable of reconstruct in detail what a person is looking at simply on the basis of electrical signals detected in his brain. In the first laboratory experiments, carried out at the Lausanne Polytechnic, it made it possible to reconstruct which sequences of a black and white film a mouse was looking at, reproducing them in an almost entirely similar way to the original. The results are published in the journal Nature.

Using signals produced by the visual cortex of mice, the Cebra algorithm has made it possible to train a deep learning model that decodes what the animal is looking at. If ten years ago it was possible to decode very simple shapes, now it becomes possible to decode entire film sequences starting from the signals fired by neurons in response to specific properties of the video (such as the objects represented, the colors and even the emotions it arouses).

The study also shows that Cebra can be used to predict limb movements in primates and to reconstruct the position of a freely moving rat in a small arena.

“Cebra’s goal is to discover structure in complex systems, and since the brain is the most complex structure in our universe, it’s the best testing space,” says neuroscientist Mackenzie Mathis who coordinated the study. “It can also give us insight into how the brain processes information and could become a platform for discovering new principles in neuroscience by combining data across different animals and even species. This algorithm is not limited to neuroscience research, as it can be applied to many datasets involving temporal or combined information, including animal behavior and gene expression data. Therefore, the potential clinical applications are exciting.”

Source: Ansa

Share this article:

Leave a Reply

most popular