Neuroscientists Develop AI Tool to Unlock Cerebellum's Secrets
Understanding and treating brain disorders such as tremor, imbalance, and speech impairments requires deep knowledge of the cerebellum, a part of the brain that鈥檚 crucial for making accurate movements. Scientists have long been able to eavesdrop on and record the electrical signals transmitted by neurons (brain cells) in the cerebellum, allowing them to observe the signals entering and exiting this region. But the computations that the brain performs between the input and output have been largely a mystery.
However, that is now changing. A team of researchers including those from 草榴社区入口, have created an artificial intelligence tool that can identify the type of neuron producing electrical signals recorded from the cerebellum during behavior, allowing a new understanding of how the cerebellum works.
The study, published in the current edition of , describes the tool, a semi-supervised deep learning classifier, as allowing researchers to understand the cerebellum's role across many behaviors.
鈥淲hen we record the activity of neurons with extracellular electrodes, it鈥檚 like overhearing a crowded conversation between groups of people, each speaking a different language 鈥 some in Spanish, others in English or German 鈥 all talking at once. Our new AI tool allows us to determine which group each recorded neuron belongs to by identifying the 鈥渓anguage鈥 it鈥檚 using, based on its electrical signature鈥 said Dr. Javier Medina, Brown Foundation Professor and Director of the Center for Neuroscience and AI at 草榴社区入口, and the senior corresponding author on the study.
鈥淭his is a revolutionary advance because it solves the first step toward decoding the content of neural conversations 鈥 understanding who is speaking. With that in place, the door is now open to uncover what the different neurons are saying to one another鈥
Scientist have long known that neurons are interconnected and have been able to record only input neuron and the output neurons.
鈥淲e couldn鈥檛 figure out how the signals that came into the structure got transformed into the output signals. We couldn鈥檛 say how the brain did it,鈥 said Dr. Stephen Lisberger, with Duke University and one of seven co-senior authors of the study, thinking back to when he began his career. 鈥淭he advanced techniques used to record electrical signals don鈥檛 reveal which neuron type generated them. If you can answer how the circuit works, then you can say how the brain generates behavior. This discovery marks a pivotal moment, promising to help answer these questions.鈥
This new development in AI technology is the result of a team of 23 researchers from Duke, 草榴社区入口, University College London, the University of Granada in Spain, the University of Amsterdam, Bar-Ilan University in Israel, and King鈥檚 College London working together since 2018 to create the classifier tool and validate its accuracy.
To build the classifier, the scientists first had to measure the unique electrical signatures of the different types of neurons within the cerebellum. Using optogenetic experiments, in which genes for light-sensitive proteins are introduced into specific types of neurons, the authors 鈥渢agged鈥 the electrical activity for each cerebellar neuron type. Using these electrical signatures, they trained their deep learning classifier to sort the activity recorded from the cerebellum by neuron type.
Dr. David Herzfeld, senior research associate at Duke, is one of seven co-first authors of the paper. He along with colleagues from other institutions, including co-first authors Maxime Beau and Federico D鈥橝gostino, designed and trained the classifier.
鈥淭his tool is a major advance in our ability to investigate how the cerebellum processes information,鈥 Herzfeld said. 鈥淚 hope our techniques inspire researchers studying other brain regions to build tools that match neural activity to neuron identity, helping to uncover how different circuits function and ultimately paving the way for new approaches to treating neurological disorders.鈥
For a full list of collaborators who contributed to this work or to find funding information, click .