AI-Powered Interface Lets Man Control Robotic Arm With Thought

Healthcare And Medical Technology News

AI-Powered Interface Lets Man Control Robotic Arm With Thought
Health And Medical TechHealth And Med TechHealth And Medical Technology
  • 📰 Medscape
  • ⏱ Reading Time:
  • 127 sec. here
  • 23 min. at publisher
  • 📊 Quality Score:
  • News: 120%
  • Publisher: 55%

An AI-powered brain-computer interface allowed a paralyzed man to control a robotic arm with his thoughts alone, with no adjustments required for 7 months.

A new brain-computer interface powered by artificial intelligence allowed a paralyzed man, who could not speak or move, to control a robotic arm to grasp and move objects simply by imagining himself performing these movements.

Notably, the BCI worked for 7 months without needing to be adjusted, compared with just a day or two for other devices. neurologist Karunesh Ganguly, MD, PhD, with University of California San Francisco Weill Institute for Neurosciences, explained that older BCI systems use spike recordings from tiny electrodes implanted in brain tissue to record signals from single or small groups of neurons near the electrode. However, these signals are unstable due to brain movement."This approach is more on the surface of the brain itself. It's still invasive, requires surgery, but it's able to record signals more stably," Ganguly said. In addition, the AI component of the BCI tracks learning"drift" over time. Initially, the system needs about 8 to 9 days to stabilize, after which the system maintains stability for up to 7 months before requiring a brief recalibration, Ganguly noted.To test the AI-powered BCI, the investigators worked with a man who had been paralyzed by aElectrodes implanted on the surface of his brain picked up brain activity when he imagined moving different parts of his body and this data was used to train the AI. The AI-powered brain-computer interface allows the patient to control a robotic arm to grasp objects simply by imagining himself performing the movements. With practice, the man could make the robotic arm pick up blocks, turn them, and move them to new locations. He was also able to open a cabinet, take out a cup, and hold it up to a water dispenser. Seven months later, the participant was still able to control the robotic arm after a 15-minute"tune-up" to adjust for how his movement representations had drifted since he had begun using the device. Other AI-driven BCIs may require daily retraining because the brain signals shift. This new system reduces that burden. "It's the difference between, I would say, like trying to re-learn a bike daily, versus having a bike that you learned over a week or two, and then you're ready to go from then on," said Ganguly. The researchers are now refining the AI models to make the robotic arm move faster and more smoothly. The goal is for the system to be usable in daily life and the community without supervision. "This blending of learning between humans and AI is the next phase for these brain-computer interfaces. It's what we need to achieve sophisticated, lifelike function," Ganguly said in a news release. This research was funded by the National Institutes of Health and the UCSF Weill Institute for Neurosciences.

We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

Medscape /  🏆 386. in US

Health And Medical Tech Health And Med Tech Health And Medical Technology Healthcare Technology Medical Technology Brain And Nerves Brain & Nerves Artificial Intelligence Deep Learning AI NPL Machine Learning ML Natural Language Processing Artificial Neural Networks Brain Central Nervous System Cns Neuroinflammation Brain Stimulation

 

United States Latest News, United States Headlines

Similar News:You can also read news stories similar to this one that we have collected from other news sources.

A Nose-Computer Interface Could Turn Dogs Into Super DetectorsA Nose-Computer Interface Could Turn Dogs Into Super DetectorsStartup Canaery is partnering with a US Department of Energy lab to develop neural implants for rats and dogs that are capable of decoding what they smell.
Read more »

Interface Flexibility Controls Self-Assembly of Supramolecular NetworksInterface Flexibility Controls Self-Assembly of Supramolecular NetworksScientists have discovered a new property called interface flexibility that determines how molecules self-organize into crystalline supramolecular networks. This finding could revolutionize the design of synthetic molecules for nanostructure growth.
Read more »

Chinese Scientists Develop World's First Two-Way Adaptive Brain-Computer InterfaceChinese Scientists Develop World's First Two-Way Adaptive Brain-Computer InterfaceChinese researchers have made a groundbreaking advancement in brain-computer interface (BCI) technology by developing the world's first two-way adaptive BCI. This innovative system allows for a reciprocal interaction between the brain and the machine, significantly enhancing efficiency and paving the way for practical applications in various fields.
Read more »

Catch a screening of animated film 'Interface' with a director Q&A at Lower East Side event |Catch a screening of animated film 'Interface' with a director Q&A at Lower East Side event |Get an inside look behind the animated feature 'Interface' at a special screening in the Lower East Side.
Read more »

Bay Area home building in Wildland Urban Interface areas increases residential wildfire riskBay Area home building in Wildland Urban Interface areas increases residential wildfire riskA data analysis by the CBS News Data Team shows a growing number of Bay Area homes are facing increased wildfire risk.
Read more »

Man breaks into San Bernardino animal shelter, lets dogs out; 1 still missingMan breaks into San Bernardino animal shelter, lets dogs out; 1 still missingMost of the dogs have been recovered, and are receiving medical treatment. Officials said a suspect is in custody.
Read more »



Render Time: 2026-04-01 02:02:26