Benefits Of “Deepfaking” The Mind In Creating Brain-Computer Interfaces
Most times when we think of deepfakes, we think of the myriad negative applications. From pornography to blackmail to politics, deepfakes are a product of machine learning. They create a lie that is so realistic that it is hard to believe it is not the real thing. In a society plagued by fake news, deepfakes have the potential to do a substantial amount of harm.
But a recent team of researchers found another use for deepfakes - to deepfake the mind. And using machine learning to simulate artificial neural data in this way may make a world of difference for those with disabilities.
Brain-Computer Interfaces For People With Disabilities
For people with full body paralysis, the body can seemingly become a prison. Communicating and the simplest of tasks may appear to be an insurmountable challenge. But even if the body is frozen, the mind may be very active. Brain-computer interfaces (BCIs) offer a way for these patients to interact with the world.
But BCIs are challenging to develop. The brain generates an enormous amount of data, and a BCI needs to be retrained for every unique user.
Simulating the Mind With AI
This is where deepfakes can come in.
The research, described in a paper recently published in Nature Biomedical Engineering, generated synthetic neural data Researchers recorded brain activity from only one session of a monkey who reached for an object. The team then “deepfaked the mind”, using machine learning to generate similar simulated neural data, which was combined with a small amount of new real data. Using this large amount of “fake data” sped up the training of the BCI by up to 20 times.
“It is the first time we’ve seen AI generate the recipe for thought or movement via the creation of synthetic spike trains,” says lead author Shixian Wen. And it’s only the beginning. “This research is a critical step towards making BCIs more suitable for real-world use.”
Post a Comment