Next-level neural interfaces: AI decodes human brain activity using virtual reality mazes

In case you can’t find your home after your neighborhood gets destroyed by a catastrophic event, don’t worry — artificial intelligence is there to help. Using a hypothetical apocalyptic scenario, researchers from Kyoto University were able to decode human brain activity using partial-observation mazes in virtual reality. The findings could lead to a new wave of neural interface technology to benefit humans in the future.

Researchers were able to decipher from subjects’ brain activities their abilities to predict positions and scenes within the maze. They were also able to measure the degree of confidence in the subjects’ predictions.

“An AI model based on human brain activity shows that the decoding accuracy of the scene prediction is dependent on the confidence level of the subject’s ability to predict,” says study lead author Risa Katayama, of Kyoto University, in a media release.

For the study, researchers examined whether AI can decode the neuronal representations of every virtual reality experienced by the subjects. They also analyzed how self-confidence levels affected how predictions can be reproduced. In the theoretical cataclysmic scenario, the subject combs through a sequence of scenes by comparing each scene prediction with the observed scene. The subject either confirmed or updated the preceding virtual reality scene.

While subjects were participating in a virtual reality maze game, researchers measured their brain activity fusing functional magnetic resonance imaging (fMRI). Even though the subjects didn’t have any knowledge of the final destination, they used their predictions and map memory to help estimate their positions in the maze and select the correct way to proceed.

“Our results suggest that when prediction confidence is high, subjects are able to imagine the scene clearly and predict quickly,” says Katayama.

Researchers believe their findings could lead to the development of brain-machine interfaces as communication tools utilizing a number of environments.

“Scene prediction can be generalized and lead to new applications such as control methods connecting human brains and AI for aerial and land vehicles,” explains Katayama. “We believe the intersection of the human mind and AI had interdisciplinary significance for further elucidation of the source of our self-consciousness.”

The study is published in the journal Communications Biology.

Leave a Reply

Your email address will not be published. Required fields are marked *