A team of researchers at the University of Technology Sydney (UTS) has developed an AI-powered helmet that can translate human thoughts into written text. This pioneering brain-computer interface invention, named DeWave, offers a non-invasive, portable, and cost-effective solution to translating brain waves into text.
The research, spearheaded by UTS’s GrapheneX Center, involved 29 volunteers who silently read text excerpts while wearing a special cap. This cap, equipped with sensors, recorded their brain activity through an electroencephalogram (EEG). The AI system, DeWave, then decoded these signals into text. Chin-Teng Lin, the project leader and director of the GrapheneX Center, highlighted that the AI helmet’s accuracy has improved from 40% to over 60%.
Unlike other brain-computer interfaces such as Elon Musk’s Neuralink, which requires surgical implantation of electrodes, this AI helmet operates non-invasively. It doesn’t rely on cumbersome and expensive equipment like MRI machines, making it a more practical and accessible technology. This feature significantly enhances its potential for widespread use and application.
The AI helmet’s technology holds immense promise for aiding individuals who are unable to speak due to illnesses or injuries, such as stroke or paralysis. It also paves the way for seamless human-machine communication. The AI, DeWave, was trained by observing examples where brain signals corresponded with specific sentences. Charles Zhou, a team member, explained that DeWave learns the relationship between thought patterns and words, further refined by connecting it to a large language model (LLM) similar to ChatGPT.
The use of EEG signals, as opposed to implanted electrodes, introduces more noise in the signal. However, the researchers are confident in achieving accuracy levels comparable to traditional language translation or voice recognition systems, which are around 90%. This advancement could revolutionize how we interact with machines, making communication more intuitive and natural.
This study builds upon previous brain-computer interface technology developed by UTS in collaboration with the Australian Defence Forces. In an earlier project, brain waves were used to command a robotic dog, demonstrating the potential of integrating large language models in neuroscience and AI.
AI has greatly influenced brain-computer interfaces — earlier neurobiologists from MIT have developed a computational model capable of predicting human emotions, and some other researchers are able to produce “high-quality” videos using AI to process signals from human brain activity.