Meta recently unveiled a white paper on its research project, “Brain2Qwerty,” a non-invasive technology designed to interpret brain activity and translate it into text. This innovation aims to restore the ability to communicate for individuals who have lost their speech due to neurological impairments.
According to the report, “Brain2Qwerty” leverages electroencephalography (EEG) and magnetoencephalography (MEG) to decode the neural patterns of thought formation, converting them into textual representations with an impressive 80% accuracy rate.
Meta emphasizes that this non-invasive approach could empower individuals who suffer from communication disabilities, enabling them to reconnect with the world without requiring intrusive medical procedures.
Furthermore, the company envisions this technology as a stepping stone toward an even more advanced capability—seamlessly converting thoughts into words at an accelerated pace, ultimately enabling direct translation of mental intent into actionable commands.
The Brain2Qwerty system currently functions by analyzing 1,000 brain activity images per second, allowing it to learn the neural signatures associated with words, syllables, and letters. This, in turn, facilitates the development of advanced artificial intelligence models capable of interpreting and utilizing language-related neural patterns.
However, significant challenges remain. The system currently exhibits an average character error rate of 32%, meaning that one in every three characters is misinterpreted. Additionally, the research primarily relies on MEG-based data analysis, which necessitates an electromagnetically stable environment, free from external interference. Furthermore, test subjects must remain as still as possible for optimal results.