MarkTechPost@AI 03月07日
Meta AI Introduces Brain2Qwerty: Advancing Non-Invasive Sentence Decoding with MEG and Deep Learning
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

Meta AI的研究人员开发了Brain2Qwerty,一种利用深度学习模型从非侵入式脑活动记录中解码文本生成的创新方法。该研究使用脑电图(EEG)和脑磁图(MEG)记录了35名参与者在打字时的脑部活动。Brain2Qwerty在MEG上的字符错误率(CER)达到32%,显著优于脑电图(67%)。这项研究成果弥合了侵入式和非侵入式脑机接口之间的差距,为无法沟通的患者提供了潜在的应用前景。该模型结合了卷积和Transformer模块,并通过字符级语言模型进行优化,为未来的脑机接口研究奠定了基础。

🧠 Brain2Qwerty是一种深度学习模型,旨在通过非侵入式脑部记录(脑电图EEG和脑磁图MEG)解码文本生成,为语言障碍患者带来希望。

⌨️ 研究选取了35名参与者,在他们使用特制键盘输入记忆中的句子时,记录他们的脑部活动。通过分析左右手按键时的神经活动差异,验证了打字协议产生预期脑部反应。

📊 实验结果表明,MEG在对手部动作和字符解码的分类上优于EEG,峰值准确率分别达到74%和22%。Brain2Qwerty模型通过卷积、Transformer和语言模型组件的协同作用,显著提高了解码性能。

💡 尽管Brain2Qwerty在非侵入式脑机接口方面取得了显著进展,但仍面临实时操作、适应闭锁综合征患者以及MEG设备非便携性等挑战。未来的研究将致力于提升实时处理能力,探索基于想象的任务,并集成更先进的MEG传感器。

Neuroprosthetic devices have significantly advanced brain-computer interfaces (BCIs), enabling communication for individuals with speech or motor impairments due to conditions like anarthria, ALS, or severe paralysis. These devices decode neural activity patterns by implanting electrodes in motor regions, allowing users to form complete sentences. Early BCIs were limited to recognizing basic linguistic elements, but recent developments in AI-driven decoding have achieved near-natural speech production speeds. Despite these advancements, invasive neuroprostheses require neurosurgical implantation, posing risks such as brain hemorrhage, infection, and long-term maintenance challenges. Consequently, their scalability for widespread use remains limited, particularly for non-responsive patient populations.

Non-invasive BCIs, primarily using scalp EEG, offer a safer alternative but suffer from poor signal quality, requiring users to perform cognitively demanding tasks for effective decoding. Even with optimized methods, EEG-based BCIs struggle with accuracy, limiting their practical usability. A potential solution lies in magnetoencephalography (MEG), which provides a superior signal-to-noise ratio compared to EEG. Recent AI models trained on MEG signals in language comprehension tasks have shown notable improvements in decoding accuracy. These findings suggest that integrating high-resolution MEG recordings with advanced AI models could enable reliable, non-invasive language production BCIs.

Researchers from Meta AI, École Normale Supérieure (Université PSL, CNRS), Hospital Foundation Adolphe de Rothschild, Basque Center on Cognition, Brain and Language, and Ikerbasque (Basque Foundation for Science) have developed Brain2Qwerty. This deep learning model decodes text production from non-invasive brain activity recordings. The study involved 35 participants who typed memorized sentences while their neural activity was recorded using EEG or MEG. Brain2Qwerty, trained on these signals, achieved a character-error rate (CER) of 32% with MEG, significantly outperforming EEG (67%). The findings bridge the gap between invasive and non-invasive BCIs, enabling potential applications for non-communicating patients.

The study explores decoding language production using non-invasive brain recordings via EEG and MEG while participants typed sentences. Thirty-five right-handed, native Spanish speakers typed words they heard, with brain activity recorded for nearly 18 and 22 hours for EEG and MEG, respectively. A custom, artifact-free keyboard was used. The Brain2Qwerty model, comprising convolutional and transformer modules, predicted keystrokes from neural signals, further refined by a character-level language model. Data preprocessing included filtering, segmentation, and scaling, while model training utilized cross-entropy loss and AdamW optimization. Performance was assessed using Hand Error Rate (HER) to compare with traditional BCI benchmarks.

To assess whether the typing protocol produces expected brain responses, researchers analyzed differences in neural activity for left- and right-handed key presses. MEG outperformed EEG in classifying hand movements and character decoding, with peak accuracies of 74% and 22%, respectively. The Brain2Qwerty deep learning model significantly improved decoding performance compared to baseline methods. Ablation studies confirmed the impact of its convolutional, transformer, and language model components. Further analysis showed that frequent words and characters were better decoded, and errors correlated with keyboard layout. These findings highlight Brain2Qwerty’s effectiveness in character decoding from neural signals.

In conclusion, the study introduces Brain2Qwerty, a method for decoding sentence production using non-invasive MEG recordings. Achieving an average CER of 32% significantly outperforms EEG-based approaches. Unlike prior studies on language perception, this model focuses on production, incorporating a deep learning framework and a pretrained character-level language model. While it advances non-invasive BCIs, challenges remain, including real-time operation, adaptability for locked-in individuals, and the non-wearability of MEG. Future work should enhance real-time processing, explore imagination-based tasks, and integrate advanced MEG sensors, paving the way for improved brain-computer interfaces for individuals with communication impairments.


    Check out the Paper. All credit for this research goes to the researchers of this project. Also, feel free to follow us on Twitter and don’t forget to join our 80k+ ML SubReddit.

    Recommended Read- LG AI Research Releases NEXUS: An Advanced System Integrating Agent AI System and Data Compliance Standards to Address Legal Concerns in AI Datasets

    The post Meta AI Introduces Brain2Qwerty: Advancing Non-Invasive Sentence Decoding with MEG and Deep Learning appeared first on MarkTechPost.

    Fish AI Reader

    Fish AI Reader

    AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

    FishAI

    FishAI

    鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

    联系邮箱 441953276@qq.com

    相关标签

    Brain2Qwerty 脑机接口 MEG 深度学习 非侵入式
    相关文章