NIX Solutions: Meta’s Brain Signal Typing Tech

Scientists at Meta have developed a system capable of interpreting brain signals to determine which keys a person is pressing. In an experiment with 35 volunteers, a deep neural network-based algorithm achieved 80% accuracy in recognizing letters. However, the technology remains strictly experimental. Despite its limitations, Meta views the project as a strategic initiative that could provide insight into human cognition and contribute to AI development.

Back in 2017, Mark Zuckerberg announced Facebook’s work on a technology for “typing directly from the brain.” The goal was to create a compact, non-invasive device—such as a hat or band—that could read brain signals and convert them into text. However, technical challenges led to the project’s commercial abandonment by 2021. Despite this, Meta continued funding fundamental neuroscience research.

NIX Solutions

Understanding the Brain Through AI

Meta’s latest study, detailed in two preprints and a company blog post, utilized magnetoencephalography (MEG)—a technique that records weak magnetic fields generated by neural activity. The signals were processed by a deep neural network, enabling the system to analyze brain activity and correlate it with specific keystrokes. Jean-Rémi King, head of Meta’s Brain & AI research group, emphasizes that the goal is not product development but a deeper understanding of intelligence. According to King, understanding the brain’s architecture could advance AI research.

The experiment, conducted at Spain’s Basque Center for Cognition, Brain, and Language (BCBL), involved 35 participants, each spending approximately 20 hours in the scanner typing in Spanish. The system, Brain2Qwerty, analyzed their brain signals and matched them with corresponding keystrokes. Initially, the algorithm required thousands of training samples before it could predict letters based on neural signals. The error rate was 32%, but Meta claims this is the highest accuracy achieved by any non-invasive typing method.

Challenges and Future Prospects

Despite promising results, the technology is far from practical application. The experiment required a $2 million MEG scanner housed in a magnetically shielded room. Brain signals are extremely weak compared to the Earth’s magnetic field, necessitating precise conditions. Additionally, the system is highly sensitive to movement—slight head movements result in signal loss. These constraints make commercialization unfeasible for now.

While Meta explores non-invasive approaches, invasive neural interfaces are advancing. In 2023, a patient with amyotrophic lateral sclerosis (ALS) regained the ability to communicate through a brain implant linked to a speech synthesizer. Neuralink, founded by Elon Musk, develops implantable devices that allow paralyzed patients to control a computer cursor. Although invasive methods offer more accurate signal readings, they require surgery and carry associated risks, notes NIX Solutions.

Meta remains focused on fundamental science rather than medical devices. Unlike implants, MEG does not capture individual neuron activity but enables researchers to study large-scale brain processes. This is crucial for understanding cognitive functions and linguistic processing. In a related study, Meta scientists analyzed how the brain structures language, confirming a hierarchical process: from general thought formation to word activation, then syllables, and finally individual letters.

Though the system is not yet practical, its findings could impact neural interface development and AI research. Modern language models already mimic human brain information processing, but a deeper understanding of cognitive functions may be key to creating truly intelligent AI systems. We’ll keep you updated as research progresses.