Welcome to a special week-long series where we take a deep dive into some of the most ground-breaking digital innovation research happening quietly in university labs around the world. These are projects that have not yet hit the headlines or your social feeds, but if they succeed, their impact could be truly transformative. Each day this week, I will bring you an in-depth look at a different area of digital research that promises to reshape how we live, work and connect.
Today, we begin with a fascinating topic: the emerging field of direct brain-to-brain communication. While it might sound like science fiction, a number of academic teams are already developing the early building blocks for mind-to-mind digital links. This work goes beyond simple brain-computer interfaces and opens up new questions about language, empathy and even identity.
Brainwaves Without Borders: The Surprising Academic Push for Mind-to-Mind Communication
In 2013, something remarkable happened at the University of Washington. For the first time in human history, one person controlled another’s hand using only thought. No wires, no speech, no body language. Just a brain signal sent across the internet. This foundational experiment opened the floodgates for a new scientific frontier. The method behind it was almost disarmingly simple. The "sender" wore an EEG cap, imagining a hand movement while their brain activity was recorded. That data was then sent across the internet and translated into a pulse of transcranial magnetic stimulation to the motor cortex of a second person. The result? The receiver’s hand involuntarily twitched and pressed a key on a keyboard, all without conscious intention. It wasn’t just a stunt. It worked. Repeatedly.
| “One person controlled another’s hand using only thought. It wasn’t just a stunt. It worked.”
This experiment, published in PLoS ONE the following year (Stocco et al., 2014), became a cornerstone of what is now called brain-to-brain interfacing, or BBI. The same team later expanded on their work in spectacular fashion. In 2019, they developed what they dubbed BrainNet, a three-person neural network that allowed two participants to collaborate in helping a third person complete a simplified game of Tetris using only brain signals (Jiang et al., 2019). In other words, they had begun to lay the groundwork for multi-person neural collaboration. The Tetris study didn’t just prove that one brain could send data to another—it hinted at a world where multiple minds could cooperate without speaking a word.
| “Multiple minds could cooperate without speaking a word.”
While those early experiments were striking, the developments of the past two years have shown just how much deeper this rabbit hole goes. In 2024, a team of researchers employed hyperscanning EEG to place two individuals in a virtual reality maze and tracked how their brains aligned as they worked together to navigate it. They found a direct relationship between the level of brainwave synchrony and how quickly and effectively the pair completed the task (Zhao et al., 2024). It’s a tantalising glimpse of something more fundamental: that the brain doesn’t just operate in isolation but seeks resonance with others when collaborating—and technology can now detect and even enhance that process.
A separate study explored this in an altogether different setting: live music. Musicians were hooked up to EEG and asked to improvise together. As their brainwaves began to synchronise, the system altered the music in real time to reflect the harmony between their neural states (Dai et al., 2023). The more in tune their brains became, the more melodious the room sounded. This wasn't just biofeedback for its own sake—it was a demonstration that shared mental states can be both measured and encouraged, potentially opening up new ways to enhance group creativity and emotional connection.
| “Shared mental states can be measured, encouraged and even turned into music.”
Then came the twist. What happens when two people actively train together in a brain-computer interface? A study published in early 2024 tested exactly this, having participants complete motor imagery tasks as pairs. What surprised researchers wasn’t just the collaborative performance—it was that both individuals improved their individual cognitive abilities through the joint experience (Khalaf et al., 2024). It was as if their brains had learned more effectively by leaning into each other. In a subtle but profound way, it suggested that shared mental activity might enhance personal cognition in ways we don’t yet fully understand.
| “Shared mental activity might enhance personal cognition in ways we don’t yet fully understand.”
Parallel to this work, researchers have been fusing neural data with visual context to create more robust models of intention and perception. One of the most important milestones came in 2025, with the release of the EgoBrain dataset. This enormous collection of EEG readings paired with over 60 hours of first-person video is now helping AI systems learn to recognise not just what someone is seeing or doing, but potentially what they’re trying to do (Lee et al., 2025). This marriage of brain data and visual context brings us closer to machines that don’t merely react, but understand. It’s an essential building block for interpreting intention.
And now, the next frontier looms. Earlier this year, researchers at Northern Illinois University secured a substantial grant from the National Science Foundation to build the first system for true two-way brain-to-brain communication. Not just send. Receive. Converse. It’s a significant shift, one that could move us past the current model of decoding isolated commands into something more like a dialogue between minds. Their system uses a mix of EEG and transcranial focused ultrasound—a method that allows sound waves to stimulate precise regions of the brain with surprising accuracy (NIU Newsroom, 2025). The idea is not only to detect and transmit signals, but to create a loop where two brains can influence each other directly.
Meanwhile, the original team at the University of Washington hasn’t been idle. Their most recent experiments aim to expand what kinds of thoughts can be shared—beyond basic motor commands, toward more complex cognitive and emotional states. They’re now working to manipulate brain waves associated with alertness and abstract reasoning, possibly setting the stage for transmitting more intricate pieces of information (Lab Manager, 2025). Imagine sharing not just an action, but an idea. A concept. A feeling.
| “We’re on the verge of sharing not just thoughts, but states of mind.”
It’s worth remembering that while full-fledged mind-to-mind conversation is still out of reach, many of the foundational technologies are already transforming lives. UC Davis Health has recently developed a brain-computer interface that allows people with severe speech impairment to communicate simply by thinking. The system has reached up to 97 percent accuracy in translating thought into text or speech, giving voice to those with conditions like ALS who otherwise might not be able to express themselves at all (UC Davis Health, 2024). While this isn’t brain-to-brain communication in the strictest sense, it’s part of the same continuum—one that brings the inner world of the mind closer to shared experience.
And so we find ourselves in the early stages of something extraordinary. We’re not speaking telepathically across rooms. Not yet. But we are sharing brain states. We are collaborating neurally. We are building tools that align minds and amplify understanding. The implications stretch far beyond science fiction. Imagine students sharing cognitive load in real time to master complex topics more quickly. Therapists and clients entering states of deeper empathy through synchronised neural rhythms. Artists and musicians producing shared states of flow that shape the creative process. Policy makers quite literally understanding one another better—not just intellectually, but cognitively.
| “The next evolution of human communication may not be spoken, written or typed. It may be felt.”
But with such power comes risk. If our thoughts can be detected, influenced or even shared, who controls that channel? How do we protect neural privacy? What safeguards are needed when intention itself becomes transmissible? These aren’t technical questions—they’re ethical ones, and we’ll need to answer them sooner than we think.
Still, one thing is certain. This is no longer fiction. Minds are connecting. And the next evolution of human communication may not be spoken, written or typed. It may be felt. Shared. Understood at the speed of thought.
Tomorrow, we will explore a lesser-known area of research that could radically change how computers understand and generate human creativity. Prepare to discover how algorithms are learning to think more like us, not just process data, and what that might mean for the future of art, design and beyond.
Citations
Stocco, A., Prat, C. S., Losey, D. M., Cronin, J. A., Wu, J., Abernethy, J. A., & Rao, R. P. N. (2014). Playing 20 Questions with the Mind: Collaborative Problem Solving by Humans Using a Brain-to-Brain Interface. PLoS ONE, 9(8), e105225.
Jiang, L., Stocco, A., Losey, D. M., Abernethy, J. A., Prat, C. S., & Rao, R. P. N. (2019). BrainNet: A Multi-Person Brain-to-Brain Interface for Direct Collaboration Between Brains. Scientific Reports, 9, 6115.
Zhao, Y., et al. (2024). Inter-Brain Synchrony in Collaborative Virtual Navigation Tasks Using Hyperscanning EEG. Frontiers in Human Neuroscience.
Dai, Y., et al. (2023). Brain-to-Brain Musical Harmony: EEG-Driven Interactive Feedback in Collaborative Performance. Journal of Neural Engineering.
Khalaf, A., et al. (2024). Synergistic Learning through Cooperative Brain-Computer Interfaces: Evidence from Motor Imagery Tasks. IEEE Transactions on Neural Systems and Rehabilitation Engineering.
Lee, J., et al. (2025). EgoBrain: A Large-Scale EEG and Egocentric Video Dataset for Decoding Intent. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition.
NIU Newsroom. (2025). NIU Awarded NSF Grant to Build First Bi-Directional Brain-to-Brain Interface.
Lab Manager. (2025). University of Washington Explores Transfer of Abstract Thought via Brain-to-Brain Communication.
UC Davis Health. (2024). Breakthrough BCI Restores Communication for People with ALS.