The Experiment
Cortical Labs has taught living human neurons grown on a microelectrode array to play Doom. The cells learned basic navigation and interaction within approximately one week, responding to visual stimuli translated into electrical signals and generating outputs that controlled in-game movement.
The performance remains far below human capability. The neurons didn’t develop strategy or spatial memory at human levels. What they did demonstrate was adaptive behavior: repeated exposure to game states produced measurable changes in firing patterns, and those changes corresponded to improved outcomes within the constrained environment of the game.
Why Neurons, Not Silicon
Biological neurons operate at energy efficiencies silicon cannot match. A human brain runs on roughly 20 watts. Modern AI systems training on comparable tasks consume thousands of times more power. The neurons in this experiment formed connections, pruned ineffective pathways, and adjusted synaptic weights without explicit programming. They learned because learning is what neurons do when subjected to patterned input and feedback.
Cortical Labs has been developing what it calls “synthetic biological intelligence” since 2021, when its DishBrain system learned to play Pong. Doom represents a significant leap in complexity: more variables, more spatial dimensions, more ambiguous feedback loops. The fact that the system achieved any coherent behavior suggests the approach scales beyond simple reflex tasks.
Implications for BCI and Beyond
This matters for the brain-computer interface industry because it inverts the usual engineering problem. Most BCIs decode signals from existing neural tissue or stimulate it according to predetermined algorithms. Cortical Labs is building computational substrates from neurons themselves. If biological processors can be grown, trained, and integrated into devices, they could handle pattern recognition, sensory processing, or adaptive control with far less power and heat than conventional chips.
The path from Doom to medical devices or prosthetics remains long. Culturing neurons at scale, maintaining viability, and interfacing reliably with electronic systems all present unsolved challenges. But the demonstration establishes a proof of concept: living cells can be directed toward functional tasks outside a body, trained through feedback, and measured with precision. The boundary between wetware and hardware is becoming negotiable.