Link to the code: brain-emulation GitHub repository

Allen Institute's 9-Million-Neuron Mouse Cortex Simulation: A Milestone Toward Whole Brain Emulation


The numbers alone signal a threshold has been crossed. Nine million neurons. Twenty-six billion synapses. Eighty-six interconnected brain regions. In November 2025, researchers at the Allen Institute presented what is arguably the most detailed biophysically realistic brain simulation ever built, a digital reconstruction of the mouse cortex that behaves like the real thing at the level of individual electrical signals.

For researchers working on whole brain emulation (WBE), this is a landmark. Not because the mouse cortex is the endpoint, but because it is a proof that the approach scales.

What Was Actually Simulated

The simulation models the mouse cortex using data drawn from two existing Allen Institute resources: the Allen Cell Types Database, which catalogs the electrical properties of individual neuron types, and the Allen Connectivity Atlas, which maps how those neurons connect to one another across brain regions.

From this biological blueprint, the team built a simulation using the Neulite neuron simulator and the Allen Institute’s own Brain Modeling ToolKit. The result captures not just the presence of neurons but their firing dynamics, synaptic activation patterns, and the propagation of electrical signals across a network that mirrors the structural organization of a real mouse cortex.

The simulation was run on Fugaku, Japan’s flagship supercomputer at the RIKEN Center for Computational Science, capable of over 400 quadrillion calculations per second. The project was led by Anton Arkhipov at the Allen Institute and Tadashi Yamazaki at Japan’s University of Electro-Communications, and was presented at the SC25 supercomputing conference in November 2025.

The scale matters in context. Eon Systems’ full fruit fly brain emulation, published in March 2026, captured approximately 125,000 neurons. This mouse cortex simulation is 72 times larger. The human brain contains roughly 86 billion neurons. The gap remains enormous, but the trajectory is no longer theoretical.

Biophysical Realism: Why It Matters

Not all brain simulations are created equal. Many computational neuroscience models simplify neurons down to abstract rate-coded units or point processes, which are tractable but sacrifice biological fidelity. What the Allen Institute team built is biophysically realistic, meaning each simulated neuron models the ionic channel dynamics, membrane potentials, and synaptic time constants derived from actual cell recordings.

This distinction carries practical weight for whole brain emulation. A simulation built on abstract neuron models can reproduce certain emergent behaviors but does not claim to replicate the substrate-level processes that generate cognition. Biophysically realistic models are closer to the level of description that WBE researchers argue is necessary to capture what a brain actually does.

The Sandberg/Bostrom WBE roadmap identifies three broad requirements for whole brain emulation: sufficient scanning resolution, accurate computational models of neural components, and the hardware to run them. The Allen Institute simulation advances the third leg directly, while drawing on real data for the first and second.

Research Applications: Disease Modeling Before Emulation

The immediate application of the simulation is not consciousness transfer. It is disease research.

The team used the virtual cortex to model how pathological signals spread through neural networks in conditions like Alzheimer’s disease and epilepsy. In a physical laboratory, observing how a seizure or amyloid cascade propagates across 86 cortical regions in real time is either impossible or prohibitively destructive. In a simulation, researchers can introduce a perturbation, track its spread with full observability, and test candidate interventions before any biological tissue is touched.

Anton Arkhipov described the significance directly: “This shows the door is open. Giving us confidence that much larger models are not only possible, but achievable.”

The phrase “much larger models” is doing real work there. The goal is not to simulate mouse cortex indefinitely. The goal is to demonstrate that the technical pipeline, from biological data through computational model to supercomputer execution, works at this scale. The mouse cortex is a validation run for a methodology that, if it holds, can be extended.

Where This Sits on the WBE Roadmap

The state of brain emulation in 2025 established that the field was progressing along multiple fronts simultaneously: connectomics resolution improving, simulation hardware scaling, and neuron models becoming more biologically detailed. The Allen Institute simulation is a concrete data point confirming that trajectory.

TRL assessment: this simulation sits at approximately TRL 3, experimental proof of concept. The simulation runs, produces biologically plausible activity patterns, and enables meaningful disease modeling. It does not claim to reproduce cognition or conscious experience, and it is not running in real time. Fugaku required significant computational resources to execute what a real mouse cortex does continuously at low metabolic cost.

The gap between simulating a cortex on a supercomputer and running a mind is not merely hardware. The Virtual Brain Twins approach developed in parallel uses personalized neural models for clinical applications, taking a different path toward the same eventual goal: simulations that are detailed enough to matter.

The Hardware Constraint Is Being Solved

One underappreciated aspect of the Allen Institute result is what it implies about the hardware trajectory. Fugaku is one of the most powerful supercomputers on Earth. Running 9 million biophysically realistic neurons on it in 2025 is a proof of feasibility, not a deployment model. But compute follows Moore’s Law adjacents. The same simulation that requires Fugaku today will run on smaller, cheaper hardware within a decade.

This is not speculation. The pattern has repeated across computational biology. Protein folding required the equivalent of decades of compute in 2010 and runs on consumer hardware in 2025 via AlphaFold derivatives. Brain simulation will not escape this curve.

The implication for WBE research is that the current supercomputer requirement does not represent a permanent ceiling. It represents today’s cost. The question researchers are working to answer is whether the models are correct, because getting the biology right now means that when the hardware catches up, there will be something worth running on it.

Scale: From Fly to Mouse to Human

It is worth mapping where the Allen Institute simulation sits in the emulation hierarchy explicitly:

  • C. elegans (roundworm): 302 neurons. First full connectome mapped in 1986, first simulation in 2014.
  • Drosophila melanogaster (fruit fly): ~125,000 neurons. First full brain emulation by Eon Systems, 2026.
  • Mouse cortex (this simulation): 9 million neurons, 26 billion synapses. Biophysically realistic. 2025.
  • Full mouse brain: approximately 71 million neurons.
  • Human brain: approximately 86 billion neurons, ~100 trillion synapses.

Each step up this hierarchy has required fundamental advances in both scanning resolution (to extract the wiring data) and computational methods (to model the dynamics). The Allen Institute result demonstrates that the 9-million-neuron scale is tractable. The question of whether the same approach holds at 71 million, or 86 billion, remains open, but the methodology is no longer hypothetical.

Implications for Whole Brain Emulation Research

Three things follow from this simulation for WBE research specifically.

First, the data pipeline works. The Allen Cell Types Database and Connectivity Atlas have been used to build something that runs and produces biologically plausible output. This validates the data collection investment made over the past decade.

Second, the computational abstraction level is sufficient for disease modeling. Whether it is sufficient for cognitive emulation requires further research, but the tools exist to ask the question rigorously.

Third, the international collaboration model is viable. Allen Institute (Seattle) + University of Electro-Communications (Tokyo) + Fugaku (Kobe) produced this result. WBE at human scale will require similar or larger international infrastructure, and this demonstrates that such collaborations can function.

What Arkhipov’s quote implies is that the field has passed from “we believe it is theoretically possible” to “we know it can be done, at least at this scale.” That shift in epistemic status matters for funding, for policy, and for the research programs that depend on both.

Path Forward

The Allen Institute team has not announced a timeline for scaling to a full mouse brain or beyond. The immediate next steps involve refining the disease modeling applications and expanding the simulation’s scope within the mouse cortex. The longer-term implications depend on whether the biological data quality can keep pace with the computational capacity.

Connectomics is the bottleneck, not compute. Getting the wiring diagram of a full mouse brain at synaptic resolution is a harder problem than running the simulation once you have it. Projects like SmartEM are addressing the scanning side of that bottleneck. When the data quality and simulation capability converge at the same scale, the milestones will come faster.


Official Sources