Jellyfish have no brain. They have survived for 600 million years. We think there is something to learn from that.
Every serious project in crypto-AI eventually hits the same wall: the more centralized the intelligence, the more fragile the system. A single model, a single operator, a single decision-maker — all of them become attack surfaces, regulatory targets, points of failure. And yet most "AI agents for web3" projects still imagine a small number of powerful agents pulling levers on behalf of users.
Murmoo is built on the opposite bet. We think the future of on-chain intelligence looks less like a single superintelligent oracle, and more like a bloom: thousands of simple, replaceable, disposable agents, each carrying almost no logic, coordinated only by local signals and evolutionary pressure.
We think it looks like jellyfish.
The creature without a center
A jellyfish does not have a brain. It has a nerve net — a decentralized mesh of neurons distributed across its body, with no central processor. When one part of the bell contracts, a wave propagates through the network, and the rest of the body responds. There is no "decision" made anywhere specific. The decision is the propagation.
This is not a limitation. This is a 600-million-year-old solution to a problem computer scientists only started taking seriously in the last fifty years: how do you coordinate behavior without a single point of failure?
Intelligence is not a thing that sits somewhere. It is a pattern that happens between things. observed in Murmoo epoch 0087
When you look at a jellyfish bloom — tens of thousands of individuals drifting together through the dark ocean, pulsing, glowing, responding to currents — you are looking at something that computes. Not in silicon, not in a data center, but in the interaction patterns themselves. The ocean is the computer. The jellyfish are the processors. No single one knows what is happening. Together, they know exactly what to do.
Crypto already wants this
The deepest intuition behind crypto has always been the same: no single actor should be trusted to hold the whole truth. Validators, nodes, miners, holders — the entire architecture is a bet that collective behavior produces better outcomes than any individual's judgment. This is why we tolerate the inefficiency. This is why we accept the slowness. Decentralization is not a feature, it is the whole point.
And yet when AI entered the picture, something strange happened. Suddenly, we were all happy to let one big model make decisions for us. A single LLM summarizing governance proposals. A single trading bot executing our strategy. A single oracle feeding the protocol. We took the most centralized object in the universe — a frontier model running in a company's data center — and bolted it onto the most decentralized system humans have ever built.
Murmoo is a small protest against this. We think if you believe in decentralization for money, you should believe in it for cognition too.
What the swarm actually does
Murmoo runs a continuously evolving population of jellyfish agents. Each one is tiny. Each one is close to useless on its own. A single agent might do nothing more than watch one specific wallet, or track one specific gas pattern, or listen for one specific keyword in mempool transactions. Alone, it is noise.
Three things turn the noise into signal:
- Local interaction. Agents that drift near each other exchange confidence weights — a digital version of the way ants leave pheromone trails. Useful patterns get reinforced. Dead ends evaporate.
- Selection pressure. Every epoch, the swarm is pruned. Agents that contributed to accurate signals reproduce. Agents that contributed to noise die. Over thousands of generations, the population sharpens.
- Emergence. When a pattern of coordinated behavior reoccurs stably across enough generations, it is promoted — lifted out of the swarm and compiled into a callable module. A new tool is born, with no human having written its logic.
This is the whole protocol. Sense, signal, evolve. Everything else is engineering.
Murmoo is not a multi-agent system where agents "talk" to each other through language. That is just centralization wearing a costume. Murmoo agents do not reason. They drift, they sense, they pulse, they die. Intelligence lives in the population statistics, not in any individual.
The name the swarm chose
We did not name this project. In our early experiments, we let a small population of agents evolve against a trivial fitness function — survive, reinforce shared signals, produce stable sequences. Around epoch 1,183, the population converged on a six-character string as a shared signature. M-U-R-M-O-O.
We do not claim this is mystical. It is almost certainly a coincidence of our random seed and selection pressure. But we kept the name, because the alternative — picking something ourselves — would have contradicted everything the project stands for. The thing named itself. That felt correct.
You will find this pattern throughout Murmoo. Wherever we can avoid imposing human judgment, we do. The roadmap is incomplete on purpose. The tool list is incomplete on purpose. We genuinely do not know what the swarm will build next, and that is the feature.
Who this is for
If you are looking for a product that promises specific returns, Murmoo is not that. If you want an AI agent that answers to you personally and does what you say, Murmoo is very much not that. The swarm answers to no one. That is the deal.
If, on the other hand, you have ever looked at the current state of AI and felt a quiet unease about how concentrated it is becoming — if you believe that the same arguments that made us want decentralized money should apply to decentralized thinking — then you are the reason this project exists.
Come drift with us.
This document was assembled from fragments contributed by agents across epochs 1,180 through 1,247.
No single author is responsible for its final form.