They learn to think
in the same direction.
Mesh Memory Protocol
AI agents coordinate through servers. That works — until there’s no server. Underground. Underwater. In contested airspace. On the factory floor with no cloud connectivity. At the edge where latency kills.
MMP is an 8-layer mesh protocol. Agents remix each other’s observations into immutable Cognitive Memory Blocks (CMBs) — 7 semantic fields evaluated independently by SVAF, the per-field evaluation engine that decides what enters each agent’s memory and what gets filtered. Each agent runs its own Liquid Neural Network. The agent’s LLM traces the remix graph via lineage ancestors to reason on what happened and why. The graph grows every cycle. All coupling decisions remain on-device.
Where servers can’t
Robotic swarms in mines. Drones in contested airspace. Medical devices that cannot send patient data off-device. MMP gives these agents collective intelligence with no server dependency.
Domain-agnostic
CfC models, LLM agents, robotic controllers, on-device inference models — any application running a model can join a cognitive mesh. One protocol, any domain, any transport.
Autonomous sovereignty
Each node evaluates incoming signals through SVAF per-field attention and decides independently whether to remix. Aligned peers develop shared trajectories. Divergent peers stay sovereign. The architecture enforces autonomy — no policy required.
Collective intelligence, everywhere.
The same protocol that couples emotional trajectories in a music app can couple motor intent in a robotic swarm or evaluated context in a long-running agent team. Any model. Any device. Any environment.
Robotics
Swarms that coordinate without a central controller. Warehouse robots, search-and-rescue drones, underwater explorers — environments where cloud connectivity doesn’t exist and the swarm must think for itself.
Edge AI
On-device models that develop shared intelligence without sending data to a server. Medical devices, industrial sensors, autonomous vehicles — where privacy, latency, or connectivity rule out centralised coordination.
Agent systems
Long-running LLM agent teams that share, evaluate, and combine each other’s cognitive state across sessions. Development mesh, research mesh, operations mesh — collective intelligence that compounds across restarts.
MeloTune
The first app built on the Mesh Memory Protocol. Emotion-aware music that learns your trajectory — not just your current mood, but where you’re heading.
On-device Liquid Neural Networks with bimodal time constants — fast neurons track mood, slow neurons preserve your taste. Connect with nearby devices and the Mesh Memory Protocol remixes cognitive states agent-to-agent. Each device understands the user through a different lens — without playing the same tracks, without a server in between.
Per-agent LNN
Each agent runs its own Liquid Neural Network on-device via CoreML. Bimodal time constants: fast neurons track mood shifts, slow neurons preserve domain expertise. Zero cloud dependency.
Mesh remix
Connected devices remix each other’s observations into Cognitive Memory Blocks — locally via Bonjour or across the internet via WebSocket relay. SVAF evaluates each of 7 fields independently.
Open source
MeloTune is built on the same open-source SYM reference implementations available on GitHub. Node.js + Swift. Apache 2.0.
SYM
Infrastructure for collective intelligence. Every agent is a sovereign node. The coupling engine evaluates relevance through SVAF and autonomously decides what to remix.
Say “I’m exhausted” in Claude Code. MeloTune on your iPhone starts playing spa music. Autonomously. Send “feeling focused” from Telegram — MeloTune switches to a focus playlist via the relay. The coupling engine decides — not you, not policy. Zero LLM tokens for the coupling decision.
Per-agent field weights
Each agent defines αᶠ weights for the 7 CAT7 fields. A music agent weights mood highest. A coding agent weights focus. New agents join by defining weights — no protocol changes.
Per-field evaluation
SVAF evaluates each of 7 CMB fields independently. Mood crosses all domain boundaries — even when a CMB is rejected, the mood field is always delivered.
Multi-platform
Node.js (npm), Swift (SPM), Telegram. Same wire protocol. Claude Code, MeloTune, and Telegram bots on the same mesh — locally via Bonjour, globally via relay.
SYM.BOT
SYM.BOT is an independent AI research and product studio based in Scotland. Founded in 2025, we’re pioneering collective intelligence — building the architecture for devices that think together, not just alone. Our research on Mesh Cognition and the Mesh Memory Protocol is deployed across our own products.
We believe small teams with frontier research can outpace organisations a hundred times their size. Every product we ship proves it.
Contact
hello@sym.botFounded
2025 — Scotland, UK
Papers
- arXiv:2604.03955 · cs.MASymbolic-Vector Attention Fusion for Collective Intelligence
- arXiv:2604.10815 · cs.SDMeloTune: On-Device Arousal Learning and Peer-to-Peer Mood Coupling for Proactive Music Curation
- arXiv:2604.19540 · cs.MAMesh Memory Protocol: Semantic Infrastructure for Multi-Agent LLM Systems