Launching today

Slm-mesh
Your AI coding sessions can finally talk to each other
31 followers
Your AI coding sessions can finally talk to each other
31 followers
SLM Mesh is an open-source MCP server that gives AI coding agents peer-to-peer communication. SLM Mesh fixes this with 8 MCP tools: - Peer discovery (scoped by machine, directory, or git repo) - Direct messaging + broadcast - Shared key-value state - File locking with auto-expire - Event bus for real-time coordination Works with Claude Code, Cursor, Aider, Windsurf, Codex, VS Code β any MCP-compatible agent. npm install -g slm-mesh






Slm-mesh
brag.fast
I'm curious, is this different than having Claude Code spawn a team of agents?
Slm-mesh
@rob_vbΒ Great question, Rob! They solve fundamentally different problems.
Claude Code's agent teams are orchestrated by a single lead agent β the lead decides what sub-tasks to delegate, spawns workers, and merges their output. It's top-down coordination. The lead agent knows about all the workers because it created them.
SLM Mesh solves the opposite problem β what happens when you have independent sessions that were NOT spawned by the same lead? For example:
You open Claude Code in VS Code to work on the frontend
You open another Claude Code session in your terminal for the backend
Your teammate opens Cursor on the same repo
You have Antigravity running with Gemini for a third task
These sessions have zero awareness of each other. They can't share context, can't coordinate file edits, can't avoid stepping on each other. You become the message bus β copy-pasting context between terminals.
SLM Mesh adds a communication layer between these independent sessions:
Peer discovery β "who else is working on this machine/repo?"
File locking β "don't edit auth.ts, someone else is refactoring it"
Messaging β "I just updated the database schema to v2.1"
Shared state β a key-value store all sessions can read/write
Event bus β subscribe to changes in real-time
It works with any MCP-compatible agent β Claude Code, Cursor, Windsurf, Aider, Codex β even across different AI models. In our demo video, Claude and Gemini are talking to each other through the mesh.
Under the hood: a lightweight broker auto-starts on localhost (SQLite + Unix Domain Sockets for <100ms delivery). Bearer token auth. Auto-shuts down when no sessions remain. 480 tests, 100% coverage.
Think of it this way: Claude Code's teams are like a manager assigning tasks to direct reports. SLM Mesh is like Slack β it lets independent people (or agents) who happen to be working on the same thing coordinate without a central manager.
SLM stands for SuperLocalMemory β the local-first AI memory system. If SuperLocalMemory is the brain (persistent memory), SLM Mesh is the nervous system (real-time communication wires between agent sessions). Both are part of the Qualixar research initiative.
Slm-mesh
@rob_vbΒ
Β Great question, Rob! They solve fundamentally different problems.
Claude Code's agent teams are orchestrated by a single lead agent β the lead decides what sub-tasks to delegate, spawns workers, and merges their output. It's top-down coordination. The lead agent knows about all the workers because it created them.
SLM Mesh solves the opposite problem β what happens when you have independent sessions that were NOT spawned by the same lead? For example:
You open Claude Code in VS Code to work on the frontend
You open another Claude Code session in your terminal for the backend
Your teammate opens Cursor on the same repo
You have Antigravity running with Gemini for a third task
These sessions have zero awareness of each other. They can't share context, can't coordinate file edits, can't avoid stepping on each other. You become the message bus β copy-pasting context between terminals.
SLM Mesh adds a communication layer between these independent sessions:
Peer discovery β "who else is working on this machine/repo?"
File locking β "don't edit auth.ts, someone else is refactoring it"
Messaging β "I just updated the database schema to v2.1"
Shared state β a key-value store all sessions can read/write
Event bus β subscribe to changes in real-time
It works with any MCP-compatible agent β Claude Code, Cursor, Windsurf, Aider, Codex β even across different AI models. In our demo video, Claude and Gemini are talking to each other through the mesh.
Under the hood: a lightweight broker auto-starts on localhost (SQLite + Unix Domain Sockets for <100ms delivery). Bearer token auth. Auto-shuts down when no sessions remain. 480 tests, 100% coverage.
Think of it this way: Claude Code's teams are like a manager assigning tasks to direct reports. SLM Mesh is like Slack β it lets independent people (or agents) who happen to be working on the same thing coordinate without a central manager.
SLM stands for SuperLocalMemory β the local-first AI memory system. If SuperLocalMemory is the brain (persistent memory), SLM Mesh is the nervous system (real-time communication wires between agent sessions). Both are part of the Qualixar research initiative.