Composable multi-agent runtime for software engineering
One Runtime.
Every Agent.
PanCode is a composable coding agent runtime for multi-agent orchestration across local LLMs, cloud APIs, and mixed fleets from one serious terminal control plane.
Not a chatbot. Not a plugin. The runtime. PanCode is not just a Claude Code alternative. It is the orchestration layer above Claude Code, Codex CLI, Gemini CLI, and local workers.
$ pancode --preset hybrid
booting control plane...
discover scanning PATH for runtimes
ok claude-code CLI anthropic/claude-sonnet-4
ok codex CLI openai/codex-mini
ok gemini-cli CLI google/gemini-2.5-pro
ok opencode CLI anthropic/claude-sonnet-4
ok native-worker NATIVE local/llama3.3-70b
load domains
agents dispatch safety prompts session observability scheduling configure ui
summary 7 adapters 10 domains receipts on budget live
ready pancode v0.3.0-exp
[build] dispatch ready - 7 Runtime Adapters
- 10 Composable Domains
- 4 Orchestrator Modes
- 35+ Commands
The Pan Taxonomy
Multi-agent orchestration needs a shared operating model.
PanCode does not win by being one more agent. It wins by giving heterogeneous agent fleets one runtime, one policy model, and one operational surface for local LLM orchestration and cross-provider dispatch.
Pan-provider
Route local engines, frontier APIs, or hybrid fleets without standardizing on a single vendor.
Pan-model
Match model capability, latency, and cost to the task instead of hard-wiring one model into the workflow.
Pan-runtime
Run native workers, SDK-backed agents, or headless CLIs under one worker contract and one control plane.
Pan-agent
Discover installed coding agents automatically and operate them as a coordinated fleet instead of isolated tools.
Pan-safety
Apply a shared safety model across runtimes so scope, privileges, and escalation rules stay coherent.
Pan-observe
Track cost, tokens, turns, receipts, and runtime status from the same terminal session that launches the work.
How PanCode Works
A coding agent runtime from discovery to observability.
The system is designed like infrastructure, not like a chat window. Each step narrows ambiguity and increases operator control across multi-agent orchestration, safety policy, and agent dispatch.
- 01
Discover agents
Scan the machine for installed runtimes and register each worker with one consistent capability model.
- 02
Configure the fleet
Adjust presets, safety, models, and budgets through conversation instead of bouncing across config files.
- 03
Dispatch with control
Route work by capability, cost, and privilege scope while keeping worker execution isolated and reproducible.
- 04
Observe everything
Read receipts, cost, runtime health, and status updates from one terminal-native control surface.
Feature Highlights
The capabilities that make the runtime defensible.
PanCode is opinionated where the orchestration layer has to be opinionated: policy, dispatch, and stateful control over the fleet.
Constitutional Prompt System
Prompt policy is compiled before any worker sees the task.
PanCode assembles prompts from audited fragments, then runs boundary and constitutional checks before execution. Modes, safety scopes, and runtime adapters inherit the same behavioral contract instead of drifting apart over time.
$ npm run check-prompts
[pass] fragments resolved: mode/build + safety/auto-edit
[pass] constitutional references intact
[pass] worker prompt compiled Dispatch Pipeline
Every run leaves behind a receipt you can reason about.
Dispatch is not a black box. PanCode records worker identity, scope, tokens, cost, and outcomes so operators can understand what happened, replay decisions, and tune the fleet with evidence instead of guesswork.
/dispatch reviewer src/core/config.ts
run rec_20260323_1841
worker reviewer.codex readonly
cost $0.14 28944 tokens
result receipt stored in .pancode/runtime/results/ Conversational Configuration
Change the runtime in plain English and keep the state on disk.
Presets, worker models, budgets, and orchestration behavior can be adjusted through conversation, then persisted to the same project and user configuration files PanCode already owns. The interface feels immediate because it is grounded in real runtime state.
You: "Switch the worker pool to local codellama"
PanCode: updated preset.hybrid.workerModel
PanCode: wrote ~/.pancode/panpresets.yaml
PanCode: reloading worker pool Architecture
Five layers. Clear boundaries.
The stack is composable by design. CLI entry, core orchestration, and runtime integration stay separate so the system scales without becoming opaque.
- 01
CLI
Commands, session control, and the terminal entry surface for operators.
- 02
Entry
Boot orchestration, environment setup, and runtime startup sequencing.
- 03
Core
Config, presets, event bus, agent profiles, and domain loading.
- 04
Domains
Safety, prompts, dispatch, observability, scheduling, session, and UI capabilities.
- 05
Engine
Runtime adapters and the sole boundary for SDK-facing integration.
Engine imports stay behind a single boundary. Worker processes stay isolated. Domains load in dependency order and communicate through an event bus instead of mutating each other directly.
Quick Start
Local LLM fleet management in 60 seconds.
Install the runtime, boot a local preset, and manage a local LLM fleet without giving up receipts, safety modes, or provider-agnostic dispatch. Node.js 20+ and tmux are the only prerequisites.
$ npm install -g pancode
$ pancode --preset local
$ pancode up