Skip to main content

OpenClaw Compatibility

LocalGPT started as a spiritual successor to OpenClaw, built from scratch in Rust. While the two projects have since diverged into fundamentally different tools, LocalGPT uses the same workspace file formats and conventions, so existing OpenClaw data can be reused.

Config

OpenClaw uses ~/.openclaw/config.json5. LocalGPT uses ~/.config/localgpt/config.toml.

Create your LocalGPT config manually. Here is a mapping of the most common settings:

OpenClaw (config.json5)LocalGPT (config.toml)
agents.defaults.modelagent.default_model
agents.defaults.workspacememory.workspace
agents.defaults.contextWindowagent.context_window
models.openai.apiKeyproviders.openai.api_key
models.anthropic.apiKeyproviders.anthropic.api_key

Example LocalGPT config:

[agent]
default_model = "claude-cli/opus"
context_window = 128000

[providers.anthropic]
api_key = "${ANTHROPIC_API_KEY}"

[providers.claude_cli]
command = "claude"

Run localgpt config show to verify your configuration after creating the file.

Workspace files

OpenClaw workspace files are plain Markdown and fully compatible. Copy them directly:

cp -r ~/.openclaw/workspace/* ~/.local/share/localgpt/workspace/

This includes:

FilePurpose
MEMORY.mdLong-term curated knowledge
HEARTBEAT.mdPending autonomous tasks
SOUL.mdPersona and tone guidance
USER.mdUser profile
IDENTITY.mdAgent identity
TOOLS.mdTool notes
AGENTS.mdOperating instructions
memory/*.mdDaily logs
knowledge/**/*.mdKnowledge repository
skills/*/SKILL.mdCustom skills

LocalGPT will rebuild the memory index automatically on first run.

Session data

Session transcripts and metadata can be copied as-is:

cp -r ~/.openclaw/agents ~/.local/share/localgpt/agents

This preserves your conversation history, session IDs, and CLI session mappings.

Key differences

LocalGPT takes a different approach from OpenClaw in several areas:

  • Bridge-based integrations — Telegram, Discord, and WhatsApp are supported via standalone bridge binaries in the bridges/ directory, connected to the daemon via secure IPC (rather than built-in channels)
  • No plugin/extension system — LocalGPT uses a simpler skills-based approach
  • No gateway routing — single-agent, local-first design with bridge daemons instead of a multi-channel gateway
  • Embedded web UI — browser-based chat interface served directly from the binary, plus an optional desktop GUI (egui)
  • No subagent spawning — single "main" agent

Everything else — memory, heartbeat, skills, session management — works the same way.

📝 These docs are AI-generated on a best-effort basis and may not be 100% accurate. Found an issue? Please open a GitHub issue or edit this page directly to help improve the project.