Works with any tool that supports custom base URLs

Your Coding Tools, Same Brain

Code in Cursor during the day, chat in OpenClaw at night — unified memory across every tool. No plugins. No patches. Just a base URL.

How It Works

MemoryRouter runs as a proxy between your coding tool and the LLM provider (Anthropic, OpenAI, etc.).

Your ToolMemoryRouterLLM Provider
  1. Your tool sends a request to MemoryRouter instead of directly to the provider
  2. MemoryRouter searches your memory vault for relevant context
  3. Memories are injected into the system prompt — seamlessly, in milliseconds
  4. The request is forwarded to the real provider with your memories included
  5. The response streams back to your tool as normal

Read-Only Keys

Coding sessions generate tons of noise — file diffs, tool calls, error traces. You don't want that in your memory vault. That's why we recommend read-only keys for coding tools.

Same key, different prefix

# Your regular key (read + write)

mk_40a267f...

# Read-only version (just add ro_ after mk_)

mk_ro_40a267f...

Read-only keys inject memories into every request but never store coding sessions back. Your vault stays clean. No new database entry, no dashboard toggle — it's just a naming convention.

Setup Guides

O

OpenAI Codex

$ export OPENAI_BASE_URL=https://api.memoryrouter.ai/v1
$ export OPENAI_API_KEY=mk_ro_40a267f...

That's it. Codex now has access to your entire memory vault.

A

Claude Code

$ export ANTHROPIC_BASE_URL=https://api.memoryrouter.ai/v1
$ export ANTHROPIC_API_KEY=mk_ro_40a267f...

Claude Code picks up the base URL automatically. Your memories flow into every conversation.

C

Cursor

In Cursor Settings → Models → OpenAI API Key:

Base URL: https://api.memoryrouter.ai/v1

API Key: mk_ro_40a267f...

Works with both the OpenAI and Anthropic provider options in Cursor.

W

Windsurf

In Windsurf settings, configure a custom provider:

Base URL: https://api.memoryrouter.ai/v1

API Key: mk_ro_40a267f...

+

Aider, Cody, Continue & Others

Any tool that lets you set a custom base URL works the same way:

Base URL: https://api.memoryrouter.ai/v1

API Key: mk_ro_40a267f...

If it talks to OpenAI or Anthropic, it works with MemoryRouter.

The Big Picture

Every tool feeds into the same memory vault. Chat with your AI assistant at night, code with Cursor during the day — your AI always knows the full picture.

OpenClaw ──── mk_40a267f... ──── read + write

Codex ───── mk_ro_40a267f... ── read only

Claude Code ─ mk_ro_40a267f... ── read only

Cursor ───── mk_ro_40a267f... ── read only

↕ All pointing to the same vault ↕

FAQ

Do I need a separate MemoryRouter account for each tool?

No. One account, one vault. Use your regular key (mk_40a267f...) for tools where you want to store conversations, and the read-only version (mk_ro_40a267f...) for coding tools.

Does this add latency?

Negligible. Memory search runs in milliseconds. The proxy adds a single hop — you won't notice it.

What if I want coding sessions stored too?

Use your regular key (mk_40a267f...) instead of the read-only version. Everything gets stored. We just recommend read-only for coding tools since file diffs and tool calls tend to add noise.

Does this work with my own API keys?

Yes. MemoryRouter forwards requests to the model provider using your credentials. Your API key goes in the x-provider-key header, or you can configure it in your dashboard.