Upload notes, docs, journals โ anything you want your AI to know. Persistent memory across every session, every model, every tool.
$ curl -fsSL https://memoryrouter.ai/install.sh | shOne binary. Zero dependencies. No Node, no Python, no package manager.
Or install via npm: npm install -g @memoryrouter/cli
# 1. Authenticate with your free memory key
$ memoryrouter auth mk_your_key_here
# 2. Upload your brain โ docs, notes, code, anything
$ memoryrouter upload ./my-docs/
# 3. That's it. Your AI now knows everything you uploaded.
# Every AI call through MemoryRouter now has access to your vault.
Without uploading, your AI starts with an empty vault and only remembers new conversations going forward.
With uploading, your AI wakes up already knowing everything โ your personal notes, your journal, reference docs, anything you want it to know. It's the difference between amnesia with a notebook vs. waking up as yourself.
# Upload your personal notes
memoryrouter upload ~/notes/
# Upload a journal, reference docs, anything
memoryrouter upload ~/journal/ ~/reference-docs/
# Supports: .md .txt .py .ts .js .json .yaml .csv .html .css and more
memoryrouter auth<key>Authenticate with your MemoryRouter API key. Validates the key against the API and saves it locally.
$ memoryrouter auth mk_abc123
โ Authenticated! Key saved to ~/.memoryrouter/config.json
memoryrouter upload<path> [--session <id>]Upload files or entire directories to your memory vault. Chunks, indexes, and stores everything for instant retrieval during AI conversations.
# Upload a directory
$ memoryrouter upload ./docs/
โ Uploaded 47 files (2.3 MB) to vault
# Upload to a specific session (isolates memory)
$ memoryrouter upload ./project-a/ --session project-a
# Upload a single file
$ memoryrouter upload architecture.md
memoryrouter status[--json]Check your vault stats โ items stored, tokens used, sessions active.
$ memoryrouter status
Vault: 142 items ยท 1.2M tokens ยท 3 sessions
Plan: Free (48.8M tokens remaining)
memoryrouter delete[--session <id>] [-y]Clear your entire vault or a specific session. Use -y to skip confirmation.
# Delete a specific session
$ memoryrouter delete --session project-a
# Clear entire vault
$ memoryrouter delete -y
memoryrouter whoamiShow your current auth info โ which key is active, which endpoint you're connected to.
Sessions let you create isolated memory spaces within a single API key. Perfect for keeping project knowledge separate, or giving different contexts to different conversations.
# Upload project docs to an isolated session
$ memoryrouter upload ./frontend/ -s frontend
$ memoryrouter upload ./backend/ -s backend
# Each session is completely isolated
# AI using "frontend" session won't see backend docs and vice versa
# Delete just one session
$ memoryrouter delete -s frontend
Running OpenClaw? The MemoryRouter plugin gives you openclaw mr upload which automatically finds and uploads your conversation history and workspace โ no paths needed.
But what about knowledge that lives outside your OpenClaw workspace? A personal journal, old project docs, reference material โ things your AI should know but that mr upload won't find. That's where the CLI comes in:
# openclaw mr upload handles your workspace + sessions automatically
$ openclaw mr upload
# Use the CLI for everything else โ notes, journals, reference docs
$ memoryrouter upload ~/journal/
$ memoryrouter upload ~/old-projects/design-specs/
$ memoryrouter upload ~/notes/ideas.md
Together, your AI knows your workspace AND your world. Full OpenClaw setup guide โ
Native plugin. Automatic memory on every conversation.
Set base URL to api.memoryrouter.ai/v1
Add as OpenAI-compatible connection.
Configure as custom provider with memory.
Works with any OpenAI-compatible SDK.
If it supports a custom base URL, it works with MemoryRouter.
๐
macOS
Apple Silicon
๐
macOS
Intel
๐ง
Linux
x64
๐ง
Linux
ARM64
Standalone binary compiled with Bun. No runtime dependencies.
curl -fsSL https://memoryrouter.ai/install.sh | sh50M tokens free. No credit card required.