v0.1.0 โ€” macOS & Linux

Give Any AI Memory
from Your Terminal

Upload notes, docs, journals โ€” anything you want your AI to know. Persistent memory across every session, every model, every tool.

Install in 5 seconds
$ curl -fsSL https://memoryrouter.ai/install.sh | sh

One binary. Zero dependencies. No Node, no Python, no package manager.
Or install via npm: npm install -g @memoryrouter/cli

Quick Start

# 1. Authenticate with your free memory key

$ memoryrouter auth mk_your_key_here

# 2. Upload your brain โ€” docs, notes, code, anything

$ memoryrouter upload ./my-docs/

# 3. That's it. Your AI now knows everything you uploaded.

# Every AI call through MemoryRouter now has access to your vault.

Why Uploading Changes Everything

Without uploading, your AI starts with an empty vault and only remembers new conversations going forward.

With uploading, your AI wakes up already knowing everything โ€” your personal notes, your journal, reference docs, anything you want it to know. It's the difference between amnesia with a notebook vs. waking up as yourself.

# Upload your personal notes

memoryrouter upload ~/notes/

# Upload a journal, reference docs, anything

memoryrouter upload ~/journal/ ~/reference-docs/

# Supports: .md .txt .py .ts .js .json .yaml .csv .html .css and more

Commands

memoryrouter auth<key>

Authenticate with your MemoryRouter API key. Validates the key against the API and saves it locally.

$ memoryrouter auth mk_abc123

โœ“ Authenticated! Key saved to ~/.memoryrouter/config.json

memoryrouter upload<path> [--session <id>]

Upload files or entire directories to your memory vault. Chunks, indexes, and stores everything for instant retrieval during AI conversations.

# Upload a directory

$ memoryrouter upload ./docs/

โœ“ Uploaded 47 files (2.3 MB) to vault

# Upload to a specific session (isolates memory)

$ memoryrouter upload ./project-a/ --session project-a

# Upload a single file

$ memoryrouter upload architecture.md

memoryrouter status[--json]

Check your vault stats โ€” items stored, tokens used, sessions active.

$ memoryrouter status

Vault: 142 items ยท 1.2M tokens ยท 3 sessions

Plan: Free (48.8M tokens remaining)

memoryrouter delete[--session <id>] [-y]

Clear your entire vault or a specific session. Use -y to skip confirmation.

# Delete a specific session

$ memoryrouter delete --session project-a

# Clear entire vault

$ memoryrouter delete -y

memoryrouter whoami

Show your current auth info โ€” which key is active, which endpoint you're connected to.

Sessions โ€” Isolated Memory Vaults

Sessions let you create isolated memory spaces within a single API key. Perfect for keeping project knowledge separate, or giving different contexts to different conversations.

# Upload project docs to an isolated session

$ memoryrouter upload ./frontend/ -s frontend

$ memoryrouter upload ./backend/ -s backend

# Each session is completely isolated

# AI using "frontend" session won't see backend docs and vice versa

# Delete just one session

$ memoryrouter delete -s frontend

OpenClaw Users ๐Ÿฆž

Running OpenClaw? The MemoryRouter plugin gives you openclaw mr upload which automatically finds and uploads your conversation history and workspace โ€” no paths needed.

But what about knowledge that lives outside your OpenClaw workspace? A personal journal, old project docs, reference material โ€” things your AI should know but that mr upload won't find. That's where the CLI comes in:

# openclaw mr upload handles your workspace + sessions automatically

$ openclaw mr upload

# Use the CLI for everything else โ€” notes, journals, reference docs

$ memoryrouter upload ~/journal/

$ memoryrouter upload ~/old-projects/design-specs/

$ memoryrouter upload ~/notes/ideas.md

Together, your AI knows your workspace AND your world. Full OpenClaw setup guide โ†’

Works with Everything

๐Ÿค– OpenClaw

Native plugin. Automatic memory on every conversation.

โšก Cline

Set base URL to api.memoryrouter.ai/v1

๐ŸŒ Open WebUI

Add as OpenAI-compatible connection.

๐Ÿ“ Continue.dev

Configure as custom provider with memory.

๐Ÿ Python / LangChain

Works with any OpenAI-compatible SDK.

๐Ÿ”ง Any Tool

If it supports a custom base URL, it works with MemoryRouter.

Platforms

๐ŸŽ

macOS

Apple Silicon

๐ŸŽ

macOS

Intel

๐Ÿง

Linux

x64

๐Ÿง

Linux

ARM64

Standalone binary compiled with Bun. No runtime dependencies.

curl -fsSL https://memoryrouter.ai/install.sh | sh

Get your free memory key โ†’

50M tokens free. No credit card required.

GitHub ยท npm ยท Releases