skills$openclaw/chromadb-memory
msensintaffar7.1k

by msensintaffar

chromadb-memory – OpenClaw Skill

chromadb-memory is an OpenClaw Skills integration for coding workflows. Long-term memory via ChromaDB with local Ollama embeddings. Auto-recall injects relevant context every turn. No cloud APIs required — fully self-hosted.

7.1k stars854 forksSecurity L1
Updated Feb 7, 2026Created Feb 7, 2026coding

Skill Snapshot

namechromadb-memory
descriptionLong-term memory via ChromaDB with local Ollama embeddings. Auto-recall injects relevant context every turn. No cloud APIs required — fully self-hosted. OpenClaw Skills integration.
ownermsensintaffar
repositorymsensintaffar/chromadb-memory
languageMarkdown
licenseMIT
topics
securityL1
installopenclaw add @msensintaffar/chromadb-memory
last updatedFeb 7, 2026

Maintainer

msensintaffar

msensintaffar

Maintains chromadb-memory in the OpenClaw Skills directory.

View GitHub profile
File Explorer
6 files
.
scripts
index.ts
8.3 KB
openclaw.plugin.json
1.5 KB
_meta.json
289 B
manifest.json
462 B
SKILL.md
3.7 KB
SKILL.md

name: chromadb-memory description: Long-term memory via ChromaDB with local Ollama embeddings. Auto-recall injects relevant context every turn. No cloud APIs required — fully self-hosted. version: 1.0.0 author: matts homepage: https://github.com/openclaw/openclaw metadata: openclaw: emoji: "🧠" requires: bins: ["curl"] category: "memory" tags:

  • memory
  • chromadb
  • ollama
  • vector-search
  • local
  • self-hosted
  • auto-recall

ChromaDB Memory

Long-term semantic memory backed by ChromaDB and local Ollama embeddings. Zero cloud dependencies.

What It Does

  • Auto-recall: Before every agent turn, queries ChromaDB with the user's message and injects relevant context automatically
  • chromadb_search tool: Manual semantic search over your ChromaDB collection
  • 100% local: Ollama (nomic-embed-text) for embeddings, ChromaDB for vector storage

Prerequisites

  1. ChromaDB running (Docker recommended):

    docker run -d --name chromadb -p 8100:8000 chromadb/chroma:latest
    
  2. Ollama with an embedding model:

    ollama pull nomic-embed-text
    
  3. Indexed documents in ChromaDB. Use any ChromaDB-compatible indexer to populate your collection.

Install

# 1. Copy the plugin extension
mkdir -p ~/.openclaw/extensions/chromadb-memory
cp {baseDir}/scripts/index.ts ~/.openclaw/extensions/chromadb-memory/
cp {baseDir}/scripts/openclaw.plugin.json ~/.openclaw/extensions/chromadb-memory/

# 2. Get your collection ID
curl -s http://localhost:8100/api/v2/tenants/default_tenant/databases/default_database/collections | python3 -c "import json,sys; [print(f'{c[\"id\"]}  {c[\"name\"]}') for c in json.load(sys.stdin)]"

# 3. Add to your OpenClaw config (~/.openclaw/openclaw.json):
{
  "plugins": {
    "entries": {
      "chromadb-memory": {
        "enabled": true,
        "config": {
          "chromaUrl": "http://localhost:8100",
          "collectionId": "YOUR_COLLECTION_ID",
          "ollamaUrl": "http://localhost:11434",
          "embeddingModel": "nomic-embed-text",
          "autoRecall": true,
          "autoRecallResults": 3,
          "minScore": 0.5
        }
      }
    }
  }
}
# 4. Restart the gateway
openclaw gateway restart

Config Options

OptionDefaultDescription
chromaUrlhttp://localhost:8100ChromaDB server URL
collectionIdrequiredChromaDB collection UUID
ollamaUrlhttp://localhost:11434Ollama API URL
embeddingModelnomic-embed-textOllama embedding model
autoRecalltrueAuto-inject relevant memories each turn
autoRecallResults3Max auto-recall results per turn
minScore0.5Minimum similarity score (0-1)

How It Works

  1. You send a message
  2. Plugin embeds your message via Ollama (nomic-embed-text, 768 dimensions)
  3. Queries ChromaDB for nearest neighbors
  4. Results above minScore are injected into the agent's context as <chromadb-memories>
  5. Agent responds with relevant long-term context available

Token Cost

Auto-recall adds ~275 tokens per turn worst case (3 results × ~300 chars + wrapper). Against a 200K+ context window, this is negligible.

Tuning

  • Too noisy? Raise minScore to 0.6 or 0.7
  • Missing context? Lower minScore to 0.4, increase autoRecallResults to 5
  • Want manual only? Set autoRecall: false, use chromadb_search tool

Architecture

User Message → Ollama (embed) → ChromaDB (query) → Context Injection
                                                  ↓
                                          Agent Response

No OpenAI. No cloud. Your memories stay on your hardware.

README.md

No README available.

Permissions & Security

Security level L1: Low-risk skills with minimal permissions. Review inputs and outputs before running in production.

Requirements

1. **ChromaDB** running (Docker recommended): ```bash docker run -d --name chromadb -p 8100:8000 chromadb/chroma:latest ``` 2. **Ollama** with an embedding model: ```bash ollama pull nomic-embed-text ``` 3. **Indexed documents** in ChromaDB. Use any ChromaDB-compatible indexer to populate your collection.

Configuration

| Option | Default | Description | |--------|---------|-------------| | `chromaUrl` | `http://localhost:8100` | ChromaDB server URL | | `collectionId` | *required* | ChromaDB collection UUID | | `ollamaUrl` | `http://localhost:11434` | Ollama API URL | | `embeddingModel` | `nomic-embed-text` | Ollama embedding model | | `autoRecall` | `true` | Auto-inject relevant memories each turn | | `autoRecallResults` | `3` | Max auto-recall results per turn | | `minScore` | `0.5` | Minimum similarity score (0-1) |

FAQ

How do I install chromadb-memory?

Run openclaw add @msensintaffar/chromadb-memory in your terminal. This installs chromadb-memory into your OpenClaw Skills catalog.

Does this skill run locally or in the cloud?

OpenClaw Skills execute locally by default. Review the SKILL.md and permissions before running any skill.

Where can I verify the source code?

The source repository is available at https://github.com/openclaw/skills/tree/main/skills/msensintaffar/chromadb-memory. Review commits and README documentation before installing.