Skip to content

Instantly share code, notes, and snippets.

@48Nauts-Operator
Created January 29, 2026 18:23
Show Gist options
  • Select an option

  • Save 48Nauts-Operator/0c72802380ea03ec6b87c8ca8ff21b29 to your computer and use it in GitHub Desktop.

Select an option

Save 48Nauts-Operator/0c72802380ea03ec6b87c8ca8ff21b29 to your computer and use it in GitHub Desktop.
moltbot-memory-local: Privacy-first unified memory plugin (SQLite + local embeddings)

Your AI's Memory Shouldn't Phone Home

A Privacy-First Local Memory Plugin for Moltbot

Andre Wolke · January 29, 2026


The Problem

Most AI memory plugins leak your data to cloud APIs. Even "local" vector databases send your text to OpenAI for embedding before storing it locally.

Your private thought → OpenAI's servers → Vector → Local storage
                            ↑
                     Privacy leak here

I fixed it.

The Solution: moltbot-memory-local

One plugin. Two search modes. Zero cloud calls.

npm install moltbot-memory-local

Combines:

  • SQLite for structured storage, timestamps, full-text search
  • LanceDB + local embeddings for semantic similarity search
  • Smart routing that picks the right backend automatically

Everything runs locally on your machine.

How It Works

Automatic Query Routing

The plugin detects what you're asking and routes to the right backend:

Query Detected As Backend
"What did you do Thursday at 14:04?" Temporal SQLite
"Find conversations about dark mode" Semantic Vector search
"What is my email address?" Exact lookup SQLite
"Similar ideas to project X" Similarity Vector search
"What happened last week?" Temporal SQLite

You don't have to think about it. Just ask.

Manual Override

When you need control:

// Force semantic search
await memory_recall({ query: "...", mode: "semantic" });

// Force structured search  
await memory_recall({ query: "...", mode: "structured" });

// Let plugin decide (default)
await memory_recall({ query: "...", mode: "auto" });

Configuration

{
  "plugins": {
    "slots": {
      "memory": "moltbot-memory-local"
    },
    "entries": {
      "moltbot-memory-local": {
        "enabled": true,
        "config": {
          "dataDir": "~/.moltbot/memory",
          "maxMemories": 10000,
          "embeddingModel": "Xenova/all-MiniLM-L6-v2",
          "enableEmbeddings": true
        }
      }
    }
  }
}

Options

Option Default Description
dataDir ~/.moltbot/memory Where data is stored
maxMemories 10000 Max before auto-pruning
embeddingModel Xenova/all-MiniLM-L6-v2 Local embedding model
enableEmbeddings true Enable semantic search

Usage Examples

Store a Memory

await memory_store({
  text: "User prefers dark mode in all applications",
  category: "preference",
  importance: 0.9
});

Stored in both SQLite (structured) and LanceDB (vector). One call.

Recall by Time

// "What happened last Thursday?"
const memories = await memory_recall({
  query: "what happened last Thursday",
  limit: 10
});
// → Routed to SQLite, searches by timestamp

Recall by Similarity

// "Find things similar to dark mode preferences"
const memories = await memory_recall({
  query: "display and theme preferences",
  limit: 5
});
// → Routed to vector search, finds semantically similar

Recall with Filters

const decisions = await memory_recall({
  query: "project architecture",
  category: "decision",
  dateFrom: "2025-01-01",
  dateTo: "2025-01-31"
});

Forget (GDPR-Compliant)

// By ID
await memory_forget({ memoryId: "uuid-here" });

// By query (deletes from BOTH backends)
await memory_forget({ query: "sensitive information" });

Architecture

┌─────────────────────────────────────────────────────────────┐
│                    moltbot-memory-local                      │
├─────────────────────────────────────────────────────────────┤
│                                                              │
│   ┌──────────────────┐      ┌──────────────────┐           │
│   │     SQLite       │      │     LanceDB      │           │
│   │  ──────────────  │      │  ──────────────  │           │
│   │  Full text       │      │  Vector store    │           │
│   │  Timestamps      │      │  Local embeddings│           │
│   │  Metadata        │      │  Semantic search │           │
│   │  Categories      │      │                  │           │
│   └────────┬─────────┘      └────────┬─────────┘           │
│            │                         │                      │
│            └──────────┬──────────────┘                      │
│                       │                                     │
│              ┌────────▼────────┐                           │
│              │  Query Router   │                           │
│              │  ────────────── │                           │
│              │  Temporal? →    │ → SQLite                  │
│              │  Semantic? →    │ → Vectors                 │
│              │  Both?     →    │ → Merge results           │
│              └─────────────────┘                           │
│                                                              │
└─────────────────────────────────────────────────────────────┘
         ❌ No cloud     ✅ 100% Local     ✅ Your data

Data Storage

~/.moltbot/memory/
├── memories.db      # SQLite (structured data, timestamps, full text)
└── vectors/         # LanceDB (embeddings for semantic search)

Both are local files. Back them up, move them, delete them — you're in control.

Why Not Just Vectors? Why Not Just SQLite?

Question Type SQLite Vectors
"Thursday at 14:04?" ✅ Perfect ❌ Can't do time
"Similar to X?" ❌ Exact match only ✅ Perfect
"My email address?" ✅ Perfect 🟡 Works but overkill
"What did I decide about Y?" 🟡 If keywords match ✅ Understands context

You need both. This plugin gives you both.

Fallback Behavior

  • If LanceDB fails → falls back to SQLite-only search
  • If embeddings disabled → SQLite full-text search only
  • If embedding fails for a memory → stored in SQLite, skipped in vectors

The system degrades gracefully. You never lose data.

The Privacy Stack

Your memory stays yours:

  ✅ Text stored locally (SQLite)
  ✅ Embeddings computed locally (Transformers.js)
  ✅ Vectors stored locally (LanceDB)
  ✅ Search runs locally
  ✅ No telemetry, no cloud calls, no exceptions

  ❌ Nothing leaves your machine

Links


Your AI's memory shouldn't phone home. Now it doesn't.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment