You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: ADR-001-tools-over-pipeline.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@
6
6
7
7
## Decision
8
8
9
-
Palinode's primary value is its **17 MCP tools + file-based storage + git versioning**, not its 4-phase injection pipeline. The pipeline is scaffolding for current models. The tools survive any model upgrade.
9
+
Palinode's primary value is its **18 MCP tools + file-based storage + git versioning**, not its 4-phase injection pipeline. The pipeline is scaffolding for current models. The tools survive any model upgrade.
Copy file name to clipboardExpand all lines: docs/LAUNCH-POSTS-FINAL.md
+15-19Lines changed: 15 additions & 19 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,7 +13,7 @@ Reply to: https://x.com/karpathy/status/2039805659525644595
13
13
14
14
I've been building exactly this.
15
15
16
-
Palinode: git-versioned markdown memory for AI agents. Hybrid BM25+vector search. 17 MCP tools. Deterministic compaction.
16
+
Palinode: git-versioned markdown memory for AI agents. Hybrid BM25+vector search. 18 MCP tools. Deterministic compaction.
17
17
18
18
The part I added? git blame on every fact your agent knows.
19
19
@@ -40,39 +40,35 @@ Same philosophy. With provenance.
40
40
41
41
**Tweet 4 (with status screenshot):**
42
42
43
-
227 files. 2,230 chunks indexed. 56 tests. 17 MCP tools. No cloud. No external DB. SQLite-vec + FTS5 + BGE-M3.
43
+
227 files. 2,230 chunks indexed. 92 tests. 18 MCP tools. No cloud. No external DB. SQLite-vec + FTS5 + BGE-M3.
44
44
45
45
Runs on a single box. MIT license.
46
46
47
47
github.com/Paul-Kyle/palinode
48
48
49
49
---
50
50
51
-
## Show HN (Monday ~8am ET / 5am PT)
51
+
## Show HN (Wednesday ~9am PT)
52
52
53
53
**Title:**
54
-
Show HN: Palinode – Persistent agent memory as plain markdown with git provenance
54
+
Show HN: Palinode – Git-versioned markdown memory for AI agents
55
55
56
-
**Body:**
57
-
Palinode is persistent memory for AI agents. Your agent's memory is a folder of markdown files — typed (people, projects, decisions, insights), git-versioned, and searchable with hybrid BM25 + vector.
56
+
**Body (post as first comment):**
57
+
I was using Mem0 for agent memory and found myself SSHing into a box to grep Qdrant vectors trying to figure out when my agent learned something wrong. I wanted `git blame` for agent memory. So I built it.
58
58
59
-
The architecture is simple: markdown files are truth, SQLite (vec + FTS5) is a derived index, and every interface (MCP, REST API, CLI, agent plugin) hits the same backend. Set up on a server, connect from any IDE on any machine.
59
+
Palinode stores your agent's memory as typed markdown files (people, projects, decisions, insights) with YAML frontmatter. A file watcher indexes them into SQLite-vec + FTS5 for hybrid search. Weekly, an LLM proposes structured compaction ops (KEEP/UPDATE/MERGE/SUPERSEDE/ARCHIVE) and a deterministic executor applies them — the LLM never writes files directly. Every compaction is a git commit.
60
60
61
-
What made me build this:
61
+
The part I think is actually new: diff, blame, rollback, and push are MCP tools your agent can call. Not just git-compatible files — the agent can trace any fact back to the session that recorded it, or revert a bad compaction, without you touching a terminal.
62
62
63
-
I was using Mem0, then found myself grepping the Qdrant vectors trying to figure out when my agent learned something wrong. I wanted `git blame` for agent memory. So I built it.
63
+
Architecture is dumb on purpose. Markdown files are truth. SQLite is a derived index. If everything crashes, `cat` still works. One API server, one .db file, one directory.
64
64
65
-
What's different:
65
+
It runs on a homelab box. I connect from two laptops over Tailscale. The MCP server is a pure HTTP client with no state — works with Claude Code, Cursor, Zed, anything that speaks MCP. Same 18 tools are also available as a REST API and CLI.
66
66
67
-
-**Files are truth** — if every service crashes, `cat` still works. Rebuild the index anytime.
68
-
-**Git operations as agent tools** — diff, blame, rollback, push are callable MCP tools, not just CLI conveniences.
69
-
-**Deterministic compaction** — an LLM proposes structured ops (KEEP/UPDATE/MERGE/SUPERSEDE/ARCHIVE), a deterministic executor applies them and commits. The LLM never writes files directly.
70
-
-**One backend, every interface** — MCP server (Streamable HTTP or stdio), REST API, CLI, OpenClaw plugin. Same 18 tools everywhere. Connect Claude Code, Zed, Cursor, Claude Desktop, or your own scripts.
71
-
-**No infrastructure** — SQLite-vec + FTS5 + Ollama. No Postgres, no Redis, no cloud. One directory, one .db file, one API server.
67
+
Karpathy's knowledge-base gist (https://gist.github.com/karpathy/442a6bf555914893e9891c11519de94f) articulated a lot of what I was already building toward — particularly the raw/compiled split, which maps directly to Palinode's ingest/consolidate cycle. This is one working implementation of those ideas.
72
68
73
-
Stack: Python 3.11+, BGE-M3 embeddings via Ollama, any chat model for consolidation. 56 tests. MIT.
69
+
What it doesn't do: no auto-injection into arbitrary LLM calls (you need an MCP client or to call the API). No multi-user. No cloud hosted version. It's a personal tool for one human and their agents.
74
70
75
-
I run it on a homelab box and connect from two laptops over Tailscale. The MCP server is a pure HTTP client — it holds no state, just proxies to the API.
71
+
Python 3.11+, BGE-M3 via Ollama, any chat model for consolidation. 92 tests. MIT.
76
72
77
73
https://github.com/Paul-Kyle/palinode
78
74
@@ -100,7 +96,7 @@ It works from any IDE — the MCP server runs over Streamable HTTP, so Claude Co
100
96
101
97
Browse your agent's brain in Obsidian. If Ollama dies, cat and grep still work.
@@ -116,6 +112,6 @@ The architecture: markdown files → SQLite-vec + FTS5 hybrid index → 4-phase
116
112
117
113
The design bet: files are the source of truth, everything else is a derived index. One backend, multiple interfaces (MCP server over Streamable HTTP, REST API, CLI, OpenClaw plugin). 18 tools work identically across Claude Code, Zed, Cursor, or shell scripts.
0 commit comments