Watches. Indexes. Surfaces.
A quiet memory for every AI on your Mac.
Beside captures what you do on your computer, indexes it into a self-organising knowledge base, quietly surfaces what matters, and remembers it as long-term context for every AI agent you use — 100% on your machine, every line of code open source.
Your day, inked into a wiki you can actually read.
Beside watches, then quietly writes your day into a Markdown wiki on your disk — topics, tags, and decisions, all under ~/beside/wiki. No filing. No formatting. No forgetting.
- Pages re-organise themselves as your work evolves
- Plain Markdown — grep it, edit it, version it
- Tags emerge from your actual signals, not a fixed schema
It captures, indexes, surfaces — and remembers.
Four quiet loops, running in the background of your machine. Together they turn every signal on your computer into structured memory your AI agents can actually use.
Watches every app, quietly.
Screenshots, active window, URLs, idle state — appended locally with negligible overhead.
Shapes raw signals into knowledge.
A local model extracts entities and topics, then continuously refactors the wiki.
Pins the moments that matter.
Patterns, follow-ups, half-finished threads — Beside surfaces them when you'll need them.
Remembers it — for every AI you use.
Claude, Cursor, ChatGPT — any MCP agent — gets persistent context, on demand.
Your memory, instantly available to every AI you use.
Beside speaks MCP. Plug it into Claude, Cursor, ChatGPT — or any MCP-compatible agent — and ask the questions you'd normally have to dig through six apps to answer.
The agent runs the query, Beside pulls the matching context from your local knowledge base, and the answer comes back grounded in what you actually did this week. Your raw data never leaves your machine.
- “What are my open items?”
- “Summarise this week with Acme.”
- “What did we decide on pricing?”
- “Draft a follow-up from yesterday's call.”
An always-on context layer, tuned for the AI age.
LLMs forget. Agents start from zero. Beside is the quiet layer in between — continuously turning what you actually do on your computer into recallable memory that any tool can use.
100% local-first
Captures, embeddings, and indexes live on your disk as JSONL + SQLite. Bring your own model — Ollama, llama.cpp, OpenAI, Anthropic — or run fully offline.
Open source · MIT
Every capture path, every prompt, every byte we touch is auditable on GitHub. Fork it, extend it, self-host it. No black boxes.
Silent capture
Screenshots, active window, URLs, idle state — captured locally with negligible overhead. Nothing leaves your machine unless you say so.
Self-organising knowledge
A local model turns captures into structured notes, topics, and timelines. The wiki re-organises itself as your work evolves.
Proactive surfacing
Beside watches for the moments that matter — patterns, follow-ups, half-finished threads — and quietly pins them where you'll see them.
Memory for any agent
Ship rich context to Claude, ChatGPT, Cursor and any MCP-compatible agent — so they remember yesterday, last week, last quarter.
From captured pixels to living memory.
The same four loops, in technical detail. Each stage is a swappable plugin — capture, storage, model, index, export.
- 1
Capture
The capture layer records screenshots, focused windows, URLs and idle events — running silently in the background with negligible overhead.
- 2
Store
Raw events are appended to immutable JSONL + SQLite locally. Nothing is destructive; everything is replayable.
- 3
Index & surface
A local LLM extracts entities, topics, and intents, continuously refactors the wiki, and surfaces patterns worth your attention.
- 4
Recall
Expose your memory to any AI agent over MCP, Markdown, or a simple API. Context engineering, finally automated.
Give your AI a memory worth keeping.
Install Beside once and every AI tool you use gets quietly smarter about you.
Windows & Linux — coming soon.