Faster context. Fewer tokens.

Your AI writes docs directly to NodeSpace. You browse and edit them in the desktop app. Both query the same semantic search - locally, instantly.

Runs entirely on your machine. No cloud accounts, no API calls, no data leaving localhost. Open source. Inspect the code yourself.

80% Fewer Roundtrips
50% Token Savings

*Benchmarked against grep/ripgrep on local markdown docs

NodeSpace desktop app showing structured documentation with semantic search NodeSpace desktop app showing structured documentation with semantic search

The context problem

Manual copy/paste

You re-explain your architecture every session. Context compacts, knowledge disappears.

Grep / ripgrep

Keyword-only. Multiple roundtrips. Burns tokens searching for what might not even match.

External MCP tools

Notion, Linear, Jira - API calls for every lookup. No semantic search, just keyword matching.

NodeSpace: local semantic search. One config line. Your AI finds the right context in milliseconds.

How It Works

1

Install the desktop app

Embeddings and database built-in. No cloud account, no API keys, no setup.

2

Add to your AI config

Add NodeSpace to Claude Code, Cursor, or any MCP-compatible assistant:

"nodespace": {
  "type": "http",
  "url": "http://localhost:3100/mcp"
}
3

Two-way knowledge flow

Import existing docs, or let your AI write directly to NodeSpace. Your knowledge base grows as you work.

Desktop app for humans to browse and edit. MCP server for your AI to read and write. Both work together.

What You Get

Instant queries, no API costs

NodeSpace runs locally. No per-request API calls, no token overhead for context lookup. Works offline, on planes, behind VPNs.

Find context by meaning

Ask "Where do we handle authentication?" and get the right files - even if they don't mention "auth" in the name. Semantic search, not keyword matching.

Works with your AI tools

Drop-in integration via MCP protocol. Claude Code, Cursor, Continue - no switching editors or changing workflows.

Bring your own AI

Compatible with any AI provider - OpenAI, Anthropic, local models. NodeSpace provides the context, you choose the model.

What's Coming

Team Sync

Share your knowledge base across your product team. Everyone's AI stays informed, no manual syncing.

Playbooks

Define reusable workflows that execute automatically. Describe what you want, NodeSpace runs it on triggers you set.

Early access users help shape what we build next.

See It In Action

Same question, two approaches - watch the difference

Without NodeSpace grep/ripgrep · 6 calls · 41.6K tokens
With NodeSpace semantic search · 1 call · 31.2K tokens

Get Early Access