The Universal Code Context Engine
Local-first MCP sidecar that gives AI coding assistants intelligent, structured code context — reducing tokens by 92%, measured while eliminating hallucinations.
No credit card required · Pro from $9.90/mo
The Context Bottleneck
AI coding assistants are only as good as the context they receive. Too little context causes hallucinations; too much wastes tokens and money.
The Problem
AI makes up APIs, invents non-existent functions, and suggests incorrect patterns when it lacks project context.
Dumping entire codebases into context windows costs $0.50+ per query and hits token limits on complex projects.
Grep-based tools find text matches but miss semantic relationships, call chains, and dependency context.
The Solution
Build an in-memory graph of imports, calls, and type relationships. Navigate code by structure, not just text.
LanceDB-powered embeddings find files by meaning, not just name. "Where is authentication handled?" → instant results.
tree-sitter strips function bodies, leaving only signatures and types. 200-line files → 15 lines of pure interface.
Everything You Need for Intelligent Context
Nine core capabilities that transform how AI assistants understand your codebase.
Hybrid Search
Vector embeddings (semantic) + dependency graph (structural) combined. Find files by meaning OR by relationship.
LanceDB-powered semantic search finds code by concept — "authentication logic" returns the right files. Graph-based search traverses imports and calls to find related code.
Code Skeletonization
Compress files to signature-only views. 200-line file → 15 lines. Class bodies, function implementations stripped.
Uses tree-sitter AST parsing to strip function bodies, keeping only signatures, types, imports, and class declarations. Preserves all the information an AI needs without the noise.
Call Graph Traversal
Bidirectional: find who calls a function (callers) or what it depends on (callees). Configurable depth for transitive analysis.
Navigate the dependency graph in both directions. Find all callers of a function to understand impact, or trace callees to understand behavior. Configurable depth from 1 to N hops.
Auto Setup
One command configures all your AI tools. Detects Claude Code, Cursor, VS Code, Windsurf, and 9 more clients automatically.
Run `ctxloom setup` and the interactive wizard scans your system for installed MCP clients, shows what was found, and configures them with a single confirmation. Supports 13 AI coding tools across config files, CLI binaries, and app bundles.
Slash Commands
Direct tool invocation with /ctx_search, /ctx_get_file, etc. Skip the AI middleman for instant, deterministic results.
Type /ctx_search auth code to search directly, or /ctx_get_context_packet src/auth.ts for instant context. Works in Claude Desktop, custom MCP clients, and CLI. No AI interpretation step — pure speed.
Live File Watching
Chokidar-based watcher with 200ms debounce. Index updates in <2 seconds when files change. No manual re-indexing.
Automatically detects file changes, creations, and deletions. Debounced at 200ms to batch rapid saves. Incremental re-indexing completes in under 2 seconds.
Rule Injection
Automatically loads .cursorrules, CLAUDE.md, CONTEXT.md, .ctxloomrc. Project conventions injected into AI context automatically.
Scans for project convention files and injects them into AI context on every request. Ensures AI assistants always follow your team's coding standards and architectural decisions.
Path Security
PathValidator prevents CWE-22 path traversal attacks. All file access validated against project root. Symlink escape prevention.
Every file path is resolved and validated against the project root before access. Prevents directory traversal attacks (../ escapes), symlink escapes, and unauthorized file reads.
Architecture Rules
Enforce import boundaries as CI lint. Define no-import rules in .ctxloom/rules.yml and catch architectural violations before they merge.
Rules are checked against the live dependency graph via picomatch globs. CLI exits 1 on error violations, 0 on clean. MCP tool ctx_rules_check brings the same engine to your AI assistant — config reloaded per call, no restart needed.
Your codebase, structured
Three lenses on your code — structural dependencies, git risk scores, and blast radius — all from a single index.
32 tools. Five categories. One install.
The most complete code context engine for AI assistants — search, graph analysis, navigation, review, and automation in a single MCP server.
ctx_search
Hybrid semantic + graph search. Vector similarity + import graph expansion combined for intelligent file discovery.
Parameters
| Name | Type | Req. | Description |
|---|---|---|---|
| query | string | yes | Natural language or code symbol query |
| limit | number | opt | Max results (default: 10) |
Input
Output
Direct Slash Commands
Skip the AI middleman. Invoke any tool directly — instant, deterministic, zero latency.
/ctx_search auth logic/ctx_blast_radius src/auth/controller.ts/ctx_execution_flow handlePayment/ctx_detect_changes/ctx_get_workflow reviewThe risk your static graph can't see
Two new tools fuse git history onto the structural graph — surfacing coupling, churn, and ownership signals invisible to AST analysis.
Risk-scored reviews on every PR
ctxloom-bot posts automated structural analysis the moment a PR opens — blast radius, risk scores, and reviewer suggestions. No manual invocation.
## ctxloom analysis · risk: high (0.81)
📊 Blast radius: 14 files · 3 communities
🔥 Top risk: src/auth/controller.ts (churn: high, bus factor: 1)
👥 Suggested reviewers: @alice, @bob (ownership match)
- Posts on every PR — idempotent, updates on new commits
- Inline annotations at the highest-risk lines
- Suggests reviewers from git ownership data
- Slash commands: /ctxloom explain, /ctxloom ignore
- Optional Check Run — gate merges on risk threshold
Historical coupling the static graph misses
Mines your git log to surface files that change together even when they have no import relationship — the hidden couplings that break refactors.
ctx_git_couplingCo-change analysis — files that historically move together (Jaccard + recency decay)
node: "src/auth/controller.ts"
ctx_risk_overlayComposite risk score: churn 35% + bug density 30% + bus factor 20% + coupling 15%
92% token reduction. Measured, not estimated.
Run on 5 real open-source repos. Reproduce it yourself: npm run bench:repos
average token reduction
| Repository | Reduction |
|---|---|
expressjs/express | 92% |
sindresorhus/got | 93% |
SergioBenitez/Rocket | 93% |
fastify/fastify | 91% |
| Average | 92% |
How ctxloom compares
| Feature | ctxloom | Others |
|---|---|---|
| Zero Python | ✅ Pure JS/TS | ❌ Python required |
| Local-first (no cloud) | ✅ | varies |
| Blast radius analysis | ✅ ctx_blast_radius | ❌ |
| Community / cluster detection | ✅ Louvain (pure JS) | ❌ |
| Execution flow tracing | ✅ ctx_execution_flow | ❌ |
| Refactor rename preview | ✅ ctx_refactor_preview | ❌ |
| Wiki generation (no LLM) | ✅ ctx_wiki_generate | ❌ |
| PR-native GitHub App | ✅ ctxloom-bot | ❌ |
| 13 languages (AST) | ✅ | varies |
| 92% token reduction | ✅ measured | ❌ estimated |
npx tsx benchmarks/benchmark.tsBuilt for Speed & Intelligence
Five-layer architecture designed for sub-second context retrieval with live indexing.
Context Engine
The core intelligence layer combining three specialized engines for comprehensive code understanding.
Components
Up and Running in 30 Seconds
Install, index, and connect. No API keys, no cloud accounts, no configuration required.
Install ctxloom
One-time global install. Node.js 20+ required.
npm install -g ctxloom-proActivate your license
Start a free 7-day trial — no credit card required. You'll receive a license key by email; paste it into the command below.
ctxloom activate CTXL_PRO_-<your-key>Connect your AI tools
Run the setup wizard once — it detects every MCP-compatible tool on your machine and writes the config automatically.
ctxloom setup+ 5 more: Codex CLI, Kimi, Qwen Code, JetBrains AI, Claude Desktop
Index your project
Run once per project. Builds the dependency graph, indexes symbols, and generates vector embeddings. Takes 5–15 seconds on a typical mid-size repo.
cd /path/to/your/project
ctxloom indexOpen your AI tool — you're done
ctxloom starts automatically when your AI tool connects. No terminal to keep open. All 31 tools are instantly available.
No API keys. Everything runs locally on your machine.
No cloud. Code never leaves your environment.
Auto-updates. The file watcher keeps the graph in sync as you code.
MCP Client Configuration
Recommended: Run ctxloom setup to auto-configure all detected clients. The manual configs below are for reference or if you prefer to configure by hand.
{
"mcpServers": {
"ctxloom": {
"command": "npx",
"args": ["-y", "ctxloom"],
"env": {
"CTXLOOM_ROOT": "/path/to/project"
}
}
}
}Add to ~/.claude/claude_desktop_config.json
{
"mcpServers": {
"ctxloom": {
"command": "npx",
"args": ["-y", "ctxloom"],
"env": {
"CTXLOOM_ROOT": "/path/to/project"
}
}
}
}Add to .cursor/mcp.json
Environment Variables
| Variable | Description | Default |
|---|---|---|
| CTXLOOM_ROOT | Project root directory to index (auto-detected if not set) | process.cwd() |
| LOG_LEVEL | Logging verbosity: debug, info, warn, error | info |
Need more? The full reference covers all 31 tools, advanced configs, and integration guides.
Read the Full Docs