Local-First MCP Sidecar
31 MCP Tools
13 Languages
✦ ctxloom-bot NEW

The Universal Code Context Engine

Local-first MCP sidecar that gives AI coding assistants intelligent, structured code context — reducing tokens by 92%, measured while eliminating hallucinations.

No credit card required · Pro from $9.90/mo

auth-controller.ts
150 lines15 lines
Original
1import { UserService } from './services/user';
2
3export class AuthController {
4 private userService: UserService;
5 private tokenSecret: string;
6 private refreshInterval: number;
7
8 constructor(config: AuthConfig) {
9 this.userService = new UserService(config.db);
10 this.tokenSecret = config.tokenSecret;
11 this.refreshInterval = config.refreshInterval;
12 }
13
14 async login(credentials: LoginDTO): Promise<AuthResponse> {
15 const user = await this.userService.findByEmail(credentials.email);
16 if (!user) throw new NotFoundError('User not found');
17 const valid = await bcrypt.compare(credentials.password, user.passwordHash);
18 if (!valid) throw new UnauthorizedError('Invalid credentials');
19 const token = this.generateToken(user);
20 const refreshToken = this.generateRefreshToken(user);
21 return { token, refreshToken, user: user.toJSON() };
22 }
23
24 async register(dto: RegisterDTO): Promise<AuthResponse> {
25 const existing = await this.userService.findByEmail(dto.email);
26 if (existing) throw new ConflictError('Email already registered');
27 const hash = await bcrypt.hash(dto.password, 12);
28 const user = await this.userService.create({ ...dto, passwordHash: hash });
29 const token = this.generateToken(user);
30 return { token, user: user.toJSON() };
31 }
32
33 private generateToken(user: User): string {
34 return jwt.sign({ sub: user.id, email: user.email }, this.tokenSecret, { expiresIn: '1h' });
35 }
36
37 private generateRefreshToken(user: User): string {
38 return jwt.sign({ sub: user.id, type: 'refresh' }, this.tokenSecret, { expiresIn: '7d' });
39 }
40}
Skeletonized
1import { UserService } from './services/user';
2
3export class AuthController {
4 private userService: UserService;
5 private tokenSecret: string;
6 private refreshInterval: number;
7 constructor(config: AuthConfig);
8 async login(credentials: LoginDTO): Promise<AuthResponse>;
9 async register(dto: RegisterDTO): Promise<AuthResponse>;
10 private generateToken(user: User): string;
11 private generateRefreshToken(user: User): string;
12}
92% token reduction, measuredctx_get_context_packet

The Context Bottleneck

AI coding assistants are only as good as the context they receive. Too little context causes hallucinations; too much wastes tokens and money.

The Problem

Too little context → Hallucinations

AI makes up APIs, invents non-existent functions, and suggests incorrect patterns when it lacks project context.

Too much context → Token waste

Dumping entire codebases into context windows costs $0.50+ per query and hits token limits on complex projects.

No structure → Irrelevant results

Grep-based tools find text matches but miss semantic relationships, call chains, and dependency context.

// AI without context:
const user = await db.findUser(); // ❌ doesn't exist
const result = userService.getProfile(); // ❌ wrong method
import { Auth } from './auth'; // ❌ wrong import

The Solution

Structured Dependency Graph

Build an in-memory graph of imports, calls, and type relationships. Navigate code by structure, not just text.

Semantic Vector Index

LanceDB-powered embeddings find files by meaning, not just name. "Where is authentication handled?" → instant results.

AST Skeletonization

tree-sitter strips function bodies, leaving only signatures and types. 200-line files → 15 lines of pure interface.

// AI with ctxloom:
const user = await userService.findByEmail(email); // ✓ correct
const profile = user.toProfile(); // ✓ real method
import { UserService } from '@/services'; // ✓ verified
92%
Token Reduction
<500ms
Retrieval Speed
Zero
API Keys
Features

Everything You Need for Intelligent Context

Nine core capabilities that transform how AI assistants understand your codebase.

Hybrid Search

Vector embeddings (semantic) + dependency graph (structural) combined. Find files by meaning OR by relationship.

LanceDB-powered semantic search finds code by concept — "authentication logic" returns the right files. Graph-based search traverses imports and calls to find related code.

Code Skeletonization

Compress files to signature-only views. 200-line file → 15 lines. Class bodies, function implementations stripped.

Uses tree-sitter AST parsing to strip function bodies, keeping only signatures, types, imports, and class declarations. Preserves all the information an AI needs without the noise.

Call Graph Traversal

Bidirectional: find who calls a function (callers) or what it depends on (callees). Configurable depth for transitive analysis.

Navigate the dependency graph in both directions. Find all callers of a function to understand impact, or trace callees to understand behavior. Configurable depth from 1 to N hops.

Auto Setup

One command configures all your AI tools. Detects Claude Code, Cursor, VS Code, Windsurf, and 9 more clients automatically.

Run `ctxloom setup` and the interactive wizard scans your system for installed MCP clients, shows what was found, and configures them with a single confirmation. Supports 13 AI coding tools across config files, CLI binaries, and app bundles.

Slash Commands

Direct tool invocation with /ctx_search, /ctx_get_file, etc. Skip the AI middleman for instant, deterministic results.

Type /ctx_search auth code to search directly, or /ctx_get_context_packet src/auth.ts for instant context. Works in Claude Desktop, custom MCP clients, and CLI. No AI interpretation step — pure speed.

Live File Watching

Chokidar-based watcher with 200ms debounce. Index updates in <2 seconds when files change. No manual re-indexing.

Automatically detects file changes, creations, and deletions. Debounced at 200ms to batch rapid saves. Incremental re-indexing completes in under 2 seconds.

Rule Injection

Automatically loads .cursorrules, CLAUDE.md, CONTEXT.md, .ctxloomrc. Project conventions injected into AI context automatically.

Scans for project convention files and injects them into AI context on every request. Ensures AI assistants always follow your team's coding standards and architectural decisions.

Path Security

PathValidator prevents CWE-22 path traversal attacks. All file access validated against project root. Symlink escape prevention.

Every file path is resolved and validated against the project root before access. Prevents directory traversal attacks (../ escapes), symlink escapes, and unauthorized file reads.

Architecture Rules

Enforce import boundaries as CI lint. Define no-import rules in .ctxloom/rules.yml and catch architectural violations before they merge.

Rules are checked against the live dependency graph via picomatch globs. CLI exits 1 on error violations, 0 on clean. MCP tool ctx_rules_check brings the same engine to your AI assistant — config reloaded per call, no restart needed.

Live Graph

Your codebase, structured

Three lenses on your code — structural dependencies, git risk scores, and blast radius — all from a single index.

server.tsDependencyGraph.tsVectorStore.tsASTParser.tsSkeletonizer.tsFileWatcher.tsindexerWorker.tsfindCallers.tsDependencyGraphVectorStoreASTParserskeletonize()embed()ctx_get_context_packetctx_get_call_graphdepgraph.test.tsvectorstore.test.tsastparser.test.ts
File
Class
Function
MCP Tool
Test
18 nodes · 23 edges visible
MCP Tools

32 tools. Five categories. One install.

The most complete code context engine for AI assistants — search, graph analysis, navigation, review, and automation in a single MCP server.

ctx_search

Hybrid semantic + graph search. Vector similarity + import graph expansion combined for intelligent file discovery.

Parameters

NameTypeReq.Description
querystringyesNatural language or code symbol query
limitnumberoptMax results (default: 10)

Input

Output

Direct Slash Commands

Skip the AI middleman. Invoke any tool directly — instant, deterministic, zero latency.

/ctx_search auth logic/ctx_blast_radius src/auth/controller.ts/ctx_execution_flow handlePayment/ctx_detect_changes/ctx_get_workflow review
Git Intelligence

The risk your static graph can't see

Two new tools fuse git history onto the structural graph — surfacing coupling, churn, and ownership signals invisible to AST analysis.

GitHub App · Beta

Risk-scored reviews on every PR

ctxloom-bot posts automated structural analysis the moment a PR opens — blast radius, risk scores, and reviewer suggestions. No manual invocation.

ctxloom-botjust nowBot

## ctxloom analysis · risk: high (0.81)

📊 Blast radius: 14 files · 3 communities

🔥 Top risk: src/auth/controller.ts (churn: high, bus factor: 1)

👥 Suggested reviewers: @alice, @bob (ownership match)

/ctxloom explain·/ctxloom ignore·/ctxloom refresh
  • Posts on every PR — idempotent, updates on new commits
  • Inline annotations at the highest-risk lines
  • Suggests reviewers from git ownership data
  • Slash commands: /ctxloom explain, /ctxloom ignore
  • Optional Check Run — gate merges on risk threshold
Git History · v0.7

Historical coupling the static graph misses

Mines your git log to surface files that change together even when they have no import relationship — the hidden couplings that break refactors.

ctx_git_coupling

Co-change analysis — files that historically move together (Jaccard + recency decay)

node: "src/auth/controller.ts"

schema/migrations/001_users.sql0.91████████████
src/auth/middleware.ts0.84███████████
tests/auth.test.ts0.76██████████
config/jwt.config.ts0.62████████
ctx_risk_overlay

Composite risk score: churn 35% + bug density 30% + bus factor 20% + coupling 15%

src/auth/controller.tshigh0.81
src/api/payments.tshigh0.74
src/db/migrations.tsmedium0.55
src/utils/logger.tslow0.18
Benchmarks

92% token reduction. Measured, not estimated.

Run on 5 real open-source repos. Reproduce it yourself: npm run bench:repos

92%

average token reduction

Raw tokens100%
Skeleton tokens8%
RepositoryReduction
expressjs/express92%
sindresorhus/got93%
SergioBenitez/Rocket93%
fastify/fastify91%
Average92%

How ctxloom compares

FeaturectxloomOthers
Zero Python✅ Pure JS/TS❌ Python required
Local-first (no cloud)varies
Blast radius analysis ctx_blast_radius
Community / cluster detection✅ Louvain (pure JS)
Execution flow tracing ctx_execution_flow
Refactor rename preview ctx_refactor_preview
Wiki generation (no LLM) ctx_wiki_generate
PR-native GitHub App ctxloom-bot
13 languages (AST)varies
92% token reduction✅ measured❌ estimated
Run it yourself:npx tsx benchmarks/benchmark.ts
Architecture

Built for Speed & Intelligence

Five-layer architecture designed for sub-second context retrieval with live indexing.

Data Flow

Context Engine

The core intelligence layer combining three specialized engines for comprehensive code understanding.

Components

In-Memory Graph
Dependency graph with bidirectional traversal. Tracks imports, calls, and type relationships in real-time.
VectorDB (LanceDB)
Semantic search via vector embeddings. Finds code by meaning, not just text matching.
Skeletonizer (tree-sitter)
AST-based code compression. Strips implementations, preserves signatures and types.
Receives data from: MCP Interface Layer
Sends data to: Auto Setup & Integration
Getting Started

Up and Running in 30 Seconds

Install, index, and connect. No API keys, no cloud accounts, no configuration required.

1

Install ctxloom

One-time global install. Node.js 20+ required.

bash
npm install -g ctxloom-pro
2

Activate your license

Start a free 7-day trial — no credit card required. You'll receive a license key by email; paste it into the command below.

bash
ctxloom activate CTXL_PRO_-<your-key>
3

Connect your AI tools

Run the setup wizard once — it detects every MCP-compatible tool on your machine and writes the config automatically.

bash
ctxloom setup
Claude Code
Cursor
VS Code
Windsurf
Augment Code
Kilo Code
Continue.dev
Aider

+ 5 more: Codex CLI, Kimi, Qwen Code, JetBrains AI, Claude Desktop

4

Index your project

Run once per project. Builds the dependency graph, indexes symbols, and generates vector embeddings. Takes 5–15 seconds on a typical mid-size repo.

bash
cd /path/to/your/project
ctxloom index

Open your AI tool — you're done

ctxloom starts automatically when your AI tool connects. No terminal to keep open. All 31 tools are instantly available.

No API keys. Everything runs locally on your machine.

No cloud. Code never leaves your environment.

Auto-updates. The file watcher keeps the graph in sync as you code.

MCP Client Configuration

Recommended: Run ctxloom setup to auto-configure all detected clients. The manual configs below are for reference or if you prefer to configure by hand.

C
Claude Code
json
{
  "mcpServers": {
    "ctxloom": {
      "command": "npx",
      "args": ["-y", "ctxloom"],
      "env": {
        "CTXLOOM_ROOT": "/path/to/project"
      }
    }
  }
}

Add to ~/.claude/claude_desktop_config.json

Cu
Cursor
json
{
  "mcpServers": {
    "ctxloom": {
      "command": "npx",
      "args": ["-y", "ctxloom"],
      "env": {
        "CTXLOOM_ROOT": "/path/to/project"
      }
    }
  }
}

Add to .cursor/mcp.json

Environment Variables

VariableDescriptionDefault
CTXLOOM_ROOTProject root directory to index (auto-detected if not set)process.cwd()
LOG_LEVELLogging verbosity: debug, info, warn, errorinfo

Need more? The full reference covers all 31 tools, advanced configs, and integration guides.

Read the Full Docs