How it works
cix has three moving parts. None of them are clever in isolation. The combination is what makes the assistant behave better.
1. The index
When you run cix on a project, it walks the codebase and builds a structured database of everything in it:
- Functions and classes with their signatures, line ranges, and source.
- Routes with method, path, handler, framework, and the file/line they're registered at.
- Database tables parsed from your migrations — Laravel, Django, Alembic, SQLAlchemy, Prisma, raw SQL.
- Imports and exports — the actual edges between files.
- Convention rules for where files belong and how they should be named.
This index lives on disk and updates automatically as the project changes. It is small (a single database file per project, typically a few megabytes), fast to query (single-digit milliseconds for most lookups), and transparent (you can inspect it directly).
The index is not a vector store. It is not a fuzzy embedding. It is a structured, factual record of the code on disk — the kind of thing a compiler or IDE builds, exposed as something an AI assistant can ask questions of.
2. The query layer
The index is useless on its own. What makes it powerful is a small set of focused queries the AI assistant can run:
- "Does this function or component already exist?"
- "Show me the source of one specific function without opening the whole file."
- "What columns does this table have?"
- "What HTTP routes does this project expose?"
- "Who calls this function?"
- "What would break if I changed this?"
- "Is this file path allowed for this kind of file?"
Each query returns a structured answer with source locations and a confidence signal. The assistant uses these queries the way a thoughtful human uses an IDE's "Go to definition" and "Find all references" — except it does so automatically, every turn, without being told.
The result is far less reading. A typical "find existing code" task that took five to ten file reads under grep takes one query under cix. A typical "trace this through the codebase" task that took twenty file reads takes three or four.
3. The convention layer
The index handles "what exists." The convention layer handles "how things are done here."
You commit a small file to your repo that encodes structural rules — folder layout, file naming, the kinds of things that belong in different parts of the project. cix ships with sensible default rules for ten common stacks (React, Vue, Next.js, Django, Flask, FastAPI, Laravel, Rails, Express, plus a generic profile), so you start with something reasonable and tune from there.
When the assistant tries to create a file, the convention layer checks the proposed path and name against your rules. On Claude Code — where write hooks are most mature today — a violation blocks the write at the file system level; the file is never created. On Codex and Gemini, the validation still runs and the violation surfaces as a warning the assistant has to address before continuing. Enforcement strength varies by client; the rule itself is the same everywhere.
This is the difference between describing your conventions in a markdown file and checking them at the moment they would be broken. Description fails over time. Verified enforcement holds — most strongly on supported clients, partially on others.
How it stays current
The index updates automatically:
- When the assistant edits a file — the change is reflected immediately.
- When you pull or switch branches — git hooks refresh the affected entries.
- When you delete or move files — stale entries are pruned.
- On demand — if you change something outside the assistant, one command catches the index up.
You do not maintain the index. It maintains itself.
How it talks to the assistant
cix exposes its query layer over the Model Context Protocol (MCP), the open standard several AI coding tools have adopted for tool use. That means:
- One install covers Claude Code, Codex, and Gemini CLI. Same index, same conventions, all three clients.
- No vendor lock-in. If you switch assistants, the index and conventions move with you.
- Source code stays on your machine. cix parses locally; only structural metadata (symbols, signatures, call graphs) flows to cix-cloud. Each user gets a private index — even within Team accounts, no two users share an index.
What you give up by using it
Honest answer: a few minutes of setup per machine, a few minutes per project, and a small amount of disk space. That is the cost.
You do not give up control over your code, your tooling choices, or your workflow. The assistant continues to work the way it always has — it just has access to better answers when it asks the right questions.
Where to go next
- Features — the full surface, broken down by capability.
- Find before you build — the most-used capability.
- Convention enforcement — the differentiator most teams care about.
- For evaluators — adoption considerations.