michaelzixizhou/codag
Visualize AI/LLM workflows in your codebase.
Codag
See how your AI code actually works.
Codag analyzes your code for LLM API calls and AI frameworks, then generates interactive workflow graphs — directly inside VSCode.
If you find Codag useful, please consider giving it a ⭐ — it helps others discover the project!
Gallery
Why Codag?
You're building an AI agent that chains 3 LLM calls across 5 files. A prompt change breaks something downstream. Which call? Which branch? You grep for openai.chat, open 8 tabs, and mentally trace the flow. Again.
Or you're onboarding onto someone's LangChain project — 20 files, tool calls inside tool calls, retry logic wrapping everything. The README says "it's straightforward." It's not.
Codag maps it out for you:
- Extracts the workflow — finds every LLM call, decision branch, and processing step across your entire codebase
- Visualizes it as a graph — interactive DAG with clickable nodes that link back to source code
- Updates in real-time — edit a file and watch the graph change instantly, no re-analysis needed
Built for AI engineers, agent builders, and anyone maintaining code that talks to LLMs — whether it's a single OpenAI call or a multi-agent LangGraph pipeline.
Features
Automatic Workflow Detection
Point Codag at your files and it maps out the entire AI pipeline — LLM calls, branching logic, data transformations — without any configuration.
Live Graph Updates
Edit your code and the graph updates instantly using tree-sitter parsing. Changed functions get a green highlight so you can see exactly what moved.
Click-to-Source Navigation
Every node links back to the exact function and line number. Click a node to open the side panel, click the source link to jump straight to the code.
Export to PNG
Export your workflow graphs as high-resolution PNG images — the entire graph or individual workflows.
Native Theme Support
Graphs automatically match your VS Code theme — light or dark. No configuration needed.
Supported Providers & Frameworks
LLM Providers: OpenAI, Anthropic, Google Gemini, Azure OpenAI, Vertex AI, AWS Bedrock, Mistral, xAI (Grok), Cohere, Ollama, Together AI, Replicate, Fireworks AI, AI21, DeepSeek, OpenRouter, Groq, Hugging Face
Frameworks: LangChain, LangGraph, Mastra, CrewAI, LlamaIndex, AutoGen, Haystack, Semantic Kernel, Pydantic AI, Instructor
AI Services: ElevenLabs, RunwayML, Stability AI, D-ID, HeyGen, and more
IDE APIs: VS Code Language Model API
Languages: Python, TypeScript, JavaScript (JSX/TSX), Go, Rust, Java, C, C++, Swift, Lua
Don't see yours? Adding a provider takes 5 lines of code.
Quick Start
1. Clone & Setup
git clone https://github.com/michaelzixizhou/codag.git
cd codag
cp backend/.env.example backend/.env
# Add your Gemini API key to backend/.env (free tier: https://aistudio.google.com/apikey)
make setup2. Start the Backend
With Docker (recommended):
docker compose up -dWithout Docker:
make runVerify: curl http://localhost:52104/health
3. Install the Extension
VS Code: Search "Codag" in Extensions, or install from the Marketplace.
Cursor: Build and install the .vsix manually:
cd frontend && npx @vscode/vsce package
cursor --install-extension codag-*.vsix4. Use It
Cmd+Shift+P/Ctrl+Shift+P→ "Codag: Open"- Select files containing LLM/AI code
- Explore the graph
MCP Server (for Cursor Agent, Claude Code, etc.)
The extension automatically registers a bundled MCP server when activated. This gives coding agents access to your workflow graph — no extra setup required.
The config is written to .cursor/mcp.json (Cursor) or .mcp.json (Claude Code) in your workspace.
How It Works
Analysis Pipeline:
- Tree-sitter parses your code into ASTs across 10+ languages
- Pattern matching detects LLM API calls and framework usage
- Call graph extraction maps function relationships
- Backend (Gemini 2.5 Flash) identifies workflow semantics — nodes, edges, decision points
Live Updates:
- File changes trigger incremental tree-sitter re-parsing
- AST diffs determine which functions changed
- Graph updates instantly without LLM round-trip
Rendering:
- ELK (Eclipse Layout Kernel) for orthogonal graph layout
- D3.js for interactive SVG rendering
- Per-file caching with content hashing — only changed files reanalyze
Roadmap
- Hosted backend (no self-hosting required)
- Diff view: compare workflows across git commits
- Support for more languages and frameworks
Have a feature request? Open an issue.
Star History
Development
For contributors working on the extension itself:
cd frontend
npm install
npm run compileThen press F5 in VS Code to launch the Extension Development Host.
Contributing
Contributions welcome! See CONTRIBUTING.md for full development setup.
Contact
Questions or feedback? Reach out at michael@codag.ai



