Claude Code blinked at my repo. Then slurped 12 files. Zero code written.
That’s $0.60 down the drain, on orientation alone.
One command—npx stacklit init—and your codebase spills its secrets into a crisp stacklit.json. No more agents playing tourist, reading src/auth/service.ts, then src/db/connection.ts, rinse, repeat. Every session.
Here’s the thing. AI agents have amnesia. Second session? Same blind tour. Burn another 400k tokens on a medium repo. Stacklit commits the map to git. Every agent—Claude, Cursor, whatever—loads it instantly.
“On a medium repo, that costs about 400,000 tokens. The agent burns through its context window before writing a single line.”
Brutal truth from the creator. And yeah, it’s spot on.
Why AI Agents Suck at Codebases (Without This)
Picture this: Your agent’s context window is a greedy maw. It devours files for ‘context.’ But repos aren’t flat. They’re sprawling cities—auth here, db there, api routes lost in the weeds. Agents? They read everything. Twice.
FastAPI. 108k lines of Python. Stacklit indexes it in 4,142 tokens. Less than one fat file. Express.js? 21k lines, 3,765 tokens. Gin in Go: 3,361. Even chunky Axum in Rust squeezes into 14k.
Agents without maps? Doomed to token bankruptcy before hello world.
I love the dry humor in the original post. “.gitignore tells git what to skip. package.json tells npm what to install. But nothing tells AI agents how your code is organized.” Bingo. We’ve got tools for humans. Agents get squat.
Stacklit changes that. Tree-sitter powers it—the parser behind VS Code’s glow, Neovim’s smarts, GitHub’s highlights. Full AST parsing. No regex hacks.
Go? Grabs imports, exports, structs. TypeScript? ESM, CJS, dynamic imports, interfaces. Python? Classes, decorators, hints. Rust, Java, C#—the lot. Even C++ structs.
Is Stacklit Actually Better Than Repomix or Aider?
Short answer: Yes. Here’s the smackdown.
Stacklit spits three files: stacklit.json (committed gold), DEPENDENCIES.md (Mermaid graph, GitHub-clickable), stacklit.html (interactive views—graph, tree, table, flow. Local browser, gitignored).
Compare:
| Stacklit | Repomix | Aider repo-map | Codebase Memory MCP |
|---|---|---|---|
| ~4k tokens | 500k+ | Ephemeral | SQLite |
| Committed | Too big | No | No |
| Dep graph | No | Yes | Yes |
| Visual HTML | No | No | No |
| MCP server | Yes | No | No |
Repomix bloats. Aider forgets. MCP needs runtime. Stacklit? Zero deps. 100ms. MIT open source.
npx stacklit init. Or go install. Done.
Why Does Stacklit Matter for Your AI Workflow?
Plug it into Claude Desktop or Cursor. Three lines in config:
{ “mcpServers”: { “stacklit”: { “command”: “stacklit”, “args”: [“serve”] } } }
Now? Agent calls get_overview(). Boom—full map. No file feast.
Tools: get_module, find_module, list_modules, get_dependencies, get_hot_files, get_hints. stacklit serve spins ‘em up.
Unique insight time: This echoes the ’80s IDE revolution. Remember Turbo Pascal? Or early Eclipse? They parsed your code once, built symbol tables. Kept ‘em in memory. Devs flew. Stacklit’s that for AI agents—pre-parsed repo brains. But committed. Shared. No PR spin; this slays real pain. Prediction: In six months, every AI dev tool mandates a stacklit.json. Or dies.
Tested on four OSS beasts. Worked. Hints like “add_feature”: “Create handler in src/api/, add route in src/index.ts”. Smart.
Activity tracking—git heatmaps. Exports listed. Dependencies mapped. It’s a repo’s love letter to agents.
But let’s poke. Single binary’s cute, but what about monorepos? Massive Rust crates? Creator claims fallback for weird langs. Fine. Still, gitignoring html means regen locally—minor drag.
Corporate hype? None. Indie dev drops truth bomb. Stars needed? Sure. But use it.
Burn rate drops. Sessions speed up. Agents write code, not read tours.
The $0.60 Hack That Pays Itself Back
Medium repo. 400k tokens/session. At $5/million input (Claude rough), that’s real cash. Stacklit: 4k tokens loaded once. Pocket change.
Four projects benchmarked. Python giant fits tiny. JS frameworks? Laughable savings.
Dry humor alert: Agents ‘orienting’ like lost puppies. Stacklit’s the leash.
Open source. GitHub: github.com/glincker/stacklit. Fork it. Star it. Run it.
🧬 Related Insights
- Read more: 107 MCP Calculator Downloads: The Quiet Signal of AI Workflow Evolution
- Read more: 82.6% Tool Accuracy on Local Qwen 32B: LangGraph, CrewAI, Smolagents Benchmarked Head-to-Head
Frequently Asked Questions
What is Stacklit and how do I install it?
Stacklit indexes your codebase into a tiny JSON map for AI agents. Install: npx stacklit init or go install github.com/glincker/stacklit/cmd/stacklit@latest.
Does Stacklit work with Claude or Cursor?
Yes—add it as an MCP server in config. Agents query get_overview() instead of reading files blindly.
Will Stacklit save tokens on large repos?
Absolutely. FastAPI’s 108k lines? 4k tokens indexed. Beats reading files every session.