BLOG / MCP
Model Context Protocol, AI agent integration, and codebase context for LLMs

A practical mcp tools design rule: return raw evidence when the model needs to inspect files, symbols, or decisions. See the boundary in Repowise.

Graph aware mcp cuts grep loops: resolve callers, dependents, and review impact in one hop. See the token and latency math.

See mcp tools design on a real regression: task-shaped tools cut re-reading and context loss, with 49% fewer calls and 27× fewer tokens.

The current state of AI-assisted engineering feels like a paradox. We have Large Language Models (LLMs) with massive context windows—some spanning millions o...

The engineering community is currently caught in a 'context arms race.' As LLM providers announce increasingly massive context windows—moving from 32k to 200...

The fundamental bottleneck in AI-assisted engineering isn't the model's reasoning capability; it’s the context window. Even with million-token windows, stuff...

We’ve all been there: staring at a 200,000-line repository, trying to explain the intricate dance of a distributed state machine to an LLM. You copy-paste th...