Show Notes
Context is king: this daily video dives into practical ways to improve AI coding agents by optimizing context with Augment, Cursor, Mermaid, and Eraser.io. It covers a concrete workflow for just-in-time context, knowledge management, and hands-on tooling.
Key ideas on context and how it drives outputs
- Context windows matter: model accuracy drops noticeably once you push past half the window.
- Goal: keep only what's needed for the current task in scope; avoid overloading the model with everything at once.
- Build around just-in-time context rather than a single, massive master doc.
A pragmatic product-management mindset for AI work
- Three vertices in play: business (is this viable?), design (how will it look/feel?), engineering (how will it work?”
- PM focus: what and why; development/engineering focus: how.
- Use spikes (research or fast prototypes) to validate direction before heavy investment.
- For AI work, pass in only the information the feature needs; watch for cross-cutting work that touches many subsystems.
How to structure PRDs and docs for AI features
- If a feature is meaningful, give it its own PRD.
- Include just-in-time context: a simple one-pager for tech stack, data fetching, libraries, and architecture.
- Use multiple forms:
- Rules and augment guidelines for shaping AI behavior.
- A file tree and architecture notes (current vs proposed).
- A central “AI refs” bank with variations of architecture docs for experimentation.
- Don’t dump the entire master doc at once; tailor the context to the task.
Just-in-time context and knowledge management in practice
- Build and maintain an AI refs folder with architecture variants to learn what context is actually needed.
- Use a tiered approach: master docs for reference, task-specific docs for day-to-day work, and concise PRD-context for agents.
- Keep it lightweight: avoid sending the full 21,000-token master doc to the AI every time.
Tools and workflows showcased
-
Augment
- Use rules and augment guidelines to codify code standards, data fetching (server vs. client), caching strategies, and core architecture.
- Link rules to the relevant documents (one-pagers, architecture notes) to keep context tight and relevant.
-
Cursor (auto rules)
- Leverages auto rules to inject relevant context automatically (example: switch from Node to Bun with SQLite already in mind).
- Great for parallel work: you can run two projects at once while Augment does the heavy lifting.
-
XGPT CLI
- A lightweight CLI tool to gather domain knowledge on demand.
- Workflow: choose content type (tweets/replies), decide on embeddings, fetch a timeline, create embeddings (via a tool like Transcribe Mini), then query against those embeddings.
- Useful when you need fast, focused context without a full server setup.
-
Eraser.io
- Connects your repo to automatically generate diagrams (cloud diagrams, ER diagrams, flowcharts, sequence diagrams).
- Diagrams auto-update on push, and you can reference them in PRDs and docs.
- Helps keep diagrams in sync with code and reduces manual diagram maintenance.
Diagramming and visualization
- Diagrams are used to keep complex architecture and flows visible.
- Eraser supports multiple diagram types and can auto-sync with your repo for up-to-date visuals.
Why all this matters
- Context, when managed well, unlocks reliable orchestration of AI agents across tasks and layers.
- Knowledge management and documentation aren’t fluff—they’re the backbone of scalable, multi-agent AI workflows.
- The approach aims to reduce wasted cycles, improve accuracy, and enable faster iteration.
Concrete takeaways you can apply now
- Start an AI refs folder with modular architecture docs and a lightweight PRD for each feature.
- Implement just-in-time context: pass only the relevant docs, not the entire repository state.
- Add an AI guidelines doc (rules) and an augment guidelines doc to standardize how agents reason.
- Use Cursor auto rules to inject environment specifics (e.g., Bun vs Node, database choices) into context automatically.
- Build a small CLI (like XGPT) to gather niche knowledge and create embeddings for quick Q&A.
- Try Eraser.io to auto-generate and keep diagrams in sync with your repo; reference these diagrams in PRDs.
Community Q&A (highlights)
- Readers ask about big-context LLM orchestration (e.g., Gemini) and task execution; the emphasis remains on practical context management rather than brute-force context expansion.
- Questions around infrastructure choices (cloud, local, home labs) and keeping momentum without burnout are addressed by focusing on repeatable, modular workflows.