Show Notes
In this quick-hit daily, I share the hard-won tactics I wish I knew when I started coding—focused on workflows, pattern learning, and how to leverage AI tools without burning through time.
Core tactics to learn faster
- Don’t negotiate on the fundamentals: nail the workflow, then optimize.
- Learn by pattern: pattern-match existing codebases to understand how things are put together.
- Ground the model in your code: use a real PRD/spec and reference concrete files/folders to anchor AI work.
Tooling and workflows that scale
- Multi-tool setup for speed:
- Cursor for indexing and grounding the codebase (left).
- Augment (VS Code) for fast code-context access (right).
- Taskmaster to keep you on track and generate a workable task list.
- For newbies vs advanced:
- Newbies: start with one toolset, gradually add more as you grow.
- Advanced coders: orchestrate multiple sets in parallel to run several features at once.
- Grounding tips:
- Turn on include project structure in Cursor, and add the most relevant files/folders.
- Copy relative paths (not just names) to improve accuracy when referencing code.
Pattern-learning with real codebases
- Start with existing projects to learn patterns quickly.
- Look for clear structure that maps to what you’re trying to build (e.g., a “Google Calendar API” reference shows how methods are organized and named).
- Recognize patterns in documentation vs code (TS docs cues backend logic in TypeScript).
Optimizing LLM usage: context windows and tokens
- Use task-level context to preserve quality:
- Create a new chat per task to maximize the model’s memory for that task.
- If you try to cram too much into one chat, quality degrades.
- Token awareness:
- Use a token-count plugin to see how many tokens each file uses.
- Be mindful of how much you feed into the model to avoid waste.
How to find and learn from open-source code
- Learn via open-source codebases:
- Use GitHub search operators to find relevant repos (e.g., repo:, path:, language:).
- Clone and explore to see how real projects are structured (
git clone <repo>; use your preferred editor).
- Practical example: exploring a Shorts/creator project
- Look for how components are organized (e.g., short creation flow, queue, scenes).
- Identify dependencies like FFmpeg and Whisper and how they’re wired into the flow.
Plan before you code: PRD-driven AI work
- Write the PRD (what, why, paths, success criteria) before handing work to AI.
- Taskmaster helps convert the PRD into concrete tasks; you still do the thinking, AI does the heavy lifting.
- Create new chats for new tasks to keep memory, context, and quality high.
Quick practical patterns (what to copy from this video)
- Enable and align Cursor’s project structure with your repo.
- Use relative paths for precise AI grounding.
- Use TS (TypeScript) to avoid data-type gaps; keep node_modules excluded from indexing to save tokens.
- Ground the AI with a focused set of files/folders to reduce noise.
Links
- Cursor, Augment, Task Master (workflow trio)
- OpenAI/Claude-style assistants and task-based planning
- TypeScript docs
- Google Calendar API reference
- GitHub search operators (repo:, path:, language:)
- FFmpeg and Whisper (for media workflows)