Show Notes
OpenAI’s new cross-app extension aims to give an AI assistant access to the context from the apps you’re working in, with a VS Code extension as the entry point. Parker walks through how it works, what you can do with it, and a hands-on demo across apps like Electron apps, iTerm, and Xcode.
What’s new with OpenAI tools
- Extension-based approach to connect multiple apps (VS Code, iTerm, Terminal, Xcode) for AI context.
- AI can see the active window and the files in it, but full system-wide context isn’t available yet.
- New beta-access UI: a prominent icon in the host app to access “Work with beta” features.
- Availability notes: you may need an OpenAI plan (Plus/Team) to access certain capabilities.
- Requires updating the host apps and installing a VSIX extension in VS Code.
How the integration works
- An extension you install provides the AI with window-level context and the ability to reference files you’re working on.
- Context is limited to what's visible in the current window or screen region; it’s not a full-system context dump.
- After updating, you install the extension via VS Code and load a VSIX package from the official blog post.
- The extension supports multiple windows/files (e.g., loading up to several files so the AI can reason over them together).
Demos and workflows
- Quick demo setup:
- Use VS Code to load multiple files and have the AI operate on them.
- The user can prompt the AI with a system prompt like: “You are an expert Electron developer...” to drive debugging or feature work.
- The AI can be asked to log actions (e.g., ensure an “eject all” command logs out safely).
- Other app contexts tested:
- iTerm: the AI can fetch recent lines (e.g., last 200 lines) and manipulate the session.
- Xcode: the extension can also surface context from Xcode projects.
- The flow includes using the Mac menu bar for quick access to the extension’s context, and you can add more files to the current session as needed.
- Dictation tip: Mac’s voice mode can be handy to speed up prompts and notes mid-work.
Pros and cons (as observed)
- Pros
- Adds cross-app AI context, useful for multi-file, multi-app workflows.
- Quick way to drive tasks in code and terminal contexts with natural-language prompts.
- The chat desktop ecosystem is notably faster and more responsive than before.
- Cons / caveats
- Context is not full-context; you still need to be mindful of what the AI actually sees.
- Output can be “copy-paste heavy” and requires careful reading and validation—don’t assume it’s perfect.
- You may need to adjust prompts to get precise file names and diffs; the AI sometimes summarizes or obfuscates filenames.
- Some setup steps are a bit finicky (download the VSIX, install from VSIX, etc.), which could be streamlined in future updates.
Getting started (practical steps)
- Update your host app (look for updates in the app’s menu).
- Download the VSIX extension package from the official blog post.
- In VS Code:
- Open the Command Palette (Cmd/Ctrl + Shift + P).
- Run Extensions: Install from VSIX and select the downloaded file.
- In VS Code, the extension will load the current window’s context; you can add more files as needed.
- Quick test prompts:
- Use a system prompt like: “You are an expert Electron developer. You are tasked with debugging our disk utility app.”
- Define the task, role, and goal, and observe how the AI sequences steps (e.g., identifying where to add logs, what to change, etc.).
- Optional: enable Mac dictation (Voice mode) to speed up note-taking and prompts.
Prompts and best practices
- Define a clear three-part prompt:
- Role (system): the AI’s specialization.
- Task (one-line): the specific action you want.
- Goal (end-state): what a successful outcome looks like.
- When reviewing outputs:
- Check for actual file names and diffs when code changes are suggested.
- Validate critical actions (e.g., safe logouts, destructive commands) before applying changes.
- Treat it as an assistant, not a replacement for careful review. Read outputs and verify in context.
Next steps and what to watch for
- This is a meaningful step forward for cross-app AI copilots, but it’s not perfect yet. Expect iterative improvements in:
- Context depth and accuracy
- Ease of installation and onboarding
- More robust handling across more apps and file types
- Parker hints that more coverage and updates are coming in follow-up clips, especially around deeper flows and additional app support.
Links
- OpenAI Platform — blog post with the OpenAI cross-app extension and download instructions (VSIX)
- VS Code Extensions — extension installation guide (Install from VSIX)