This document lists the goals, planned steps, feature list, and current status for the two-part WebAiBridge prototype (VSCode extension + Chrome extension bridge).
Current Version: 0.6.0
- Build a VSCode extension that can extract code/context and send it to a browser-based AI chat site.
- Build a Chrome extension that receives context from VSCode, displays/manages context chips, previews token usage, and inserts context into AI chat inputs.
- Provide a local bridge (WebSocket) to connect the VSCode extension and the web extension without a remote server.
- Send selected text or full files from VSCode to the web extension.
- Attach full files, folders, snippets, file trees, diagnostics, browser tabs, GitHub repos, and contextual docs.
- Token counting and token-aware previews/truncation.
- Rules and ignore patterns at User/Workspace/Folder level.
- Optimized prompt formatting and chunking for LLM consumption.
- Authentication and account sync (optional, e.g., for Context7/GitHub features).
- Cross-site compatibility: ChatGPT, Claude, Google AI Studio, etc.
- Browser tab screenshots and attachments.
- Context chips UI with token chips, previews, and history.
- Local WebSocket Bridge: VS Code runs WebSocket server on ports 64923-64932 with auto-discovery
- Multi-instance Support: PING/PONG discovery protocol, instance picker in popup
- Keep-alive Mechanism: Auto-reconnection without opening popup
- Commands: sendSelection, sendFile, addSelectionToContext, addFileToContext, addFolderToContext, viewContext, sendContext, clearContext
- Context Menus: Editor and Explorer right-click menus with WebAiBridge submenu
- Ignore Patterns: Configurable exclude patterns, .gitignore parsing, file size limits
- @ Mention System: Type trigger (default
@, customizable) in AI chat to pull context from VS Code@focused-file→ inserts as@filename.ext(uses actual filename from VS Code)@selection→ inserts as@selection-1,@selection-2, etc.@visible-editors,@open-tabs,@problems,@file-tree,@git-diff,@terminal- Customizable Trigger: Set your preferred trigger character in popup settings
- Context Chip Bar: Floating bar above input showing all added contexts
- Total token count display
- Hide/show toggle
- Click to preview content
- × to remove individual chips
- "Clear All" button
- Placeholder Expansion: Readable placeholders like
@content.jsin input, expand to full content on submit - Per-Message Limits: User-configurable limit with Warn/Chunk/Truncate modes
- Smart Chunking: Split large content at natural boundaries (paragraphs, sentences)
- AI Response Capture: "Send to VS Code" and "Code to VS Code" buttons on responses
- BPE-style Estimation: ~95% accuracy for English text and code
- Model-specific Limits: GPT-4 (8K-128K), Claude 3 (200K), Gemini 1.5 (1M+)
- Gemini Optimization: 15% token reduction for SentencePiece efficiency, relaxed warning threshold (90%)
- Color-coded Warnings: Green/yellow/red based on usage percentage
- ChatGPT / OpenAI
- Claude / Anthropic (ProseMirror compatibility)
- Gemini / Google AI Studio
- Microsoft Copilot (M365)
- MutationObserver: Replaced setInterval with MutationObserver for input clear detection
- ResizeObserver: Dynamic chip bar repositioning on input element resize
- textInput Events: Proper event dispatch for ProseMirror/Quill editors
- Modern Input APIs: Replaced execCommand with beforeinput events and Range API
- Generation Complete Detection: Watch for "Stop generating" button removal
- Shadow DOM Traversal: Deep queries through Fluent UI shadow roots
- Sanitizer-Proof Placeholders:
[[WAB::id::label]]format survives React sanitization - Modern Text Insertion: Uses beforeinput events with text replacement fallback
- ARIA-based Button Detection: Finds send button via role attributes in shadow DOM
- Text Sync: Chips auto-removed when placeholders deleted from input
- ID Display: Shows placeholder ID in chip bar for reference
- Dynamic Repositioning: ResizeObserver + MutationObserver + polling fallback
- GitHub Actions workflow builds both VS Code (.vsix) and Chrome (.zip) extensions
- Extension works in any workspace after installation
- ✅
Improve token counting accuracy— Implemented BPE-style tokenizer - ✅
Add token limit warnings and truncation rules— Implemented with color-coded warnings - ✅
Implement ignore patterns— Implemented with .gitignore support - ✅
Context chips UI— Implemented with chip bar, preview, remove - ✅
Per-message limits and chunking— Implemented with three modes - ✅
@ mention system— Implemented with customizable trigger - ✅
Multi-instance support— Implemented with port scanning and instance picker - ✅
Microsoft Copilot support— Implemented with Shadow DOM and sanitizer-proof placeholders - Add authentication flow and settings sync
- Expand file extraction: PDFs, DOCX, images (OCR)
- File Picker @ Mentions:
@package.json,@src/utils.tsto reference specific files - Diff Mode / Sync Back: Send AI responses back to VS Code with file+diff instructions
- GitHub repository search & attach (including private repos via OAuth)
- Context7 integration (semantic docs search)
- Browser tab screenshots and batch-attach functionality
- Optimize prompt formatting for different LLM providers
- Implement a WebView-based chat panel inside VS Code
- Features:
- Send messages to the connected AI chat site without leaving VS Code
- See AI responses rendered with syntax highlighting
- Apply code blocks directly to files with one click
- Keep conversation history per workspace
- Support for multiple concurrent conversations
- Bridge approach: Route messages through existing WebSocket to browser
- Alternative: Direct API integration with AI providers
- Sync preferences between VS Code and Chrome extension
- Cloud backup of context history and favorites
- PDF/DOCX/PPTX extraction
- Image OCR
- Workspace file tree browsing with selective chunking
- Browser tab content capture
- Open the repo folder in VSCode:
code "C:\Users\Andrew\Documents\GitHub\WebAiBridge"-
Load the web extension in Chrome (Developer mode → Load unpacked) pointing to
web-extension. -
Option A: Install packaged extension (recommended)
- Download the latest
.vsixfrom GitHub Releases or run the packaging workflow - Install:
code --install-extension webaibridge-0.6.0.vsix - Reload VS Code
- Download the latest
-
Option B: Development mode In the
vscode-extensionfolder:
cd vscode-extension
npm install
npm run compileThen press F5 in VSCode to launch the Extension Development Host.
- Use the VSCode commands from the command palette (Ctrl+Shift+P):
WebAiBridge: Send Selected Text— sends selection to the web extension.WebAiBridge: Send Current File— sends whole file.WebAiBridge: Add Selection to Context— add selection as a chip.WebAiBridge: Add File to Context— add current file as a chip.WebAiBridge: Add Folder to Context— recursively add folder contents.WebAiBridge: View Context Chips— view all context chips.WebAiBridge: Send All Context to Browser— send all chips to browser.WebAiBridge: Clear Context— clear all chips.
- Or use right-click context menus:
- In the editor: Right-click → WebAiBridge submenu
- In the file explorer: Right-click files/folders for quick add
- Or use @ mentions in AI chat:
- Type your trigger (default
@) in any supported AI site - Customize the trigger character in the popup settings
- Select context type from the popover
- Placeholder like
@content.jsappears in your message - On submit, placeholder expands to full content
- In the Chrome popup, verify bridge status and toggle settings.
- The prototype uses an unsecured local WebSocket on
ws://localhost:64923-64932. For distribution consider a native messaging host or secure channel. - Token estimation uses a BPE-style heuristic (~95% accuracy). For exact counts, integrate with model-specific tokenizers.
- The current VSCode extension runs locally and will not work across remote development sessions (SSH/Containers) without an alternate bridge.
- Gemini token estimates are reduced by 15% to account for SentencePiece efficiency.
- VSCode extension source:
webaibridge-vscode/src/extension.ts - VSCode extension config:
webaibridge-vscode/package.json - Web extension popup:
web-extension/src/popup.html,web-extension/src/popup.js - Web extension background:
web-extension/src/background.js - Content script:
web-extension/src/content.js - Tokenizer:
web-extension/src/tokenizer.js - GitHub Actions:
.github/workflows/package-extension.yml - Landing page:
landing-page.md
Last updated: 2026-01-16