A lightweight Chrome extension that shows exact token counts, a live context bar, output tokens, and model info — right inside gemini.google.com.
- Exact token counts — pulled directly from Gemini's own
usageMetadata, not a guess - Live context bar — visual progress bar with tiered warning states (75% → orange, 90%+ → red)
- Output tokens — shows
N outalongside prompt token count - Threshold markers — subtle tick marks at 75% and 90% on the context bar
- Dynamic tooltip — live % context usage on hover
- Conversation history — token counts persisted per-conversation via
chrome.storage.local - Model info — detects and displays the active Gemini model
- Fallback tokenizer — uses a local
o200k_baseestimate before the first real response arrives
The counter widget appears in Gemini's input bar, showing estimated input tokens, a progress bar, and the active model — all without leaving the page.
Works on Chrome, Edge, and Arc (any Chromium-based browser).
- Download this repo — click
Code → Download ZIPorgit clone - Open
chrome://extensionsin your browser - Enable Developer mode (toggle in the top-right)
- Click Load unpacked and select the
gemini_counter_v2folder - Navigate to gemini.google.com — the counter appears automatically ✅
Gemini's web app uses internal, obfuscated endpoints (unlike Claude's clean REST API). This extension works around that by:
- Injecting a bridge script into the page context to intercept
fetchandXMLHttpRequest - Parsing JSON response bodies to find
usageMetadata— the same data Google's official API returns - Extracting
promptTokenCount,candidatesTokenCount, andthoughtsTokenCountdirectly - Falling back to a local tokenizer estimate until the first real API response arrives
Token counts come straight from Gemini — so they're exact, not estimated.
- All data stays local — no external servers, no analytics, no tracking
- Only reads Gemini's own response data from your browser session
- Makes zero additional network requests
- Token history stored only in
chrome.storage.local
| Model | Context Window |
|---|---|
| Gemini 2.5 Pro | 1,048,576 tokens (1M) |
| Gemini 2.5 Flash | 1,048,576 tokens (1M) |
| Gemini 2.5 Flash Lite | 1,048,576 tokens (1M) |
| Gemini 2.0 Flash | 1,048,576 tokens (1M) |
| Gemini 1.5 Pro | 2,097,152 tokens (2M) |
| Gemini 1.5 Flash | 1,048,576 tokens (1M) |
Unknown models fall back to 1M and log a warning to the browser console.
gemini_counter_v2/
├── manifest.json # Extension manifest (MV3)
├── icons/
│ ├── icon16.png
│ ├── icon48.png
│ └── icon128.png
├── assets/
│ └── screenshot.png # UI screenshot
└── src/
├── styles.css # Counter widget styles
├── vendor/
│ └── o200k_base.js # Local tokenizer fallback
├── injected/
│ └── bridge.js # Page-context fetch interceptor
└── content/
├── constants.js # Model limits & config
├── bridge-client.js # Messaging between contexts
├── tokens.js # Token counting logic
├── ui.js # Widget rendering
└── main.js # Entry point & orchestration
- UTF-8 corruption fix —
TextDecodernow uses a single decoder with correct{ stream: true }flag, preventing multi-byte character corruption at chunk boundaries - postMessage origin hardened — All
window.postMessagecalls targetwindow.location.origininstead of'*' - Ghost duplicate widget fixed —
attach()removes the container from its current parent before re-inserting - DOM observer debounced —
MutationObserverdebounced 200ms to avoid layout thrashing on Gemini's SPA - Dead hash API removed — Unused
kind: 'hash'handler removed frombridge.js accumulatedTextdead variable removed — Cleaned up unused stream handler variable
- Output token display (
N out) - Tiered warning states (normal → orange at 75% → red at 90%+)
- Bar threshold markers at 75% and 90%
- Dynamic tooltip with live % context usage
- Unknown model console warnings
- Conversation history storage (last 50 entries per conversation)
gemini-2.5-flash-liteadded to known models
Contributions are welcome! Please read CONTRIBUTING.md before opening a PR.
MIT — free to use, modify, and distribute.