Skip to content

fix: correct gpt-5-mini context window to 272K#143

Closed
jeff-atriumn wants to merge 1 commit intomainfrom
fix/gpt5mini-context-window
Closed

fix: correct gpt-5-mini context window to 272K#143
jeff-atriumn wants to merge 1 commit intomainfrom
fix/gpt5mini-context-window

Conversation

@jeff-atriumn
Copy link
Copy Markdown
Member

Summary

The pricing table had gpt-5-mini at 400K context window, but OpenAI's actual limit is 272K. This caused pre-pass chunks (75% of 400K = 300K) to still exceed the real limit.

Test plan

  • 439 tests pass

🤖 Generated with Claude Code

Was set to 400K but OpenAI's actual limit is 272K tokens for
gpt-5-mini. This caused pre-pass chunks to exceed the limit
despite chunking being enabled.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@jeff-atriumn
Copy link
Copy Markdown
Member Author

Closing — tokencost confirms 400K is correct. The issue was insufficient chunking headroom, fixed in #144.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant