Skip to content

Conversation

@TheSethRose
Copy link

@TheSethRose TheSethRose commented Dec 20, 2025

Summary

This PR adds support for the OpenCode Zen BYOK (Bring Your Own Key) provider to the GitHub Copilot Chat extension. The OpenCode Zen provider allows users to connect to various AI models through the OpenCode Zen API service.

Changes

  1. OpenCodeZenLMProvider (src/extension/byok/vscode-node/openCodeZenProvider.ts):

    • New provider class extending BaseOpenAICompatibleLMProvider
    • Dynamic model discovery from /models API endpoint
    • Smart endpoint routing based on model family:
      • GPT models → /responses endpoint
      • Claude models → /messages endpoint
      • Gemini models → /models/{id} endpoint
      • Other models → /chat/completions endpoint
  2. Provider Registration (src/extension/byok/vscode-node/byokContribution.ts):

    • Added OpenCodeZenLMProvider to the BYOK contribution system
    • Proper dependency injection for required services
  3. Test Suite (src/extension/byok/vscode-node/test/openCodeZenProvider.spec.ts):

    • Comprehensive unit tests for model fetching
    • Endpoint routing tests
    • Mock API responses for testing

Supported Models

The provider supports multiple model families from OpenCode Zen:

  • GPT Series: gpt-5.2, gpt-5.1, gpt-5, etc.
  • Claude Series: claude-sonnet-4-5, claude-haiku-4-5, etc.
  • Gemini Series: gemini-3-pro, gemini-3-flash, etc.
  • OpenAI-Compatible: GLM, Kimi, Qwen, Grok Code Fast 1, etc.

Testing

  • All tests pass
  • Extension compiles successfully
  • Models are dynamically discovered from API rather than using static fallbacks

Checklist

  • Code compiles without errors
  • Tests pass
  • Follows existing code patterns and architecture
  • Proper error handling and logging
  • Comprehensive test coverage
  • Documentation in commit messages

Please review and let me know if any changes are needed!

- Add OpenCodeZenLMProvider class that extends BaseOpenAICompatibleLMProvider
- Implement model fetching from /models API endpoint
- Add endpoint routing for different model families (GPT, Claude, Gemini, etc.)
- Add comprehensive test suite for the provider
- Register provider in BYOK contribution system

The provider supports multiple model families from OpenCode Zen:
- GPT models (gpt-5.2, gpt-5.1, etc.) using /responses endpoint
- Claude models (claude-sonnet-4-5, etc.) using /messages endpoint
- Gemini models (gemini-3-pro, etc.) using /models/{id} endpoint
- OpenAI-compatible models (GLM, Kimi, Qwen, Grok, etc.) using /chat/completions

All models are dynamically discovered from the API rather than using static fallbacks.
@TheSethRose TheSethRose force-pushed the feature/opencode-zen-only branch from 112ca15 to 1414c4f Compare December 20, 2025 06:01
@TheSethRose
Copy link
Author

More Information: https://opencode.ai/docs/zen/

Copy link
Author

@TheSethRose TheSethRose left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've updated the package.json to include the opencodezen provider in the languageModelChatProviders contribution point. This ensures the provider is correctly registered and visible in the VS Code model list. I also fixed the compilation errors in the provider implementation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants