Skip to content

Conversation

@TheSethRose
Copy link

Description

This PR fixes an issue where Gemini models via OpenRouter would fail with a 400 error when using tools. The error was caused by missing reasoning_details (thought signatures) in subsequent requests, which OpenRouter/Google AI Studio requires to be preserved and replayed.

Error fixed:
Gemini models require OpenRouter reasoning details to be preserved in each request. ... Function call is missing a thought_signature in functionCall parts.

Changes:

  • Implemented OpenRouterEndpoint to capture reasoning_details from streaming responses.
  • Added a reasoningCache in OpenRouterLMProvider to associate reasoning blocks with tool call IDs.
  • Injected cached reasoning_details back into outgoing request messages for assistant turns containing tool calls.
  • Added comprehensive unit tests in src/extension/byok/vscode-node/test/openRouterProvider.spec.ts.

Testing

  • Added unit tests covering:
    • Model fetching and parsing.
    • Reasoning detail caching and cleanup.
    • Injection of reasoning details into request bodies.
  • Verified the fix by compiling the extension and testing the VSIX.

Checklist

  • Signed CLA (Verified via GitHub profile)
  • Tests added/updated
  • Followed project architecture and coding standards
  • Verified compilation output

Fixes: Copilot Request id: 8aa4dd19-4578-4e8c-abe8-986364545cd9

@TheSethRose
Copy link
Author

@microsoft-github-policy-service agree

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants