Replace @inngest/ai adapter system with Vercel AI SDK#294
Draft
djfarrelly wants to merge 6 commits intomainfrom
Draft
Replace @inngest/ai adapter system with Vercel AI SDK#294djfarrelly wants to merge 6 commits intomainfrom
djfarrelly wants to merge 6 commits intomainfrom
Conversation
Swap out the custom @inngest/ai abstraction layer and per-provider adapters
(OpenAI, Anthropic, Gemini, Azure, Grok) in favor of the Vercel AI SDK's
`generateText()` and `LanguageModelV1` interface. Users now pass AI SDK
model instances directly instead of using re-exported factory functions.
- Add `ai` package as dependency; remove `@inngest/ai`
- Rewrite AgenticModel to use generateText() + step.run() instead of
step.ai.infer() + raw HTTP calls
- Add src/converters.ts for Message[] <-> CoreMessage[] bridging
- Replace AiAdapter.Any with LanguageModelV1 across agent.ts and network.ts
- Delete src/adapters/ directory and src/models.ts
- Update tests to use mock LanguageModelV1 instead of fetch mocks
BREAKING CHANGE: Model configuration now requires Vercel AI SDK provider
packages (e.g. @ai-sdk/openai) instead of @inngest/ai factory functions.
|
djfarrelly
commented
Mar 5, 2026
| if (step) { | ||
| result = (await step.ai.infer(stepID, { | ||
| const doInference = async (): Promise<SerializableResult> => { | ||
| const result = await generateText({ |
Member
Author
There was a problem hiding this comment.
this is the core change here - need to test for o11y
- Fix mapToolChoice return type removing misleading "none" variant - Deduplicate identical system/user/assistant branches in messagesToCoreMessages - Remove type assertions in resultToMessages in favor of typed locals - Remove triplicate RoutingConstructor interface declaration in agent.ts - Add 24 unit tests covering all converter functions
- Text-only response - Tool call response (with tool definitions passed) - Mixed text + tool call response - Error propagation from generateText() (the main medium-priority gap) - Non-Error exception propagation (e.g. thrown strings) - Verifies tools/toolChoice aren't passed when no tools provided - createAgenticModelFromLanguageModel factory function
- Add try/catch around z.toJSONSchema() in toolsToAiTools so incompatible
schemas (e.g. Zod v3 from MCP) fall back to an open object schema
instead of crashing inference
- Extract createMockModel into shared __tests__/test-helpers.ts, removing
duplication across routing-with-done, network, and model test files
- Add standalone agent.run() and withModel() tests (6 new tests)
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Swap out the custom @inngest/ai abstraction layer and per-provider adapters (OpenAI, Anthropic, Gemini, Azure, Grok) in favor of the Vercel AI SDK's
generateText()andLanguageModelV1interface. Users now pass AI SDK model instances directly instead of using re-exported factory functions.aipackage as dependency; remove@inngest/aiBREAKING CHANGE: Model configuration now requires Vercel AI SDK provider packages (e.g. @ai-sdk/openai) instead of @inngest/ai factory functions.