feat: Add OpenAI-compatible provider for local LLMs and custom APIs#1
feat: Add OpenAI-compatible provider for local LLMs and custom APIs#1d33pstatetech-stack wants to merge 2 commits intomainfrom
Conversation
Add a new 'Custom (OpenAI-Compatible)' provider option that supports any service implementing the OpenAI chat completions API: - Local LLMs: Ollama, llama.cpp, vLLM, LM Studio, LocalAI - Cloud APIs: Together AI, Groq, Fireworks, Deepseek, Mistral, etc. Changes: - lib/openai-compatible-auth.ts: New auth module (flexible base URL, optional API key) - lib/ai-client.ts: OpenAICompatibleClient class, updated AIProviderType - lib/settings.ts: New provider type, free-form model getter - app/api/settings/route.ts: CRUD for base URL, model name, API key - app/api/settings/test/route.ts: Connection test with friendly errors - app/settings/page.tsx: New provider tab with config UI and test button - .env.example: Documented new env vars with common endpoint examples
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 9b6736e999
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| } | ||
|
|
||
| function ProviderToggle({ value, onChange }: { value: 'anthropic' | 'openai' | 'minimax'; onChange: (v: 'anthropic' | 'openai' | 'minimax') => void }) { | ||
| function ProviderToggle({ value, onChange }: { value: 'anthropic' | 'openai' | 'minimax' | 'openai_compatible'; onChange: (v: 'anthropic' | 'openai' | 'minimax' | 'openai_compatible') => void }) { |
There was a problem hiding this comment.
Render a Custom provider option in the provider toggle
The settings UI adds openai_compatible to provider state and save logic, but this toggle still renders only three buttons (Anthropic/OpenAI/MiniMax). As a result, users cannot switch to the new custom provider from the UI unless they manually set the DB/API value first, which makes the main feature path effectively inaccessible.
Useful? React with 👍 / 👎.
| if (openaiCompatibleModel !== undefined) { | ||
| const trimmed = openaiCompatibleModel.trim() |
There was a problem hiding this comment.
Validate OpenAI-compatible fields before calling trim
These new settings branches call .trim() without checking that the incoming JSON value is a string. If a client sends a malformed payload (for example {"openaiCompatibleModel":123}), the route throws a runtime TypeError and returns a 500 instead of a clean 400 validation error. The older key handlers already guard with typeof ... === 'string', so these branches should do the same to avoid avoidable server errors.
Useful? React with 👍 / 👎.
…ble input types Address Codex review feedback: - Render the 'Custom' button in ProviderToggle so users can actually switch to the openai_compatible provider from the UI - Add typeof string checks on openaiCompatibleModel, openaiCompatibleBaseUrl, and openaiCompatibleApiKey before calling .trim(), matching the pattern used by the existing key handlers to return 400 instead of 500 on malformed payloads
Summary
Adds a new "Custom (OpenAI-Compatible)" provider option that lets Siftly work with any service implementing the OpenAI chat completions API.
Supported Endpoints
Local LLMs:
http://localhost:11434/v1)http://localhost:8080/v1)http://localhost:8000/v1)http://localhost:1234/v1)Cloud APIs:
Changes
lib/openai-compatible-auth.tslib/ai-client.tsOpenAICompatibleClientclass + updatedAIProviderTypeunionlib/settings.tsapp/api/settings/route.tsapp/api/settings/test/route.tsapp/settings/page.tsx.env.exampleOPENAI_COMPATIBLE_BASE_URLandOPENAI_COMPATIBLE_API_KEYwith common examplesHow It Works
http://localhost:11434/v1for Ollama)llama3.2)All existing AI features (categorization, search, image analysis, mind maps) work with the new provider.