Skip to content

feat(litellm): add support for local proxy without API key#46

Open
vinit13792 wants to merge 2 commits intorepowise-dev:mainfrom
vinit13792:main
Open

feat(litellm): add support for local proxy without API key#46
vinit13792 wants to merge 2 commits intorepowise-dev:mainfrom
vinit13792:main

Conversation

@vinit13792
Copy link
Copy Markdown

Summary

  • Add litellm to the interactive provider selection menu
  • Support LITELLM_BASE_URL for local proxy deployments (no API key required)
  • Auto-add openai/ prefix when using api_base for proper LiteLLM routing
  • Add dummy API key for local proxies (OpenAI SDK requirement)
  • Add validation and tests for litellm provider configuration

Changes

CLI (packages/cli)

  • Added litellm to _PROVIDER_ENV and _PROVIDER_SIGNUP in ui.py
  • Updated _detect_provider_status() to check for LITELLM_BASE_URL
  • Updated interactive_provider_select() to skip API key prompt for local proxy
  • Added litellm handling in helpers.py resolve_provider() and validate_provider_config()

Core (packages/core)

  • Updated LiteLLMProvider to auto-add openai/ prefix when api_base is set
  • Added dummy API key (sk-dummy) for local proxy without authentication

Tests

  • Added 3 litellm validation tests in tests/unit/cli/test_helpers.py
  • Created new tests/unit/test_providers/test_litellm_provider.py with 14 tests

Test Plan

  • All existing tests pass
  • New unit tests added and passing
  • Manual testing with local litellm proxy at localhost:4000
  • Code formatted with ruff

🤖 Generated with Claude Code

- Add litellm to interactive provider selection menu
- Support LITELLM_BASE_URL for local proxy deployments (no API key required)
- Auto-add openai/ prefix when using api_base for proper LiteLLM routing
- Add dummy API key for local proxies (OpenAI SDK requirement)
- Add validation and tests for litellm provider configuration

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@RaghavChamadiya
Copy link
Copy Markdown
Collaborator

This is clean. The sk-dummy workaround is exactly what the LiteLLM docs recommend for local proxies, and the openai/ prefix auto-add is the right UX touch. 14 tests is solid coverage.

One small request before merge: add a comment next to the sk-dummy value so it does not trip future secret-scanning tools:

# LiteLLM requires a non-empty key even for unauthenticated local proxies (OpenAI SDK requirement)
call_kwargs["api_key"] = "sk-dummy"

This appears in two places in litellm.py (_generate_with_retry and stream_chat). Once that comment is added, I am ready to merge.

… false positives

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@vinit13792
Copy link
Copy Markdown
Author

Yup makes sense! Updated! Ready for you to merge the code.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants