Skip to content

fix(samples): Use native Gemini integration instead of LiteLLM#603

Open
mattcarrollcode wants to merge 2 commits intogoogle:mainfrom
mattcarrollcode:fix/use-native-gemini-integration
Open

fix(samples): Use native Gemini integration instead of LiteLLM#603
mattcarrollcode wants to merge 2 commits intogoogle:mainfrom
mattcarrollcode:fix/use-native-gemini-integration

Conversation

@mattcarrollcode
Copy link

@mattcarrollcode mattcarrollcode commented Feb 5, 2026

Summary

  • Fix the warning about using Gemini via LiteLLM:
    $ uv run .
    A2UI/samples/agent/adk/restaurant_finder/agent.py:107: UserWarning: [GEMINI_VIA_LITELLM] gemini/gemini-2.5-flash: You are using Gemini via LiteLLM. For better performance, reliability, and access to latest features, consider using Gemini directly through ADK's native Gemini integration. Replace LiteLlm(model='gemini/gemini-2.5-flash') with Gemini(model='gemini-2.5-flash'). Set ADK_SUPPRESS_GEMINI_LITELLM_WARNINGS=true to suppress this warning.
      model=LiteLlm(model=LITELLM_MODEL),
    
  • Replace LiteLlm wrapper with ADK's native Gemini class for better performance, reliability, and access to latest features
  • Rename env var from LITELLM_MODEL to GEMINI_MODEL

Test plan

  • Run each sample and verify no LiteLLM warning appears

Replace LiteLlm wrapper with ADK's native Gemini class for better
performance, reliability, and access to latest features. This removes
the warning about using Gemini via LiteLLM.

Changes:
- Import Gemini from google.adk.models.google_llm instead of LiteLlm
- Rename env var from LITELLM_MODEL to GEMINI_MODEL
- Update model format from "gemini/gemini-2.5-flash" to "gemini-2.5-flash"
@google-cla
Copy link

google-cla bot commented Feb 5, 2026

Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA).

View this failed invocation of the CLA check for more information.

For the most up to date status, view the checks section at the bottom of the pull request.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request correctly replaces the LiteLlm wrapper with the native Gemini class across several sample agent files, which should improve performance and reliability. The changes are mostly straightforward replacements of imports, environment variable names, and model instantiation. I've identified a minor inconsistency in variable naming for the Gemini model across the modified files and have provided suggestions to align them with Python's PEP 8 style guide for local variables (snake_case). This will improve code consistency and maintainability. Overall, this is a solid improvement.

Comment on lines 185 to 187
GEMINI_MODEL = os.getenv("GEMINI_MODEL", "gemini-2.5-flash")
agent = LlmAgent(
model=LiteLlm(model=LITELLM_MODEL),
model=Gemini(model=GEMINI_MODEL),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The variable GEMINI_MODEL should be renamed to gemini_model to follow Python's snake_case convention for local variables. This improves consistency with the style used in samples/agent/adk/rizzcharts/__main__.py and adheres to PEP 8.

Suggested change
GEMINI_MODEL = os.getenv("GEMINI_MODEL", "gemini-2.5-flash")
agent = LlmAgent(
model=LiteLlm(model=LITELLM_MODEL),
model=Gemini(model=GEMINI_MODEL),
gemini_model = os.getenv("GEMINI_MODEL", "gemini-2.5-flash")
agent = LlmAgent(
model=Gemini(model=gemini_model),
References
  1. The repository style guide (line 9) states that code should follow relevant style guides for each language. For Python, PEP 8 is the standard, which recommends snake_case for local variables. (link)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: Todo

Development

Successfully merging this pull request may close these issues.

1 participant