Dynamic model listing#4
Conversation
|
It turns out, there is unfortunately not a clear API from Fireworks to get a list of models that are available and return valid chat results. I did a survey of their two endpoints (the OpenAI compatible one and the Fireworks native one) and the ways you can filter them to generate some sets of models as follows: $ curl \
-H "Authorization: Bearer $LLM_FIREWORKS_KEY" \
https://api.fireworks.ai/inference/v1/models | \
jq -r '.data[] | .id' \
> inf_models.txt
$ curl \
-H "Authorization: Bearer $LLM_FIREWORKS_KEY" \
https://api.fireworks.ai/inference/v1/models | \
jq -r '.data[] | select(.supports_chat) | .id' \
> inf_models_chat.txt
$ curl \
-H "Authorization: Bearer $LLM_FIREWORKS_KEY" \
'https://api.fireworks.ai/v1/accounts/fireworks/models?filter=supports_serverless=true' | \
jq -r '.models[] | .name' \
> fw_serverless.txtThese formed my three sets of possible candidates. I then tried hitting The results are as follows (as of today 2026-04-04):
What this reveals is that there is seemingly no clearly documented way to get all the models that are valid for chatting out of the box. For each of the possible ways you could query for a list of models, none of them return all of the valid options, and all of them return at least one invalid option. This is cursed. I reached out in Firework's discord for clarification on what is up here. Hopefully they can fix the |
Summary
This PR replaces the static list of models supported by Fireworks with a call to their API to list the currently available models.
Additionally, it adds handling for the
supports_image_inputandsupports_toolsfields returned in the list of models, so that tools and vision can be used with Fireworks hosted inference.Closes #2
Demos
Model listing:
Tool calling support:
Vision support: