Skip to content

can't prompt, Ollama not supported #7

@antlep

Description

@antlep

hello Korben Dallas 😛

I've downloaded the deepseek model via ollama as you suggested but in the interface, i can't prompt :

Provider 'ollama' not supported. Available providers: black-forest-labs,cerebras,cohere,fal-ai,featherless-ai,hf-inference,fireworks-ai,groq,hyperbolic,nebius,novita,nscale,openai,ovhcloud,replicate,sambanova,together

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions