Skip to content

Allow URI in model name #3

@kfsone

Description

@kfsone

To support using more than one ollama-style endpoint or endpoints on multiple machines/containers, allow specifying a URI in model names:

OLLAMA_API_URL=http://localhost:11434
DEFAULT_MODELS=http://ollama-pi:11434/gemma3:1b,http://ollama-pc:11434/gemma3:4b,http://ollama-mini:11434/gemma3:12b,http://myrunpod11434/deepseek-r1:70b,llama3.2:2b

Even without concurrent requests, the time saved from having to switch models would speed things up a lot.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions