Skip to content

bug: "Forbidden" using ollama as provider #8131

@AlbertoSinigaglia

Description

@AlbertoSinigaglia

Version: e.g. 0.7.9

Describe the Bug

Every time i try to get Ollama completions, i get a "Forbidden".
However, the model list feature when i add the provide works with no problems.

Screenshots / Logs

Image

yet the model list works fine:

Image

I've also tested the completion manually:

(base) ➜  ~  curl -i --max-time 30 \
  -H 'Content-Type: application/json' \
  -d '{"model":"gemma4:31b","messages":[{"role":"user","content":"Reply with exactly: OK"}],"max_tokens":16,"temperature":0}' \
  http://127.0.0.1:6000/v1/chat/completions
HTTP/1.1 200 OK
Content-Type: application/json
Date: Mon, 11 May 2026 15:24:45 GMT
Content-Length: 357

{"id":"chatcmpl-215","object":"chat.completion","created":1778513085,"model":"gemma4:31b","system_fingerprint":"fp_ollama","choices":[{"index":0,"message":{"role":"assistant","content":"","reasoning":"The user wants me to reply with exactly \"OK\".\n\n    "},"finish_reason":"length"}],"usage":{"prompt_tokens":21,"completion_tokens":16,"total_tokens":37}}

So to be fair, I'm completely lost (and yes, I've already added OLLAMA_ORIGINS="*")

Not sure what's happening, because the vLLM provider instead works like a charm

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    Projects

    Status

    No status

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions