Tags: anuraaga/ollama
Tags
openai: align chat temperature and frequency_penalty options with com… …pletion (ollama#6688)
Add findutils to base images (ollama#6581) This caused missing internal files
Only enable numa on CPUs (ollama#6484) The numa flag may be having a performance impact on multi-socket systems with GPU loads
gpu: Group GPU Library sets by variant (ollama#6483) The recent cuda variant changes uncovered a bug in ByLibrary which failed to group by common variant for GPU types.
Split rocm back out of bundle (ollama#6432) We're over budget for github's maximum release artifact size with rocm + 2 cuda versions. This splits rocm back out as a discrete artifact, but keeps the layout so it can be extracted into the same location as the main bundle.
CI: remove directories from dist dir before upload step (ollama#6429)
Merge pull request ollama#6424 from dhiltgen/cuda_v12 Fix overlapping artifact name on CI
PreviousNext