Skip to content
#

ollama

Here are 236 public repositories matching this topic...

Microservices framework that retrofits existing agentic workflows to opportunistically route inference to local compute when your GPU is free, with built-in benchmarking, wake-on-LAN, and automatic cloud fallback. Includes a Windows tray app that monitors GPU load and gates Ollama network access automatically and notifies the user on running jobs

  • Updated Mar 21, 2026
  • Rust

Improve this page

Add a description, image, and links to the ollama topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the ollama topic, visit your repo's landing page and select "manage topics."

Learn more