Belullama is a comprehensive AI application that bundles Ollama, Open WebUI, and Automatic1111 (Stable Diffusion WebUI) into a single, easy-to-use package.
-
Updated
May 30, 2025 - Shell
Belullama is a comprehensive AI application that bundles Ollama, Open WebUI, and Automatic1111 (Stable Diffusion WebUI) into a single, easy-to-use package.
Nginx proxy server in a Docker container to Authenticate & Proxy requests to Ollama from Public Internet via Cloudflare Tunnel
Opinionated macOS setup
Streamline Coding & Speed Up Dev Process. Your Own Personal Senior Engineer For Free!
A simple, lightweight shell script to interact with OpenAI or Ollama or Mistral AI or LocalAI or ZhipuAI from the terminal, and enhancing intelligent system management without any dependencies(pure shell).
Self-hosted app store and runtime for AI agents. Install third-party agents, run them on your infrastructure with your own model providers (Ollama, Bedrock, OpenAI, etc.). Container isolation, credential injection, default-deny egress.
One click guide to run Sixgpt Minner on Vana Network
An oh-my-zsh plugin that integrates the OLLAMA AI model to provide command suggestions
Drop the faff, dodge the judgment. Another bloody AI commit generator, but this one stays local 🦙
The Ollama Toolkit is a collection of powerful tools designed to enhance your experience with the Ollama project, an open-source framework for deploying and scaling machine learning models. Think of it as your one-stop shop for streamlining workflows and unlocking the full potential of Ollama!
A one-file Ollama CLI client written in bash
This shell script installs a GUI version of privateGPT for Linux.
Self-contained offline environment providing local AI chat, offline Wikipedia/content archives, IRC communication, audio streaming, file server, and development tools. Designed for zero internet dependency - download once, run anywhere. Perfect for remote areas, emergency scenarios, or escaping surveillance capitalism.
Add a description, image, and links to the ollama topic page so that developers can more easily learn about it.
To associate your repository with the ollama topic, visit your repo's landing page and select "manage topics."