Put up to 8 AI models on every coding task — blind spots surface before you ship. Claude Code plugin.
-
Updated
Apr 15, 2026 - Shell
Put up to 8 AI models on every coding task — blind spots surface before you ship. Claude Code plugin.
🚀 Self-hosted AI automation platform. Deploy n8n, Ollama, Flowise, RAG, Supabase & 30+ tools with one command. Auto HTTPS. Free Zapier/Make alternative.
A 100% offline, fully portable, zero-trace AI (Ollama + Llama 3 + AnythingLLM) that runs natively from a USB drive on Windows and Mac.
Belullama is a comprehensive AI application that bundles Ollama, Open WebUI, and Automatic1111 (Stable Diffusion WebUI) into a single, easy-to-use package.
Nginx proxy server in a Docker container to Authenticate & Proxy requests to Ollama from Public Internet via Cloudflare Tunnel
18 AI personas deliberate your hardest decisions across multiple LLM providers. Aristotle, Feynman, Kahneman, Torvalds & more — structured multi-round deliberation with genuine model diversity. One command: /council
Opinionated macOS setup
Streamline Coding & Speed Up Dev Process. Your Own Personal Senior Engineer For Free!
A simple, lightweight shell script to interact with OpenAI or Ollama or Mistral AI or LocalAI or ZhipuAI from the terminal, and enhancing intelligent system management without any dependencies(pure shell).
Self-hosted app store and runtime for AI agents. Install third-party agents, run them on your infrastructure with your own model providers (Ollama, Bedrock, OpenAI, etc.). Container isolation, credential injection, default-deny egress.
Portable multi-agent AI developer setup for Claude Code + Ollama. Role-based local LLM orchestration via Bash — plan, code, review, commit. Zero Dependency. Works with any language stack.
Drop the faff, dodge the judgment. Another bloody AI commit generator, but this one stays local 🦙
ask is an AI-powered CLI tool for developers who live in the terminal. It brings multi-provider LLM support, agent capabilities, and shell-native intelligence to your fingertips.
An oh-my-zsh plugin that integrates the OLLAMA AI model to provide command suggestions
Add a description, image, and links to the ollama topic page so that developers can more easily learn about it.
To associate your repository with the ollama topic, visit your repo's landing page and select "manage topics."