Skip to content

josuemontano/ollama-copilot

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

99 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ollama Copilot

A proxy that allows you to use Ollama as a coding assistant similar to GitHub Copilot.

Zed demo

Table of Contents

Features

  • Works as a drop-in replacement for GitHub Copilot
  • Compatible with multiple IDEs
  • Uses local LLMs through Ollama for privacy and control
  • Customizable model selection

Requirements

  • Go 1.20 or higher
  • Ollama installed and running
  • At least 16GB RAM (depending on the LLM model used)

Installation

Install Go

  1. Download Go from the official website:
# For Linux (adjust version as needed)
wget https://go.dev/dl/go1.25.0.linux-amd64.tar.gz
sudo tar -C /usr/local -xzf go1.25.0.linux-amd64.tar.gz

# For macOS (using Homebrew)
brew install go
  1. Add Go to your PATH in ~/.bashrc, ~/.zshrc, or equivalent:
export PATH=$PATH:/usr/local/go/bin
export GOPATH=$HOME/go
export PATH=$PATH:$GOPATH/bin

# Apply the changes
source ~/.bashrc  # or source ~/.zshrc
  1. Verify the installation:
go version

Install Ollama

Ensure Ollama is installed:

curl -fsSL https://ollama.com/install.sh | sh

# For macOS (using Homebrew)
brew install ollama

Download the required model

ollama pull qwen3-coder:30b

You can use other models as well, but qwen3-coder:30b is the default expected by ollama-copilot.

Install ollama-copilot

go install github.com/josuemontano/ollama-copilot@latest

Usage

Basic Usage

  1. Make sure Ollama is running:
ollama serve
  1. In a separate terminal, start ollama-copilot:
ollama-copilot
  1. Configure your IDE to use the proxy (see IDE Configuration below)

Command Line Options

Flag Default Description
--port :11437 HTTP port to listen on
--proxy-port :11438 HTTP proxy port to listen on
--port-ssl :11436 HTTPS port to listen on
--proxy-port-ssl :11435 HTTPS proxy port to listen on
--cert "" Certificate file path (*.crt) for HTTPS
--key "" Key file path (*.key) for HTTPS
--model qwen3-coder:30b LLM model to use with Ollama
--num-predict 200 Maximum number of tokens to predict
--prompt-template <|fim_prefix|> {{.Prefix}} <|fim_suffix|>{{.Suffix}} <|fim_middle|> Fill-in-middle template for prompts
--verbose false Enable verbose logging mode

Example with custom options:

ollama-copilot --model qwen2.5-coder:7b --num-predict 300 --verbose

Environment Variables

You can configure the Ollama host using environment variables:

OLLAMA_HOST="http://192.168.133.7:11434" ollama-copilot

IDE Configuration

Neovim

  1. Install copilot.vim
  2. Configure variables in your Neovim config:
let g:copilot_proxy = 'http://localhost:11435'
let g:copilot_proxy_strict_ssl = v:false

VSCode

  1. Install the GitHub Copilot extension
  2. Configure settings (File > Preferences > Settings or Ctrl+,):
{
  "github.copilot.advanced": {
    "debug.overrideProxyUrl": "http://localhost:11437"
  },
  "http.proxy": "http://localhost:11435",
  "http.proxyStrictSSL": false
}

Zed

  1. Open settings (Ctrl+,)
  2. Set up edit completion proxying:
{
  "features": {
    "edit_prediction_provider": "copilot"
  },
  "show_completions_on_input": true,
  "edit_predictions": {
    "copilot": {
      "proxy": "http://localhost:11435",
      "proxy_no_verify": true
    }
  }
}

Emacs

(Experimental)

  1. Install copilot-emacs
  2. Configure the proxy in your Emacs config:
(use-package copilot
  :straight (:host github :repo "copilot-emacs/copilot.el" :files ("*.el"))  ;; if you don't use "straight", install otherwise
  :ensure t
  ;; :hook (prog-mode . copilot-mode)
  :bind (
         ("C-<tab>" . copilot-accept-completion)
         )
  :config
  (setq copilot-network-proxy '(:host "127.0.0.1" :port 11434 :rejectUnauthorized :json-false))
  )

Troubleshooting

  • If you encounter connection issues, make sure Ollama is running
  • Verify that the correct ports are accessible
  • Check logs by running with the -verbose flag
  • Ensure your Go path is correctly set up in your environment

About

Proxy that allows you to use Ollama as a copilot like Github Copilot

Topics

Resources

License

Stars

Watchers

Forks

Languages

  • Go 100.0%