Custom ComfyUI Nodes for interacting with Ollama using the ollama python client.
Integrate the power of LLMs into ComfyUI workflows easily or just experiment with LLM inference.
To use this properly, you would need a running Ollama server reachable from the host that is running ComfyUI.
Instructions
Install on Linux
curl -fsSL https://ollama.com/install.sh | shCPU only
docker run -d -p 11434:11434 -v ollama:/root/.ollama --name ollama ollama/ollamaNVIDIA GPU
docker run -d -p 11434:11434 --gpus=all -v ollama:/root/.ollama --name ollama ollama/ollamaUse ComfyUI's built-in extension manager to install the nodes. Search for comfyui-ollama by Stav Sapir.
Or
If you prefer ComfyUI-Manager, search for ollama and select the one by stavsap
Or
- git clone into the
custom_nodesfolder inside your ComfyUI installation or download as zip and unzip the contents tocustom_nodes/compfyui-ollama. pip install -r requirements.txt- Start/restart ComfyUI
If you are using Ollama Cloud templates that require authentication, you must provide your Ollama public key.
You can do this automatically with the CLI:
ollama signinOr by manually adding your public key at ollama.com/settings/key.
Your public key is located at:
| OS | Path to id_ed25519.pub |
|---|---|
| macOS | ~/.ollama/id_ed25519.pub |
| Linux | /usr/share/ollama/.ollama/id_ed25519.pub |
| Windows | C:\Users<username>.ollama\id_ed25519.pub |
See Ollama's FAQ for more details.
A node that provides ability to set the system prompt and the prompt.
Ability to save context locally in the node enable/disable
Inputs:
- OllamaConnectivity (optional)
- OllamaOptions (optional)
- images (optional)
- context (optional), a context from other OllamaConnectivity
- meta (optional), passing metadata of the OllamaConnectivity and OllamaOptions from other OllamaGenerate node.
Notes:
- For this node to be operational, OllamaConnectivity or meta must be inputted!.
- If images are inputted and a chain of meta usage is made, all the images need to be passed as well to the next OllamaConnectivity nodes.
A node for conversational interaction using the dedicated Ollama chat endpoint (ollama.chat). It manages the full conversation history natively and allows for chained sequences of chat nodes.
- Functionality: Unlike
OllamaGenerate, this node is designed specifically for multi-turn conversations and cloud models with your Ollama public key (see the Configuration section). - Key Features:
- Conversation history is handled natively within the node instance.
- Dedicated history outputs for chaining multiple chat nodes.
- Option to reset the current conversation history on demand.
Inputs:
- OllamaConnectivity (optional)
- OllamaOptions (optional)
- images (optional)
- meta (optional), passing metadata of the OllamaConnectivity and OllamaOptions from other OllamaChat node.
- history (optional), passing history ID from other OllamaChat node.
Outputs:
- result: The generated text
- thinking: The thinking text
- meta: The metadata to pass to the next OllamaChat node
- history: The history ID to pass to the next OllamaChat node
A node responsible only fot the connectivity to the ollama server
A node for full control of the ollama api options.
For an option to take effect, each option have also enable/disable, enabled options are passed to api call to ollama server.
Ollama API options can be found in this table.
Note: There is an additional option debug that enables debug print in the cli, its not part of ollama api.
Old V1 nodes are still available, please replace them with the above ones. Here's the documentation of V1 nodes.
Please see the example_workflows folder or use ComfyUI's template browser.
The custom Text Nodes in the examples can be found here: https://github.com/pythongosssss/ComfyUI-Custom-Scripts