macai (macOS AI) is a simple yet powerful native macOS client made to interact with modern AI tools (ChatGPT- and Ollama-compatible API are supported).
You can download latest binary, notarized by Apple, on Releases page.
You can also support project on Gumroad.
Checkout main branch and open project in Xcode 14.3 or later
- ChatGPT/Ollama and other compatible API support
- Organized with chats, where each chat has its own context
- Customized system messages (instructions) per chat
- System-defined light/dark theme
- Backup and restore your chats
- Customized context size
- Any model with compatible API can be used
- Formatted code blocks
- Formatted tables, copy as JSON
- With tabs, one can easily work with multiple chats simultaneously
- Data is stored using CoreData
- Streamed responses
- Automatically generate chat names
Project is in the active development phase.
Contributions are welcomed. Take a look at macai project page and Issues page to see planned features/bug fixes, or create a new one.
To run macai with ChatGPT, you need to have ChatGPT API token. You can get it here. Add the token to the settings and you are ready to go. Note: by default, gpt-3.5-turbo model is selected. You can change it in the New Chat settings.
Run with Ollama
Ollama is the open-source back-end for various LLM models. Run with Ollama is very easy:
- Install Ollama from the official website
- Follow installation guides
- After installation, select model (llama3 is recommended) and run ollama using command:
ollama run llama3
- In macai LLM settings, set ChatGPT/LLM API Url to
http://localhost:11434/api/chat
: - In macai New Chat settings, set model to
llama3
- Changing default instructions is recommended
- Test and enjoy!
An example of custom system message and ChatGPT responses:
The syntax of the code provided in ChatGPT response will be highlighted (185 languages supported)
In most cases, tables in ChatGPT repsonses can be formatted as follows: