Releases: ableinc/local-ai
Releases · ableinc/local-ai
v1.1.2
v1.1.0
Release Notes
- Significantly smaller bundle/binary size
- Conversations now have memory (no more repeating yourself!)
- Faster load times
Note: This is a breaking change. When running Ollama locally, you must have the nomic-embed-text
embedding model installed on your machine. It's very lightweight, so it won't consume much storage or RAM/VRAM. To install:
ollama pull nomic-embed-text
Also, I recommend updating your .zshrc
or .bashrc
by adding an alias which points to com.capable.localai
which is the package name for Local Ai.
alias localai="com.capable.localai"
v1.0.0
- 🤖 Local AI Integration: Connect to Ollama models running on your machine
- 💬 Persistent Chat History: SQLite database stores all conversations locally
- 🔄 Streaming Responses: Real-time AI responses with typing indicators
- 📂 Multiple Conversations: Create and manage multiple chat sessions
- 🎨 Modern UI: Clean, responsive interface with dark/light theme support
- 📱 Mobile-Friendly: Responsive design with collapsible sidebar
- 🚀 Fast Performance: Built with Vite for lightning-fast development and builds
- 📄 Paginated History: Efficient loading of chat history with scroll-based pagination
- 🔧 Model Selection: Choose from available Ollama models via dropdown