Skip to content

Releases: ableinc/local-ai

v1.1.2

18 Jul 15:51
Compare
Choose a tag to compare
  • Settings menu
  • Toggle when to use memory and when not
  • Better memory context handling

v1.1.0

16 Jul 18:38
Compare
Choose a tag to compare

Release Notes

  • Significantly smaller bundle/binary size
  • Conversations now have memory (no more repeating yourself!)
  • Faster load times

Note: This is a breaking change. When running Ollama locally, you must have the nomic-embed-text embedding model installed on your machine. It's very lightweight, so it won't consume much storage or RAM/VRAM. To install:

ollama pull nomic-embed-text

Also, I recommend updating your .zshrc or .bashrc by adding an alias which points to com.capable.localai which is the package name for Local Ai.

alias localai="com.capable.localai"

v1.0.0

16 Jul 13:00
Compare
Choose a tag to compare
  • 🤖 Local AI Integration: Connect to Ollama models running on your machine
  • 💬 Persistent Chat History: SQLite database stores all conversations locally
  • 🔄 Streaming Responses: Real-time AI responses with typing indicators
  • 📂 Multiple Conversations: Create and manage multiple chat sessions
  • 🎨 Modern UI: Clean, responsive interface with dark/light theme support
  • 📱 Mobile-Friendly: Responsive design with collapsible sidebar
  • 🚀 Fast Performance: Built with Vite for lightning-fast development and builds
  • 📄 Paginated History: Efficient loading of chat history with scroll-based pagination
  • 🔧 Model Selection: Choose from available Ollama models via dropdown