This release brings powerful local LLM support, smoother request management, and new ways to save and share AI insights.
🚀 New Features
Local LLM Support (Ollama)
- Run AI locally using Ollama models directly on your machine.
- No API costs, save money while keeping everything private.
- Fast and offline-friendly request explanations.
Save AI Explanations and Attack Suggestions
- New save feature for Request Explanations and Attack Vector Suggestions.
- Easily share insights with teammates during pentests.
- Reduce token usage (RPD/TPM) by reusing saved results.
🎨 UI and Workflow Enhancements
Group Deletion for Multi-Tab Capture
- Delete any group when too many domains accumulate.
- Smooth fade-out animation for a cleaner experience.
- No modal dialogs, instant and seamless removal.
🛠️ Improvements and Fixes
- Polished UI transitions and animations.
- Improved performance during heavy multi-tab captures.
- Minor fixes related to grouping and AI interactions.