Legal Assistant is an innovative application that leverages RAG (Retrieval-Augmented Generation) technology to deliver personalized legal advice and guidance based on Moroccan law.
-
Updated
Apr 19, 2024 - HTML
Legal Assistant is an innovative application that leverages RAG (Retrieval-Augmented Generation) technology to deliver personalized legal advice and guidance based on Moroccan law.
Super simple RAG supported HTML to web code generator
Django LLM Portal
ollama llava image detection
Click a keyword, read a Markdown document, then click another keyword etc. Uses Gemini Flash or Ollama (gemma2:2b)
Monitoring and evaluating LLM apps with Langfuse. Presented at PyConZA 2024.
Web Client For Ollama - Llama LLM
Template for building microservice-based apps with a frontend, backend, LLM serving engine (e.g., vllm), and nginx.
Backend for a AI chatbot that provides you business insights and startup news, customizable according to your niche.
Ollama Local Docker - A simple Docker-based setup for running Ollama's API locally with a web-based UI. Easily deploy and interact with Llama models like llama3.2 and llama3.2:1b on your local machine. This repository provides an efficient, containerized solution for testing and developing AI models using Ollama.
This is a very early clone of the Arc Search Browse for Me feature. It is a simple python script that uses Ollama and DuckDuckGo to search for a query and generate a report of the results.
10_36个常用linux命令_每3个命令成为一个单独的课程_每个课程是一个club的工作空间_共12个课程 12_每个3个命令是1个独立的课程_教师在8分钟内讲解完成_学员可以在后继的时间中直接在club.cloudstudio中操作
AI Image Analyzer for ollama mistral.rs molmo in Mac M2 max (Screen Capture Analyzer ,Camera Capture Analyzer)
This open-source software helps healthcare institutions utilize historical case reports by embedding data into a vector database. AI integration enhances scalability, making it ideal for clinical decision education and training.
Add a description, image, and links to the ollama topic page so that developers can more easily learn about it.
To associate your repository with the ollama topic, visit your repo's landing page and select "manage topics."