Demystifying Insurance Policies with AI
PolicyPal is an intelligent web application designed to decode complex insurance documents using the power of artificial intelligence. Upload any policy PDF, ask your question in plain English, and receive a clear, fact-grounded answer — instantly, accurately, and transparently.
🔗 Live Demo: Try PolicyPal Now
Understanding insurance policies shouldn't require legal expertise. PolicyPal bridges the gap between dense insurance jargon and everyday clarity, powered by Retrieval-Augmented Generation (RAG) and best-in-class LLM APIs.
This repository contains the fully deployed version of PolicyPal — a refined result of technical agility and architectural decisions made during development.
Our original vision centered around a fully self-hosted AI pipeline powered by Google's open-source Gemma model. While we successfully built this system (available here), deployment limitations on available infrastructure prompted a strategic pivot.
To ensure a seamless user experience, we transitioned to cloud-based APIs for inference and embeddings, allowing us to showcase the application's full potential — without compromising its logic, responsiveness, or integrity.
💡 Our self-hosted version remains the technical foundation of this project. Explore it here:
🔗 Gemma Architecture (Self-Hosted)
This version follows a clean, modular, three-tier microservice architecture:
- Framework: React.js
- Role: Provides a responsive, elegant interface for file uploads and user queries
- Hosting: Vercel
- Framework: Node.js (Express.js)
- Role: Serves as a secure API gateway between the client and the AI service
- Hosting: Render
- Framework: FastAPI (Python)
- Role: Core document question-answering logic using RAG
- APIs Used:
- Inference: Groq (Llama 3)
- Embeddings: Cohere
- Hosting: Render
| Feature | Description |
|---|---|
| 📄 PDF Analysis | Parses and processes complex insurance policy documents |
| 🗣️ Natural Language Q&A | Accepts user queries in plain English — no jargon required |
| 🛡️ Fact-Grounded Answers | Each response is backed by actual excerpts from the document |
| 🔍 Transparent Reasoning | Reveals which parts of the policy informed the answer |
| 🧱 Structured Output | Uses Pydantic models for predictable, validated AI output |
| 💻 Modern UI | Clean, responsive, and intuitive design |
Set up the entire system on your machine in minutes.
git clone https://github.com/muskan-khushi/PolicyPal-Deployed.git
cd PolicyPal-DeployedCreate an environment file for the AI service:
# /doc_qa_backend/.env
GROQ_API_KEY="your_groq_api_key"
COHERE_API_KEY="your_cohere_api_key"cd doc_qa_backend
python -m venv venv
# On Windows:
venv\Scripts\activate
# On Mac/Linux:
source venv/bin/activate
pip install -r requirements.txt
uvicorn app.main:app --reload --port 8000Service runs at:
http://localhost:8000
cd ../server
npm install
npm startServer runs at:
http://localhost:5000
cd ../client
npm install
npm startApp opens at:
http://localhost:1234
PolicyPal represents the intersection of practical AI engineering and real-world problem solving. What started as an ambitious self-hosted AI project evolved into a production-ready application that demonstrates both technical depth and deployment pragmatism. Key Achievements:
✅ End-to-end RAG implementation from document processing to response generation ✅ Production deployment across multiple cloud platforms ✅ Architectural flexibility - seamless transition from self-hosted to cloud APIs ✅ User-centric design - complex insurance logic translated into clear, actionable insights
This project showcases not just the ability to build sophisticated AI systems, but the engineering judgment to adapt and deploy them effectively in real-world constraints. PolicyPal makes insurance accessible, one query at a time.
| Name | Role | GitHub |
|---|---|---|
| Rupali Kumari | Team Leader & Backend Developer | 🔗 @Rupali2507 |
| Shanvi Dixit | Frontend Developer | 🔗 @shanvid19 |
| Prisha Garg | ML Engineer | 🔗 @prishagarg |
| Muskan | ML Engineer (yours truly) 💫 | 🔗 @muskan-khushi |
Built with ❤️ and lots of ☕