Skip to content

ScottLL/Chat_bot

Repository files navigation

DeFi Q&A Bot πŸ€–πŸ’°

A modern, AI-powered question-answering system specialized in Decentralized Finance (DeFi) topics. Built with LangGraph orchestration, FastAPI, and React, featuring real-time streaming responses and advanced semantic search capabilities.

You can try the live app here (you might wait a while for app loading) (Fly.io trial ends on July 19th, so the link will not avaliable after that)

πŸš€ Features

Core Capabilities

  • 🧠 Intelligent Q&A: Semantic search over a curated DeFi knowledge base
  • ⚑ Real-time Streaming: Word-by-word response streaming via WebSocket
  • πŸ” High Accuracy: OpenAI embeddings with similarity-based retrieval
  • πŸ“Š Confidence Scoring: Each answer includes a confidence percentage
  • 🎯 DeFi Specialized: Covers lending, DEXs, yield farming, staking, and more

Technical Features

  • 🌐 Concurrent Handling: Support for multiple simultaneous users
  • πŸ”„ Session Management: Isolated user sessions with automatic cleanup
  • πŸ“ˆ Monitoring & Metrics: Comprehensive logging and Prometheus metrics
  • πŸ›‘οΈ Error Resilience: Robust error handling with user-friendly messages
  • πŸš€ Production Ready: Docker containerization and cloud deployment configs
  • ⚑ Rate Limiting: Configurable request rate limiting for API protection

πŸ“‹ Quick Start

Prerequisites

  • Python 3.11+
  • Node.js 18+
  • OpenAI API Key

1. Clone & Setup

git clone https://github.com/yourusername/Chat_bot.git
cd Chat_bot

# Backend setup
cd backend
python -m venv env
source env/bin/activate  # On Windows: env\Scripts\activate
pip install -r requirements.txt

# Frontend setup
cd ../frontend
npm install

2. Environment Configuration

Create a .env file in the backend directory:

cp backend/environment_template.txt backend/.env

Edit .env with your settings:

OPENAI_API_KEY=your_openai_api_key_here
ENVIRONMENT=development
HOST=127.0.0.1
PORT=8000

3. Start the Application

# Terminal 1: Start Backend
cd backend
python -m uvicorn main:app --reload

# Terminal 2: Start Frontend
cd frontend
npm start

4. Access the Application

πŸ—οΈ Architecture Overview

System Components

graph TB
    UI[React Frontend] --> WS[WebSocket Connection]
    UI --> HTTP[HTTP API]
  
    WS --> MAIN[FastAPI Backend]
    HTTP --> MAIN
  
    MAIN --> LG[LangGraph Agent]
    MAIN --> SM[Session Manager]
    MAIN --> MON[Monitoring System]
  
    LG --> EMB[Embedding Service]
    LG --> CACHE[Cache Manager]
    LG --> DS[Dataset Loader]
  
    EMB --> OPENAI[OpenAI API]
    CACHE --> FILES[File System]
    DS --> JSON[DeFi Dataset]
Loading

🎯 Usage

Web Interface

  1. Ask Questions: Type DeFi-related questions in the input field
  2. Real-time Responses: Watch answers stream in word-by-word
  3. Confidence Scores: See how confident the AI is in each answer
  4. Connection Status: Monitor your WebSocket connection status

Example Questions

β€’ "What is impermanent loss in liquidity provision?"
β€’ "How does Compound's interest rate model work?"
β€’ "What are the risks of yield farming?"
β€’ "How do flash loans work on Aave?"
β€’ "What is the difference between APR and APY in DeFi?"

🐳 Docker Deployment

Quick Docker Setup

# Build and run with Docker
cd backend
docker build -t defi-qa-bot .
docker run -p 8000:8000 --env-file .env defi-qa-bot

Docker Compose (Full Stack)

# Copy environment template
cp docker.env.example .env
# Edit .env with your OPENAI_API_KEY

# Start all services
docker-compose up --build -d

# Access application
# Frontend: http://localhost
# Backend: http://localhost:8000

πŸš€ Cloud Deployment

Recommended Platform: Fly.io

This application is currently deployed on Fly.io due to specific technical requirements. You can access the live demo at: https://defi-qa-frontend.fly.dev/

Deployment Challenges & Solutions

⚠️ Vercel Limitation

Important: This application cannot be deployed on Vercel due to the 250MB serverless function limit. The combination of:

  • Large ML/AI dependencies (OpenAI, LangGraph, sentence transformers)
  • Extensive NLP libraries and models
  • DeFi dataset and embeddings cache
  • FastAPI and associated dependencies

Results in a bundle size that exceeds Vercel's 250MB unzipped limit for serverless functions. While the vercel.json configuration file exists in the repository, deployment will fail with a "Serverless Function has exceeded the unzipped maximum size" error.

βœ… Fly.io Solution

Fly.io was chosen as the deployment platform because:

  • No function size limits: Supports applications with large dependencies
  • Persistent storage: Better handling of cache files and datasets
  • Container-based: Full Docker support for complex Python applications
  • WebSocket support: Native support for real-time features
  • Global edge network: Fast performance worldwide

Supported Deployment Platforms

Platform Configuration File Status Notes
Fly.io fly.toml βœ… Recommended Currently deployed
Heroku Procfile βœ… Compatible Large slug size
Railway railway.toml βœ… Compatible Good Docker support
Render render.yaml βœ… Compatible Blueprint deployment
Vercel vercel.json ❌ Not Compatible 250MB limit exceeded

Fly.io Deployment

# Install Fly CLI
curl -L https://fly.io/install.sh | sh

# Deploy the application
fly deploy

# Set environment variables
fly secrets set OPENAI_API_KEY=your-key-here

Environment Variables for Production

OPENAI_API_KEY=your-openai-key
ENVIRONMENT=production
DEBUG=false
ALLOWED_ORIGINS=https://your-domain.com

πŸ› οΈ Development

Project Structure

Chat_bot/
β”œβ”€β”€ backend/                 # FastAPI backend (see backend/README.md)
β”‚   β”œβ”€β”€ main.py             # Application entry point
β”‚   β”œβ”€β”€ agents/             # LangGraph agents
β”‚   β”œβ”€β”€ services/           # Business logic
β”‚   β”œβ”€β”€ infrastructure/     # Monitoring, logging
β”‚   └── tests/             # Test suite
β”œβ”€β”€ frontend/               # React frontend (see frontend/README.md)
β”‚   └── src/
β”œβ”€β”€ data/                  # DeFi Q&A dataset
└── cache/                # Cached embeddings

Running Tests

# Backend tests
cd backend
python -m pytest tests/ -v

# Integration tests
python test_integration.py
python test_api.py

Development Environment

# Backend development mode
cd backend
python -m uvicorn main:app --reload --host 0.0.0.0 --port 8000

# Frontend development mode
cd frontend
npm start

πŸ“Š Monitoring

Access the monitoring dashboard at /dashboard for:

  • Health Score: Overall system health (0-100)
  • Performance Metrics: Response times and request counts
  • System Resources: Memory and CPU usage
  • WebSocket Activity: Active connections and message flow
  • Error Tracking: Real-time error monitoring

Key Endpoints

  • Health Check: GET /health
  • API Documentation: GET /docs
  • Metrics: GET /metrics (Prometheus format)
  • Dashboard: GET /dashboard

πŸ“Š Performance

Benchmarks

  • Response Time: < 2 seconds average for semantic search
  • Concurrent Users: Tested with 50+ simultaneous connections
  • Throughput: 200+ requests per minute per instance
  • Memory Usage: ~200MB baseline, scales with concurrent sessions
  • Cache Hit Rate: 80%+ for repeated questions

🀝 Contributing

We welcome contributions! Here's how to get started:

Development Setup

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature/amazing-feature
  3. Follow the development setup in Quick Start
  4. Make your changes and add tests
  5. Run the test suite: python -m pytest
  6. Commit with conventional commits: git commit -m "feat: add amazing feature"
  7. Push and create a Pull Request

Areas for Contribution

  • πŸ” Expand Dataset: Add more DeFi Q&A pairs
  • 🧠 Improve AI: Enhance semantic search accuracy
  • 🎨 UI/UX: Improve frontend design and user experience
  • πŸ“Š Analytics: Add more monitoring and analytics features
  • 🌐 Integrations: Add support for more data sources

πŸ“ž Support

  • Backend Documentation: See backend/README.md
  • Frontend Documentation: See frontend/README.md
  • Issues: Report bugs via GitHub Issues
  • Discussions: Join our GitHub Discussions for questions

πŸ“œ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ™ Acknowledgments

  • OpenAI: For providing the GPT and embedding models
  • LangGraph: For the excellent graph-based agent framework
  • FastAPI: For the high-performance async web framework
  • React: For the modern frontend framework

Built with ❀️ for the DeFi community

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published