A sophisticated, enterprise-grade HR assistant system that revolutionizes recruitment and employee management
This AI-powered HR assistant combines cutting-edge artificial intelligence, biometric verification, and intelligent document processing to streamline HR operations. Built with enterprise scalability in mind, it provides a comprehensive solution for resume screening, candidate verification, and intelligent HR query handling.
- 🤖 Intelligent Conversational AI: Context-aware chatbot powered by OpenAI's GPT models for natural HR interactions
- 📄 Automated Resume Processing: Extract, analyze, and categorize candidate information with high accuracy
- 🔐 Biometric Verification: Secure face verification using Face++ API for identity authentication
- 🔍 Smart Knowledge Retrieval: FAISS-powered vector search for instant access to HR policies and documentation
- 🚀 Production-Ready: Containerized architecture with Nginx for scalable, enterprise deployment
- 💾 Robust Data Management: MongoDB integration for secure storage of verification records and user profiles
- 🔌 RESTful Architecture: Clean, well-documented API endpoints for seamless integration
┌─────────────────┐
│ Nginx Proxy │
└────────┬────────┘
│
┌────────▼────────────────────────────┐
│ FastAPI Application │
├─────────────────────────────────────┤
│ ┌──────────────────────────────┐ │
│ │ Chatbot Service (GPT-5) │ │
│ └──────────────────────────────┘ │
│ ┌──────────────────────────────┐ │
│ │ Resume Processing Engine │ │
│ └──────────────────────────────┘ │
│ ┌──────────────────────────────┐ │
│ │ Face Verification (Face++) │ │
│ └──────────────────────────────┘ │
│ ┌──────────────────────────────┐ │
│ │ FAISS Knowledge Base │ │
│ └──────────────────────────────┘ │
└─────────────┬───────────────────────┘
│
┌───────▼────────┐
│ MongoDB │
└────────────────┘
- Recruitment Automation: Screen hundreds of resumes in minutes
- Identity Verification: Verify candidate identity during virtual interviews
- HR Query Management: Instant answers to policy and procedure questions
- Onboarding Support: Guide new employees through documentation and processes
- Compliance Tracking: Maintain verification records for audit purposes
Ensure you have the following installed and configured:
- Python: Version 3.12 or higher
- Docker: Latest stable version
- Docker Compose: Version 2.0+
- API Credentials:
- OpenAI API key with GPT-5 access
- Face++ API key and secret
- MongoDB connection URI
-
Clone the Repository
git clone <repository-url> cd ai-hr-assistant
-
Configure Environment Variables
Create a
.envfile in the project root:# OpenAI Configuration OPENAI_API_KEY=sk-your-openai-api-key MODEL=gpt-5 # Face++ Configuration API_KEY=your-faceplus-api-key API_SECRET=your-faceplus-secret # Database Configuration MONGODB_URI=mongodb://username:password@localhost:27017/hr_assistant # Application Configuration APP_HOST=0.0.0.0 APP_PORT=8000 LOG_LEVEL=INFO
-
Install Python Dependencies
pip install -r requirements.txt
-
Prepare Knowledge Base
Place your HR documents (policies, procedures, FAQs) in the knowledge base directory:
mkdir -p data/knowledge_base # Copy your documents to data/knowledge_base/ -
Build Knowledge Base Index
python ingest.py
This creates FAISS vector indices for efficient document retrieval.
For local development with hot-reload:
uvicorn com.mhire.app.main:app --host 0.0.0.0 --port 8000 --reloadAccess the application at http://localhost:8000
Deploy using Docker Compose for production environments:
# Build and start all services
docker-compose up -d --build
# View logs
docker-compose logs -f
# Stop services
docker-compose downProduction endpoint: http://localhost:3014
Verify the application is running:
curl http://localhost:8000/healthOnce running, access comprehensive API documentation:
- Swagger UI:
http://localhost:8000/docs - ReDoc:
http://localhost:8000/redoc
POST /chat
Content-Type: application/json
{
"message": "What is the company's leave policy?",
"session_id": "user-123"
}POST /resume
Content-Type: multipart/form-data
file: resume.pdfPOST /face-verification
Content-Type: multipart/form-data
image: photo.jpg
user_id: "candidate-456"GET /verification/{user_id}All endpoints return JSON responses with consistent structure:
{
"status": "success",
"data": { /* response data */ },
"message": "Operation completed successfully"
}ai-hr-assistant/
│
├── com/mhire/app/
│ ├── main.py # FastAPI application & routing
│ ├── config/
│ │ └── settings.py # Configuration management
│ ├── database/
│ │ └── mongodb.py # Database connection handler
│ ├── models/
│ │ ├── chat.py # Chat data models
│ │ ├── resume.py # Resume data models
│ │ └── verification.py # Verification data models
│ └── services/
│ ├── chatbot/
│ │ ├── chain.py # LangChain conversation chain
│ │ └── prompts.py # System prompts
│ ├── resume/
│ │ ├── parser.py # Resume parsing logic
│ │ └── analyzer.py # Resume analysis
│ ├── verification/
│ │ └── handler.py # User verification logic
│ └── verification_system/
│ └── face_verification/
│ ├── detector.py # Face detection
│ └── comparator.py # Face comparison
│
├── data/
│ └── knowledge_base/ # HR documents & policies
│
├── faiss_index/ # Vector store indices (generated)
│
├── nginx/
│ └── nginx.conf # Nginx configuration
│
├── tests/ # Unit & integration tests
│ ├── test_chatbot.py
│ ├── test_resume.py
│ └── test_verification.py
│
├── docker-compose.yml # Multi-container orchestration
├── Dockerfile # Container image definition
├── requirements.txt # Python dependencies
├── ingest.py # Knowledge base indexing script
├── .env.example # Example environment variables
├── .gitignore # Git ignore rules
└── README.md # This file
- OpenAI GPT-5: State-of-the-art language understanding
- LangChain: LLM application framework
- FAISS: Efficient similarity search (Facebook AI)
- Face++: Enterprise face detection and recognition
- All sensitive credentials stored in environment variables
- API keys never committed to version control
- MongoDB authentication enforced
- HTTPS recommended for production deployments
- Face verification data encrypted at rest
- Temporary face images deleted after processing
- Compliance with GDPR and CCPA regulations
- API key authentication for all endpoints
- Rate limiting configured in Nginx
- Session management for chatbot conversations
# Never commit .env files
echo ".env" >> .gitignore
# Use strong MongoDB passwords
# Rotate API keys regularly
# Enable MongoDB encryption at rest
# Configure HTTPS with valid SSL certificatesRun the test suite:
# Install test dependencies
pip install pytest pytest-asyncio pytest-cov
# Run all tests
pytest
# Run with coverage
pytest --cov=com.mhire.app --cov-report=html
# Run specific test file
pytest tests/test_chatbot.py -v| Variable | Description | Required | Default |
|---|---|---|---|
OPENAI_API_KEY |
OpenAI API authentication key | ✅ | - |
MODEL |
OpenAI model identifier | ✅ | gpt-5 |
API_KEY |
Face++ API key | ✅ | - |
API_SECRET |
Face++ API secret | ✅ | - |
MONGODB_URI |
MongoDB connection string | ✅ | - |
APP_HOST |
Application host | ❌ | 0.0.0.0 |
APP_PORT |
Application port | ❌ | 8000 |
LOG_LEVEL |
Logging level | ❌ | INFO |
Update ingest.py to customize document processing:
# Supported document types
SUPPORTED_FORMATS = ['.pdf', '.docx', '.txt', '.md']
# Chunking parameters
CHUNK_SIZE = 1000
CHUNK_OVERLAP = 200- FAISS Index: Use GPU-accelerated FAISS for large knowledge bases
- Caching: Implement Redis for chatbot session caching
- Load Balancing: Deploy multiple Nginx instances behind a load balancer
- Database: Configure MongoDB replica sets for high availability
Integrate monitoring tools:
- Prometheus: Metrics collection
- Grafana: Visualization dashboards
- Sentry: Error tracking and reporting
We welcome contributions! Please follow these guidelines:
- Fork the repository
- Create a feature branch (
git checkout -b feature/AmazingFeature) - Commit your changes (
git commit -m 'Add some AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
- Follow PEP 8 style guide
- Add docstrings to all functions
- Write unit tests for new features
- Update documentation as needed
See CHANGELOG.md for version history and updates.
Issue: FAISS index not found
# Solution: Rebuild the index
python ingest.pyIssue: MongoDB connection failed
# Solution: Verify MongoDB is running
docker ps | grep mongo
# Check connection string in .envIssue: Face++ API rate limit exceeded
# Solution: Implement request queuing or upgrade planFor issues and questions:
- Bug Reports: Open an issue on GitHub
- Feature Requests: Submit via GitHub issues
- Security Concerns: Contact nazmulislam45213@gmail.com