AI-Powered Automated RFP & Tender Response System with Intelligent Opportunity Discovery
An enterprise-grade system that automatically discovers open tenders, analyzes RFP requirements, generates competitive proposals, and submits responses to maximize revenue through automated bid management.
- Prospect Discovery: Automated scanning of UN/NGO procurement and funding opportunities
- Decision Maker Intelligence: AI identification of key contacts and influencers
- Opportunity Scoring: Smart qualification and prioritization of prospects
- Pipeline Management: Automated tracking from lead to closed revenue
- Outreach Automation: AI-powered contact campaigns and relationship building
- Win Rate Analytics: Continuous improvement of conversion strategies
- Revenue Forecasting: Predictive modeling for business planning
- Competitive Intelligence: Real-time monitoring of market opportunities
- FastAPI REST API: Production-ready API with comprehensive lead management endpoints
- Multi-Channel Integration: LinkedIn, email, CRM, and procurement platform connections
- Real-Time Monitoring: Continuous scanning for new opportunities and market changes
- Anti-Scraping System: Advanced rate limiting, proxy management, and CAPTCHA solving
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β ProposalMaster Layer β
β βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ β
β β Task Planning β β Orchestration β β Progress β β
β β & Breakdown β β & Scheduling β β Tracking β β
β βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Automated RFP Discovery Layer β
β βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ β
β β Tender β β Opportunity β β Deadline β β
β β Scanning β β Filtering β β Management β β
β βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β AI Proposal Generation Layer β
β βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ β
β β Requirements β β Content β β Compliance β β
β β Analysis β β Generation β β Verification β β
β βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Core Services β
β βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ β
β β Portal β β Document β β Submission β β
β β Integration β β Processing β β Automation β β
β βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β API Layer β
β FastAPI REST Endpoints + Interactive Documentation β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
- Python 3.8+ with pip
- 8GB+ RAM (recommended for ML models)
- API Keys: OpenAI, Anthropic, or Perplexity for AI features
- Clone and setup environment:
git clone <repository-url> cd proposal_master python3 -m venv .venv source .venv/bin/activate # macOS/Linux # .venv\Scripts\activate # Windows pip install -r requirements.txt
The system uses a .env file to manage API keys and other secrets. To get started, copy the example file:
cp .env.example .envThen, edit the .env file and add your API keys. At least one of the following AI provider keys is required for the system to function correctly:
ANTHROPIC_API_KEY: For using Anthropic's Claude models.OPENAI_API_KEY: For using OpenAI's GPT models.PERPLEXITY_API_KEY: For using Perplexity's models for research.
The other keys listed in the .env.example file are optional and can be configured as needed.
- Configure API keys:
cp .env.example .env # Edit .env with your API keys: # OPENAI_API_KEY="sk-proj-..." # ANTHROPIC_API_KEY="sk-ant-api03-..." # PERPLEXITY_API_KEY="pplx-..."
Start the API Server:
python start_api.py
# API: http://localhost:8000
# Docs: http://localhost:8000/docs Run Main Application:
python main.pyThis section details how to deploy the Proposal Master system to a Google Cloud Platform (GCP) environment using Pulumi for infrastructure management and Kubernetes (k3s) for container orchestration.
- Google Cloud SDK (
gcloud): Installed and authenticated. - Pulumi CLI: Installed.
- Docker: Installed locally for building container images.
- A Docker/Container Registry: Such as Google Container Registry (GCR) or Docker Hub, to store built images.
- SSH Key: An SSH public key located at
~/.ssh/id_rsa.pub.
The infrastructure is defined in __main__.py and managed by Pulumi. It creates a GCP Compute Engine instance and configures it with Docker and k3s.
-
Configure Pulumi for GCP: Set your GCP project and region for Pulumi.
pulumi config set gcp:project YOUR_GCP_PROJECT_ID pulumi config set gcp:region YOUR_GCP_REGION
-
Deploy the Infrastructure: Run
pulumi upto preview and deploy the resources.pulumi up
After a successful deployment, Pulumi will output the
instance_ipof your new server.
The application is split into two containers: one for the backend and one for the frontend.
-
Build the Backend Image:
docker build -f Dockerfile.backend -t YOUR_REGISTRY/proposal-master-backend:latest . -
Build the Frontend Image:
docker build -f Dockerfile.frontend -t YOUR_REGISTRY/proposal-master-frontend:latest . -
Push the Images to Your Registry:
docker push YOUR_REGISTRY/proposal-master-backend:latest docker push YOUR_REGISTRY/proposal-master-frontend:latest
Note: Remember to replace
YOUR_REGISTRYwith your container registry's path (e.g.,gcr.io/your-gcp-project-id).
-
Update Image Names in
k8s.yml: Open thek8s.ymlfile and replace the placeholder image names (your-registry/proposal-master-backend:latestandyour-registry/proposal-master-frontend:latest) with the actual image paths from the previous step. -
Connect to the Server: SSH into the newly created GCP instance.
ssh ubuntu@<INSTANCE_IP>
-
Apply Kubernetes Manifests: Once on the server, you can apply the Kubernetes configuration. The
k8s.ymlfile will need to be copied to the server or made available. For simplicity, you can copy its content and create it on the server.# On the GCP instance kubectl apply -f /path/to/k8s.ymlA simpler alternative is to use the k3s kubeconfig on local machine to deploy remotely.
Once the Kubernetes pods are running, you can access the application in your browser using the server's IP address.
- Frontend UI:
http://<INSTANCE_IP>/ - Backend API:
http://<INSTANCE_IP>/api/(e.g.,http://<INSTANCE_IP>/api/v1/health)
The NGINX Ingress controller will automatically route requests to the correct service.
proposal_master/
βββ src/
β βββ core/ # Core system components
β β βββ document_processor.py
β β βββ rag_system.py
β β βββ similarity_engine.py
β βββ modules/ # Automated RFP processing modules
β β βββ discovery/ # Tender scanning and opportunity detection
β β βββ analysis/ # RFP requirement extraction and compliance
β β βββ intelligence/ # Competitive analysis and win probability
β β βββ generation/ # Automated proposal content creation
β β βββ submission/ # Portal integration and automated bidding
β βββ agents/ # AI agent implementations
β βββ prompts/ # AI prompt templates
β βββ utils/ # Utility functions
β βββ api/ # REST API endpoints
βββ data/ # Data storage
β βββ documents/ # Document collections
β βββ embeddings/ # Vector embeddings
β βββ cache/ # Temporary files
βββ logs/ # Application logs
βββ main.py # Main application entry point
βββ requirements.txt # Python dependencies
βββ README.md # This file
- Create module directory under
src/modules/ - Implement core functionality
- Add corresponding agent in
src/agents/ - Update API routes in
src/api/routes/
This project follows Python best practices:
- PEP 8 style guidelines
- Type hints for all functions
- Docstrings for classes and methods
- Black for code formatting
- Flake8 for linting
- MyPy for type checking
pytest tests/black src/
flake8 src/
mypy src/The system supports integration with various AI services:
- OpenAI GPT models for content generation
- Anthropic Claude for analysis and reasoning
- Sentence Transformers for embeddings
- Custom fine-tuned models for domain-specific tasks
- RFP Samples:
data/documents/rfp_samples/ - Case Studies:
data/documents/case_studies/ - Industry Reports:
data/documents/industry_reports/ - Templates:
data/documents/templates/
- Vector Index:
data/embeddings/rag_index.pkl - Temporary Files:
data/cache/temp_files/
- Secure document handling and storage
- API authentication and authorization
- Compliance checking for industry standards
- Data privacy and confidentiality measures
- Complete AI agent implementations
- Web-based user interface
- Advanced analytics dashboard
- Integration with CRM systems
- Multi-language support
- Cloud deployment options
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Make your changes
- Run tests and ensure code quality
- Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is open source. Feel free to modify and distribute as needed.
For questions, issues, or feature requests, please open an issue in the repository.
Proposal Master - Transforming proposal management with AI intelligence π