APLMinT is a versatile and robust Python-based Telegram bot that acts as a proxy to various Large Language Models (LLMs). It allows users to interact with multiple state-of-the-art AI models directly from their Telegram chat, with features designed for a seamless and controlled user experience.
Usage: The bot is accessible on Telegram at @aplmint_bot. Simply start a chat, select a model using the /model
command, and start prompting!
(Pro-tip: Take a screenshot of your bot in action and replace the URL above to make your README more engaging!)
- Multi-Model Support: Interact with a variety of powerful LLMs through the OpenRouter API.
- Interactive Model Selection: Users can easily switch between models using a clean, interactive
/model
command with inline keyboard buttons. - Intelligent Defaults: The bot remembers the last model a user selected for subsequent queries, providing a smooth conversational flow.
- Rate Limiting: A built-in daily query limit per user helps manage API usage and costs.
- Concurrency Control: The bot intelligently handles simultaneous requests from a single user, preventing spam and ensuring orderly responses.
- Persistent Logging: All query metadata (user, model used, timestamp) is logged to a lightweight SQLite database for analytics and monitoring.
- Dockerized for Deployment: Comes with a
Dockerfile
for easy, consistent, and isolated deployment. - Modern Development Environment: Developed using
uv
for fast and efficient package management.
- Language: Python 3.13
- Bot Framework: python-telegram-bot (v22.1+)
- LLM Gateway: OpenRouter
- HTTP Client: httpx (asynchronous)
- Database: SQLite
- Containerization: Docker
- Package Manager: uv
To run this project locally, you will need Python 3.13+, uv
, and Docker installed.
git clone https://github.com/heliomancer/aplmint.git
cd aplmint
This project uses uv
for managing the virtual environment and dependencies.
# Create the virtual environment
uv venv
# Activate the environment
source .venv/bin/activate
# Install the required packages
uv pip install -r requirements.txt
The bot requires API keys to function. These are managed through a .env
file.
- Create a file named
.env
in the root of the project directory. - Add your secret keys to this file. You will need:
- A Telegram Bot Token from @BotFather.
- An OpenRouter API Key from openrouter.ai.
# .env file content
TELEGRAM_BOT_TOKEN="123456:ABC-DEF1234ghIkl-zyx57W2v1u123ew11"
OPENROUTER_API_KEY="sk-or-your-long-open-router-api-key"
Ensure your virtual environment is activated and the .env
file is configured.
Run the bot as a Python module from the project's root directory:
python -m src.bot
You should see log output indicating the database has been initialized and the bot is running. You can now interact with it on Telegram.
The included Dockerfile
allows you to build a self-contained image of the bot for easy deployment.
From the project's root directory, run:
docker build -t aplmint .
Run the image as a detached container, passing in the .env
file to provide the necessary secrets.
docker run -d --name my-telegram-bot --env-file .env aplmint
Useful Docker Commands:
- To view the logs of the running container:
docker logs my-telegram-bot
- To follow the logs in real-time:
docker logs -f my-telegram-bot
- To stop the container:
docker stop my-telegram-bot
- To remove the stopped container:
docker rm my-telegram-bot
The project follows a clean, modular structure within the src/
directory.
aplmint/
├── src/
│ ├── __init__.py # Makes 'src' a Python package
│ ├── bot.py # Main entry point: initializes and runs the bot
│ ├── config.py # Loads configuration and secrets
│ ├── database.py # All SQLite database logic
│ ├── handlers.py # All Telegram command and message handlers
│ └── llm_service.py # Logic for communicating with the LLM API
├── .env.example # Example environment file
├── .gitignore
├── Dockerfile
└── README.md
- Implement conversation context/memory for each user.
- Store conversation history in the SQLite database for persistence.
- Add a token counting mechanism for more precise history management.
- Introduce a system for user-defined system prompts.
This project was created as part of a Python course.