TF-Agents: A reliable, scalable and easy to use TensorFlow library for Contextual Bandits and Reinforcement Learning.
-
Updated
Dec 7, 2025 - Python
TF-Agents: A reliable, scalable and easy to use TensorFlow library for Contextual Bandits and Reinforcement Learning.
A lightweight contextual bandit & reinforcement learning library designed to be used in production Python services.
🐯REPLICA of "Auction-based combinatorial multi-armed bandit mechanisms with strategic arms"
Deep contextual bandits in PyTorch: Neural Bandits, Neural Linear, and Linear Full Posterior Sampling with comprehensive benchmarking on synthetic and real datasets
Thompson Sampling for Bandits using UCB policy
Python library of bandits and RL agents in different real-world environments
Python implementation of common RL algorithms using OpenAI gym environments
Code for our PRICAI 2022 paper: "Online Learning in Iterated Prisoner's Dilemma to Mimic Human Behavior".
Multi-armed Bandit Gymnasium Environment
Code for our paper "Bandits with Preference Feedback: A Stackelberg Game Perspective"
This project provides a simulation of multi-armed bandit problems. This implementation is based on the below paper. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0322757.
Code and data for the paper "A Combinatorial Multi-Armed Bandit Approach to Correlation Clustering", DAMI 2023
Play Rock, Paper, Scissors (Kaggle competition) with Reinforcement Learning: bandits, tabular Q-learning and PPO with LSTM.
Repository for the course project done as part of CS-747 (Foundations of Intelligent & Learning Agents) course at IIT Bombay in Autumn 2022.
Implementation of the prophet inequalities
Study the interplay between communication and feedback in a cooperative online learning setting.
Foundations of Intelligent and Learning Agenet
Python package (work in progress) implementing sampling policies (known and new) for (contextual) bandits and valid statistical inference methods
Add a description, image, and links to the bandits topic page so that developers can more easily learn about it.
To associate your repository with the bandits topic, visit your repo's landing page and select "manage topics."