TF-Agents: A reliable, scalable and easy to use TensorFlow library for Contextual Bandits and Reinforcement Learning.
-
Updated
Jun 16, 2025 - Python
TF-Agents: A reliable, scalable and easy to use TensorFlow library for Contextual Bandits and Reinforcement Learning.
Materials for the Practical Sessions of the Reinforcement Learning Summer School 2019: Bandits, RL & Deep RL (PyTorch).
A lightweight contextual bandit & reinforcement learning library designed to be used in production Python services.
Another A/B test library
lightweight contextual bandit library for ts/js
Code associated with the NeurIPS19 paper "Weighted Linear Bandits in Non-Stationary Environments"
🐯REPLICA of "Auction-based combinatorial multi-armed bandit mechanisms with strategic arms"
Thompson Sampling for Bandits using UCB policy
Deep contextual bandits in PyTorch: Neural Bandits, Neural Linear, and Linear Full Posterior Sampling with comprehensive benchmarking on synthetic and real datasets
Python library of bandits and RL agents in different real-world environments
Python implementation of common RL algorithms using OpenAI gym environments
Code for our PRICAI 2022 paper: "Online Learning in Iterated Prisoner's Dilemma to Mimic Human Behavior".
Collaborative project for documenting ML/DS learnings.
Multi-armed Bandit Gymnasium Environment
Code for our ICDMW 2018 paper: "Contextual Bandit with Adaptive Feature Extraction".
Deep Reinforcement Learning Agents in Pytorch in a modular framework
C++ implementation of Multi-Armed Bandits (Gaussian and Bernoulli)
Code for our paper "Bandits with Preference Feedback: A Stackelberg Game Perspective"
Code for our AJCAI 2020 paper: "Online Semi-Supervised Learning in Contextual Bandits with Episodic Reward".
This project provides a simulation of multi-armed bandit problems. This implementation is based on the below paper. https://arxiv.org/abs/2308.14350.
Add a description, image, and links to the bandits topic page so that developers can more easily learn about it.
To associate your repository with the bandits topic, visit your repo's landing page and select "manage topics."