Classic papers and resources on recommendation
-
Updated
Oct 16, 2025 - Python
Classic papers and resources on recommendation
OpenDILab Decision AI Engine. The Most Comprehensive Reinforcement Learning Framework B.P.
Python implementations of contextual bandits algorithms
Code to reproduce the experiments in Sample Efficient Reinforcement Learning via Model-Ensemble Exploration and Exploitation (MEEE).
This is the pytorch implementation of ICML 2018 paper - Self-Imitation Learning.
Code for NeurIPS 2022 paper Exploiting Reward Shifting in Value-Based Deep RL
The official code release for Provable and Practical: Efficient Exploration in Reinforcement Learning via Langevin Monte Carlo, ICLR 2024.
The official code release for "Langevin Soft Actor-Critic: Efficient Exploration through Uncertainty-Driven Critic Learning", ICLR 2025
A short implementation of bandit algorithms - ETC, UCB, MOSS and KL-UCB
Research Thesis - Reinforcement Learning
The official code release for "More Efficient Randomized Exploration for Reinforcement Learning via Approximate Sampling", Reinforcement Learning Conference (RLC) 2024
The GitHub repository for "Accelerating Approximate Thompson Sampling with Underdamped Langevin Monte Carlo", AISTATS 2024.
over-parameterization = exploration ?
Uses GloVe embeddings + bandit-style exploration/exploitation with adaptive diversification, UCB-driven cluster search, and stagnation recovery.
Repository Containing Comparison of two methods for dealing with Exploration-Exploitation dilemma for MultiArmed Bandits
A reinforcement learning project where a snake learns to navigate and survive in a dynamic environment through Q-learning.
A systematic parameter study of exploration–exploitation trade-offs in an Active Inference agent under varying precision and sensory noise.
Deep Intrinsically Motivated Exploration in Continuous Control
Add a description, image, and links to the exploration-exploitation topic page so that developers can more easily learn about it.
To associate your repository with the exploration-exploitation topic, visit your repo's landing page and select "manage topics."