A clean PyTorch implementation of PPO, SAC, and TD3 made from scratch. It is built for testing and comparing continuous control RL algorithms on complex environments such as BipedalWalker-v3.
-
Updated
Jan 28, 2026 - Python
A clean PyTorch implementation of PPO, SAC, and TD3 made from scratch. It is built for testing and comparing continuous control RL algorithms on complex environments such as BipedalWalker-v3.
A collection of reinforcement learning projects exploring different algorithms in various simulated environments.
Quadruped robot dog walking simulation and RL policy training
Implemented Behavior Cloning, DAgger, Double Q-Learning, Dueling DQN, and Proximal Policy Optimization (PPO) in a simulated environment and analyzed/compared their performance in terms of efficiency, stability, and generalization.
Clean DQN-to-PPO reinforcement learning implementations in PyTorch
A novel approach to solve Contextual Reinforcement Learning
3D container loading optimization using classical RL (PPO, A3C) and quantum-classical hybrid RL (Quantum PPO, Quantum A3C) with VQE actor and QAOA critic circuit
Implementations of modern machine-learning papers , including PPO ,PPG and POP3D
Implementing ppo based on openai/baselines, but more readable.
PPO algoritması ve SUMO kullanarak akıllı trafik ışığı yönetimi projesi.
Implementation of policy based reinforcement learning algorithms
A prototype RAG system that explains CatBoost code in housing data use-cases.
Final Project for CSE592- Foundations of AI
Deep RL agent that solves LunarLander-v2 using PPO.
ROS 2 Jazzy + Gazebo Sim + PPO project for autonomous mobile robot navigation using reinforcement learning
Deep Reinforcement Learning Agent that learns to plays League of Legends
Project work for Autonomous and Adaptive Systems, UNIBO 2022
Unofficial PyTorch reproduction of MSPM: A Modularized and Scalable Multi-Agent RL System for Portfolio Management (PLOS ONE, 2022)
Add a description, image, and links to the ppo topic page so that developers can more easily learn about it.
To associate your repository with the ppo topic, visit your repo's landing page and select "manage topics."