Skip to content

Egnaro321/voxel-serl

 
 

Repository files navigation

Voxel SERL

Contributions

Code Directory Description
robot_controllers Impedance controller for the UR5 robot arm
box_picking_env Environment setup for the box picking task
vision Point-Cloud based encoders
utils Point-Cloud fusion and voxelization

Quick start guide for box picking with a UR5 robot arm

Without cameras (TODO modify the bash files)

  1. Follow the installation in the official SERL repo.
  2. Check envs and either use the provided box_picking_env or set up a new environment using the one mentioned as a template. (New environments have to be registered here)
  3. Use the config file to configure all the robot-arm specific parameters, as well as gripper and camera infos.
  4. Go to the box picking folder and modify the bash files run_learner.py and run_actor.py. If no images are used, set camera_mode to none . WandB logging can be deactivated if debug is set to True.
  5. Record 20 demostrations using record_demo.py in the same folder. Double check that the camera_mode and all environment-wrappers are identical to drq_policy.py.
  6. Execute run_learner.py and run_actor.py simultaneously to start the RL training.
  7. To evaluate on a policy, modify and execute run_evaluation.py with the specified checkpoint path and step.

Modaliy examples

TODO's

  • improve readme
  • add paper link
  • document how to use in a real setting

About

SERL2025

Resources

License

Stars

Watchers

Forks

Languages

  • Python 100.0%