Stars
4
results
for source starred repositories
Clear filter
Llama-style transformer in PyTorch with multi-node / multi-GPU training. Includes pretraining, fine-tuning, DPO, LoRA, and knowledge distillation. Scripts for dataset mixing and training from scratch.
This is an implementation for train hifigan part of XTTSv2 model using Coqui/TTS.