Highlights
- Pro
Stars
Welcome to the Llama Cookbook! This is your go to guide for Building with Llama: Getting started with Inference, Fine-Tuning, RAG. We also show you how to solve end to end problems using Llama mode…
Code for "TabZilla: When Do Neural Nets Outperform Boosted Trees on Tabular Data?"
Code for the paper "LLark: A Multimodal Instruction-Following Language Model for Music" by Josh Gardner, Simon Durand, Daniel Stoller, and Rachel Bittner.
Audio captioning - DCASE challenge 2023 task 6a
A benchmark for distribution shift in tabular data
Audiocraft is a library for audio processing and generation with deep learning. It features the state-of-the-art EnCodec audio compressor / tokenizer, along with MusicGen, a simple and controllable…
Tools and tutorials for the OpenMIC-2018 dataset.
Best theme for Pelican Static Blog Generator
Perform transfer learning for MIR using Jukebox!
[ICLR 2024] Fine-tuning LLaMA to follow Instructions within 1 Hour and 1.2M Parameters
Repository for Multimodal AutoML Benchmark
Open-sourced codes for MiniGPT-4 and MiniGPT-v2 (https://minigpt-4.github.io, https://minigpt-v2.github.io/)
Instruct-tune LLaMA on consumer hardware
PyTorch implementation of TabNet paper : https://arxiv.org/pdf/1908.07442.pdf
An open-source framework for training large multimodal models.
High-speed download of LLaMA, Facebook's 65B parameter GPT model
A repo for transfer learning with deep tabular models
Code for Language-Interfaced FineTuning for Non-Language Machine Learning Tasks.
Research on Tabular Deep Learning: Papers & Packages
Example of how to use Weights & Biases on Slurm