Skip to content

Latest commit

 

History

History

notebooks

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

AdapterHub Notebooks

Here you can find a collection of notebooks for AdapterHub and the adapters library.

The table shows notebooks provided by AdapterHub and contained in this folder that show the different possibilities of working with adapters.

As adapters is fully compatible with HuggingFace's Transformers, you can also use the large collection of official and community notebooks there: 🤗 Transformers Notebooks.

Chapter 1: The Basics

Notebook Description
1: Training an Adapter How to train a task adapter for a Transformer model Open In Colab
2: Using Adapters for Inference How to download and use pre-trained adapters from AdapterHub Open In Colab

Chapter 2: Modularity & Composition

Notebook Description
3: Adapter Fusion How to combine multiple pre-trained adapters on a new task using Fuse composition. Open In Colab
4: Cross-lingual Transfer How to perform zero-shot cross-lingual transfer between tasks using the MAD-X setup (Stack). Open In Colab
5: Parallel Adapter Inference Using the Parallel composition block for inference. Open In Colab
6: Adapter merging and Task Arithmetics How to merge multiple adapters to create a new one through Task Arithmetics. Open In Colab
7: Complex Adapter Configuration How to flexibly combine multiple adapter methods in complex setups using ConfigUnion. Open In Colab

Chapter 3: Additional Notebooks

Notebook Description
Text Generation How to train an adapter for language generation. Open In Colab
QLoRA LLama Finetuning How to finetune a quantized Llama model for using QLoRA. Open In Colab
Training a NER Adapter How to train an adapter on a named entity recoginition task. Open In Colab
Adapter Drop Training How to train an adapter using AdapterDrop Open In Colab
Inference example for id2label How to use the id2label dictionary for inference Open In Colab
NER on Wikiann Evaluating adapters on NER on the wikiann dataset Open In Colab
Finetuning Whisper with Adapters Fine Tuning Whisper using LoRA Open In Colab
Adapter Training with ReFT Fine Tuning using ReFT Adapters Open In Colab