🇺🇦
declare variables not war
- Fort Collins, Colorado
-
16:11
(UTC -07:00) - https://duskvirkus.com/links.html
Stars
transformers
6 repositories
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
Pretrained language model with 100B parameters
Transformers for Information Retrieval, Text Classification, NER, QA, Language Modelling, Language Generation, T5, Multi-Modal, and Conversational AI
A Unified Library for Parameter-Efficient and Modular Transfer Learning
A concise but complete full-attention transformer with a set of promising experimental features from various papers