-
University of Sheffield
- jasivan.github.io
- @jasivan_s
- in/jasivan-s-0b29a97a
Stars
Deep-learning Transfer Learning models of NTUA-SLP team submitted at the IEST of WASSA 2018 at EMNLP 2018.
mourga / awd-lstm-lm
Forked from salesforce/awd-lstm-lmLSTM and QRNN Language Model Toolkit for PyTorch 1.2.0!
PyTorch implementation of Variational LSTM and Monte Carlo dropout.
Source code for the ACL 2019 paper "Attention-based Conditioning Methods for External Knowledge Integration"
Code for evaluating uncertainty estimation methods for Transformer-based architectures in natural language understanding tasks.
Code for the EMNLP 2021 Paper "Active Learning by Acquiring Contrastive Examples" & the ACL 2022 Paper "On the Importance of Effectively Adapting Pretrained Language Models for Active Learning"
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.