Stars
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
Open Source Neural Machine Translation and (Large) Language Models in PyTorch
Unsupervised Word Segmentation for Neural Machine Translation and Text Generation
pytorch implementation of "Get To The Point: Summarization with Pointer-Generator Networks"
A Pytorch Implementation of "Attention is All You Need" and "Weighted Transformer Network for Machine Translation"
A Structured Self-attentive Sentence Embedding
[ICSE 2021] - InferCode: Self-Supervised Learning of Code Representations by Predicting Subtrees
The implementation of the IJCAI 2018 paper: Code Completion with Neural Attention and Pointer Networks