Stars
Processing the OntoNotes 5.0 corpus
Chinese NER using Lattice LSTM. Code for ACL 2018 paper.
Dataset for the Emerging & Novel Entity NER task (WNUT '17)
ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
自然语言处理领域下的相关论文(附阅读笔记),复现模型以及数据处理等(代码含TensorFlow和PyTorch两版本)
Example codes in the medium post titled "Optuna meets Weights and Biases."
GuwenModels: 古文自然语言处理模型合集, 收录互联网上的古文相关模型及资源. A collection of Classical Chinese natural language processing models, including Classical Chinese related models and resources on the Internet.
CLUENER2020 中文细粒度命名实体识别 Fine Grained Named Entity Recognition
🧑🏫 60+ Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), ga…
60分钟闪击速成PyTorch(Deep Learning with PyTorch: A 60 Minute Blitz)相关文件
An easy/swift-to-adapt PyTorch-Lighting template. 套壳模板,简单易用,稍改原来Pytorch代码,即可适配Lightning。You can translate your previous Pytorch code much easier using this template, and keep your freedom to edit a…
此项目是机器学习(Machine Learning)、深度学习(Deep Learning)、NLP面试中常考到的知识点和代码实现,也是作为一个算法工程师必会的理论基础知识。
2018/2019/校招/春招/秋招/自然语言处理(NLP)/深度学习(Deep Learning)/机器学习(Machine Learning)/C/C++/Python/面试笔记,此外,还包括创建者看到的所有机器学习/深度学习面经中的问题。 除了其中 DL/ML 相关的,其他与算法岗相关的计算机知识也会记录。 但是不会包括如前端/测试/JAVA/Android等岗位中有关的问题。
Scripts and links to recreate the ELI5 dataset.
This repository contains the NarrativeQA dataset. It includes the list of documents with Wikipedia summaries, links to full stories, and questions and answers.
Source code for transferable dialogue state generator (TRADE, Wu et al., 2019). https://arxiv.org/abs/1905.08743
SUMBT: Slot-Utterance Matching for Universal and Scalable Belief Tracking (ACL 2019)
Zero-shot dialogue state tracking (DST)
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.