-
Nanjing University
- Nanjing Jiangsu China
- http://weibo.com/mmmoqi
Stars
The best way to write secure and reliable applications. Write nothing; deploy nowhere.
Making large AI models cheaper, faster and more accessible
Code for the paper "Language Models are Unsupervised Multitask Learners"
Fast and memory-efficient exact attention
Microsoft Cognitive Toolkit (CNTK), an open source deep-learning toolkit
LAVIS - A One-stop Library for Language-Vision Intelligence
Open Source Neural Machine Translation and (Large) Language Models in PyTorch
keras implement of transformers for humans
Tensorflow port of Image-to-Image Translation with Conditional Adversarial Nets https://phillipi.github.io/pix2pix/
An Open-Source Package for Neural Relation Extraction (NRE)
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
Implementation of BERT that could load official pre-trained models for feature extraction and prediction
Code for paper Fine-tune BERT for Extractive Summarization
SwissArmyTransformer is a flexible and powerful library to develop your own Transformer variants.
Code for the paper "DeepType: Multilingual Entity Linking by Neural Type System Evolution"
Virtual-Taobao simulators with OpenAI Gym interface
distant supervised relation extraction models: PCNN MIL (Zeng 2015), PCNN+ATT(Lin 2016). 关系抽取
Implementation with some extensions of the paper "Snowball: Extracting Relations from Large Plain-Text Collections" (Agichtein and Gravano, 2000)
Exposing DeepFake Videos By Detecting Face Warping Artifacts
Code for training a Neural Open IE model (NAACL2018)
Chinese Open Information Extraction (Tree-based Triple Relation Extraction Module)
Framework for converting QA-SRL to Open-IE and evaluating Open IE parsers.
AFET: Automatic Fine-Grained Entity Typing (EMNLP'16)