Stars
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
基于PaddleNLP搭建评论观点抽取和属性级情感分析模型,并基于前后端分离式架构完成属性级情感分析Web系统搭建,通过细粒度情感分析帮助用户和商家更好决策。
[WWW'23] TFE-GNN: A Temporal Fusion Encoder Using Graph Neural Networks for Fine-grained Encrypted Traffic Classification
推荐系统论文算法实现,包括序列推荐,多任务学习,元学习等。 Recommendation system papers implementations, including sequence recommendation, multi-task learning, meta-learning, etc.
A deep learning network anomaly detection system.
Code for the AAAI'23 paper "Yet Another Traffic Classifier: A Masked Autoencoder Based Traffic Transformer with Multi-Level Flow Representation"
Low-Quality Training Data Only? A Robust Framework for Detecting Encrypted Malicious Network Traffic
使用机器学习的恶意加密流量识别系统
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
Use ChatGPT to summarize the arXiv papers. 全流程加速科研,利用chatgpt进行论文全文总结+专业翻译+润色+审稿+审稿回复
code implementation for paper: A Novel Multimodal Deep Learning Framework for Encrypted Traffic Classification
Source code for AAAI 2022 paper: Unified Named Entity Recognition as Word-Word Relation Classification
Graph Convolutional Networks for Text Classification. AAAI 2019
Meta-BERT: Learning to Learn fast For Low-Resource Text Classification
SOTA punctation restoration (for e.g. automatic speech recognition) deep learning model based on BERT pre-trained model
Code and source for paper ``How to Fine-Tune BERT for Text Classification?``
A Bert-CNN-LSTM model for punctuation restoration
在 Google BERT Fine-tuning基础上,利用cnn/rnn进行中文文本的分类
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"