-
HKUST-GZ
- Guangzhou, China
-
08:30
(UTC +08:00) - https://dblp.org/pid/301/6349.html
Highlights
- Pro
Stars
Perform data science on data that remains in someone else's server
[NeurIPS 2023] LLM-Pruner: On the Structural Pruning of Large Language Models. Support Llama-3/3.1, Llama-2, LLaMA, BLOOM, Vicuna, Baichuan, TinyLlama, etc.
A collection of implementations of deep domain adaptation algorithms
This is an open-source toolkit for Heterogeneous Graph Neural Network(OpenHGNN) based on DGL.
[Arxiv] Discrete Diffusion in Large Language and Multimodal Models: A Survey
An Open-Source Toolkit for Heterogeneous Information Network Embedding (HINE)
An automated scoring function to facilitate and standardize the evaluation of goal-directed generative models for de novo molecular design
[NeurIPS 2022] A Fast Post-Training Pruning Framework for Transformers
Repository for SMILES-based RNNs for reinforcement learning-based de novo molecule generation
Token-Mol 1.0:tokenized drug design with large language model
Due to the huge vocaburary size (151,936) of Qwen models, the Embedding and LM Head weights are excessively heavy. Therefore, this project provides a Tokenizer vocabulary shearing solution for Qwen…
Structured Neuron Level Pruning to compress Transformer-based models [ECCV'24]