-
City University Of Hong Kong
Stars
Awesome Datasets for Music Recommendation
模型压缩的小白入门教程,PDF下载地址 https://github.com/datawhalechina/awesome-compression/releases
Build, evaluate and train General Multi-Agent Assistance with ease
This is the official repository of the paper "FunReason: Enhancing Large Language Models’ Function Calling via Self-Refinement Multiscale Loss and Automated Data Refinement"
Benchmark dataset for the paper "Towards Next-Generation Recommender Systems: A Benchmark for Personalized Recommendation Assistant with LLMs"
😎 A curated list of tensor decomposition resources for model compression.
A flexible Federated Learning Framework based on PyTorch, simplifying your Federated Learning research.
A curated list for Efficient Large Language Models
FlashMLA: Efficient Multi-head Latent Attention Kernels
Streaming Factor Trajectory Learning for Temporal Tensor Decomposition(NeurIPS 2023)
A new markup-based typesetting system that is powerful and easy to learn.
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Intro to Quantum Computing Website
《动手学大模型Dive into LLMs》系列编程实践教程
Repository for the Paper "Multi-LoRA Composition for Image Generation"
EasyTPP: Towards Open Benchmarking Temporal Point Processes
Python for Tensor Network: Tutorial. The lecturing vedios (in Chinese) can be found at https://space.bilibili.com/401005433
(WWW'24 + LinkedIn) The first RS that tightly combines LLM with ID-based RS
中文nlp解决方案(大模型、数据、模型、训练、推理)
Large Language Model-enhanced Recommender System Papers
Implementation of "Intensity-Free Learning of Temporal Point Processes" (Spotlight @ ICLR 2020)
SliceNStitch: Continuous CP Decomposition of Sparse Tensor Streams (ICDE'21)
TensorCodec: Compact Lossy Compression of Tensors without Strong Data Assumptions (ICDM 23)
NeuKron: Constant-Size Lossy Compression of Sparse Reorderable Matrices and Tensors (WWW 23)