-
University of Wisconsin-Madison
- Madison
-
00:16
(UTC -05:00) - https://wql2002.github.io/
Highlights
- Pro
Lists (27)
Sort Name ascending (A-Z)
AI
awesome-repo
Backend Dev
Benchmark
C/C++
Database
Distributed Systems
Embedded System
Go
healthcare
Linux
Linux kernel, development, application-related topicsLLM
LoRa
MLsys
machine learning system-related topicsNetworks
networks-related topics (distributed, transportation, video, etc.)NSDI
OSDI
Python
RasPi
RDMA
Rust
SOSP
System
canonical system-related topicsTier Memory
Time-Series
Toolkits
toolkits for research and developmentVibe Coding
Stars
Public release of the code for "Accelerating Vision Transformers with Adaptive Patches"
Official implementation of ICLR26 (Oral): Decentralized Attention Fails Centralized Signals: Rethink Transformers for Medical Time Series
Tile primitives for speedy kernels
AI agents running research on single-GPU nanochat training automatically
Mantis: Lightweight Calibrated Foundation Model for User-Friendly Time Series Classification
Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024)
PyTorch implementation of MAE https//arxiv.org/abs/2111.06377
Minimalistic 4D-parallelism distributed training framework for education purpose
Conditional Memory via Scalable Lookup: A New Axis of Sparsity for Large Language Models
A course of learning LLM inference serving on Apple Silicon for systems engineers: build a tiny vLLM + Qwen.
Demystify RAG by building it from scratch. Local LLMs, no black boxes - real understanding of embeddings, vector search, retrieval, and context-augmented generation.
[ICLR 2024] Official implementation of " 🦙 Time-LLM: Time Series Forecasting by Reprogramming Large Language Models"
Source code and datasets for Ekya, a system for continuous learning on the edge.
A scalable generative AI framework built for researchers and developers working on Large Language Models, Multimodal, and Speech AI (Automatic Speech Recognition and Text-to-Speech)
NASBench: A Neural Architecture Search Dataset and Benchmark
[ICLR 2025] NeuroLM: A Universal Multi-task Foundation Model for Bridging the Gap between Language and EEG Signals
[ICLR 2025] BrainUICL: An Unsupervised Individual Continual Learning Framework for EEG Applications
Public repository associated with "Deep Learning for ECG Analysis: Benchmarks and Insights from PTB-XL"
[ICML'2025] From Token to Rhythm: A Multi-Scale Approach for ECG-Language Pretraining
[ICLR 2025] CBraMod: A Criss-Cross Brain Foundation Model for EEG Decoding
A fast GPU memory copy library based on NVIDIA GPUDirect RDMA technology
[ICLR 2024 spotlight] Large Brain Model for Learning Generic Representations with Tremendous EEG Data in BCI
Benchmark code for the paper: "Dreem Open Datasets: Multi-Scored Sleep Datasets to compare Human and Automated sleep staging"