Highlights
- Pro
Lists (12)
Sort Name ascending (A-Z)
Stars
Show usage stats for OpenAI Codex and Claude Code, without having to login.
The repo is finally unlocked. enjoy the party! The fastest repo in history to surpass 100K stars ⭐. Join Discord: https://discord.gg/5TUQKqFWd Built in Rust using oh-my-codex.
Code for "WebVoyager: WebVoyager: Building an End-to-End Web Agent with Large Multimodal Models"
High-Performance Triton Ops: RMSNorm+RoPE Fusion, Gated MLP Fusion & FP8 Quantized GEMM for Transformers | 高性能 Triton 算子库:RMSNorm+RoPE 融合、Gated MLP 融合、FP8 量化 GEMM,专为 Transformer 优化
zwxandy / VLMEvalKit
Forked from open-compass/VLMEvalKitOpen-source evaluation toolkit of large multi-modality models (LMMs), support 220+ LMMs, 80+ benchmarks
An Open Phone Agent Model & Framework. Unlocking the AI Phone for Everyone
A curated list of recent efficient video generation methods.
AcadHomepage: A Modern and Responsive Academic Personal Homepage
TurboDiffusion: 100–200× Acceleration for Video Diffusion Models
Official implementation of paper "Think-at-Hard: Selective Latent Iterations to Improve Reasoning Language Models"
Open-source evaluation toolkit of large multi-modality models (LMMs), support 220+ LMMs, 80+ benchmarks
✨✨Latest Advances on Multimodal Large Language Models
Qwen3-VL is the multimodal large language model series developed by Qwen team, Alibaba Cloud.
A Survey of Efficient Attention Methods: Hardware-efficient, Sparse, Compact, and Linear Attention
This repo summarizes papers for efficient PPML across protocol, model, and system levels.
🧑🚀 全世界最好的LLM资料总结(多模态生成、Agent、辅助编程、AI审稿、数据处理、模型训练、模型推理、o1 模型、MCP、小语言模型、视觉语言模型) | Summary of the world's best LLM resources.
本项目旨在分享大模型相关技术原理以及实战经验(大模型工程化、大模型应用落地)
AcadHomepage: A Modern and Responsive Academic Personal Homepage
Doing simple retrieval from LLM models at various context lengths to measure accuracy
🔥 How to efficiently and effectively compress the CoTs or directly generate concise CoTs during inference while maintaining the reasoning performance is an important topic!
Chain of Thoughts (CoT) is so hot! so long! We need short reasoning process!