- Ho Chi Minh, Vietnam
- https://vtrnnhlinh.github.io/
- in/vtrnnhlinh
Lists (2)
Sort Name ascending (A-Z)
Stars
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
Robust Speech Recognition via Large-Scale Weak Supervision
Python tool for converting files and office documents to Markdown.
Making large AI models cheaper, faster and more accessible
Framework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks.
TensorFlow code and pre-trained models for BERT
A modular graph-based Retrieval-Augmented Generation (RAG) system
Improve your resumes with Resume Matcher. Get insights, keyword suggestions and tune your resumes to job descriptions.
🤗 smolagents: a barebones library for agents that think in code.
gpt-oss-120b and gpt-oss-20b are two open-weight language models by OpenAI
Tongyi Deep Research, the Leading Open-source Deep Research Agent
Automate browser based workflows with AI
tiktoken is a fast BPE tokeniser for use with OpenAI's models.
LLM agents built for control. Designed for real-world use. Deployed in minutes.
Ongoing research training transformer models at scale
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
Use PEFT or Full-parameter to CPT/SFT/DPO/GRPO 500+ LLMs (Qwen3, Qwen3-MoE, Llama4, GLM4.5, InternLM3, DeepSeek-R1, ...) and 200+ MLLMs (Qwen3-VL, Qwen3-Omni, InternVL3.5, Ovis2.5, Llava, GLM4v, Ph…
MiniMax-M1, the world's first open-weight, large-scale hybrid-attention reasoning model.
E-Ink Display with a Raspberry Pi and a Web Interface to customize and update the display with various plugins
GraphRAG using Local LLMs - Features robust API and multiple apps for Indexing/Prompt Tuning/Query/Chat/Visualizing/Etc. This is meant to be the ultimate GraphRAG/KG local LLM app.
Dataset of Linus Torvalds' rants classified by negativity using sentiment analysis
LongLLaMA is a large language model capable of handling long contexts. It is based on OpenLLaMA and fine-tuned with the Focused Transformer (FoT) method.
Source code and dataset for ACL 2019 paper "ERNIE: Enhanced Language Representation with Informative Entities"
Transformer seq2seq model, program that can build a language translator from parallel corpus
PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538
Neural Graph Collaborative Filtering, SIGIR2019