Stars
An Open Source Machine Learning Framework for Everyone
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
Graph Neural Network Library for PyTorch
A Deep Learning based project for colorizing and restoring old images (and video!)
Github Pages template based upon HTML and Markdown for personal, portfolio-based websites.
Running large language models on a single GPU for throughput-oriented scenarios.
Open Source Neural Machine Translation and (Large) Language Models in PyTorch
Java 1-21 Parser and Abstract Syntax Tree for Java with advanced analysis functionalities.
[ICLR'24 spotlight] An open platform for training, serving, and evaluating large language model for tool learning.
Easily train or fine-tune SOTA computer vision models with one open source training library. The home of Yolo-NAS.
It is my belief that you, the postgraduate students and job-seekers for whom the book is primarily meant will benefit from reading it; however, it is my hope that even the most experienced research…
Code for the paper "Evaluating Large Language Models Trained on Code"
My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entr…
Datasets, tools, and benchmarks for representation learning of code.
Guide to using pre-trained large language models of source code
Deep learning with dynamic computation graphs in TensorFlow
TensorFlow GNN is a library to build Graph Neural Networks on the TensorFlow platform.
Code repo for "WebArena: A Realistic Web Environment for Building Autonomous Agents"
TensorFlow code for the neural network presented in the paper: "code2vec: Learning Distributed Representations of Code"
Public repo for the NeurIPS 2023 paper "Unlimiformer: Long-Range Transformers with Unlimited Length Input"
TensorFlow implementations of Graph Neural Networks
Implementation of Memorizing Transformers (ICLR 2022), attention net augmented with indexing and retrieval of memories using approximate nearest neighbors, in Pytorch
Code for the model presented in the paper: "code2seq: Generating Sequences from Structured Representations of Code"
PaL: Program-Aided Language Models (ICML 2023)
Official Repository of "A Fair Comparison of Graph Neural Networks for Graph Classification", ICLR 2020