- Seattle, WA
- http://iwsmith.net
Stars
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
Models and examples built with TensorFlow
The simplest, fastest repository for training/finetuning medium-sized GPTs.
TensorFlow code and pre-trained models for BERT
A toolkit for developing and comparing reinforcement learning algorithms.
⚡ A Fast, Extensible Progress Bar for Python and CLI
Pretrain, finetune ANY AI model of ANY size on 1 or 10,000+ GPUs with zero code changes.
Graph Neural Network Library for PyTorch
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
Best Practices on Recommendation Systems
State-of-the-Art Text Embeddings
DALL·E Mini - Generate images from a text prompt
1 Line of code data quality profiling & exploratory data analysis for Pandas and Spark DataFrames.
An open-source NLP research library, built on PyTorch.
A terminal spreadsheet multitool for discovering and arranging data
Natural Language Processing Best Practices & Examples
Model parallel transformers in JAX and Haiku
A Python implementation of LightFM, a hybrid recommendation algorithm.
STUMPY is a powerful and scalable Python library for modern time series analysis
Implementation and experiments of graph embedding algorithms.
Simplest working implementation of Stylegan2, state of the art generative adversarial network, in Pytorch. Enabling everyone to experience disentanglement
🔮 A refreshing functional take on deep learning, compatible with your favorite libraries