-
OpenAI
- San Francisco, California
- tomdlt.github.io/
- @tomdlt10
Stars
Lightweight coding agent that runs in your terminal
Easily run Python at the shell! Magical, but never mysterious.
Package for extracting and mapping the results of every single tensor operation in a PyTorch model in one line of code.
You like pytorch? You like micrograd? You love tinygrad! ❤️
Named tensors with first-class dimensions for PyTorch
A tiny library for coding with large language models.
todo.txt manager for Linux, Windows and MacOS, free and open-source (FOSS)
🧑🏫 60+ Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), ga…
Fast and modular sklearn replacement for generalized linear models
Material for a 24 hours course on Scientific Python
Voxelwise Encoding Model tutorials from the Gallant lab.
Machine learning in Python with scikit-learn MOOC
Fast solver for L1-type problems: Lasso, sparse Logisitic regression, Group Lasso, weighted Lasso, Multitask Lasso, etc.
A tiny scalar-valued autograd engine and a neural net library on top of it with PyTorch-like API
Making your ML and optimization benchmarks simple and open
Machine Learning From Scratch. Bare bones NumPy implementations of machine learning models and algorithms with a focus on accessibility. Aims to cover everything from linear regression to deep lear…
A Python module for getting the GPU status from NVIDA GPUs using nvidia-smi programmically in Python
Words of the same length with related meanings.
Latex code for making neural networks diagrams
Pycortex is a python-based toolkit for surface visualization of fMRI data
Experiments for "Distributed Convolutional Dictionary Learning (DiCoDiLe): Pattern Discovery in Large Images and Signals"
Convolution dictionary learning for time-series
scikit-learn: machine learning in Python
Optimised tools for group-indexing operations: aggregated sum and more
Preview GitHub README.md files locally before committing them.