-
Queen's University
- Kingston, Ontario, Canada
-
03:43
(UTC -05:00) - https://digitalearthscience.com/
Lists (23)
Sort Name ascending (A-Z)
401rich
acou
adj
solver/seis adjacent codecfd
cool
random, misc.ditherpunk
drone
gamedev
geomech
gzipknn
helpful
image
juliautils
utilities for julialisp
ML
libs, example codeProg
Utilsriver
sat
sci
other solversseis
utils, algosstress
topo
waveprop
solvers- All languages
- AGS Script
- APL
- Arc
- Assembly
- BQN
- C
- C#
- C++
- CMake
- Chapel
- Common Lisp
- Coq
- Cuda
- Forth
- Fortran
- GLSL
- Go
- HTML
- Haskell
- Haxe
- Java
- JavaScript
- Julia
- Jupyter Notebook
- Lean
- Lua
- M4
- MATLAB
- MDX
- MLIR
- Makefile
- Markdown
- Mathematica
- Nim
- OCaml
- Perl
- PostScript
- Python
- R
- Roff
- Ruby
- Rust
- Scala
- Scheme
- ShaderLab
- Shell
- Stan
- Swift
- TeX
- Toit
- TypeScript
- Vim Script
- Zig
Starred repositories
Implement a ChatGPT-like LLM in PyTorch from scratch, step by step
A latent text-to-image diffusion model
Using Low-rank adaptation to quickly fine-tune diffusion models.
Flax is a neural network library for JAX that is designed for flexibility.
An annotated implementation of the Transformer paper.
Taming Transformers for High-Resolution Image Synthesis
Easily train or fine-tune SOTA computer vision models with one open source training library. The home of Yolo-NAS.
Jupyter notebooks for the Natural Language Processing with Transformers book
Efficient few-shot learning with Sentence Transformers
[CVPR 2021] Official PyTorch implementation for Transformer Interpretability Beyond Attention Visualization, a novel method to visualize classifications by Transformer based networks.
A suite of image and video neural tokenizers
Koç University deep learning framework.
Code for the Interspeech 2021 paper "AST: Audio Spectrogram Transformer".
Model explainability that works seamlessly with 🤗 transformers. Explain your transformers model in just 2 lines of code.
Python supercharged for the fastai library
Implementation of Transformer model (originally from Attention is All You Need) applied to Time Series.
Croissant is a high-level format for machine learning datasets that brings together four rich layers.
Kolmogorov-Arnold Networks (KAN) using Chebyshev polynomials instead of B-splines.
The fast Continuous Wavelet Transform (fCWT) is a library for fast calculation of CWT.
SeisBench - A toolbox for machine learning in seismology
An implementation of demixed Principal Component Analysis (a supervised linear dimensionality reduction technique)
High performance implementation of Extreme Learning Machines (fast randomized neural networks).
Travel-time calculator based on the fast-marching method solution to the Eikonal equation.
Python worked examples and problems from Reservoir Engineering textbooks (Brian Towler SPE Textbook Vol. 8, etc.)
A nvImageCodec library of GPU- and CPU- accelerated codecs featuring a unified interface
Teaching material for ErSE 210 Seismology course