-
UofT, @NVIDIA
- Toronto, CA
- https://rishitdagli.com
- @rishit_dagli
- https://www.cs.toronto.edu/~rishit
Highlights
Lists (1)
Sort Name ascending (A-Z)
Starred repositories
GigaTrain: An Efficient and Scalable Training Framework for AI Models
⚙️ minimal paralllelizable physics simulator supporting differentiation entirely in torch
Squeeze3D: Your 3D Generation Model is Secretly an Extreme Neural Compressor
GPU-optimized version of the MuJoCo physics simulator, designed for NVIDIA hardware.
Ray tracing and hybrid rasterization of Gaussian particles
New repo collection for NVIDIA Cosmos: https://github.com/nvidia-cosmos
Simulate a mesh, Gaussian Splat, or a NeRF
Official code for NeRF-US: Removing Ultrasound Imaging Artifacts from Neural Radiance Fields in the Wild
Fast and memory-efficient exact attention
Official code for SEE-2-SOUND: Zero-Shot Spatial Environment-to-Spatial Sound
Schedule-Free Optimization in PyTorch
TORAX: Tokamak transport simulation in JAX
A pytorch CUDA extension implementation of instant-ngp (sdf and nerf), with a GUI.
Simple code for generating a color-coded latex table from raw data
This repository contains the official implementation of Astroformer, an ICLR Workshop 2023 paper.
[ICLR 2024 Oral] Generative Gaussian Splatting for Efficient 3D Content Creation
ImageBind One Embedding Space to Bind Them All
A collection of utility functions to prototype geometry processing research in python
3D Transforms is a library to easily work with 3D data and make 3D transformations. This library originally started as a few functions here and there for my own work which I then turned into a libr…
An implementation of Invariant Point Attention from Alphafold 2
An implementation of "Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks"
Official Implementation of ICML 2023 paper: "A Generalization of ViT/MLP-Mixer to Graphs"
Easily train or fine-tune SOTA computer vision models with one open source training library. The home of Yolo-NAS.
My notes from UofT CS Theory Research Group seminars
An implementation of the Nyströmformer, using Nystrom method to approximate standard self attention