Stars
a continual learning optimizer mitigating catastrophic forgetting and loss of plasticity
Harness for Codex-style agents working on matching decompilation goals
[WACV 2025] Official implementation of "Online-LoRA: Task-free Online Continual Learning via Low Rank Adaptation" by Xiwen Wei, Guihong Li and Radu Marculescu
A lightweight and flexible framework for Hebbian learning in PyTorch.
Code for CoLLAs-2025 paper "Self-Regulated Neurogenesis for Online Data-Incremental Learning"
Lifelong Learning with Dynamically Expandable Networks, ICLR 2018
Progressive Neural Networks (PNNs) are a fundamental architecture-based continual learning concept that offer explicit support for transferring knowledge across incremental tasks, creating dedicate…
A modern, GPU-accelerated Reservoir Computing library for PyTorch. ResDAG brings the power of Echo State Networks (ESNs) and reservoir computing to PyTorch with a clean, modular API. Built for rese…
A Python toolkit for Reservoir Computing and Echo State Network experimentation based on pyTorch. EchoTorch is the only Python module available to easily create Deep Reservoir Computing models.
Repository for Layered Attention-Enhanced Reservoir Computer (LAERC) trained on the OpenWebTxt data set with GPT2 tokenization.
aliasoblomov / mhr-vps-worker
Forked from denuitt1/mhr-cfwA customized fork of mhr-cfw that replaces Cloudflare Workers with a private Node.js worker on your VPS. Bypasses DPI directly via IP
A Domain-Fronting Relay that routes traffic though GAS (Google Apps Script) and forwards it to Cloudflare Workers. Designed to bypass DPI.
The official implementation of DiM: Diffusion Mamba for Efficient High-Resolution Image Synthesis
This repository has implementations of various alternatives to backpropagation for training neural networks.
Code for Latent Speech-Text Transformer (LST)
Code for the paper "Predictive Coding Approximates Backprop along Arbitrary Computation Graphs"
Domain-fronted HTTP/SOCKS5 proxy tunneling traffic through Google Apps Script with MITM TLS interception, HTTP/1-2 multiplexing, and DPI evasion.
PyTorch implementation of the groundbreaking paper "NoProp: Training Neural Networks Without Backpropagation or Forward Propagation".
[NeurIPS 2023] MeZO: Fine-Tuning Language Models with Just Forward Passes. https://arxiv.org/abs/2305.17333
Repository for paper "ZO-ASR: Zeroth-Order Fine-Tuning of Speech Foundation Models without Back-Propagationn"
Receiving unencrypted data with MITM, Then send it with DomainFronting
[KDD 2026] Voxlect: A Speech Foundation Model Benchmark for Modeling Dialects and Regional Languages Around the Globe
Bypass DPI with IP/TCP-Header manipulation
Time-Annealed Perturbation Sampling for Diversity in Diffusion Language Models
⚡ InstaFlow! One-Step Stable Diffusion with Rectified Flow (ICLR 2024)