-
University of Modena and Reggio Emilia
- Modena
- https://orcid.org/0000-0001-7594-993X
Highlights
- Pro
Stars
BioReason-Pro: Advancing Protein Function Prediction with Multimodal Biological Reasoning
Model merging, task-vector rebasin, and fine-tuning for vision and LLM models.
GradFix is the official codebase for the ICLR 2026 paper: "Gradient-Sign Masking for Task Vector Transport Across Pre-Trained Models"
nw_wrld is an event-driven sequencer for triggering visuals using web technologies. It enables users to scale up audiovisual compositions for prototyping, demos, exhibitions, and live performances.…
Pytorch code for NeurIPS 2025 paper "Accurate and Efficient Low-Rank Model Merging in Core Space"
[ICML 2025] No Task Left Behind: Isotropic Model Merging with Common and Task-Specific Subspaces (official repository)
A Django web application to track 2v2 (spikeball) matches internally in my lab
Open source code for AlphaFold 2.
Code for paper "Boosting Continual Learning of Vision-Language Models via Mixture-of-Experts Adapters" CVPR2024
Official implementation of the NeurIPS 2025 paper "Soft Thinking: Unlocking the Reasoning Potential of LLMs in Continuous Concept Space"
[NeurIPS 2022] “M³ViT: Mixture-of-Experts Vision Transformer for Efficient Multi-task Learning with Model-Accelerator Co-design”, Hanxue Liang*, Zhiwen Fan*, Rishov Sarkar, Ziyu Jiang, Tianlong Che…
Combating hidden stratification with GEORGE
Official implementation of BPA (CVPR 2022)
An open-source framework for training large multimodal models.
An open source implementation of CLIP.
Pre-trained models, data, code & materials from the paper "ImageNet-trained CNNs are biased towards texture; increasing shape bias improves accuracy and robustness" (ICLR 2019 Oral)
[ICLR 2025] Official implementation of "Mitigating Parameter Interference in Model Merging via Sharpness-Aware Fine-Tuning"
Biomni: a general-purpose biomedical AI agent
Repository for the paper "U-Net Transplant: The Role of Pre-training for Model Merging in 3D Medical Segmentation" accepted @ MICCAI2025
Official codebase of "Update Your Transformer to the Latest Release: Re-Basin of Task Vectors" - ICML 2025
Task Singular Vectors: Reducing Task Interference in Model Merging. Merge models avoiding task interference through separable models.
Pruna is a model optimization framework built for developers, enabling you to deliver faster, more efficient models with minimal overhead.
[NeurIPS'24] Official PyTorch implementation for paper "Knowledge Composition using Task Vectors with Learned Anisotropic Scaling"
Codebase for "C2M3: Cycle-Consistent Multi-Model Merging".