Skip to content
View vtrnnhlinh's full-sized avatar
:octocat:
0 or 1, or both 1 and 0?
:octocat:
0 or 1, or both 1 and 0?

Block or report vtrnnhlinh

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Maximum 250 characters. Please don't include any personal information such as legal names or email addresses. Markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
75 stars written in Python
Clear filter

Finetune mistral-7b-instruct for sentence embeddings

Python 87 18 Updated May 2, 2024

Code implement for FastToG

Python 87 9 Updated Apr 13, 2025
Python 77 22 Updated Jul 4, 2021

[MICCAI2024] "FedFMS: Exploring Federated Foundation Models for Medical Image Segmentation". A framework for fine-tuning SAM (Segment Anything) in the federated learning paradigm for medical image …

Python 61 7 Updated Feb 16, 2025

Selective Aggregation for Low-Rank Adaptation in Federated Learning [ICLR 2025]

Python 51 10 Updated Apr 17, 2025

NIPS 24: Text-space Graph Foundation Models: Comprehensive Benchmarks and New Insights

Python 47 4 Updated Dec 21, 2024
Python 31 17 Updated Nov 11, 2024

[ICML 2025] Generalization Principles for Inference over Text-Attributed Graphs with Large Language

Python 15 1 Updated Jul 15, 2025

Elixir: Train a Large Language Model on a Small GPU Cluster

Python 15 5 Updated Jun 8, 2023
Python 14 1 Updated Jul 30, 2025

Code for the paper "KG-Adapter: Enabling Knowledge Graph Integration in Large Language Models through Parameter-Efficient Fine-Tuning"

Python 14 1 Updated Oct 21, 2025
Python 12 1 Updated Jun 28, 2024

[ICML 2024] Code for the paper "MoE-RBench: Towards Building Reliable Language Models with Sparse Mixture-of-Experts"

Python 9 Updated Jul 1, 2024

Job scheduling with deadlines using quantum annealing

Python 3 Updated Nov 12, 2023
Python 1 Updated Nov 4, 2025