-
Notifications
You must be signed in to change notification settings - Fork 372
Open
Labels
enhancementNew feature or requestNew feature or request
Description
I am trying to do federated training of adapters using Flower framework. But I am unable to find a way to get and set the adapter state_dict similar to set_peft_model_state_dict. Here is standard Flower code for getting and setting parameters
def set_parameters(model, parameters: NDArrays) -> None:
"""Change the parameters of the model using the given ones."""
peft_state_dict_keys = get_peft_model_state_dict(model).keys()
params_dict = zip(peft_state_dict_keys, parameters)
state_dict = OrderedDict({k: torch.Tensor(v) for k, v in params_dict})
set_peft_model_state_dict(model, state_dict)
def get_parameters(model) -> NDArrays:
"""Return the parameters of the current net."""
state_dict = get_peft_model_state_dict(model)
return [val.cpu().numpy() for _, val in state_dict.items()]
How do I go around doing this with adapters instead of peft?
Any help is appreciated
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request