Skip to content

Add functionality to get and set state_dict of a particular adapter #756

@itsskofficial

Description

@itsskofficial

I am trying to do federated training of adapters using Flower framework. But I am unable to find a way to get and set the adapter state_dict similar to set_peft_model_state_dict. Here is standard Flower code for getting and setting parameters

def set_parameters(model, parameters: NDArrays) -> None:
    """Change the parameters of the model using the given ones."""
    peft_state_dict_keys = get_peft_model_state_dict(model).keys()
    params_dict = zip(peft_state_dict_keys, parameters)
    state_dict = OrderedDict({k: torch.Tensor(v) for k, v in params_dict})
    set_peft_model_state_dict(model, state_dict)


def get_parameters(model) -> NDArrays:
    """Return the parameters of the current net."""
    state_dict = get_peft_model_state_dict(model)
    return [val.cpu().numpy() for _, val in state_dict.items()]

How do I go around doing this with adapters instead of peft?

Any help is appreciated

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions