Skip to content

Support safetensors for PEFT adapters or raise an exception when safe_serialization=True is specified #546

@crowsonkb

Description

@crowsonkb

I saved my LoRA weights with model.save_pretrained(path, safe_serialization=True) and expected the serialization to be actually safe but it saved in PyTorch pickle format instead (I unzipped the adapter_model.bin to check and there was a pickle in it). It is very unsafe to save an unsafe checkpoint when safe serialization is requested because then people will have riskier behaviors when sharing and using shared checkpoints. Please either support safetensors for PEFT model weights (greatly preferred!) or raise an exception (at least print a warning) when safe_serialization=True is requested so that people do not think the checkpoints are safe when they are not.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions