Skip to content

[Bug]: ModelPatchLoader新版本Z-Image-Turbo-Fun-Controlnet-Union-2.0.safetensors出错 #1487

@ne7359

Description

@ne7359

App Version

ModelPatchLoader

Error(s) in loading state_dict for ZImage_Control:
Unexpected key(s) in state_dict: "control_layers.10.adaLN_modulation.0.bias", "control_layers.10.adaLN_modulation.0.weight", "control_layers.10.after_proj.bias", "control_layers.10.after_proj.weight", "control_layers.10.attention.k_norm.weight", "control_layers.10.attention.q_norm.weight", "control_layers.10.attention.out.weight", "control_layers.10.attention.qkv.weight", "control_layers.10.attention_norm1.weight", "control_layers.10.attention_norm2.weight", "control_layers.10.feed_forward.w1.weight", "control_layers.10.feed_forward.w2.weight", "control_layers.10.feed_forward.w3.weight", "control_layers.10.ffn_norm1.weight", "control_layers.10.ffn_norm2.weight", "control_layers.11.adaLN_modulation.0.bias", "control_layers.11.adaLN_modulation.0.weight", "control_layers.11.after_proj.bias", "control_layers.11.after_proj.weight", "control_layers.11.attention.k_norm.weight", "control_layers.11.attention.q_norm.weight", "control_layers.11.attention.out.weight", "control_layers.11.attention.qkv.weight", "control_layers.11.attention_norm1.weight", "control_layers.11.attention_norm2.weight", "control_layers.11.feed_forward.w1.weight", "control_layers.11.feed_forward.w2.weight", "control_layers.11.feed_forward.w3.weight", "control_layers.11.ffn_norm1.weight", "control_layers.11.ffn_norm2.weight", "control_layers.12.adaLN_modulation.0.bias", "control_layers.12.adaLN_modulation.0.weight", "control_layers.12.after_proj.bias", "control_layers.12.after_proj.weight", "control_layers.12.attention.k_norm.weight", "control_layers.12.attention.q_norm.weight", "control_layers.12.attention.out.weight", "control_layers.12.attention.qkv.weight", "control_layers.12.attention_norm1.weight", "control_layers.12.attention_norm2.weight", "control_layers.12.feed_forward.w1.weight", "control_layers.12.feed_forward.w2.weight", "control_layers.12.feed_forward.w3.weight", "control_layers.12.ffn_norm1.weight", "control_layers.12.ffn_norm2.weight", "control_layers.13.adaLN_modulation.0.bias", "control_layers.13.adaLN_modulation.0.weight", "control_layers.13.after_proj.bias", "control_layers.13.after_proj.weight", "control_layers.13.attention.k_norm.weight", "control_layers.13.attention.q_norm.weight", "control_layers.13.attention.out.weight", "control_layers.13.attention.qkv.weight", "control_layers.13.attention_norm1.weight", "control_layers.13.attention_norm2.weight", "control_layers.13.feed_forward.w1.weight", "control_layers.13.feed_forward.w2.weight", "control_layers.13.feed_forward.w3.weight", "control_layers.13.ffn_norm1.weight", "control_layers.13.ffn_norm2.weight", "control_layers.14.adaLN_modulation.0.bias", "control_layers.14.adaLN_modulation.0.weight", "control_layers.14.after_proj.bias", "control_layers.14.after_proj.weight", "control_layers.14.attention.k_norm.weight", "control_layers.14.attention.q_norm.weight", "control_layers.14.attention.out.weight", "control_layers.14.attention.qkv.weight", "control_layers.14.attention_norm1.weight", "control_layers.14.attention_norm2.weight", "control_layers.14.feed_forward.w1.weight", "control_layers.14.feed_forward.w2.weight", "control_layers.14.feed_forward.w3.weight", "control_layers.14.ffn_norm1.weight", "control_layers.14.ffn_norm2.weight", "control_layers.6.adaLN_modulation.0.bias", "control_layers.6.adaLN_modulation.0.weight", "control_layers.6.after_proj.bias", "control_layers.6.after_proj.weight", "control_layers.6.attention.k_norm.weight", "control_layers.6.attention.q_norm.weight", "control_layers.6.attention.out.weight", "control_layers.6.attention.qkv.weight", "control_layers.6.attention_norm1.weight", "control_layers.6.attention_norm2.weight", "control_layers.6.feed_forward.w1.weight", "control_layers.6.feed_forward.w2.weight", "control_layers.6.feed_forward.w3.weight", "control_layers.6.ffn_norm1.weight", "control_layers.6.ffn_norm2.weight", "control_layers.7.adaLN_modulation.0.bias", "control_layers.7.adaLN_modulation.0.weight", "control_layers.7.after_proj.bias", "control_layers.7.after_proj.weight", "control_layers.7.attention.k_norm.weight", "control_layers.7.attention.q_norm.weight", "control_layers.7.attention.out.weight", "control_layers.7.attention.qkv.weight", "control_layers.7.attention_norm1.weight", "control_layers.7.attention_norm2.weight", "control_layers.7.feed_forward.w1.weight", "control_layers.7.feed_forward.w2.weight", "control_layers.7.feed_forward.w3.weight", "control_layers.7.ffn_norm1.weight", "control_layers.7.ffn_norm2.weight", "control_layers.8.adaLN_modulation.0.bias", "control_layers.8.adaLN_modulation.0.weight", "control_layers.8.after_proj.bias", "control_layers.8.after_proj.weight", "control_layers.8.attention.k_norm.weight", "control_layers.8.attention.q_norm.weight", "control_layers.8.attention.out.weight", "control_layers.8.attention.qkv.weight", "control_layers.8.attention_norm1.weight", "control_layers.8.attention_norm2.weight", "control_layers.8.feed_forward.w1.weight", "control_layers.8.feed_forward.w2.weight", "control_layers.8.feed_forward.w3.weight", "control_layers.8.ffn_norm1.weight", "control_layers.8.ffn_norm2.weight", "control_layers.9.adaLN_modulation.0.bias", "control_layers.9.adaLN_modulation.0.weight", "control_layers.9.after_proj.bias", "control_layers.9.after_proj.weight", "control_layers.9.attention.k_norm.weight", "control_layers.9.attention.q_norm.weight", "control_layers.9.attention.out.weight", "control_layers.9.attention.qkv.weight", "control_layers.9.attention_norm1.weight", "control_layers.9.attention_norm2.weight", "control_layers.9.feed_forward.w1.weight", "control_layers.9.feed_forward.w2.weight", "control_layers.9.feed_forward.w3.weight", "control_layers.9.ffn_norm1.weight", "control_layers.9.ffn_norm2.weight", "control_noise_refiner.0.after_proj.bias", "control_noise_refiner.0.after_proj.weight", "control_noise_refiner.0.before_proj.bias", "control_noise_refiner.0.before_proj.weight", "control_noise_refiner.1.after_proj.bias", "control_noise_refiner.1.after_proj.weight".
size mismatch for control_all_x_embedder.2-1.weight: copying a param with shape torch.Size([3840, 132]) from checkpoint, the shape in current model is torch.Size([3840, 64]).

Expected Behavior

Z-Image-Turbo-Fun-Controlnet-Union-2.0.safetensors

Actual Behavior

MacBook Pro
芯片:Apple M4 Pro
内存: 24 GB

Steps to Reproduce

macoS Tahoe 26.2

Debug Logs

Browser Logs

No response

Settings JSON

No response

Other

No response

┆Issue is synchronized with this Notion page by Unito

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions