-
Notifications
You must be signed in to change notification settings - Fork 22.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Torch 1.13 - 2.3 Onnx Scope name not correct! #90439
Comments
Thanks for reporting, it looks like the discrepancy was due to a fix for Sequential module being skipped by torch in scope recording, will need to investigate. # Run this without patching numerical atom `_unqualified_variable_name`
class MainModule(torch.nn.Module):
def __init__(self):
super().__init__()
self.module_list = torch.nn.ModuleList(
[torch.nn.Linear(10, 10) for _ in range(2)]
)
def forward(self, x):
y = self.module_list[0](x)
z = self.module_list[1](y)
return z
module = MainModule()
print(module)
f = io.BytesIO()
torch.onnx.export(module, torch.randn(1, 10), f, verbose=True)
f.seek(0, 0)
onnx_model = onnx.load(f)
for node in onnx_model.graph.node:
print(node.name)
where it should be
|
@BowenBao thanks for looking into this. I am getting the following output with this model you suggested:
|
I tested module = torch.hub.load("intel-isl/MiDaS:f28885af", "MiDaS_small")
input = torch.randn(1, 3, 320, 640) All onnx node names are missing |
@kevalmorabia97 thanks for providing another great example. I took another look and it turns out pytorch populates scope name at This is also the reason why A more precise solution is probably to store the full qualified name of modules as scope name, instead of relying on the pytorch scope tree structure. However, it may have implications on other components. |
Thanks @BowenBao for the explanation. Do you have any rough estimate on how much effort is required in the fix and whether it could make it in the next release? I find pytorch scope name to be super useful as it helps in mapping the onnx nodes with the pytorch modules providing insights at the pytorch module level. |
I will bring this to the attention of our team. We need to figure if there is enough bandwidth. |
@BowenBao any updates on this? |
Could you retest this, please? |
Still seeing same issue with torch 2.3.0 on the |
🐛 Describe the bug
In torch 1.13.0, Onnx Scope name was introduced which replaces the old Onnx node format (
Conv_0
,Relu_1
, ...) into a new format that captures the pytorch submodule name. But the actual onnx node name may not always correspond to a submodule and instead needs some further post-processing.Example:
Output:
As we can see here for
ResNet18
, there is no submodule Conv namedlayer1.layer1.0.conv1
in the model but ratherlayer1.0.conv1
. So, its onnx node name should be/layer1/0/conv1/Conv
.Sidenote: Ideally it would have been more helpful to have the submodule name as the onnx node name i.e.
layer1.0.conv1
instead of/layer1/layer1.0/conv1/Conv
.Versions
The text was updated successfully, but these errors were encountered: