Skip to content

Shape inference fails in Concat layer with dynamic batch #7445

@gcunhase

Description

@gcunhase

Bug Report

Is the issue related to model conversion?

No, this is regarding shape inference.

Describe the bug

Shape inference fails with strict_mode=True for models with Concat layer containing 1 non-static dimension:

Image

The expectation is that, as long as the concatenating dimension is static, this inference should be successful, but this is only true if all dimensions are static.

System information

  • OS Platform and Distribution: Linux Ubuntu 24.04
  • ONNX version: 1.19.0
  • Python version: 3.12.3

Reproduction instructions

  1. Download models: models.zip
  2. Run script:
import onnx

onnx_path = <path to ONNX model>.onnx
onnx_model = onnx.load(onnx_path)

onnx_model_infer = onnx.shape_inference.infer_shapes(onnx_model, strict_mode=True)

Observed behavior

Shape inference in model_dynamic.onnx fails with error:

Shapes inference failed in strict mode: [ShapeInferenceError] Inference error(s): (op_type:Reshape, node name: Reshape): [ShapeInferenceError] Inferred shape and existing shape differ in dimension 0: (0) vs (-1)

Shape inference in model_static.onnx is successful.

Expected behavior

Shape inference is successful in both models.

Notes

Solving #7100 (adding an infer_types function) could serve as a temporary WAR for my use-case, but that wouldn't be a permanent solution.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions