Skip to content

🐞 <title>unspecified behavior of output names when export to Onnx #3024

@Matthew-DgNg

Description

@Matthew-DgNg

Describe the bug

I have a differences between convert to onnx when using same methods but 1 case in source code, 1 case run in .exe. Case run in .exe have the different output names, making it difficult for me to control output. I expected it will have "anomaly_map" but "onnx::Mul.."

Image

Dataset

N/A

Model

N/A

Steps to reproduce the behavior

def anomaly2onnx(engine: Engine, model: AnomalibModule, imgWidth: int, imgHeight:int, out_dir: str = None):
path_onnx = out_dir
engine.export(
model=model.to("cuda"),
export_root=f"{path_onnx}\f32",
input_size=(imgHeight, imgWidth), # Adjust based on your needs
export_type=ExportType.ONNX
)

OS information

OS information:

  • Python version: [3.10.0]
  • Anomalib version: [1.1.1]
  • PyTorch version: [2.8.0+cu128]
  • GPU models and configuration: [e.g. 2x GeForce RTX 5060Ti]
  • Any other relevant information: [e.g. I'm using a custom dataset]

Expected behavior

I have a differences between convert to onnx when using same methods but 1 case in source code, 1 case run in .exe. Case run in .exe have the different output names, making it difficult for me to control output. I expected it will have "anomaly_map" but "onnx::Mul.."

Screenshots

No response

Pip/GitHub

GitHub

What version/branch did you use?

main

Configuration YAML

None

Logs

None

Code of Conduct

  • I agree to follow this project's Code of Conduct

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions