Export to ONNX - fail to load converted model with ONNX #10489
Unanswered
GotG
asked this question in
code help: CV
Replies: 1 comment 1 reply
-
can you share the error? or a minimal reproducible script to check the issue? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello, I am trying to convert a model from https://github.com/hmorimitsu/ptlflow - FlownetS
Model export is successful (only get the warning below).
However I do not know how to run inference with Onnxruntime.
Notebook:
https://colab.research.google.com/drive/11WtAyF6Rb5E6NaTQa3IqydUQXBEYsZQL?usp=sharing
Beta Was this translation helpful? Give feedback.
All reactions