You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I successfully converted a model from PyTorch to ONNX. However, I am currently trying to convert the ONNX model to OpenVINO. My ONNX model contains a Slice operator, and while converting to OpenVINO, I encounter the following error:
OpenVINO runtime found in: C:\Users\zhizhin\PycharmProjects\gtrcn_model_wenv\Lim_narabotki\venv\lib\site-packages\openvino
OpenVINO runtime version: 2022.1.0-7019-cdb9bec7210-releases/2022/1
Model Optimizer version: 2022.1.0-7019-cdb9bec7210-releases/2022/1
[ ERROR ] -------------------------------------------------
[ ERROR ] ----------------- INTERNAL ERROR ----------------
[ ERROR ] Unexpected exception happened.
[ ERROR ] Please contact Model Optimizer developers and forward the following information:
[ ERROR ] Check 'node->visit_attributes(visitor)' failed at C:\j\workspace\private-ci\ie\build-windows-vs2019@3\b\repos\openvino\src\core\src\pass\serialize.cpp:926:
Visitor API is not supported in v0::NullNode NullNode_9 () -> (dynamic...)
[ ERROR ] Traceback (most recent call last):
File "C:\Users\zhizhin\PycharmProjects\gtrcn_model_wenv\Lim_narabotki\venv\Lib\site-packages\openvino\tools\mo\main.py", line 533, in main
ret_code = driver(argv)
File "C:\Users\zhizhin\PycharmProjects\gtrcn_model_wenv\Lim_narabotki\venv\Lib\site-packages\openvino\tools\mo\main.py", line 493, in driver
ret_res = moc_emit_ir(ngraph_function, argv)
File "C:\Users\zhizhin\PycharmProjects\gtrcn_model_wenv\Lim_narabotki\venv\lib\site-packages\openvino\tools\mo\moc_frontend\serialize.py", line 44, in moc_emit_ir
serialize(ngraph_function, (orig_model_name + ".xml").encode('utf-8'), (orig_model_name + ".bin").encode('utf-8'))
RuntimeError: Check 'node->visit_attributes(visitor)' failed at C:\j\workspace\private-ci\ie\build-windows-vs2019@3\b\repos\openvino\src\core\src\pass\serialize.cpp:926:
Visitor API is not supported in v0::NullNode NullNode_9 () -> (dynamic...)
I also tried converting my Torch model to ONNX with the "dynamic_axes" parameter, but I also received error about Slice usage
I hope you can answer my question or tell me about workaround. I need to convert the model to OpenVINO exactly with version 2022.1.0.
Hope to hear from you soon!
Version of ONNX: 1.16.2
Version of Torch: 2.1.2+cu118
Version of OpenVINO and OpenVINO-dev: 2022.1.0
Step-by-step reproduction
import math
import torch.nn as nn
import torch
import onnx
from onnxsim import simplify
class DummyModel(nn.Module):
def __init__(self,
emb_dim,
emb_ks,
emb_hs,
n_head=4):
super().__init__()
self.emb_dim = emb_dim
self.emb_ks = emb_ks
self.emb_hs = emb_hs
self.n_head = n_head
def __getitem__(self, item):
return getattr(self, item)
def forward(self, x):
B, C, old_T, old_F = x.shape
T = math.ceil((old_T - self.emb_ks) / self.emb_hs) * self.emb_hs + self.emb_ks
F = math.ceil((old_F - self.emb_ks) / self.emb_hs) * self.emb_hs + self.emb_ks
x = nn.functional.pad(x, (0, F - old_F, 0, T - old_T))
inter_rnn = x[:, :, :, :old_F]
return inter_rnn
stream_model = DummyModel(emb_dim=64,
emb_hs=2,
emb_ks=4)
input_names = ["inp_noisy"]
output_names =["out_enh"]
onnx_name = "Dummy_Model.onnx"
torch.onnx.export(
stream_model,
args=(noisy),
input_names=input_names,
output_names=output_names,
f=onnx_name,
verbose=False,
opset_version=13,
)
onnx_model = onnx.load(onnx_name)
onnx.checker.check_model(onnx_model)
model_simp, check = simplify(onnx_model)
assert check, "Simplified ONNX model could not be validated"
onnx.save(model_simp, "simplified_Dummy_Model.onnx")
After conversion to ONNX, I am trying to convert my model to OpenVINO with: mo --input_model "simplified_Dummy_Model.onnx" --output_dir "path_to_output_dir"
ONNX GRAPH
Relevant log output
Issue submission checklist
I'm reporting an issue. It's not a question.
I checked the problem with the documentation, FAQ, open issues, Stack Overflow, etc., and have not found a solution.
There is reproducer code and related data files such as images, videos, models, etc.
The text was updated successfully, but these errors were encountered:
ndzhizhin
changed the title
[Bug]:
[Bug]: Conversion of a model from ONNX with a Slice operator to OpenVINO
Jan 16, 2025
ndzhizhin
changed the title
[Bug]: Conversion of a model from ONNX with a Slice operator to OpenVINO
[Bug]: Conversion Issue: ONNX Slice Operator to OpenVINO
Jan 16, 2025
ndzhizhin
changed the title
[Bug]: Conversion Issue: ONNX Slice Operator to OpenVINO
[Bug]: ONNX Slice Operator Fails During OpenVINO Conversion
Jan 16, 2025
Your version of OpenVINO is too old. I am not sure if we can support this version. Is it possible for you to move to the latest one (2024.6.0) where we provide direct PyTorch model conversion and multiple important fixes and features for ONNX model conversion. Check our documentation:
OpenVINO Version
2022.1.0
Operating System
Windows System
Device used for inference
CPU
Framework
ONNX
Model used
No response
Issue description
Hello!
I successfully converted a model from PyTorch to ONNX. However, I am currently trying to convert the ONNX model to OpenVINO. My ONNX model contains a Slice operator, and while converting to OpenVINO, I encounter the following error:
I also tried converting my Torch model to ONNX with the "dynamic_axes" parameter, but I also received error about Slice usage
I hope you can answer my question or tell me about workaround. I need to convert the model to OpenVINO exactly with version 2022.1.0.
Hope to hear from you soon!
Version of ONNX: 1.16.2
Version of Torch: 2.1.2+cu118
Version of OpenVINO and OpenVINO-dev: 2022.1.0
Step-by-step reproduction
After conversion to ONNX, I am trying to convert my model to OpenVINO with: mo --input_model "simplified_Dummy_Model.onnx" --output_dir "path_to_output_dir"
ONNX GRAPH
Relevant log output
Issue submission checklist
The text was updated successfully, but these errors were encountered: