Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: ONNX Slice Operator Fails During OpenVINO Conversion #28485

Open
3 tasks done
ndzhizhin opened this issue Jan 16, 2025 · 1 comment
Open
3 tasks done

[Bug]: ONNX Slice Operator Fails During OpenVINO Conversion #28485

ndzhizhin opened this issue Jan 16, 2025 · 1 comment
Assignees
Labels
bug Something isn't working support_request

Comments

@ndzhizhin
Copy link

ndzhizhin commented Jan 16, 2025

OpenVINO Version

2022.1.0

Operating System

Windows System

Device used for inference

CPU

Framework

ONNX

Model used

No response

Issue description

Hello!

I successfully converted a model from PyTorch to ONNX. However, I am currently trying to convert the ONNX model to OpenVINO. My ONNX model contains a Slice operator, and while converting to OpenVINO, I encounter the following error:

OpenVINO runtime found in:      C:\Users\zhizhin\PycharmProjects\gtrcn_model_wenv\Lim_narabotki\venv\lib\site-packages\openvino
OpenVINO runtime version:       2022.1.0-7019-cdb9bec7210-releases/2022/1
Model Optimizer version:        2022.1.0-7019-cdb9bec7210-releases/2022/1
[ ERROR ]  -------------------------------------------------
[ ERROR ]  ----------------- INTERNAL ERROR ----------------
[ ERROR ]  Unexpected exception happened.
[ ERROR ]  Please contact Model Optimizer developers and forward the following information:
[ ERROR ]  Check 'node->visit_attributes(visitor)' failed at C:\j\workspace\private-ci\ie\build-windows-vs2019@3\b\repos\openvino\src\core\src\pass\serialize.cpp:926:
Visitor API is not supported in v0::NullNode NullNode_9 () -> (dynamic...)

[ ERROR ]  Traceback (most recent call last):
  File "C:\Users\zhizhin\PycharmProjects\gtrcn_model_wenv\Lim_narabotki\venv\Lib\site-packages\openvino\tools\mo\main.py", line 533, in main
    ret_code = driver(argv)
  File "C:\Users\zhizhin\PycharmProjects\gtrcn_model_wenv\Lim_narabotki\venv\Lib\site-packages\openvino\tools\mo\main.py", line 493, in driver
    ret_res = moc_emit_ir(ngraph_function, argv)
  File "C:\Users\zhizhin\PycharmProjects\gtrcn_model_wenv\Lim_narabotki\venv\lib\site-packages\openvino\tools\mo\moc_frontend\serialize.py", line 44, in moc_emit_ir
    serialize(ngraph_function, (orig_model_name + ".xml").encode('utf-8'), (orig_model_name + ".bin").encode('utf-8'))
RuntimeError: Check 'node->visit_attributes(visitor)' failed at C:\j\workspace\private-ci\ie\build-windows-vs2019@3\b\repos\openvino\src\core\src\pass\serialize.cpp:926:
Visitor API is not supported in v0::NullNode NullNode_9 () -> (dynamic...)

I also tried converting my Torch model to ONNX with the "dynamic_axes" parameter, but I also received error about Slice usage

I hope you can answer my question or tell me about workaround. I need to convert the model to OpenVINO exactly with version 2022.1.0.
Hope to hear from you soon!

Version of ONNX: 1.16.2
Version of Torch: 2.1.2+cu118
Version of OpenVINO and OpenVINO-dev: 2022.1.0

Step-by-step reproduction


import math
import torch.nn as nn
import torch
import onnx
from onnxsim import simplify

class DummyModel(nn.Module):
    def __init__(self,
                 emb_dim,
                 emb_ks,
                 emb_hs,
                 n_head=4):
        super().__init__()
        self.emb_dim = emb_dim
        self.emb_ks = emb_ks
        self.emb_hs = emb_hs
        self.n_head = n_head

    def __getitem__(self, item):
        return getattr(self, item)

    def forward(self, x):

        B, C, old_T, old_F = x.shape
        T = math.ceil((old_T - self.emb_ks) / self.emb_hs) * self.emb_hs + self.emb_ks
        F = math.ceil((old_F - self.emb_ks) / self.emb_hs) * self.emb_hs + self.emb_ks
        x = nn.functional.pad(x, (0, F - old_F, 0, T - old_T))

        inter_rnn = x[:, :, :, :old_F]


        return inter_rnn



stream_model = DummyModel(emb_dim=64,
                              emb_hs=2,
                              emb_ks=4)

    input_names = ["inp_noisy"]
    output_names =["out_enh"]

    onnx_name = "Dummy_Model.onnx"
    torch.onnx.export(
        stream_model,
        args=(noisy),
        input_names=input_names,
        output_names=output_names,
        f=onnx_name,
        verbose=False,
        opset_version=13,

    )

    onnx_model = onnx.load(onnx_name)
    onnx.checker.check_model(onnx_model)

    model_simp, check = simplify(onnx_model)
    assert check, "Simplified ONNX model could not be validated"
    onnx.save(model_simp, "simplified_Dummy_Model.onnx")

After conversion to ONNX, I am trying to convert my model to OpenVINO with: mo --input_model "simplified_Dummy_Model.onnx" --output_dir "path_to_output_dir"

ONNX GRAPH

Image

Relevant log output

Issue submission checklist

  • I'm reporting an issue. It's not a question.
  • I checked the problem with the documentation, FAQ, open issues, Stack Overflow, etc., and have not found a solution.
  • There is reproducer code and related data files such as images, videos, models, etc.
@ndzhizhin ndzhizhin added bug Something isn't working support_request labels Jan 16, 2025
@ndzhizhin ndzhizhin changed the title [Bug]: [Bug]: Conversion of a model from ONNX with a Slice operator to OpenVINO Jan 16, 2025
@ndzhizhin ndzhizhin changed the title [Bug]: Conversion of a model from ONNX with a Slice operator to OpenVINO [Bug]: Conversion Issue: ONNX Slice Operator to OpenVINO Jan 16, 2025
@ndzhizhin ndzhizhin changed the title [Bug]: Conversion Issue: ONNX Slice Operator to OpenVINO [Bug]: ONNX Slice Operator Fails During OpenVINO Conversion Jan 16, 2025
@rkazants
Copy link
Member

rkazants commented Jan 17, 2025

Hi @ndzhizhin,

Your version of OpenVINO is too old. I am not sure if we can support this version. Is it possible for you to move to the latest one (2024.6.0) where we provide direct PyTorch model conversion and multiple important fixes and features for ONNX model conversion. Check our documentation:

  1. direct conversion of PyTorch models, no need in ONNX format: https://docs.openvino.ai/2024/openvino-workflow/model-preparation/convert-model-pytorch.html
  2. conversion of ONNX models: https://docs.openvino.ai/2024/openvino-workflow/model-preparation/convert-model-onnx.html

New conversion API gets more user-friendly and simplified. Note that newly generated IRs are supported only by new inference API that is also more convenient and refined: https://docs.openvino.ai/2024/learn-openvino/openvino-samples/hello-classification.html

Best regards,
Roman

@rkazants rkazants self-assigned this Jan 17, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working support_request
Projects
None yet
Development

No branches or pull requests

2 participants