-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
mask rtdetr 导出模型 C++部署使用trt加载报错 #9152
Comments
The error message Conversion to JSON format is not supported indicates that there is an issue with the result type being returned by the model, specifically the SegmentationResult. This suggests that the FastDeploy might not be able to serialize the output in the expected JSON format. |
Hi,根据您提供的错误信息,存在几个问题:
|
你好,我这边是使用教程导出的模型,在cuda和cpu模式下可以运行,RT-DETR系列模型trt导出在文档中有提及使用 --trt 参数,Mask-RTDETR导出代码是否忘记处理这部分了? 这个文件名对应的cuda12.3_cudnn9.0.0_trt8.6.1.6, |
你好,由于mask rtdetr的部分tensor依赖于输入tensor(scale factor和im_shape)的数值,在使用trt时,可能需要确保在collect shapes时使用的scale_factor和im_shape的数值与真实数据匹配,而不是构造的dummy数据。 |
|
这个可能涉及到套件具体代码的用法,辛苦 @liu-jiaxuan 看看呢 |
Hi,启用后是否保证了 collect shapes 时使用的 scale_factor 和 im_shape 的数值与真实数据匹配呢?无论是否启用nableTunedTensorRtDynamicShape(),均需要保证这一点,目前我们还没有针对这个特定问题的示例代码,可以参考你的数据特点和官方教程调整各参数 |
问题确认 Search before asking
Bug组件 Bug Component
Deploy
Bug描述 Describe the Bug
-windows
-c++
-导出环境:paddle3.0b1/paddledetection develop
-推理环境 paddle inference 3.0.0 beat1
使用 cpu 加载模型:正常
使用 cuda 加载模型:正常
使用 trt 加载模型报错如下:
'''
C++ Traceback (most recent call last):
Not support stack backtrace yet.
Error Message Summary:
InvalidArgumentError: paddle::get failed, cannot get value (desc.GetAttr("dim")) by type class std::vector<int,class std::allocator >, its type is class std::vector<__int64,class std::allocator<__int64> >. (at C:\home\workspace\Paddle\paddle\fluid\inference\tensorrt\op_teller.cc:2329)
'''
复现环境 Environment
-windows
-c++
-导出环境:paddle3.0b1/paddledetection develop
-推理环境 paddle inference 3.0.0 beat1
Bug描述确认 Bug description confirmation
是否愿意提交PR? Are you willing to submit a PR?
The text was updated successfully, but these errors were encountered: