Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

昇腾310P3无法使用dynamic shape #10583

Open
ly303550688 opened this issue Oct 31, 2024 · 1 comment
Open

昇腾310P3无法使用dynamic shape #10583

ly303550688 opened this issue Oct 31, 2024 · 1 comment
Assignees

Comments

@ly303550688
Copy link

ly303550688 commented Oct 31, 2024

模型是paddlenlp文本纠错,下边是config代码,去掉动态shape可以正常运行

       config = CxxConfig()
        current_dir = os.getcwd()
        config.set_model_file("model/model.pdmodel")
        config.set_param_file("model/model.pdiparams")
        places = [
            Place(TargetType.NNAdapter, PrecisionType.INT8),
            Place(TargetType.NNAdapter, PrecisionType.FP16),
            Place(TargetType.ARM, PrecisionType.INT8),
            Place(TargetType.ARM, PrecisionType.FP32),
        ]
        config.set_nnadapter_device_names(["huawei_ascend_npu"])
        config.set_nnadapter_dynamic_shape_info(
            {"input_ids": [[-1, -1]], "pinyin_ids": [[-1, -1]]}
        )
        config.set_nnadapter_context_properties(
            "HUAWEI_ASCEND_NPU_SELECTED_DEVICE_IDS=0;"
            + f"HUAWEI_ASCEND_NPU_DUMP_MODEL_FILE_PATH={current_dir}/model/cache/;"
            + "HUAWEI_ASCEND_NPU_ENABLE_DYNAMIC_SHAPE_RANGE=true;"
            + "HUAWEI_ASCEND_NPU_PRECISION_MODE=allow_fp32_to_fp16;"
            + "HUAWEI_ASCEND_NPU_OP_SELECT_IMPL_MODE=high_precision_for_all;"
        )
        config.set_nnadapter_model_cache_dir(
            os.path.join(current_dir, "model", "cache")
        )
        config.set_valid_places(places)
        self.predictor = create_paddle_predictor(config)
        self.predictor.save_optimized_pb_model(config.nnadapter_model_cache_dir())

报错信息:
[W 10/31 23:18: 0.423 .../src/driver/huawei_ascend_npu/utility.cc:57 InitializeAscendCL] CANN version mismatch. The build version is 0.0.0, but the current environment version is 8.0.1.
[I 10/31 23:18: 0.423 ...r/src/driver/huawei_ascend_npu/engine.cc:42 Context] properties: HUAWEI_ASCEND_NPU_SELECTED_DEVICE_IDS=0;HUAWEI_ASCEND_NPU_DUMP_MODEL_FILE_PATH=/work/source/paddle/csc/model/cache/;HUAWEI_ASCEND_NPU_ENABLE_DYNAMIC_SHAPE_RANGE=true;HUAWEI_ASCEND_NPU_PRECISION_MODE=allow_fp32_to_fp16;HUAWEI_ASCEND_NPU_OP_SELECT_IMPL_MODE=high_precision_for_all;
[I 10/31 23:18: 0.423 ...r/src/driver/huawei_ascend_npu/engine.cc:67 Context] selected device ids:
[I 10/31 23:18: 0.423 ...r/src/driver/huawei_ascend_npu/engine.cc:69 Context] 0
[I 10/31 23:18: 0.423 ...r/src/driver/huawei_ascend_npu/engine.cc:79 Context] profiling path:
[I 10/31 23:18: 0.423 ...r/src/driver/huawei_ascend_npu/engine.cc:89 Context] dump model path: /work/source/paddle/csc/model/cache/
[I 10/31 23:18: 0.423 ...r/src/driver/huawei_ascend_npu/engine.cc:99 Context] precision mode: allow_fp32_to_fp16
[I 10/31 23:18: 0.423 ...r/src/driver/huawei_ascend_npu/engine.cc:121 Context] op select impl mode: high_precision_for_all
[I 10/31 23:18: 0.423 ...r/src/driver/huawei_ascend_npu/engine.cc:131 Context] op type list for impl mode:
[I 10/31 23:18: 0.423 ...r/src/driver/huawei_ascend_npu/engine.cc:141 Context] enable compressw weight:
[I 10/31 23:18: 0.423 ...r/src/driver/huawei_ascend_npu/engine.cc:151 Context] auto tune mode:
[I 10/31 23:18: 0.423 ...r/src/driver/huawei_ascend_npu/engine.cc:161 Context] enable dynamic shape range: true
[I 10/31 23:18: 0.423 ...r/src/driver/huawei_ascend_npu/engine.cc:177 Context] initial buffer length of dynamic shape range: -1
[W 10/31 23:18: 0.423 ...ter/nnadapter/src/runtime/compilation.cc:334 Finish] Warning: Failed to create a program, No model and cache is provided.
[W 10/31 23:18: 0.423 ...le-lite/lite/kernels/nnadapter/engine.cc:149 LoadFromCache] Warning: Build model failed(3) !
[I 10/31 23:18: 0.423 ...r/nnadapter/src/operation/elementwise.cc:63 operator()] Cannot broadcast input0: -1, input1: 0
[I 10/31 23:18: 0.423 ...r/nnadapter/src/operation/elementwise.cc:63 operator()] Cannot broadcast input0: -1, input1: 0
[W 10/31 23:18: 0.444 ...nnadapter/nnadapter/src/runtime/model.cc:86 GetSupportedOperations] Warning: Failed to get the supported operations for device 'huawei_ascend_npu', because the HAL interface 'validate_program' is not implemented!
[W 10/31 23:18: 0.444 ...kernels/nnadapter/converter/converter.cc:171 Apply] Warning: Failed to get the supported operations for the selected devices, one or more of the selected devices are not supported!
[I 10/31 23:18: 0.444 ...r/src/driver/huawei_ascend_npu/driver.cc:70 CreateProgram] Create program for huawei_ascend_npu.
[2024-10-31-15:18:08.497.916]1111769 WARNING: Option input_shape_range is deprecated and will be removed in future version,please use input_shape instead
[2024-10-31-15:18:08.721.473]1111769 WARNING: Option input_shape_range is deprecated and will be removed in future version,please use input_shape instead
[2024-10-31-15:18:08.776.681]1111769 WARNING: Option input_shape_range is deprecated and will be removed in future version,please use input_shape instead
[2024-10-31-15:18:08.831.448]1111769 WARNING: Option input_shape_range is deprecated and will be removed in future version,please use input_shape instead
[F 10/31 23:18: 8.721 .../src/driver/huawei_ascend_npu/utility.cc:315 BuildOMModelToBuffer] Check failed: (reinterpret_castge::graphStatus(aclgrphBuildModel(ir_graph, options, om_buffer)) == ge::GRAPH_SUCCESS): 1343266818!==0 1343266818 Unknown ATC error code(1343266818)
[F 10/31 23:18: 8.721 .../src/driver/huawei_ascend_npu/utility.cc:315 BuildOMModelToBuffer] Check failed: (reinterpret_castge::graphStatus(aclgrphBuildModel(ir_graph, options, om_buffer)) == ge::GRAPH_SUCCESS): 1343266818!==0 1343266818 Unknown ATC error code(1343266818)

[F 10/31 23:18: 8.891 ...ter/nnadapter/src/runtime/compilation.cc:98 ~Program] Check failed: device_context: No device found.
[F 10/31 23:18: 8.891 ...ter/nnadapter/src/runtime/compilation.cc:98 ~Program] Check failed: device_context: No device found.

terminate called after throwing an instance of 'nnadapter::logging::Exception'
what(): NNAdapter C++ Exception:
[F 10/31 23:18: 8.891 ...ter/nnadapter/src/runtime/compilation.cc:98 ~Program] Check failed: device_context: No device found.

Aborted (core dumped)

Process ForkServerPoolWorker-2:
Process ForkServerPoolWorker-5:
Process ForkServerPoolWorker-4:
Process ForkServerPoolWorker-3:
Process ForkServerPoolWorker-9:
Process ForkServerPoolWorker-7:
Traceback (most recent call last):
Process ForkServerPoolWorker-8:
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/pool.py", line 131, in worker
put((job, i, result))
Traceback (most recent call last):
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/queues.py", line 377, in put
self._writer.send_bytes(obj)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 411, in _send_bytes
self._send(header + buf)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/pool.py", line 131, in worker
put((job, i, result))
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
Process ForkServerPoolWorker-6:
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/queues.py", line 377, in put
self._writer.send_bytes(obj)
BrokenPipeError: [Errno 32] Broken pipe
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])

During handling of the above exception, another exception occurred:

File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 411, in _send_bytes
self._send(header + buf)
Traceback (most recent call last):
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
BrokenPipeError: [Errno 32] Broken pipe
Traceback (most recent call last):
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/process.py", line 315, in _bootstrap
self.run()

During handling of the above exception, another exception occurred:

File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
Traceback (most recent call last):
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/pool.py", line 131, in worker
put((job, i, result))
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/pool.py", line 136, in worker
put((job, i, (False, wrapped)))
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/queues.py", line 377, in put
self._writer.send_bytes(obj)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/queues.py", line 377, in put
self._writer.send_bytes(obj)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/process.py", line 315, in _bootstrap
self.run()
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 411, in _send_bytes
self._send(header + buf)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 411, in _send_bytes
self._send(header + buf)
Traceback (most recent call last):
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/pool.py", line 136, in worker
put((job, i, (False, wrapped)))
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
Traceback (most recent call last):
BrokenPipeError: [Errno 32] Broken pipe
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/queues.py", line 377, in put
self._writer.send_bytes(obj)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/pool.py", line 131, in worker
put((job, i, result))
BrokenPipeError: [Errno 32] Broken pipe
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])

During handling of the above exception, another exception occurred:

File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/pool.py", line 131, in worker
put((job, i, result))
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/queues.py", line 377, in put
self._writer.send_bytes(obj)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 411, in _send_bytes
self._send(header + buf)
Traceback (most recent call last):
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/queues.py", line 377, in put
self._writer.send_bytes(obj)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 411, in _send_bytes
self._send(header + buf)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/process.py", line 315, in _bootstrap
self.run()
BrokenPipeError: [Errno 32] Broken pipe
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 411, in _send_bytes
self._send(header + buf)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
Traceback (most recent call last):
BrokenPipeError: [Errno 32] Broken pipe
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/pool.py", line 136, in worker
put((job, i, (False, wrapped)))
BrokenPipeError: [Errno 32] Broken pipe

During handling of the above exception, another exception occurred:

File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/queues.py", line 377, in put
self._writer.send_bytes(obj)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/pool.py", line 131, in worker
put((job, i, result))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/queues.py", line 377, in put
self._writer.send_bytes(obj)
Traceback (most recent call last):
Traceback (most recent call last):
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 411, in _send_bytes
self._send(header + buf)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/process.py", line 315, in _bootstrap
self.run()
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/process.py", line 315, in _bootstrap
self.run()
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
Traceback (most recent call last):
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/pool.py", line 131, in worker
put((job, i, result))
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 411, in _send_bytes
self._send(header + buf)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
BrokenPipeError: [Errno 32] Broken pipe
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/queues.py", line 377, in put
self._writer.send_bytes(obj)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/pool.py", line 131, in worker
put((job, i, result))
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/pool.py", line 136, in worker
put((job, i, (False, wrapped)))
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/pool.py", line 136, in worker
put((job, i, (False, wrapped)))
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
BrokenPipeError: [Errno 32] Broken pipe
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/queues.py", line 377, in put
self._writer.send_bytes(obj)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/queues.py", line 377, in put
self._writer.send_bytes(obj)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/queues.py", line 377, in put
self._writer.send_bytes(obj)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 411, in _send_bytes
self._send(header + buf)

During handling of the above exception, another exception occurred:

File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
Traceback (most recent call last):
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 411, in _send_bytes
self._send(header + buf)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 411, in _send_bytes
self._send(header + buf)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 411, in _send_bytes
self._send(header + buf)
BrokenPipeError: [Errno 32] Broken pipe
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/process.py", line 315, in _bootstrap
self.run()
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)

During handling of the above exception, another exception occurred:

BrokenPipeError: [Errno 32] Broken pipe
BrokenPipeError: [Errno 32] Broken pipe
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
BrokenPipeError: [Errno 32] Broken pipe
Traceback (most recent call last):

During handling of the above exception, another exception occurred:

File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/pool.py", line 136, in worker
put((job, i, (False, wrapped)))
Traceback (most recent call last):
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/process.py", line 315, in _bootstrap
self.run()
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/queues.py", line 377, in put
self._writer.send_bytes(obj)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/process.py", line 315, in _bootstrap
self.run()
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/pool.py", line 136, in worker
put((job, i, (False, wrapped)))
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 411, in _send_bytes
self._send(header + buf)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/queues.py", line 377, in put
self._writer.send_bytes(obj)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/pool.py", line 136, in worker
put((job, i, (False, wrapped)))
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
BrokenPipeError: [Errno 32] Broken pipe
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/queues.py", line 377, in put
self._writer.send_bytes(obj)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 411, in _send_bytes
self._send(header + buf)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 411, in _send_bytes
self._send(header + buf)
BrokenPipeError: [Errno 32] Broken pipe
File "/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
BrokenPipeError: [Errno 32] Broken pipe
/root/miniconda3/envs/paddle/lib/python3.9/multiprocessing/resource_tracker.py:216: UserWarning: resource_tracker: There appear to be 41 leaked semaphore objects to clean up at shutdown
warnings.warn('resource_tracker: There appear to be %d '

@cmcamdy
Copy link
Contributor

cmcamdy commented Nov 4, 2024

可以试试文档中的动态shape设置方法

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants