TFLite initializes Coral device but still runs inference on CPU #857
Labels
comp:model
Model related isssues
comp:thirdparty
Thirdparty related issues
Hardware:USB Accelerator
Coral USB Accelerator issues
subtype:ubuntu/linux
Ubuntu/Linux Build/installation issues
type:bug
Bug
type:performance
Performance issues
Description
I have a model which I wish to run on RPi-4 with Coral USB device connected to it. I've set the context manager verbosity to max. I can see that the library is communicating with the device, however, in the end, it still runs inference on CPU. I have verified this by measuring time and checking the CPU utilization.
Notice: "INFO: Created TensorFlow Lite XNNPACK delegate for CPU." in the middle of the logs.
Here's the code to reproduce the behavior:
Click to expand!
Issue Type
Bug
Operating System
Ubuntu
Coral Device
USB Accelerator
Other Devices
Rapsberry Pi 4
Programming Language
C++
Relevant Log Output
The text was updated successfully, but these errors were encountered: