Hi,
I am working on deploying a tflite model into a esp32-cam device. I have successfully running my model with Arduino code, however it is very slow (~35s per inference for minimal model), so I switched into esp-idf framework, in the hope that it can speed up the model inference.
However, I encounter an issue when running the inference when using `esp-tflite-micro`, specifically on a file `fully_connected.cc` https://github.com/espressif/esp-tflite ... ed.cc#L100. Here is the backtrace that showed in my serial monitor:
```
abort() was called at PC 0x400e21a8 on core 0
0x400e21a8: tflite::(anonymous namespace)::FullyConnectedEval(TfLiteContext*, TfLiteNode*) at ***/esp-tflite-example/managed_components/espressif__esp-tflite-micro/tensorflow/lite/micro/kernels/esp_nn/fully_connected.cc:100 (discriminator 1)
Backtrace: 0x40081c09:0x3ffbdac0 0x40088fd9:0x3ffbdae0 0x4008eec5:0x3ffbdb00 0x400e21a8:0x3ffbdb70 0x400db2a9:0x3ffbde10 0x400daf7e:0x3ffbde40 0x400da4d9:0x3ffbde60 0x400da3ba:0x3ffbde80 0x400898c5:0x3ffbdea0
0x40081c09: panic_abort at ***/esp/v5.3/esp-idf/components/esp_system/panic.c:463
0x40088fd9: esp_system_abort at ***/esp/v5.3/esp-idf/components/esp_system/port/esp_system_chip.c:92
0x4008eec5: abort at ***/esp/v5.3/esp-idf/components/newlib/abort.c:38
0x400e21a8: tflite::(anonymous namespace)::FullyConnectedEval(TfLiteContext*, TfLiteNode*) at ******/esp-tflite-example/managed_components/espressif__esp-tflite-micro/tensorflow/lite/micro/kernels/esp_nn/fully_connected.cc:100 (discriminator 1)
0x400db2a9: tflite::MicroInterpreterGraph::InvokeSubgraph(int) at ***/esp-tflite-example/managed_components/espressif__esp-tflite-micro/tensorflow/lite/micro/micro_interpreter_graph.cc:194
0x400daf7e: tflite::MicroInterpreter::Invoke() at ***/esp-tflite-example/managed_components/espressif__esp-tflite-micro/tensorflow/lite/micro/micro_interpreter.cc:294
0x400da4d9: loop at ***/esp-tflite-example/main/main_functions.cc:106
0x400da3ba: tf_main() at ***/esp-tflite-example/main/main.cc:14 (discriminator 1)
0x400898c5: vPortTaskWrapper at ***/esp/v5.3/esp-idf/components/freertos/FreeRTOS-Kernel/portable/xtensa/port.c:134
```
Note that I can run all 3 official examples (https://github.com/espressif/esp-tflite ... r/examples) without any issue.
For anyone who wants to reproduce and resolving issue, I have put the minimal code in a https://github.com/wayinone/esp-tflite-example
* in the repo you can also find the python code to generate the custom model.
Configuration
* ESP Device
esp32-cam (ESP32-CAM by AI Thinker). In PlatformIO's esp-idf extension, the target is `espe32`
* IDE
I am building this esp-idf project on PlatformIO, with esp-idf extension.
* esp-idf version
This is defined in `./main/idf_component.yml`, currently I am running with ESP-IDF v5.3.0
* Python (for generating model and converting into tflite model)
Python3.10
Tensorflow 2.15.1
Appreciate any help!
Issue on deploy a model with esp-tflite-micro
Who is online
Users browsing this forum: Google [Bot], Majestic-12 [Bot] and 420 guests