Search found 4 matches
- Mon Feb 26, 2024 11:24 am
- Forum: ESP-WHO 中文讨论版
- Topic: 使用TVM自动生成模型部署项目-部署项目生成错误
- Replies: 1
- Views: 9244
Re: 使用TVM自动生成模型部署项目-部署项目生成错误
Currently, export_onnx_model.py only supports single input and single output scenarios. You can modify the input and output of the model in the generate_project() function of export_onnx_model.py and in app_main.c according to your own model. In addition, the support for TVM in ESP-DL is not very ma...
- Thu Feb 01, 2024 9:51 am
- Forum: ESP-WHO
- Topic: ESP32-S3 TVM Example crash after 1st inference
- Replies: 5
- Views: 9450
Re: ESP32-S3 TVM Example crash after 1st inference
This problem can be reproduced. The cause of this issue is that the memory space for the model's output is declared as const, leading to some undefined behavior. The modification method can be referred to this commit: 97e1c52 in https://github.com/espressif/esp-dl.git. In addition, the support for T...
- Tue Jan 16, 2024 6:33 am
- Forum: ESP-WHO 中文讨论版
- Topic: 使用TVM自动生成模型部署项目-校准数据集格式
- Replies: 2
- Views: 33772
Re: 使用TVM自动生成模型部署项目-校准数据集格式
数据集需要numpy格式,可以检查下数据集是否还是pickle。 在 https://docs.espressif.com/projects/esp-dl/en/latest/esp32/tutorials/deploying-models-through-tvm.html#step-1-3-quantize-the-model 中有对校准数据集格式作说明: "Create an input data reader: First, an input data reader will be created to read the calibration data from the data so...
- Tue Nov 28, 2023 6:28 am
- Forum: ESP-WHO
- Topic: Change ONNX model at runtime
- Replies: 2
- Views: 45562
Re: Change ONNX model at runtime
If you use the TVM approach, the TVM architecture does not support change model at runtime. ESP-DL does not support this feature too. Firstly, ESP-DL does not have graph parsing function, so there is no interface to import ONNX models. Secondly, there is no mechanism to change model at runtime. Mayb...