Inference on ESP32?


I have some self-trained computer vision models, trained with gluon and exported to symbol/params files. I’m looking for a possibility to run the models on the smallest possible edge device, so that I don’t have to send the video feed to a server and do inference remote.

I would like to do inference on a ESP32 with camera using my own mxnet models. Has anyone experience with this? I found that Tensorflow is somehow supported on ESP32, but I could not find anything similar for mxnet or onnx.


I guess it’s not real. Try to use Raspberry Pi 4 / Jetson Nano or something more powerful than ESP.

Tensorflow doesn’t support ESP32. TensorFlow Lite for Microcontrollers

Tensorflow Lite for Microcontrollers is another project under the brand “Tensorflow”, it supports only limited boards (chips) with limitations.

I haven’t heard about anything like this for mxnet or onnx.

Ok, good to know. Thank you.