I trained a MNIST model (.h5) in TensorFlow and converted it to int8 quantized model (mnist_quant.tflite) by TFLite converter. Following shows the converted model.
To run the model on EK-RA6M3 board, I translated it by using e-AI translator V2.3.0 plugged in e^2 studio V22.10.0. When the quantized model was translated with the C_INT8 output format option of e-AI translator, the translated model made correct predictions. Following is the prediction result in which each row shows softmax output [1,10] in response to MNIST image array from 0 to 9.
However, in case of the CMSIS_INT8 output format option, the translated model showed abnormal outputs as in the following.
For reference, the following is the dnn_compute function generated by C_INT8 output format option.
The following shows the case with CMSIS_INT8 option.
Could you (some Renesas engineers) check out the e-AI translation with the CMSIS_INT8 output option?
Could you confirm the version of Flexible Software Package(FSP)?
As mentioned in e-AI Translator V2.3.0 User's Manual, the support version of FSP is from 3.5 to 3.7.1. (Refer to "1.1. Operating…
Thanks for posting your question online.
Have you taken section "3.7. How to use CMSIS_INT8" of user's manual into consideration ? You need to add “Arm CMSIS NN Library Source” stack.
I hope it helps.
I have already added the Arm CMSIS-NN/DSP library in the FSP stack configuration before translating the model with the CMSIS_INT8 output format option.
Thank you for your response.
We are investigating your issue internally and we will come back with an answer shortly.
I'm wondering if you can give me an answer about the issue. I look forward to your reply.
As mentioned in e-AI Translator V2.3.0 User's Manual, the support version of FSP is from 3.5 to 3.7.1. (Refer to "1.1. Operating Environment".)
We plan to support later FSP versions by releasing the next version of e-AI Translator.
We want to release it next spring. (The detailed schedule is not fixed yet.)
Let us know if it helped.
I missed the required versions of FSP for e-AI translator V2.3.0. I installed the latest e^2 studio V2022-10. It seems the FSP V4.0.0 was installed by default.
Thank you for your reply. I appreciate your support.
Please let us know if it works with the suggested FSP versions in the manual.
After changing the FSP version from 4.0.0 to 3.7.0, the MNIST model works well with the CMSIS_INT8 option.
Thank you again!