This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

A model translated with CMSIS_INT8 output format option shows abnormal inference outputs on EK-RA6M3 board

Hello,

I trained a MNIST model (.h5) in TensorFlow and converted it to int8 quantized model (mnist_quant.tflite) by TFLite converter. Following shows the converted model.

To run the model on EK-RA6M3 board, I translated it by using e-AI translator V2.3.0 plugged in e^2 studio V22.10.0. When the quantized model was translated with the C_INT8 output format option of e-AI translator, the translated model made correct predictions. Following is the prediction result in which each row shows softmax output [1,10] in response to MNIST image array from 0 to 9.

However, in case of the CMSIS_INT8 output format option, the translated model showed abnormal outputs as in the following.

For reference, the following is the dnn_compute function generated by C_INT8 output format option.

The following shows the case with CMSIS_INT8 option.

Could you (some Renesas engineers) check out the e-AI translation with the CMSIS_INT8 output option? 

B.R.
Y.H.Lee




[locked by: Sai (SWF) at 12:49 (GMT 0) on 2 Feb 2023]
Parents Reply
  • Hi Y.H.Lee,

    Could you confirm the version of Flexible Software Package(FSP)?

    As mentioned in e-AI Translator V2.3.0 User's Manual, the support version of FSP is from 3.5 to 3.7.1. (Refer to "1.1. Operating Environment".)

    We plan to support later FSP versions by releasing the next version of e-AI Translator.

    We want to release it next spring. (The detailed schedule is not fixed yet.)

    Let us know if it helped.

    Regards,

    AZ

Children