ADC conversion time varies unexpectedly for single channel

Hello all,

I am using PTP example to synchronise time and after that starting timer with pulse time (positive width) 10 us and period 20 us.
I am using IPLS and MINT interrupts to detect rising and falling edge, so that I have either interrupt at 10 us interval.
In both timer edge detection  ISR I am using software ELC event to trigger ADC on single channel. So Basically ADC is triggered at every 10 us on single channel.
I am using ADC 12 bit conversion in normal mode without sample and hold activation on ch 0 in ADC unit 0 on PIN P000.
The ADC ISR is triggered after AD conversion is finished. ADC conversion end interrupt has highest priority (0). In the ADC ISR GPIO PIN is raised high, read ADC result in a variable and GPIO is lowered.

The problem is ADC conversion time varies after trigger. Like most of the time it takes around 5us but some times it takes more than 10 us as well. And this is causing troubles as next trigger is missed.

Can any one suggests how to make the conversion time deterministic?

See the following screen grab. The top window is the 10 us triggers. At all the edges I am triggering ADC with software ELC link. The next window immediately below the first window shows 'persistence' and the trigger is quite deterministic meaning not varying in time. The third window shows GPIO toggle in ADC ISR. Which generally is seen after 5 us on an average after each edge in the upper window, The interesting observation is the bottom most window which is the variance of ADC scan finish ISR. This seems to vary a lot. You can see the negative pulse width measurement varies from 2.9899 us to 20.039us which is huge undesired range. Given ADC single scan is triggered at every 10 us, the gap between this should be 10 us as can be seen in mean value which is 9.7067 us, but wild variation in minimum and maximum values are quite derailing the operation of signal acquisition.

Any lead in making the ADC conversion time more deterministic will be highly appreciated.

      

Parents
  • Hello,

    Thanks for reaching out to Renesas Engineering Community.

    What is the frequency of PCLKC, which is A/D conversion clock ?

    Also, what is the signal source of the signal source ? The maximum permissible signal source impedance is 1KOhm.

    Please let us know.

    Regards

  • Hi AZ,

    I think the issue is interrupt latency of IPLS and MINT interrupts. I am toggling GPIO just before software ELC events and I see the results are quite startling. I have put highest priority to both the interrupts and still ISR is called after non deterministic time. I have event linked to GPIO which toggles the pin and that is quite accurate but till the time ISR is called it is very random. see the same picture but now in the bottom part is the glitch when I trigger ADC. And you see ADC is itself triggered at varying time after interrupts.

    So my question is now how to minimise the interrupt latency in the case of IPLS and MINT interrupts?




  • Try to search for 'RTOS Resources' from Window->Show view->Other.

    I am also triggering ADC at every MINT interrupt via ELC and triggering DMA after ADC conversion end using ELC

    So it sounds like everything is working as expected. What is the current problem ? The above waveforms are a bit difficult to understand.

    Regards

  • So it sounds like everything is working as expected. What is the current problem ? The above waveforms are a bit difficult to understand.

    This is the problem. I am opening 'Memory' window and plotting the content of buffer where DMA is copying the data.  So ideally I should be seeing continuous wave. But because of latency in serving DMA interrupt I am loosing samples. And you see this discontinuous graph.