Hello all,I am using PTP example to synchronise time and after that starting timer with pulse time (positive width) 10 us and period 20 us. I am using IPLS and MINT interrupts to detect rising and falling edge, so that I have either interrupt at 10 us interval. In both timer edge detection ISR I am using software ELC event to trigger ADC on single channel. So Basically ADC is triggered at every 10 us on single channel. I am using ADC 12 bit conversion in normal mode without sample and hold activation on ch 0 in ADC unit 0 on PIN P000. The ADC ISR is triggered after AD conversion is finished. ADC conversion end interrupt has highest priority (0). In the ADC ISR GPIO PIN is raised high, read ADC result in a variable and GPIO is lowered.The problem is ADC conversion time varies after trigger. Like most of the time it takes around 5us but some times it takes more than 10 us as well. And this is causing troubles as next trigger is missed.Can any one suggests how to make the conversion time deterministic?See the following screen grab. The top window is the 10 us triggers. At all the edges I am triggering ADC with software ELC link. The next window immediately below the first window shows 'persistence' and the trigger is quite deterministic meaning not varying in time. The third window shows GPIO toggle in ADC ISR. Which generally is seen after 5 us on an average after each edge in the upper window, The interesting observation is the bottom most window which is the variance of ADC scan finish ISR. This seems to vary a lot. You can see the negative pulse width measurement varies from 2.9899 us to 20.039us which is huge undesired range. Given ADC single scan is triggered at every 10 us, the gap between this should be 10 us as can be seen in mean value which is 9.7067 us, but wild variation in minimum and maximum values are quite derailing the operation of signal acquisition. Any lead in making the ADC conversion time more deterministic will be highly appreciated.
Hello,
Thanks for reaching out to Renesas Engineering Community.
What is the frequency of PCLKC, which is A/D conversion clock ?
Also, what is the signal source of the signal source ? The maximum permissible signal source impedance is 1KOhm.
Please let us know.
Regards
Hi AZ,I think the issue is interrupt latency of IPLS and MINT interrupts. I am toggling GPIO just before software ELC events and I see the results are quite startling. I have put highest priority to both the interrupts and still ISR is called after non deterministic time. I have event linked to GPIO which toggles the pin and that is quite accurate but till the time ISR is called it is very random. see the same picture but now in the bottom part is the glitch when I trigger ADC. And you see ADC is itself triggered at varying time after interrupts.
So my question is now how to minimise the interrupt latency in the case of IPLS and MINT interrupts?
I am using/modifying the PTP example and I haven't added any more threads than the example uses.Also I am trying to use only IPLS interrupt on both rising and falling edges to trigger hardware ELC events , but it seems ADC is not getting triggered.
Also on page#900 there is no mention of IPLS trigger but in code it is possible to link. Can you clarify if IPLS can be linked with ELC or not ?
Yes, the ELC_EVENT_EPTPC_IPLS event can be linked with ELC.
Can you make sure that the IPLS interrupts occur ?
Do you call R_ELC_Open and then R_ELC_Enable in your code ?
The example is using a RTT Thread. Can you try to delete this ?
AZ [Renesas] said:Yes, the ELC_EVENT_EPTPC_IPLS event can be linked with ELC.
I tried but it is not working . I am having IPLS ISR from where I am changing GPIO level using 'R_IOPORT_PinEventOutputWrite' Even in hardware manual there is no mention of this event in ELC chapter as well as in EPTPC chapter.
AZ [Renesas] said:Can you make sure that the IPLS interrupts occur ?
Yes there are two pins which toggles as you can see in the screen shot and as mentioned above I am calling the driver function from IPLS and MINT ISR
AZ [Renesas] said:Do you call R_ELC_Open and then R_ELC_Enable in your code ?
Yes. The reason you see no variance in 10us waveform is because of ELC triggering GPIO.
AZ [Renesas] said:The example is using a RTT Thread. Can you try to delete this ?
The following image is showing the result after rtt thread deleting. I has improved interrupt latency but it still vary much in the range of 1 us which is still catastrophic for the project. (see the bottom most graph of variance). Is there any other way ?Thanks for the help.
Actually it is worse as before , I keep the system running and within few minutes the latency is becoming worse and worse see the following image
Interrupt latency is not getting affected after deleting RTT thread just look , how much worse it gets over period of time. Every alternate 10 us IPLS and MPLS ISR is called . But time variance in interrupt latency degrades over the time
It sounds like the PTP thread does not have enough execution time. Can you extract the execution times of the threads using 'RTOS Resources' view on e2studio ? You can follow this note:
www.renesas.com/.../partner-rtos-aware-debugging-ra
This example is using ThreadX and the guide is for FreeRTOS. Can you please guide me towards something similar for ThreadX ? Also 'Renesas Views' menu is not same for me as they have shown in the above mentioned guide.My GUI
Guide
I am also triggering ADC at every MINT interrupt via ELC and triggering DMA after ADC conversion end using ELC ,and in DMA ISR I am changing buffer. But if I see memory , I see many glitches in waveform. I can not really understand why are these glitches happening. I also suspect latency in serving DMA ISR is causing the lost sample in memory and that is why the glitch . see the video.
Try to search for 'RTOS Resources' from Window->Show view->Other.
444 said:I am also triggering ADC at every MINT interrupt via ELC and triggering DMA after ADC conversion end using ELC
So it sounds like everything is working as expected. What is the current problem ? The above waveforms are a bit difficult to understand.
AZ [Renesas] said:So it sounds like everything is working as expected. What is the current problem ? The above waveforms are a bit difficult to understand.
This is the problem. I am opening 'Memory' window and plotting the content of buffer where DMA is copying the data. So ideally I should be seeing continuous wave. But because of latency in serving DMA interrupt I am loosing samples. And you see this discontinuous graph.
This data is coming from the ADC and the ADC is triggered every 10us from the MINT interrupt, right ?
How do you expect a continuous waveform from an ADC that converts samples every 10us ?
This is how. https://github.com/renesas/ra-fsp-examples/blob/master/example_projects/ek_ra6m3/adc_gpt_periodic_sampling/adc_gpt_periodic_sampling_ek_ra6m3_ep/adc_gpt_periodic_sampling_notes.mdThe graph is collection of points (basically memory locations). Software connects the dots and looks like continuous graph.In my case i am using PTP example to generate MINT interrupt every 20 us and remaining process is like ADC example I quoted. So in ADC example they have used timer where I am using PTP.
AZ [Renesas] said:his data is coming from the ADC and the ADC is triggered every 10us from the MINT interrupt, right ?
Yes.
What is the frequency of your analog signal ?
1k Hz