Hello all,I am using PTP example to synchronise time and after that starting timer with pulse time (positive width) 10 us and period 20 us. I am using IPLS and MINT interrupts to detect rising and falling edge, so that I have either interrupt at 10 us interval. In both timer edge detection ISR I am using software ELC event to trigger ADC on single channel. So Basically ADC is triggered at every 10 us on single channel. I am using ADC 12 bit conversion in normal mode without sample and hold activation on ch 0 in ADC unit 0 on PIN P000. The ADC ISR is triggered after AD conversion is finished. ADC conversion end interrupt has highest priority (0). In the ADC ISR GPIO PIN is raised high, read ADC result in a variable and GPIO is lowered.The problem is ADC conversion time varies after trigger. Like most of the time it takes around 5us but some times it takes more than 10 us as well. And this is causing troubles as next trigger is missed.Can any one suggests how to make the conversion time deterministic?See the following screen grab. The top window is the 10 us triggers. At all the edges I am triggering ADC with software ELC link. The next window immediately below the first window shows 'persistence' and the trigger is quite deterministic meaning not varying in time. The third window shows GPIO toggle in ADC ISR. Which generally is seen after 5 us on an average after each edge in the upper window, The interesting observation is the bottom most window which is the variance of ADC scan finish ISR. This seems to vary a lot. You can see the negative pulse width measurement varies from 2.9899 us to 20.039us which is huge undesired range. Given ADC single scan is triggered at every 10 us, the gap between this should be 10 us as can be seen in mean value which is 9.7067 us, but wild variation in minimum and maximum values are quite derailing the operation of signal acquisition. Any lead in making the ADC conversion time more deterministic will be highly appreciated.
Hello,
Thanks for reaching out to Renesas Engineering Community.
What is the frequency of PCLKC, which is A/D conversion clock ?
Also, what is the signal source of the signal source ? The maximum permissible signal source impedance is 1KOhm.
Please let us know.
Regards
Hi AZ,I think the issue is interrupt latency of IPLS and MINT interrupts. I am toggling GPIO just before software ELC events and I see the results are quite startling. I have put highest priority to both the interrupts and still ISR is called after non deterministic time. I have event linked to GPIO which toggles the pin and that is quite accurate but till the time ISR is called it is very random. see the same picture but now in the bottom part is the glitch when I trigger ADC. And you see ADC is itself triggered at varying time after interrupts.
So my question is now how to minimise the interrupt latency in the case of IPLS and MINT interrupts?
This example is using ThreadX and the guide is for FreeRTOS. Can you please guide me towards something similar for ThreadX ? Also 'Renesas Views' menu is not same for me as they have shown in the above mentioned guide.My GUI
Guide
I am also triggering ADC at every MINT interrupt via ELC and triggering DMA after ADC conversion end using ELC ,and in DMA ISR I am changing buffer. But if I see memory , I see many glitches in waveform. I can not really understand why are these glitches happening. I also suspect latency in serving DMA ISR is causing the lost sample in memory and that is why the glitch . see the video.
Try to search for 'RTOS Resources' from Window->Show view->Other.
444 said:I am also triggering ADC at every MINT interrupt via ELC and triggering DMA after ADC conversion end using ELC
So it sounds like everything is working as expected. What is the current problem ? The above waveforms are a bit difficult to understand.
AZ [Renesas] said:So it sounds like everything is working as expected. What is the current problem ? The above waveforms are a bit difficult to understand.
This is the problem. I am opening 'Memory' window and plotting the content of buffer where DMA is copying the data. So ideally I should be seeing continuous wave. But because of latency in serving DMA interrupt I am loosing samples. And you see this discontinuous graph.
This data is coming from the ADC and the ADC is triggered every 10us from the MINT interrupt, right ?
How do you expect a continuous waveform from an ADC that converts samples every 10us ?
This is how. https://github.com/renesas/ra-fsp-examples/blob/master/example_projects/ek_ra6m3/adc_gpt_periodic_sampling/adc_gpt_periodic_sampling_ek_ra6m3_ep/adc_gpt_periodic_sampling_notes.mdThe graph is collection of points (basically memory locations). Software connects the dots and looks like continuous graph.In my case i am using PTP example to generate MINT interrupt every 20 us and remaining process is like ADC example I quoted. So in ADC example they have used timer where I am using PTP.
AZ [Renesas] said:his data is coming from the ADC and the ADC is triggered every 10us from the MINT interrupt, right ?
Yes.
What is the frequency of your analog signal ?
1k Hz
AZ [Renesas] said:Try to search for 'RTOS Resources' from Window->Show view->Other.
This is the message!
I do not think that this is caused by DMA latency, this is very small.
It seems like your application(thread) is not running all the time, that is why I asked you about investigating the execution times of threads.
In the original example, there is a call to tx_thread_sleep(1) in ptp_thread_entry.c file inside the while(1) loop. This function will cause the thread to suspend for 1 tick and because there are (by default) 100 ticks per second, this would mean that the thread would suspend for 10ms.
Please check if this call exists in your code and if it does modify it to "tx_thread_sleep(0)" so the service returns immediately.
The update in values in memory is done almost automatically unless there is an interrupt from DMA, so when the waveform is broken , some sampling instances are missed because DMA was busy at the time of sampling. I am toggling GPIO at every DMA ISR , look in the following image the variance in ISR triggers hits the sampling boundary which is the multiple of 10us. you can clearly see an overlap.And that's when DMA is unavailable but data is ready to be picked up. and that is what causes disruptions in the shape of waveform.
AZ [Renesas] said:Please check if this call exists in your code and if it does modify it to "tx_thread_sleep(0)" so the service returns immediately.
I changed it but look at the following. There is no improvement.
This waveform looks "more" continuous than the previous one. Why do you say that there is no improvement ?
It is just the matter of time when I take screen shot, just look at this.I
On this screenshot also, the improvement is clear as the values of the signal result in a continuous line. That is happening because your thread is not suspended for a period of time and is executing all the time.
I think the problem you are referring to is these abnormal edges of the signal that do not exist in a normal sine wave. However, this can be caused by noise (or other factors) that affect the signal generator or even the ADC. This has not to do with DMA.
AZ [Renesas] said:However, this can be caused by noise (or other factors) that affect the signal generator or even the ADC. This has not to do with DMA.
This is the example project and look at the signal. It is much more clean. Everything is same, same ADC same signal generator.
And this is the signal which I am getting which is much more disrupted.Just look at the cursor in the following image
This is zoomed in version of above image.
This is the zoomed in image of second glitch
Is there a difference between event link created in 'Event Links' tab and
in the ADC properties in 'Stacks'
And what happens if I have both ? does the ADC trigger 2 times ?
The configuration on the second image requires the use of ELC module, so you do not need to create an additional user event to trigger the ADC.
Can you share your DMA configurations ?
https://drive.google.com/file/d/1OApjEs-8x5yFlcQlyPl8xlLTfhtAy6Ge/view?usp=sharing
This is the whole project. I am using 2 DMA one triggered at ADC0 conversion complete and another at ADC1 conversion complete.ADC are triggered by rising and falling edge of EPTPC timer.DMA are writing alternate locations and switched between buffers.
I am also sending buffer out as UDP packet and I plotted it on receiving machine, the waveform is clean and without any glitches.So the issue might be in the simultaneous access of buffer. When debugger is accessing the buffer the DMA is updating at the same time. And if this is true then why example project is producing clean graphs???
The signal data are stored as expected indeed.
I think the glitches could be caused by the debugger's memory view window. One possible solution would be to enable "Real-time Refresh" by pressing this button in Memory window.