RTC slows down, affecting DMA and serial output
Posted: Tue Apr 06, 2021 2:51 pm
I am trying to track down an issue in a fairly complex application, but the symptom is very odd. The application is a clock, so one thing it does is display the time. It also plays sounds, which it reads from an SD card and streams to I2S using DMA (but it doesn't use the onboard DACs). It maintains an active WiFi connection and it synchronizes time using SNTP.
After some time, the RTC clock slows. The evidence for this is that the time display slows down, the sound output slows down (the music is played slower), the debug that is being output over serial becomes unreadable because the baud rate changes and the WiFi becomes unusable. It is especially a shame that the serial output becomes unreadable. I compiled with core debug and it is trying to output information.
So my question is, what could do this? It seems that the effect is at a very fundamental level to be affecting all of these different functions, so I doubt that any RTOS task could have this effect. Am I wrong?
After some time, the RTC clock slows. The evidence for this is that the time display slows down, the sound output slows down (the music is played slower), the debug that is being output over serial becomes unreadable because the baud rate changes and the WiFi becomes unusable. It is especially a shame that the serial output becomes unreadable. I compiled with core debug and it is trying to output information.
So my question is, what could do this? It seems that the effect is at a very fundamental level to be affecting all of these different functions, so I doubt that any RTOS task could have this effect. Am I wrong?