esp_timer_get_time() and vTaskDelay() behaviour
Posted: Wed Feb 28, 2024 9:19 pm
Hi, I was trying the example provided by ESPESSIF in this page: https://docs.espressif.com/projects/esp ... speed.html
I wanted to test it in my code while making the following change:
For my specific function I got 766 microseconds per invocation... But, if I incluide vTaskDelay(pdMS_TO_TICKS(2)) inside the loop:
I get 2000 microseconds per invocation... I was expecting something around 2766 microseconds per invocation since I include my function inside the while. Removing the function and leaving only vTaskDelay(pdMS_TO_TICKS(2)) gives the same results -> 2000 microseconds per invocation.
Is this the tipical behaviour?
I wanted to test it in my code while making the following change:
Code: Select all
#include "esp_timer.h"
void measure_important_function(void) {
const unsigned MEASUREMENTS = 5000;
uint64_t start = esp_timer_get_time();
uint64_t retries = 0;
while(retries < MEASUREMENTS) {
important_function();
retires++;
}
uint64_t end = esp_timer_get_time();
printf("%u iterations took %llu milliseconds (%llu microseconds per invocation)\n",
MEASUREMENTS, (end - start)/1000, (end - start)/MEASUREMENTS);
}
Code: Select all
while(retries < MEASUREMENTS) {
vTaskDelay(pdMS_TO_TICKS(2))
important_function();
retires++;
}
I get 2000 microseconds per invocation... I was expecting something around 2766 microseconds per invocation since I include my function inside the while. Removing the function and leaving only vTaskDelay(pdMS_TO_TICKS(2)) gives the same results -> 2000 microseconds per invocation.
Is this the tipical behaviour?