vTaskDelay(pdMS_TO_TICKS(1000)); vs vTaskDelay(1000 / portTICK_PERIOD_MS);
Posted: Mon Jul 08, 2019 10:35 am
So i've been seeing more examples using vTaskDelay(pdMS_TO_TICKS(x)); over the IMHO traditional vTaskDelay(x / portTICK_PERIOD_MS);
I am just curious why this is. For when I look them up they are defines as such.
Doesn't seem right to use vTaskDelay(pdMS_TO_TICKS(x)); when it gives same result but just adds more code using it.
I am just curious why this is. For when I look them up they are defines as such.
Code: Select all
//#define pdMS_TO_TICKS( xTimeInMs ) ( ( ( TickType_t ) ( xTimeInMs ) * configTICK_RATE_HZ ) / ( TickType_t ) 1000 )
vTaskDelay(pdMS_TO_TICKS(1000));
//#define portTICK_PERIOD_MS ( ( TickType_t ) 1000 / configTICK_RATE_HZ )
vTaskDelay(1000 / portTICK_PERIOD_MS);