I am running freeRTOS at the default 1000Hz tick rate on a 32 bits PIC running at 200MHz.
Keeping that tick rate:
- can I use vTaskDelayUntil() to create delays of microseconds (i.e. thousands of times slower)?
- generally what is the precision / accuracy of vTaskDelayUntil()?
If you are using vTaskDelayX then the tick count is the time base. So with a 1ms period you have 1ms resolution. Note this means a 1 tick delay will delay between 0.00001 and 0.9999 ms, depending how far through the current tick period you are when the delay starts
If you want something faster you would need to use a peripheral timer.
Thank you Richard,
you mention vTaskDelay, I assume that includes also vTaskDelayUntil, correct?
When you say use a peripheral timer, do you mean the classic way or is it better to use some freeRTOS API? If the freeRTOS API, then which one and where do I find some examples?
Finally, I read it is better to keep the freeRTOS tick to 1,000 Hz. But I have a PIC32 running at 200 MHz. Is that still the case or not?
Many thanks again!
With the X in vTaskDelayX Richard means all related functions because they all are based on the FreeRTOS system tick. The Systick is the only timer used by FreeRTOS.
Therefore there is no FreeRTOS API using any other timer and if you make use of an other peripheral timer you have to implement it yourself.
They’re is not really a general rule or value for the FreeRTOS Systick i.e.
configTICK_RATE_HZ. It’s up to you to find the best matching value based on the actual clock speed and the desired FreeRTOS Systick resolution. When using time slicing the Systick also determines the granularity of the scheduler running/preempting 2 or more tasks with the same priority. Keep in mind that a higher Systick frequency increases the overhead consumed by the FreeRTOS scheduler itself because it’s triggered more often (with every Systick).
I did not pick on the X.
Thank you Harmut and Richard.
One quick comment, 1000 Hz is NOT the ‘default’ speed of the tick interrupt, it is just the rate used by the demos, in part to stress the system.
I find I tend to use a significantly lower tick rate in most system. The biggest restriction on Tick rate is the precision that you need for timeouts, since they will tend to be -1,+0, but that normally isn’t that tight. A second factor is what rate/precision you need for pure time periodic tasks, but I find in most cases this will tend to be ‘human’ time scale stuff (so 1 ms is still fast) or is actually based on some other device/interrupt, not just time based.
That makes sense. Thank you Richard