I’am begginner in RTOS , I’am confused in TICK_RATE, what is Tick_Rate_HZ, what changes occurr when changing configTICK_RATE_HZ could you please explain or give any link to understand about the tick_Rate.
RTOSes such as FreeRTOS configure a time source to generate a periodic interrupt which is called the tick interrupt. The interrupt is used to unblock tasks that are blocked with a time out, and to switch between tasks (time slice) that run at an equal priority.
Time is measured in ‘tick periods’, which is the number of tick interrupts. So, for example, a task may want to block using the vTaskDelay() API function, and when it does it specifies how long it wants to block for in ticks. For example, vTaskDelay( 100 ) means block for 100 ticks. How long 100 ticks actually is depends on the frequency of the tick interrupt. You can convert ms into tick by dividing by the tick rate in ms, so 100ms is ( 100 / portTICK_RATE_MS ).
If you set the frequency high then you can have a higher resolution time, at the expense of having to process more tick interrupt. Normally an application would set the tick frequency at around about 100Hz.
Thank you so much, okay i understood something. Here if i want 500ms delay i want to set my API function vTaskDelay( 500/portTICK_RATE_MS ) that means vTaskDelay( 500/(1000/100) ) and it is equal to vTaskDelay( 50 ), 500ms would take 50 tick interrupts if my tick frequency is 100Hz, and 1 second = 100ticks. all these are correct?. If so what happens if i set tick frequency 10Hz would it be vTaskDelay( 500/(1000/10) ) = vTaskDelay(5)? that means 5 ticks for 500ms or 10 ticks for 1 second and if so i can set upto vTaskDelay(65535) that would give 109 minutes. But documentation says if configUSE_16_BIT_TICKS is 1 the maximum available delay is nearly 11 minutes.
documentation says if configUSE_16_BIT_TICKS is 1 the maximum available delay is nearly
11 minutes.
The maximum delay can only be specified in tick periods, as the amount of calendar time it relates to is dependent on the tick frequency (as per the calculations you have in your post). Where does it say this in the documentation? Can you post a link - I will get it clarified if it is misleading.
In that discussion, I said that at 100 Hz (a fairly reasonable tick rate, and the rate that the questioner indicated) the time limit was about 11 minutes. Note that I said: “At 100Hz, a 16 bit tick will allow a maximum delay of 655 seconds, i.e. about 11 minutes.”.
The poster was asking about getting hours of delay, and I was posting that this wouldn’t work.
Sorry it was my mistake. let me know all other things which i mentioned here are correct or not?.Let me ask some more things can i set frequency 1 Hz and for 500ms
vTaskDelay( 500/(1000/1) ) ? Is that possible vTaskDelay(.5) .5 tick?. Why saying 100HZ is a fairly reasonable tick rate? And i think i can set delay upto vTaskDelay( 65535000/(1000/1) if configUSE_16_BIT_TICKS is 1, that mean vTaskDelay(65535) 65535 seconds=18 hrs?
With a 1 second tick rate, you can not delay for .5 seconds, as the parameter is an integral type (so 1/2 = 0 by integer math). Actually, a delay of 1 tick means delay until the next tick occurs which might be in a few microseconds, or might be nearly a full tick interval, depending on how long it has been since the last tick. Thus a delay of 1 will give you 0-1 tick periods, a delay of 2 will give you 1-2 tick periods, and so on (and the upper end is only applicable if no higher priority task wants time).
I comment on 10ms/100Hz as a reasonable tick period out of experience. If you slow it down to 10Hz/100ms, then any user interface that works on timer ticks will exhibit noticeable lags at time as you can notice lags of that order.
I also rarely find a need for timeout to be as accurate as a ms, 10s of ms seem to be good enough.
Sorry i don’t understand you, it may be because of first time in RTOS
I comment on 10ms/100Hz as a reasonable tick period out of experience. If you slow it down to 10Hz/100ms, then any user interface that works on timer ticks will exhibit noticeable lags at time as you can notice lags of that order.
i asked many things but i didn’t get expected reply. Could you please reply each separately.
suppose my configTICK_RATE_HZ is 100HZ
1) What does that mean? 100 ticks will give 1 second delay? // vTaskDelay(100) = 1 sec delay, vTaskDelay(50) = 500ms delay?
**2)**vTaskDelay(500) In this API function it means delay or block for 500 ticks and if vTaskDelay( 500/portTICK_RATE_MS) means delay or block for 500ms right? equivalent vTaskDelay( 50).
configTICK_RATE_HZ is 10HZ
**3)**vTaskDelay( 10) here 10 ticks blocks 1 second and i can set maximum delay upto
vTaskDelay(6553500/portTICK_RATE_MS)= vTaskDelay(65535)( maxim unsigned short value) that gives 1.8 hrs block time is that correct?
4) I’am using 16 bit microcontroller, since can i set configUSE_16_BIT_TICKS in to 0
It means the tick frequency in hertz is 100. So there are 100 ticks per second, or one tick every one hundredth of a second.
100 ticks will give 1 second delay?
Yes.
2)vTaskDelay(500) In this API function it means delay or block for 500 ticks and if
vTaskDelay( 500/portTICK_RATE_MS) means delay or block for 500ms right?
Yes.
equivalent vTaskDelay( 50).
If configTICK_RATE_HZ is 100 then asking for a delay of 50 ticks is asking for a delay of (50 * 0.01) = 0.5 seconds as there is one tick every one hundredth of a second.
configTICK_RATE_HZ is 10HZ
3)vTaskDelay( 10) here 10 ticks blocks 1 second
Yes.
and i can set maximum delay upto
vTaskDelay(6553500/portTICK_RATE_MS)= vTaskDelay(65535)( maxim unsigned short value)
So the maximum delay you can ask for with a 16-bit number is (0xffff * 0.1) = 6553.5 seconds = 1.82 hours
Note that this only applies to vTaskDelay() and vTaskDelayUntil(). If you are blocking on queue or semaphore and INCLUDE_vTaskSuspend is set to 1 in FreeRTOSConfig.h then the longest block time in ticks you can request is actually 0xfffe as 0xffff is a special case that results in the task blocking indefinitely (without a timeout).