sub-tick blocking delays

demanzano wrote on Saturday, March 06, 2010:

I’m quite new to FreeRTOS, I’ld ask your hints about best ways to perform very small (sub-tick) blocking delays.
I’m working with Microchip PIC24H and FreeRTOS 6.0.3.
Classic example are LCD data writes: I’m thinking about using a classic “gatekeeper” task which actually performs nibble-data writes (reading from an in memory copy of screen data), but delays are about 0.5 millisecond (or even less).
I think it is not correct to increase ticktimer frequency to, say, 2Khz just to be able to do a FreeRTOS-API one tick delay or similar.
Hardware is working at 80 Mhz - 40 MIPS, but I guess there are better ways.
I also tought about using real, looping blocking delays (like __delay()) and setting LCD gatekeeper task to low-priority, so it would be probably preempted and delays will increase (LCD controller would be happy but “screen refresh” would be impacted as well)
What do you suggest for this case ? which other solutions do you use ?
Thanks for any hints, suggestion, ideas !



richard_damon wrote on Sunday, March 07, 2010:

There are a couple of ways to handle these short delays.

1)  Put a busy loop in your code to just burn that time, wasteful of cycles but simple.
2) Raise the interrupt rate. Note that even a 2khz isn’t totally efficient as to be sure you get 0.5 milliseconds, you need to wait 2 ticks, as 1 tick is 0-0.5 milliseconds (depending on where you are in the current tick)
3) Just always delay a longer time then needed.
4) Set up an auxiliary timer channel. When you want a short delay, setup the timer to interrupt in the desired time and wait for a semaphore that the timer gives.

demanzano wrote on Tuesday, March 09, 2010:

Very useful hints, indeed!

I’m experimenting with 1), busy loops of about 0.5 ms and seems to work fine.
Actually I think it also includes 3), because the task can be preempted from both ISRs and other tasks, so I’m sure I delay at least 0.5 ms, no less but maybe more.
Solution 4) sounds interesting but maybe overkill, IMVHO, for this purpose (LCD controller).
About increasing tick timer frequency (currently I’m running at 200 ticks/sec, 5 ms each) what’s the rule of thumb to choose the “right” one ? This PIC24 runs at 80 Mhz / 40 MIPS, so a tick frequency of 1000 makes 1 ms = 40K opcodes, sounds still acceptable…

Many thanks for your answer!



richard_damon wrote on Tuesday, March 09, 2010:

Tick rate is a trade off, lower rates give you less accuracy in delays, but higher tic rates increase the overhead of running the tic interrupt and scheduler each tic. The delay loop works fine as long as you are able to make the routine low priority and just burn the time and get acceptable performance. #3 was calling vTaskDelay(2) which will delay between 1 to 2 ticks, always long enough, but likely way too long. If that still gives acceptable performance, it doesn’t waste the cycles in case other background tasks could make good use of the cycles.