xTaskGetTickCount() resolution to microseconds

Hello,

I want to use the xTaskGetTickCount() function to get the time in microseconds.

I know that by using:

xTaskGetTickCount() * portTICK_PERIOD_MS;

whereby

#define portTICK_PERIOD_MS ( ( TickType_t ) 1000 / configTICK_RATE_HZ )

I can get the time in milliseconds. What is the best way to have the resolution in microseconds?
I didn’t found any examples up to now

Thankful for any suggestion!

You could just multiply the number by 1000 to get it in microseconds, but that doesn’t improve the accuracy of the value.

Basically, you need to read a higher speed counter to get more resolution, and that would be port-specific. You could look at the performance counter code for an example, may processors have a counter that runs at the instruction rate, or have timer peripherals that can run at a specified rate that can give you this information.

Thanks for answering. You mean (the not accurate way) just:

xTaskGetTickCount() * 1000?

Richard means xTaskGetTickCount() * portTICK_PERIOD_MS * 1000

Yes, xTaskGetTickCount can’t have more accuracy then the tick, but you can give it as much resolution as you want. To get more real accuracy, you need to use a faster counter, the availability of which is very processor specific.

Yes, I concur with @richard-damon. You should use a hardware specific counter/timer to get more resolution than the tick count.
In essence:
TO-DO:

  • Use a hardware timer/counter specific to your processor. Such details can be found in the data-sheet/reference-manual of the MCU.

Not TO-DO:

  • Don’t increase the frequency of the tick count too much (think tick count every microsecond). This will most probably be too much for a small processor and context switching overhead will also reduce the performance.