FreeRTOS + SystemClock Behavior


I am using STM32L4A6VGTxP with CubeIDE and want to create a system with FreeRTOS.
However, I am having the following problem when running the system under FreeRTOS while changing the SystemClock.
Is there a solution?

・Custom boards using the STM32L4A6VGTxP microcontroller
・Using STM32CubeIDE v1.9.0
・Firmware Package STM32Cude FW_L4 V1.17.2
・FreeRTOS v10.3.1 with CMSIS v2

Problem Description
FreeRTOS is initialized with the system clock set at HSI 16MHz at microcontroller startup.
If the system clock is subsequently changed, the OS time remains set at 16MHz.
How can I synchronize the OS time when the system clock is changed?

For example
(1) If you change the system clock from HSI 16MHz to HSI 80MHz
→osDelay(1000), the Delay time will be 200ms.
Execute osEventFlagsWait(ef_id, flags, options, 1000), the timeout time will be 200ms.
When HAL_Delay(1000) is executed, Delay time becomes 1000ms.

(2) When changing from HSI 16MHz to MSI 4MHz
→When osDelay(1000) is executed, Delay time becomes 4000ms.
When osEventFlagsWait(ef_id, flags, options, 1000) is executed, timeout time becomes 4000ms.
Execute HAL_Delay(1000), Delay time will be 1000ms.

When SystemClock is changed from 16MHz to 80MHz,
If we use Systick for Timebase Source,
Delay time for HAL_Delay(1000) and osDelay(1000) is the same at 1000ms.

However, when TIM6 is used for Timebase Source,
Delay time for HAL_Delay(1000) is 1000ms, but Delay time for osDelay(1000) is 200ms.

When using RTOS with STM32CubeIDE, it is strongly recommended to use a HAL timebase source other than Systick.

I do not understand why there is a difference in delay time between HAL_Delay and osDelay when Timebase is set to TIM6.
I would also like to know what kind of reconfiguration would maintain 1Tick=1ms.
Thank you in advance for your help.

The key point to consider is that the when you change the rate of the system clock, you need to change the divider used by the timer generating your tick interrupt to keep that period constant. FreeRTOS itself doesn’t come with code to do this, as it is too hardware specific, and even for the porting layer, too application specific and unusual. I suspect that 99+% of programs won’t need to handle this sort of case.

Your options:

  1. Write code yourself to change the divider for the timer being used by the system tick. You can look at how the timer is initialized in the port layer to see what you need to do.

  2. Write code yourself to change the system tick to be based on a timer that doesn’t change when you change the processor frequency. I will often use one of the “Low Power” timers that decides the 32768 Hz crystal used by the real time clock to give me a 1024 Hz tick or more likely 102.4 Hz tick. Your rarely really need 1 ms resolution on the tick, and being exactly x ms is also an unusual requirement, so that is close enough.

When you call osDelay(1000) which maps to vTaskDelay(1000), you are asking the task to delay for 1000 ticks. If SysTick is driven using the same clock as core and you change the core clock, the FreeRTOS tick rate will change. As a result, 1000 ticks will result in different amount of time.

What is the value of configTICK_RATE_HZ? If it is set to SystemCoreClock like this and assuming that this variable gets updated when you change the system clock, using pdMS_TO_TICKS should fix the issue -

vTaskDelay( pdMS_TO_TICKS(1000) );

HAL_Delay is a busy loop delay and you should avoid that to not waste CPU cycles.

One more option is to drive the FreeRTOS tick using a different timer (which does not change with processor frequency) by overriding this weak function -

Thank you for your answer.

The configTICK_RATE_HZ is still 1000 though,
Because configSYSTICK_CLOCK_HZ changes when the SystemClock is changed,
if the Clock is changed during operation,
vPortSetupTimerInterrupt() by myself to update the FreeRTOS SysTick.
If we update it with vPortSetupTimerInterrupt() when we change the Clock,
I was able to confirm that the FreeRTOS SysTick works in sync with the changed Clock.

Very helpful.
Thank you very much.
I appreciate it very much.

Remember, configTICK_RATE_HZ doesn’t actually force the the tick rate to always be the right speed, but only that it will be the right speed when the system clock is the speed indicated by configSYSTIC_CLOCK_HZ at the point the system initializes the timer that generates the tick.

The key thing to remember is that every “timer” based on the system clock may need to be reprogrammed when you change the system clock rate, so if you plan to change that dynamically, your code needs to handle that.

You might not want to just call vPortSetupTimerInterrupt(), as that might do some things that you don’t need (or want) done again, but you should perform the action from that that set the timer dividers for the new frequency.

Thank you for your kind remarks.

When changing the system clock rate,
that all “timers” based on the system clock may need to be reprogrammed.
I will be aware of this and will check it out.

Thank you very much.