I have setup my application and it’s working fine.
My peripheral clock is 40M, and the cpu clock is 80M.
The tick is correctly generated at 1000Hz, I verified with a scope.
So even if everything appears correct, my task uses a delay of 500ms
const TickType_t smallDelay = 500 / portTICK_PERIOD_MS;
yet in reality the task is switched at 250ms.
I tried with both delay and delayuntil, but the result is the same.
I am quite baffled by this, I don’t get where this factor of 2 comes in.
The only way to solve it is to put the peripheral clock in the FreeRTOSConfig at 80M.
But that’s not true:
#pragma config FNOSC = PRIPLL // Oscillator Selection #pragma config FPLLIDIV = DIV_2 // PLL Input Divider (PIC32 Starter Kit: use divide by 2 only) #pragma config FPLLMUL = MUL_20 // PLL Multiplier #pragma config FPLLODIV = DIV_1 // PLL Output Divider #pragma config FPBDIV = DIV_2 // Peripheral Clock divisor
And the main OSC is an 8MHz xtal, so divided by 2 gives 4M, and multiplied by 20 gives 80M main clock. The peripheral clock is then divided by 2 so that gives 40M as correctly stated in the config.
#define configTICK_RATE_HZ ( ( portTickType ) 1000 ) #define configCPU_CLOCK_HZ ( ( unsigned long ) 80000000UL ) #define configPERIPHERAL_CLOCK_HZ ( ( unsigned long ) 40000000UL )
Anyone knows what’s happening?
And please how do I avoid code to appear in bold in this forum?