hello, i’m attaching a function void vPortSetupTimer() from port.c file. please explain the subroutines that are call within the function bacause i didn’t get an idea that how it sets the timer to generate the tick interrupt.
/* Sets and enable the timer interrupt */
time = mtime->val_low;
time |= ((uint64_t)mtime->val_high << 32);
time += (configCPU_CLOCK_HZ / configTICK_RATE_HZ);
mtimecmp->val_low = (uint32_t)(time & 0xFFFFFFFF);
mtimecmp->val_high = (uint32_t)((time >> 32) & 0xFFFFFFFF);
/* Enable timer interrupt */
__asm volatile("csrs mie,%0"::"r"(0x80));
A few pointers: we don’t know which platform you are using it would have helped mentioning the processor manufacturer at least.
Also, setting up the tick timer interrupt is a fundamental part of getting FreeRTOS going and it would help if you took a read through the extensive documentation available online before getting going.
As a starter I would recommend you read the new project creation pages, follow up with looking through your platform’s demos.
thanks for reply, i’m working on RTG4 FPGA which supports RISC-V processor and operate it at 50 MHz frequency. but the thing is i’m suffering from delay related issues like i didn’t got a proper delay means timing calculation are not exact.
as per my knowledge tick interrupt value is set by configTICK_RATE_HZ right?? i had set that value properly.
please tell me that configCPU_CLOCK_HZ will affect to tick rate??
i think timer that sets the tick interrupt defined within the port.c file will affect the delay timing. is it?
configCPU_CLOCK_HZ influences the tick rate as that is used for calculating the contents of the tick timer register. So you want to set that to
50000000UL. Also make sure that
configTICK_RATE_HZ is set to no more than 1000.
i already set the values as you told but still delay value is not exact. Is there anything else which i missed to set??
What’s the variation in the tick frequency? Can you toggle a pin within the timer tick interrupt and measure that compared to what you expect?
I had created a program with two task in first task i used UART to print values and in second task use LED to toggle values. i had set configTICK_RATE_HZ = 1000 and take vTaskDelay(1000).
so by calculation delay is 1000ms (1 sec) but in my case i got delay value as 100sec. means each time i got value which is multiply by 100 with it’s original value ( 1 sec *100 = 100sec).
One thing to note, that configCPU_CLOCK_HZ is slightly misnamed, as it really needs to be the clock rate of the timer that is being used (which maybe has a prescaler from the real CPU clock).
Second, that setup looks a bit strange, unless the tick interrupt does something similar, and keeps setting the value of mtimecmp to be the desired point of time in the future.