FreeRTOS: Studying impact of delays in tasks

Hello. I am new to freeRTOS and currently learning the fundamentals. I am trying to study the impact of delays/functions on the execution time of a task. Consider this simple task that I wrote:

xStart = xTaskGetTickCount();
xil_printf(<A 14400 byte string>)
xEnd = xTaskGetTickCount();
xDifference = xEnd - xStart;
xil_printf("Time diff: %lu ticks\n\r", xDifference );

Now since I am printing the string on a serial interface with a baud rate of 115200 (14400 bytes/sec), the print statement should take exactly 1 second to execute (TICK period set to 1000). But, the time difference calculated is 1.250 seconds. Can anyone please explain this to me? Why is it taking more time than it should?

Also, if I use “vTaskDelay( pdMS_TO_TICKS( 1000UL ) );” instead of a print statement the difference calculated is 1000, which is correct.

Thank you.

You are converting Baud rate to Characters Per Second wrong. When sending out on a character on an asyncronous part, there is first 1 start bit, then typically 8 data bits, then typically 1 stop bit (some modes add a parity bit, might only have 7 data bits, and sometimes 2 stop bits), so the typical transmission has 10 bits per character, so at 115200 you get 11520 characters a second, not 14400 (and 14400 / 11520 = 1.25)

Thank you very much for the answer.