Is delay in a task not functioning as expected?

OK I am experiencing an interesting behavior
The idea is to read sensor values every X seconds i.e 3s here) on a nordic uC but I when I check the logic analyzer, it seems to be happening a lot more frequently despite the delay used in Read().

Another issue is with xQueueReceive which somehow ends up in an unknown address after its execution (couldn’t debug myself; may check with the nordic team on it)

// system.cpp

SystemTask::SystemTask(Sensor& _sensor, QueueHandle_t& _q) : sensor(_sensor), q(_q) {}

void SystemTask::Start()
{
    if (xTaskCreate(SystemTask::Process, "Run", 350, this, 0, &taskHandle) != pdPASS)	
    {
        APP_ERROR_HANDLER(NRF_ERROR_NO_MEM);
    } 
}

void SystemTask::Process(void* instance)
{
    auto* app = static_cast<SystemTask*>(instance);
    app->Run();
}

void SystemTask::Run()
{
    sensor.Start();

    while(true)
    {
       /* should be blocking the thread but for some reason ends up in an unknown state
        if (xQueueReceive(mTaskQueue, &mMessage, portMAX_DELAY) == pdPASS)      
        {
            
        }
      */
       vTaskDelay(pdMS_TO_TICKS(10000));
    }
}


// sensor.cpp
void Sensor::Start()
{
    if (xTaskCreate(Sensor::Process, "Process", 100, this, 0, &mTaskHandle) != pdPASS)	
    {
        APP_ERROR_HANDLER(NRF_ERROR_NO_MEM);
    } 
}

void Sensor::Process(void* instance)
{
    auto* app = static_cast<Sensor*>(instance);
    app->Run();
}

void Sensor::Run()
{
    ConfigI2C();
    Write(buffer, 1);  // write byte
   
    // read loop every 3s
    while(true)
    {
        Read();
        vTaskDelay(pdMS_TO_TICKS(3000));
    }
}

// main.cpp
int main()
{
   static constexpr uint8_t queueSize = 10;
   static constexpr uint8_t itemSize  = 1;
   QueueHandle_t q = xQueueCreate(queueSize, itemSize);

   Sensor sensor;
   systemTask.Start(sensor, q);
   vTaskStartScheduler();
}

Here’s the visuals: Screenshot-2023-05-28-at-11-30-44-PM hosted at ImgBB — ImgBB
clearly the read calls are made a lot more frequent than 3000s so I am curious what may be going on here despite a 3s delay?

There’s nowhere else Read() is invoked

If the MCU is a Cortex-M the main stack is reset and used for ISRs after starting the scheduler as documented. If sensor is a reference argument this would be a problem. The related code is missing unfortunately.
If the systick is configured correctly vTaskDelay works, of course.

The local var problem Hartmut pointed out of course also applies to the queue handle q.

I think I read about it a while back but what’s happening here then is the ISR reusing the main stack, overwriting the local vars defined in main?

Is there a reason why does this happen? If ISR hadn’t kicked off, this wouldn’t be an issue?

Since the Tick and the PEND that does the Scheduler Request both are ISRs, the time before an ISR firsts in basically non-existent.

I wouldn’t waste time speculating about it. Just follow the rules and don’t use the main stack in case you have a related MCU.

It’s not wasting time though? Isn’t it always good to understand the HW before working on it?

We reset the main stack here - FreeRTOS-Kernel/port.c at main · FreeRTOS/FreeRTOS-Kernel · GitHub

After the scheduler starts, whatever you created on main stack will likely get overwritten by ISRs. The solution to your problem is to make the variables created in main global or file level global.

Yes but what if ISRs aren’t fired, would the variables in main still get overwritten?

But since the scheduler is started on Cortex-M processors (on others) with a PENDV interrupt, that happens as soon as the scheduler is started. It isn’t just your application ISRs that do this, ALL ISRs do it.

1 Like

Thanks for the link. Can you also please elaborate on how does this snippet clear the main stack?

It doesn’t “Clear” the stack, but reloads the “system mode” stack pointer to the value stored in the reset vector.