I am facing a complex issue to debug around queueset usage.
My queueset is made of 2 queues Q1 and Q2 which are filled in different contexts
I respected the queueset construction by setting a size for whole queueset which is the sum of the elements for each Q1 and Q2.
Q1 is filled using xQueueSend in two different tasks T1, T2 and T3.
Q2 is filled using xQueueSendFromISR in always the same interrupt source
I am using T1 to make decisions based on Q1/Q2 data coming into. T1 is the only task waiting for events from Q1/Q2 using the queueset.
When I am waiting for the queueset to unblock from Q1 or Q2 from T1 task, I am using xQueueSelectFromSet function with an portMAX_DELAY tick delay generally.
Then once I get the member result from function, I consume Q1 or Q2 data using xQueueReceive function with 0 delay.
As stated above, T1 can sometimes sends data inside Q1, but Q1 can also be filled from anoter T2 or T3 task as well.
In 99% of the time, the handle of my mecanism is working fine, HOWEVER I found a weird issue : I am loosing an event in Q1 and the sum of waiting messages from Q1 + Q2 is not the same as waiting messages in queueset !!
How can it be possible ??
Maybe I am doing something wrong, please advice.