I have set up a control task that sets a frequency generator up and delays itself for a set period of time in millisecond ticks. The idea is that when it resumes it turns off the frequency generator and an exact known number of rising edges have occurred.
My question is… Just how accurate is tick timing? Presumably, i could run the frequency generator at 1kHz for 1000 ticks and i’d get exactly 1000 rising edges. What are the odds I’d get 999 or 1001.
Now, pushing the limits, what if i were running at 2, 5, 10kHz, how likely am I to be able to stop the frequency generator at exactly the right number of rising edges?
For the sake of argument, lets say there is only ONE task running.
Thanks for your input,