Hello!
I made a big (wel for me it is big) State Machine (SM).
There are a few while loops (Q-sm's) they all have got a case inside, and before the case there is a element de-queued. The queued data is a case with a variant and a typedef. The typedef selects the case the variant is used in the case for what ever reason. It is a pretty standard Qued SM ques.
So there is:
(1)one Q-sm for the data-IO, now it is a simulation but later on it wil be real hardware.
(2)One main SM
(3)A error log SM
I have got some questions about how to get the timing right.
Now I have got the function Wait (ms) in the data-IO loop waiting for 10ms AND the same function Wait (ms) in the main SM.
Q1 Is this the right way? Do I need the function Wait (ms) in all Q sm's? Wouldnt it be better to have one SM telling the other SM's the speed of exeqution?
Q2 in the error log SM there is NO timer, is this okay to do? In other words will the Q-sm only exequte when there is data dequeued?
Q3 Is the function Wait (ms) the right function? It suggests that this function waits for x-ms (I use 10 ms) does this 10 ms add to the rest of the code? That way if the rest of the code would take 1ms and I have got a 10ms function Wait (ms), I get a total of 11 ms. Is this true?
Thank you all for the great help!!!