hi
I'm using 2 target to host DMA FIFOs to transfer acquired data to the host PC (see picture).
In the FPGA vi, whitin the while loop, I use a Wait (in microseconds).
Whatever the value "dt" given to this Wait, in the host I recollect data with a systematic error in time:
every 10 points the time elapsed is not dt but (dt+1 microsec).
It's a small error, but becomes significant when dt needs to go below 10 us, as I need, and makes the analysis harder.
This behavior is always identical and not random (I get ten good "dt", then one dt+1 microsec, and back again).
I wonder if I do something wrong or if it's an intrisic problem.
Thanks