Hello Labview Community,
I am trying to implement a measurement setup where I use two ABZ-Counters simultanously to obtain the position error of a DUT (ctr0) vs a reference system (ctr1).
I use a dummy AI-Task to generate a clock for synchronous clocking of both counters and an ARM Start trigger to have both counter task start when the DUT Index pulse is detected.
I use a flat sequence structure to ensure both counter tasks are started before the clock is being provided. At the end the calculated discrepency (i.e. the error) is then plotted against the corresponding position value of the reference system to obtain a dataset 'error vs. position on circumference'.
Now the problem is that when I execute the VI I'll notice that the dataset doesn't start at position 0° degrees but at 60°-100°, depending on how fast the scale is turning when taking the measurement. That makes me think that for some reason the counter registers are being filled up before the 'read N samples' VI gets to execute and the N samples then are not taken from the very beginning. I'm a bit at a loss here, how can I ensure that my measurements starts at 0 degrees while the scale is in movement?
I tried forcing the start task of the dummy AI to perform after the read N samples VI via flat sequence, hoping that providing the clock after the samples have been requested solves the issue, but to no avail.