Quantcast
Channel: Counter/Timer topics
Viewing all articles
Browse latest Browse all 1271

Timing latency

$
0
0

We're trying to get accurate (ms level) measurements from one pre-processed analog signal, one (python) program performing some processing, and a post-processed analog signal. Currently we have the two analog signals fed into a NI USB-6212 box. Additional processing is performed by other systems before the signal reaches our program and after it as well. What we'd like is a measurement of what part of the total lag is the analog signal getting to the python program and what part is after.

 

Since the NI DAQ box can get highly accurate time measurements from a recorded start time using LabView, we figured that having our python script also record the system time would suffice. However, based on our readings, we're not completely confident that the clocks are matching up - our code seems to get the signal before the NI DAQ box records the signal, which should be impossible.

 

We're considering trying to get our program to output a signal to the NI box so all measurements are performed on the same clock - in which case, getting the lowest latency signal possible is important.

 

The other approach is to move the NI DAQ code inside our code (using pydaqmx) - but my understanding of how data is collected in this way in batches sounds like it doesn't guarantee more accurate timing. 

 

Any thoughts or advice would be appreciated. This is my first time working with a NI DAQ system.


Viewing all articles
Browse latest Browse all 1271

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>