I'm doing what should be a pretty simple edge counting/event timestamping operation. I have some random events occurring at unknown times after an external trigger signal. I need to timestamp the time difference between the trigger and the random events, so I have a counter set up to count the 100 MHz internal timebase, and use the external events as a sample clock. Basically, whenever the external event happens, it reads a value from the counter and stores it. The trigger signal is used to reset the internal counter.
I wrote and tested this using a USB-6353 at the office, and my customer is using a PCIe-6612. Right now we're just doing some testing using function generators. As best as we can tell, the function generators are configured the same, but we're getting different values for the reads. The test setup is just a 10 kHz repeating trigger with a second pulse happening 5 µs after the first. As often the case, "it works fine for me" but I read this value as 5 µs and the customer sees it as ~6.3 µs. A 4 µs pulse reads as 4 µs for me, and ~5.3 µs for the customer on his 6612.
Is there a difference between the 6353 and 6612 when it comes to counter based tasks? Maybe some default edge direction I'm forgetting to set, or maybe a default "counter reset" value?