Hi,
I am using a PCIe-6353 DAQ card and try to figure out what the 'allowed' frequencies are. I found some threads about it but I am not sure I understood everything correctly. This article comes pretty close:
https://knowledge.ni.com/KnowledgeArticleDetails?id=kA00Z000000P83OSAS&l=en-US
As far as I understand, the CO can only generate signals at multiples of its internal clock. For the PCIe-6353 that should be 100 MHz (I think; the datasheet has different values for "Sample clock timebase"), 10 ns respectively. So let's say I want to generate a clock at 140 kHz (period 7142.86 ns). With a time base of 10 ns, an integer no, of ticks is therefore not possible. Following the linked article, the device will actually output the closest frequency that can be represented by an integer No. of ticks. In this case 139.86 kHz or 140.056 kHz, depending if one rounds up or down.
My question is:
1) Did I understand it correctly?
2) The article above says one can use the DAQmx timing property node "SampleClk.Rate" to read the actual frequency. WHen I do that with a simulated device, I get exactly 140 kHz, which should not be (see above). Is that because of the simulated device?
3) Isn't it better to calculate the allowed frequency myself and use this as input? This way I know exactly what is actually happening. imagine I have another external device but can only synchronize the start, not the clocks itself. In that case it will make a difference if I set the NI card to 140 kHz (running at 140.056 kHz) and the other device running at exactly 140 kHz. If i would just set it to a value that can actually followed exactly, the synchronization should be better - assuming the other device can actually run at exactly 140 kHz)
Thank you