Search the Community
Showing results for tags 'accuracy'.
Found 2 results
Hi all, I am seeing an issue on multiple Digital Discovery devices, where the timing precision is off by over a percent (where I'd expect 10—100ppm, at least 2 orders of magnitude better). The problem can be seen most easily by having the device generate a simple 50% duty cycle clock on one of its output pins, e.g. 1 kHz. On two different Digital Discovery devices I tested, the scope reports a frequency of below 990 Hz. The signals otherwise look as expected (good level, stable, good edges) apart from the bad frequency behavior. I confirmed the measurement with a high-quality Keysight 53230A frequency counter. Also, repeating the same measurement using an Analog Discover 2 shows the expected performance (1 kHz accurate to about 10 ppm, which is pretty good for a non oven-controlled crystal oscillator). I have attached screenshots of what I'm seeing for one of the Digital Discovery devices below. The other one is similar (showing about 987 Hz instead of 984.5 Hz). You can already see on the scope that the signal isn't a nice 1 kHz, and the Keysight measurement confirms that quite unambiguously. It should also be noted that the Keysight measurement shows that the signal not only has an unexpected frequency, but also that it isn't as stable as one would expect. I can provide serial numbers of the devices I tested and perform measurements if this helps to pinpoint what's going on here. Any help would be appreciated. Given my very positive experiences with the AD2, I can't believe that what I'm seeing is within spec. On the other hand, I verified this with two different DD devices purchased over 6 months apart, so it looks like I'm not just looking at a single faulty device. As a side node, it seems I'm seeing the same thing on the inputs; if I sample a high-quality externally generated 1 kHz clock (coming from an SRS FS740), the timing seems noticeably off. However, Waveforms doesn't provide an easy way to display the trigger rate counts as far as I am aware, so I want to focus on the clock generation problem first, because that's much easier to verify and reproduce. If anyone else can repeat this simple measurement and share their findings that would be great! Best regards, Sidney
Documents for the DMM Shield states an accuracy of +-0.1% for voltage, current, and resistance except for specific ranges highlighted. Is this accuracy obtained with factory calibration only OR does the user need to also need to calibrate the equipment? If the user needs to calibrate, can the calibration factors be stored and used by Shield OR is a external program required to make adjustment in real time. At the moment, the Shield fails to produce comparable measurement with a Fluke 88V that has lower accuracy for current and resistance readings. For the voltage readings, the accuracies are comparable but the Shields gives a poor reading based on calculated values vs measured.