Search the Community
Showing results for tags 'accuracy'.
Hi to everyone, I'm new here. While i was reading the AD2 specifications I noticed that Analog Input and Voltmeter have some differences in the accuracy: is there any technical explaination? Why the Analog input has a 0.5% more?
Hi all, I am seeing an issue on multiple Digital Discovery devices, where the timing precision is off by over a percent (where I'd expect 10—100ppm, at least 2 orders of magnitude better). The problem can be seen most easily by having the device generate a simple 50% duty cycle clock on one of its output pins, e.g. 1 kHz. On two different Digital Discovery devices I tested, the scope reports a frequency of below 990 Hz. The signals otherwise look as expected (good level, stable, good edges) apart from the bad frequency behavior. I confirmed the measurement with a high-quality Ke
Documents for the DMM Shield states an accuracy of +-0.1% for voltage, current, and resistance except for specific ranges highlighted. Is this accuracy obtained with factory calibration only OR does the user need to also need to calibrate the equipment? If the user needs to calibrate, can the calibration factors be stored and used by Shield OR is a external program required to make adjustment in real time. At the moment, the Shield fails to produce comparable measurement with a Fluke 88V that has lower accuracy for current and resistance readings. For the voltage readings, the accuracie