Search the Community
Showing results for tags 'datalogger'.
Found 3 results
Hi guys, I would like to find out if it is possible to log an analog signal at 100MSPS for about 5 to 10 minutes? I red that the buffer of the AD2 is very limited so it wont safe a continuously file without any gaps. But this would be fundamental. If there are these gaps, is it possible to estimate the length of these gaps? If this would work, what proper solutions are there to safe the data? I thought a Raspberry would be perfect for this. But I red that the AD2 has some problems with it. How would the software waveforms safe the data? Is it easy to export it to excel? Would be great if you can help me. Pascal
I am a scientist who is frequently tasked to characterize instrumentation ( As opposed to calibrating to NIST standards.) I noted with interest on the OpenScope Kickstarter page: "monitor long-term (create an IoT device to capture, calculate, and log readings over many hours or days)" Life would be a LOT easier if I could use an affordable solution like an OpenScope-WaveForm with two 16-bit sensors at <<1000sps each. Then I read the part about 12-bit 48 channel ADC. Well, I would settle for 12-bit, but I have to ask.. Question: Is it practical to sacrifice bandwidth and <44 channels to achieve some semblance of 16-bit resolution? I appreciate that Digilent should not be expected to cannibalize their commercial offerings in favor of the OpenScope project. I am hoping that EE wizards will find a 2-channel, 16-bit, 1000 sps OpenScope to be of no threat to their productline/market. On the other hand, it would of tremendous help to scientists! 😬 Thanks for your patience, and congratulations on a spectacular Kickstart!
Hi all, I struggled for a while with unexpected DC outcomes from the datalogger. After research I found a DC temperatuur drift error. I think this affects the oscilloscope and voltmeter tool as well. If you watch the attached picture you see a 2 hour record of a 3,3V DC channel (1) and a zero volt (shorted) channel (2). You see a temperature / voltage drift in the range from -38,75 mV by 22 °C to + 2,4 mV by 43,75 °C. The only point where this drift is zero is after 1 hour. For me this DC drift makes the mV range useless and even in the Volt range i have to wait an hour before i have more or less reliable results. The fact that the Range setting in the settings pull down menu is by default on maximum and does not get saved with the workspace, gives always the maximum error for any certain point in time, I made this record with the 10V range in the settings pull down, if it was on the default maximum it was much worse, both channels are in the 100 mV range. Calibrating the device after one hour makes no difference in the temperature dependency. Perhaps this is only the case with my device but that's why we have a forum isn't it? Now my question, where am I wrong? Thanks in advance, Hans.