Jump to content

Introduction, hello!


mor.paul

Recommended Posts

@xc6lx45, @zygot, and all

Thank you for your input.  As I mentioned earlier, I'm limited in time on this project.  I have many things to do besides collecting the data.  I managed to borrow a

Tektronix TDS 3034 scope and this is much better than the Rigol 1104 I was using.  I have stopped looking for an alternative ADC method, I stopped working

on the Eclypse z7.  Thanks for all the input.  I'll up date the results from this project as I get some real data.

 

Paul

 

 

 

Link to comment
Share on other sites

On 7/19/2020 at 2:44 PM, xc6lx45 said:

(oscilloscopes I've worked with tended towards lower resolution e.g. 8-bit-ish, compared to dedicated data acquisition cards)

ADC and DAC resolution is a very interesting and complicated topic. The place to get this information is not at the top of a converter datasheet or distributor filtered part query. Very few n-bit ADC devices delivery close to n bits of resolution. Just read carefully through the datasheet for clues on what to investigate. A lot of 12-bit ADCs are closer to having 8-bit resolution. And that's before you analyze the analog front end and other pertinent considerations that aren't part of the device but are necessary in order to use it. I'm just saying, when it comes to ADC hardware, don't assume anything or take marketing hype at face value.

At least with professional test equipment you are provided with tested and guaranteed performance specifications, and with a bit less wobbling on actual real world performance. As long as an ADC vendor delivers n bits of data ( or something ) out of an ADC he can claim that it's an n-bit converter, even when they say that perhaps it's really more like a converter with 8 ENOBs buried somewhere deep in the datasheet text. If you bother to research it you'll be somewhat dismayed at how many variables there are that can limit the true resolution and other performance metrics of a converter.

Link to comment
Share on other sites

Well, what I meant is, equivalent to quantization noise of an 8-bit(ish) Nyquist rate converter. As if I'd round the samples to signed char or a SNR of n*6 dB across the sampling bandwidth. That kind of back-of-paper napkin estimate.

I'll be careful not to generalize this idea of mine (especially since I've rarely used scope digital data capture, more often turning IQ baseband inputs on a spectrum analyzer into an overpriced two-channel oscilloscope) but I suspect that oscilloscopes are mainly designed to give a pretty picture on-screen and cut corners e.g. on linearity. A dedicated digitizer (=> the ADC it's based on) is orders of magnitude more accurate than anything the eye could pick out on a time domain waveform.

For example, arbitrary timebase resampling on a scope ("fine" controls on the X axis button) is a standard feature on a scope box with screen but it's a near-impossible theoretical problem in realtime anywhere near ADC quality. On a digitizer (if it's an option), I can simply turn it off because I know it's rotten. On a scope, you may need detective work to even figure out what sampling rates are being used internally and what sort of decimation algorithm is being used... oops nearest neighbor "0th-order" interpolation - suddenly timing jitter defines the noise floor, which means it's largely proportional to the frequency of the highest signal component. Not a tool I'd pick by choice.

And, I dimly recall that even basic frequency accuracy (the sampling grid / timebase) was an issue with oscilloscope-digitized data, when I tried coherent analysis (and for a budget unit not having a 10 MHz reference clock input I'd know this is the case just by looking at the back panel).

 

 

 

Link to comment
Share on other sites

I don't disagree with anything that you've added. My point is that regardless of what you are using to capture data you need to understand the context in which your measurements are made. This context includes things that we've both mentioned and quite a few things not mentioned. Part of both science and engineering is running experiments and taking measurements. You can't correctly interpret the data from experiments if the measurements are flawed, except perhaps to realize that you need to try again. Bad data can come from the test equipment, bad assumptions about how accurate the measurements are or even test equipment techniques and settings. I have personally witnessed wrong conclusions based on bad data because of poor techniques and understanding of the equipment. Sometimes, this can result in deadly decisions. Yes, I've also made erroneous measurements or drawn incorrect conclusions myself.

I suppose that the reason that I brought up the matter of resolution at all is that I've been through so many converter datasheets and application notes over the years that I get a visceral reaction to any casual mention of it. So, my post is more about my internal turmoil than your post. Still, thinking about the subject is not a waste of time.

Link to comment
Share on other sites

20 hours ago, xc6lx45 said:

I'll be careful not to generalize this idea of mine (especially since I've rarely used scope digital data capture,

Don't ask why this line just popped into my head...

If you've used a digital scope chances are that you've used a decent digitizer. Even the better low end digital scopes sample at 5X to 10X their analog bandwidth. Don't confuse the scope display with the scope digitizer raw data output. Usually what's collected when you download captured data is the raw data from the digitizer. Of course you need to verify this for your own scope. So, if you are prepared to process very large data files your scope is just a digitizer with a fancy display to help find what you are looking for before trying to capture data. The cheaper scopes can frustratingly interfere with the data capture part until you figure out how to beat it into submission. But really, for most non-automated work a reasonably good digital scope is more useful and a lot cheaper than a high end digitizer. It does depend on resolution requirements. Oversampling can increase deficiencies in resolution for some measurement situations.

I once had a couple of <$500 digital scopes that were just fine for most work. Both died within two years; I think that they would still be working if they could boot the processor controller. But the interface was surprisingly useful. You can expect cheap stuff to occasionally have nice surprises but eventually disappoint. The stuff from the typical test equipment vendors is generally too expensive for individuals and most smaller companies and even then they can disappoint.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...