Rick314

Members
  • Content Count

    9
  • Joined

  • Last visited

About Rick314

  • Rank
    Newbie

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. So let me clarify for other readers -- Everything in reply to my original 2/4/19 issue regarding the Electronics Explorer spectrum analyzer is unrelated to the root problem. It is instead a defect with the Electronics Explorer arbitrary waveform generator (AWG) that is generating the FM modulation signal. I tried what @attila suggested and the workaround works as shown below. Compare the following image to the image in my original post. Now the FM Bessel null (delta_f/f_mod = 2.405) correctly happens at an Electronics Explorer FM Index setting of 19.25%, derived by solving delta_f/f_mod = x*f_carrier/f_mod = x*12340/987.6 = 2.405, for x = 19.25%.
  2. @attila: I am still interested in getting my February 25 questions answered if you are able to help. In that post I also described what appears to be a misunderstanding on your part regarding the spectrum of an FM modulated signal. I think correcting that understanding will lead to other corrections in what you contributed to this topic.
  3. Thank you @attila, especially for the National Instruments "Understanding FFTs and Windowing" paper. It was very helpful. I still have the following related questions and issues. Given an AWG FM signal of 12.34 kHz carrier frequency, 987.6 Hz modulation frequency, and 19.25% "Index" (+/- 2.375 kHz deviation, or a Modulation Index of 2.375/987.6 = 2.405), how can I configure the Spectrum Analyzer to most accurately indicate the first Bessel null, where the carrier level is at a minimum? I know it won't be exactly the null, but what settings come closest? You show spectrum analyzer images and mention their corresponding FFT capture periods. Where in the Waveforms user interface or online Reference manual is it described what FFT capture period, windowing function, etc. result from given spectrum analyzer center frequency and span settings? In your February 12 post, you say "If you look at the frequency measurement which is performed progressively you see ~9kHz/~11kHz, but the FFT shows wrongly much wider spectrum." I believe this is an error in your interpretation and that the spectrum image is correct. It shows a 10 kHz carrier, 1 kHz modulation rate, and 10% (1 kHz) FM deviation. This is a modulation index of delta_f/f_mod = 1/1 = 1. Bessel functions say that the level of the FM carrier and side-bands 1 through 5 in this situation should be -2, -7, -19, -34, -54, -74 dBc (dB relative to the un-modulated carrier) respectively. Within the resolution of the image, this is just what it is being displayed.
  4. @attila -- It appears the default FFT settings used in my test case introduce errors in the displayed spectrum analyzer result. I did not understand your reply, and think I do understand FM, spectrum analysis, and FFTs. If possible, please answer my questions. I want to understand the source of the error in the spectrum analyzer display produced by the Electronic Explorer board and Waveforms software.
  5. Thank you for your response @attila but it led to more questions. My understanding is that in my test case, the AWG is sweeping 12.34 kHz +/- 20.3% or +/- 2.505 kHz. There is no problem with the AWG and it is behaving as its settings indicate. > your test the modulation frequency too high for the frequency transformation and produces odd result. So the issues relate to the Spectrum Analyzer and its FFT. > The FFT is performed on the entire (windowed) capture, approximating the entire waveform length with sine waves of different frequencies. I think of the FFT as the 40 Ms/second sampler starting, sampling, stopping, processing, and posting one measurement. What do you mean by "entire (windowed) capture"? It would be best if you can point me to somewhere in the Reference Manual for such details. I am missing why you inserted "(windowed)". > Ideally the input waveform should be stable for the transformation period. What do you mean by "stable"? The input is an FM signal, 12.34 kHz carrier, 987.6 Hz modulation frequency, 2.505 kHz deviation. It is a "stable" signal as I understand the word. What do you mean by "the transformation period"? The sample period (or what I think of as the "window") of the 40 Ms/second sampler, or the time when the FFT (the "transformation") is happening? A key question: Given an AWG FM signal of 12.34 kHz carrier frequency, 987.6 Hz modulation frequency, and 19.25% "Index" (+/- 2.375 kHz deviation, or a Modulation Index of 2.375/987.6 = 2.405), how can I configure the Spectrum Analyzer to accurately indicate the first Bessel null, where the carrier level is at a minimum? > Here the top frequency is 20kHz, input period is 200ms´╗┐: I understand what you are showing, but not "input period". Do you mean "sample period"? And how do you know it is 200ms? Again, pointing to the Reference Manual explanation would help. Thanks again for the time you are putting into this.
  6. In WaveForms 3.9.1, what does the FM Modulation Index setting actually set? The screenshot explains why this is unclear. On the Electonics Explorer board a single jumper wire connects AWG1 to the Scope 1 AC input. Frequency Modulation (FM) Modulation Index is defined as (peak frequency deviation) / (modulation frequency) or delta_f / f_mod. A value of 2.405 is the first Bessel null, meaning on a spectrum analyzer the carrier level goes to zero as shown in the image. This is a common test scenario. So given that f_mod = 987.6 Hz, delta_f = 987.6 * 2.405 = 2.375 kHz. I found the carrier null by small changes to the Amplitude/Index FM setting in WaveForms, and 20.3% nulls the carrier lower than either 20.2% or 20.4%. What logic makes a setting of 20.3% result in 2.375 kHz FM frequency deviation? I thought it might be "peak frequency deviation = 20.3% of the carrier frequency" but that would be 0.203*12.34 kHz = 2.505 kHz so that doesn't work. Both Factory and my own calibration settings produce the same result.
  7. > The high impedance mode is considered to be safest option and it is used as default. Thank you for your reply attila, but "is considered" by who and "to be the safest option" why? I described why this is a problem -- In my college Digital Logic class using 9 Electronics Explorer boards for the first time, over half the students experienced unexpected Static I/O LED outputs turning on for no apparent reason. This is a problem. Explaining what was happening had nothing to do with the lesson. There isn't really any "safety" problem with the default configuration being outputs, is there? (I am a software engineer with decades of test equipment user interface design experience.) It seems the highest RIO fix for the problem is to make the default configuration of the Static I/O DIO pins be as outputs.
  8. Thank you attila, that makes sense. But then it seems that "LED" is an unwise default value of all Static I/O pins, being equivalent to "32 antennas sensing nearby DIO pins or other stray signals". Defaulting to some driven choice (an output with a default 0 or 1) seems like a better choice, so it takes intentional user action to turn them into antennas. Do you agree?
  9. Electronics Explorer Static I/O DIO pins configured as LEDs that are unused/unconnected will light when an adjacent driven LED lights. Details: Board power ON, nothing in the breadboard area, Supplies "Fixed Supply - Vcc" = 5V and "Master Enable" On, Static I/O DIO 31-8 = LEDs, DIO 7-0 = Bit I/O, Switch, Push/Pull. Connecting a wire from DIO 8 to Vcc lights both LED 8 and 9 solidly (DIO 9 has nothing connected to it.) Similarly, connecting Vcc to DIO 9 only results in both LED 9 and 8 on. The same is true for LED pairs 10 and 11, 12 and 13, 14 and 15... The current workaround is to configure the LEDs adjacent to used ones as non-LED (e.g. a Button) to keep them out of consideration. But shouldn't unused LEDs always be off? Why is it only an issue with adjacent pairs?