• 0
Sign in to follow this  
D@n

Phantom logic in DD

Question

On one of my first attempts to use a DD, I captured a display showing several "phantom" transitions.  (See below, and attached)

dd-bouncing-display.thumb.png.1a1c2a9744fb2b4274ffd0ec314e772b.png

I call these "phantom" transitions because when I zoomed in on any of these, they vanished.  You can see many of these just left of the 0.35ms line.  However, there are many others scattered throughout the plot.  For example, the DIO30 line shows a very slow logic waveform--with super fast phantom transitions on it as well.  In general, most of these phantom transitions are very narrow.  However, there are some larger and thicker phantoms.  For example, if you look at the "PP-CLK" trace, just after 0.98ms, there is a simple rise and fall.  However, the rise is thicker than the fall, even though there are no extra transitions there.

Dan

Share this post


Link to post
Share on other sites

37 answers to this question

Recommended Posts

  • 0

@D@n,

So, you haven't provided enough information. What's clocking the data sampler? Are all of the captured signals derived form the sampling clock? I've never used the DD but have used a number of similar "digital logic analyzer" tools, including my own. It's possible that what you are showing is an artefact of the display processing but I suspect that something else is going on.

Share this post


Link to post
Share on other sites
  • 0

@zygot,

The DD has an internal clock.  From the image, assuming others can see it (I can't), I had the clock set at 200MHz.  The signals are captured using this sampling clock.

Based upon what I've seen, I'm concluding that I'm seeing an artifact of the display processing.  The reason for this conclusion is that when I zoom in (without changing the collection data, or the collected buffer), the transitions vanish.

Let me try attaching the image again.

Dan

P.S. I found the bug in my own code ... the clock on the iCE40 wasn't locking.  I had never checked for that possibility over the last couple of months of struggling with this design.  My logic was not working reliably, and I couldn't figure it out until I saw a transition take place "faster" than my logic clock.  (Sample at 200MHz, logic clock at 40MHz)  That led me to look at the clock and then the light suddenly dawned on me.  Hence, it is now thanks to the DD that I was able to fix my design.

dd-bouncing-display.png

Share this post


Link to post
Share on other sites
  • 0

@D@n,

You write: "P.S. I found the bug in my own code"...

I'm glad to hear that you found an error in your HDL. I'm skeptical of your inference from the DD display, however. What you're telling me is that the DD sample clock is unrelated to any of the signals you are looking at. Whenever you are doing asynchronous sampling you will have "artifacts" that need interpretation.  How these will be displayed might be different on an $50K logic analyzer from HP than on a simple $50 logic analyzer from Amazon, but they will be there. When your sampling clock is a few orders of magnitude higher than the clock generating the target signals it's a bit easier to interpret artifacts. If state changes "vanish" when you zoom in on them I would certainly be concerned about the truthfulness of your logic analyzer ( which always involves display processing software ). If you don't understand when you tools are likely to lie to you the result is usually a trip down the rabbit hole where little makes sense.

The same idea about asynchronous sampling applies to logic in your FPGA of course. If glitches are happening to the combinatorial logic between the clock edges and the timing isn't constrained properly then you can do asynchronous sampling even when your HDL says that you are using clocked processes properly. This is what makes FPGA development exciting. A favorable place and route one day can present a design as "working" and the next day a rebuild of the same HDL with portions routed differently can present intermittent errors. A good rule of thumb is to use the minimum number of combinatorial logic levels and the least complex combinatorial logic between clock edges as possible. And, never use a clock derived from logic as a clock in your HDL.

Share this post


Link to post
Share on other sites
  • 0

@zygot,

I think you are misunderstanding what I am trying to describe.

I'm setting the logic analyzer to trigger on a condition.  It's triggering nicely.  I'm setting the logic analyzer to sample at 200MHz, and to record a large buffer.  Somewhere within it, running at 200MHz, logic levels are being converted to 1's and 0's and then written to memory.  This memory is then dumped to the waveforms software.  I am aware of problems that can take place during this process--particularly problems associated with sampling logic taking place at 40MHz with an unsynchronized 200MHz clock.  I have no reason to suspect any problems in this processing chain given the information I have seen from the device.  (Other than the interface has crashed during four of those transfers, but we'll ignore those transfers for now ... those are another issue.)

I am also aware of problems that can be created by sampling artifacts that vanish with a new sample buffer, or with a running capture that just captures and then recaptures data.  That's not the case here as I have manually set the DD set to trigger off of a condition, and the sample buffer is not changing as I am then examining it.

My problem is associated with displaying a given buffer.  At some display resolutions, there are phantom artifacts that then disappear at other display resolutions.  Having built displays like this before, I understand the difficulty of fitting more samples on a screen then there are pixels on a screen.  The likelihood of a bug in this part of the display software is non-zero--I've had to chase several down myself with my own software.  If I zoom into the display buffer, though, to the point where I can essentially see the individual samples, and these individual samples are about 20-pixels across or so--the phantom transitions go away.  During this examination process, the sample buffer is not changing--only the display parameters.

That's the evidence I'm dealing with.

As for finding the bug within my code ... this one snapshot doesn't show what I found.  The snapshot of the clock, dumped to a wire that I could then observe, showed the problem--since the clock was anything but regular.  Using a 200MHz sampling clock, I shouldn't have any aliasing problems with 40MHz signals ... and once I had the 40MHz clock working properly, the result was a *drastic* difference.  (My logic started working reliably then too ...)  Even still, though, this 40MHz clock wasn't clocking the logic analyzer, but rather my own logic within an iCE40 (slower) FPGA.  Hmm ... capturing some samples of what I found, showing the bug, might be a fun adventure ... I should be able to re-enact the bug based on the git history for the project ...

Dan

Share this post


Link to post
Share on other sites
  • 0
Posted (edited)

@D@n,

I'm sure that I've been understanding what you are trying to describe. Perhaps I'm not as good at describing my response. ???

When you hook up flying leads to pins there are a number of things that can be problematic. A $50K logic analyzer has pretty sophisticated probe conditioning between the grabber leads and the sampler. A cheap analyzer doesn't. For the DD, between the local DDR sample buffer and the PC software display buffer is a lot of USB software. I don't have access to the DD control software so I don't know how it operates. We can both appreciate, from experience, how easy it is to mess up displaying data with a limited number of pixels to map to.

What you don't see with things like the DD are samples where data is transitioning between 0 and 1. You can't because of the design; it's all digital. If the sampler were, say a 4-bit A/D and the sampling clock period was about 1/10th the signal rise and fall times you could see this transition as well as over-shoot and under-shoot and possibly coupling of channels. This isn't a criticism of the DD ( or any cheap logic analyzer )  as its price precludes such amenities. 

Still, as likely as it is that the artifacts in the compressed display that you show  is the result of poorly designed display software, I don't feel compelled to change my previous comments. The end result is that if your display doesn't represent a "true" picture of what the target signals are doing it doesn't matter how sophisticated the analyzer is; because the display is the information.

While it's hard to see what you are seeing from the pictures ( by browser doesn't let me zoom in very far ) you no doubt noticed that the "phantom transitions" are all near transitions on other signals. I don't think that either of us can make a conclusion as to what exactly is going on.

 

Edited by zygot

Share this post


Link to post
Share on other sites
  • 0

@zygot,

Let's examine the evidence then:

  1. You don't have a DD.
  2. You haven't examined the picture I presented as evidence, nor have you asked questions so as to fully understand what I'm talking about.
  3. The problems you are describing don't match the evidence I've seen and described.  This would be apparent if you had a DD and saw and watched it work, and particularly what I was doing with it.
  4. I'm taking fixed snapshots, and watching the effect come and go *after* the snapshot has been downloaded to my PC.  There's only one transfer from USB to PC in this example, once that's finished there's no more room for further USB to PC errors.
  5. When zooming in and looking at the DD's samples, on a sample by sample basis, the samples transmitted over the USB to my PC look good and very believable.  They are not displaying any unexpected artifacts such as you are describing.
  6. You are comparing the DD to a 50k logic analyzer and not realizing the impacts of 1) the fact that modern electronics has made high rate sampling a cheap commodity, 2) by running software on the PC on my desk the logic analyzer doesn't need to include a high priced display, 3) that by using my computer this device doesn't need any of the knobs or switches the other user interface items that high priced analyzers needed to support, 4) by running on a PC and using the PC hardware, Digilent doesn't have to maintain a large support and maintenance staff to fix broken buttons, etc.

The problem I'm describing has to do with phantom transitions being displayed when massively zoomed out, but not when the display is zoomed in.  It takes place after the interaction over the USB has been completed and the data has been downloaded to the PC.  It seems particularly acute when the display line is thicker.  Sometimes the thick display line gets mixed with small display lines, with confusing results.  I'm sure this was intended to be a feature of the software, but I'm writing in to suggest that it isn't working.

Quit while you have the chance, friend, since the phenomena you are describing, while valid, haven't matched the evidences I'm describing.

I'm still hoping and waiting for @attila to comment.

Dan

Share this post


Link to post
Share on other sites
  • 0
Posted (edited)

@D@n,

Well....

1) True. I've said that already

2) Not true. But your picture doesn't provide the "evidence" that you mention. Perhaps a few pictures. I haven't asked any questions because I haven't used the DD interface. But here's one. Can you define the logic standard for the signals of interest? I could see a possible issue with threshold voltages and other logic specifications if your logic isn't compatible with what the FPGA in the DD is using. While I clearly haven't provided any ideas that make you happy I disagree that I haven't examined the evidence. What you've described just seems to be evidence of something other than a display issue to me.

3) Can't think of anything to say

4) I've understood that from the beginning

5) Same as 4

6) Not true. I'm trying to convey some of the things that a tool that bills itself as a logic analyzer might do. I already mentioned that I don't expect tools in the price category to compete with professional level tools. That doesn't mean that they shouldn't perform properly, to the level expected for the design. I don't know what you mean by high sampling rate but 200 MHz isn't high, nor has it been for a few decades, unless you are capturing logic that toggles at about 1 MHz. I've tried to convey that there's a lot more to a logic analyzer than sampling rates. BTW a real logic analyzer would have programmable threshold voltages. I'm just trying to lay out some ideas about what a logic analyzer might need to do to accomplish its duties. To be frank I disagree with everything that you say in item 6.

Like you, I'm waiting for someone else to explain your issues. Just because it looks to you like a display issue doesn't mean that it is. Really, I'd be more understanding if the display missed logic transitions instead of putting non-existent ones into a plot where there were too many samples for the the number of displayable pixels. That doesn't make sense to me. Which is why I think that perhaps what you are seeing is a symptom of something else going on.

Since it's just you and me so far discussing this I don't feel the need to seize an opportunity to quit until someone who knows what's going on reveals the answer.  If you don't like what I have to say then that's fine. If you want to dispute what I have to say then please do. Don't feel sorry for me, I haven't embarrassed myself... yet.

 

Edited by zygot

Share this post


Link to post
Share on other sites
  • 0

@D@n,

Here's a question. Can the DD allow you to save raw samples to a file? You can then do your own processing to see what the display ought to show.

I think that your assumptions about what's going on in sample processing before the display software even gets started might be wrong.

Share this post


Link to post
Share on other sites
  • 0

@HansV,

Not bad observations for someone who claims that the preceding discussion is over your head, You are correct that expensive equipment can mislead you if you aren't aware of its limitations and the physics of good probing. I don't know that crosstalk is @D@n's issue but it is certainly a possibility with a logic analyzer designed the way that the DD is. He is convinced that what he is looking are representations of a complete record of contiguous samples; and he could be correct. I'm not so sure, based on past experience.

I've made my own "DD" , without the display software so I'm not totally ignorant about what can go wrong. 35 years of using the expensive equipment hunting down complex electronics misbehaviour probably is guiding my responses. 

Share this post


Link to post
Share on other sites
  • 0

Let me try this again.  Attached, please find a figure showing two examples of the same data at different zoom levels.

dd-bounce-diagram.png.9c1bec52befb1ce1896d16e1e222213b.png

In red, I have drawn a circle on the left around a feature that looks like a thin impulse followed by a thicker impulse.  (Not the block feature, nor the phantom thin impulse before it, but rather the thick and thin impulse before that.)  Now, if you zoom in on that feature, WITHOUT MAKING A NEW COLLECT, WITHOUT TRANSFERRING ANY MORE DATA OVER USB, WITHOUT MODIFYING THE BUFFER, you find the well-defined logic pulses on each of the two traces to the right.

Let's see if I can draw a line around just the feature I am talking about on the left hand side a little clearer ...

dd-bounce-zoom.png.8e908335e6369108cdb55492daf72c56.png

This is the feature I am talking about.  When zoomed in, there's only one pulse on each trace.

My argument is that when zoomed out, these SAME SIMPLE LOGIC PULSES look like a pair of pulses--one a thin vertical line next to a thicker vertical line.  But when zoomed in, only the simple logic pulses (one per trace) is apparent.

The problem appears to be associated with both the zoom level, and a "feature" in the display having to deal with the extra thick lines.  (Notice that these phantom signals have thin lines, whereas I have requested thicker lines from the display.)  If you zoom out far enough, it seems as though the waveforms software tries to replace the "thick lines" I have requested (due to my poor eyesight) with thin lines.  These thin lines, however, don't match up with the location of the original features.

I'd like to take this opportunity to say that I am in wholehearted opposition to ANY feature on a display that mis-characterizes the data upon which the display was built.  I am also in complete opposition to replacing thick lines (for my poor eyesight) with thin lines--especially if these thin lines create a view of the data that is not consistent with the actual data that is present.

I also think that part of this problem is related to the fact that the x-axis is getting pixellated at the line width, rather than at the underlying pixel width.  This is creating aliasing problems in the display where features appear to appear and disappear as the display zoom is adjusted.  I find this both annoying and confusing.

@zygot, You may wish to note as well that there are no "glitches" showing up when zoomed in.  I understand how such "glitches" can appear and show up in a data collector.  Were that the case here, I would expect that when I zoom in on the samples, I would see the glitches.  I also understand how such glitches might show up in one collect and not another.  That's not the issue here, because I'm talking about THE SAME COLLECT.  We are examining ONE SET OF DATA.  When zoomed in, there aren't any there.  When zoomed out, they appear.  This is why I call them "phantom"s.

Dan

Share this post


Link to post
Share on other sites
  • 0

@D@n,

Yeah, your last set of pictures is a lot better at illustrating what you want to say. As of this moment I don't wish to change any of my previous comments. I still think that you are making incorrect assumptions about what the tools' software is doing.

You haven't answered my 2 questions:

1) can you dump data capture samples to a file in a format that you can read with your own program and analyze?

2) Do you have control over what the DD FPGA input IOSTANDARD is set to?

regards 

Share this post


Link to post
Share on other sites
  • 0

@zygot,

  1. Yes, I can save the data file.  It saved nicely to a 1.2GB CSV file.  I should be able to convert that to a VCD file for another viewer (GTKWave perhaps?), although ... I would rather have a viewer that worked out of the box without either phantoms or my needing to "prove" it.
  2. As for the voltage standard, it is set to 3.3V digital I/O with a 1.42V threshold.

Dan

 

Share this post


Link to post
Share on other sites
  • 0

@D@n,

BTW, I got your inference that you believe that the DD software is working on a single contiguous record of sample data the first time; you don't have to emphasize it each time you post here.

Share this post


Link to post
Share on other sites
  • 0

@zygot,

Ok, but then ... I don't get your comments.  They seem to me to be only relevant if their is a glitch in the capture portion of the software, something that I believe the data disproves.  It's not that such glitches can't or don't exist, it's just that ... I haven't seen any such glitches in the data I'm talking about.  Hence I can't understand why you keep going back and suggesting that a capture glitch is responsible for the artifacts I am seeing.

Dan

Share this post


Link to post
Share on other sites
  • 0

@D@n,

OK. "set to 3.3V digital I/O with a 1.42V threshold.". What device(s) are you probing? For my own version of the "DD" or even for csv formatted dumps from my oscilloscope I've written some simple programs to run through very large data sets to extract transitions. With a scope this is a bit more complicated as I have 8-bit A/D data but everything is timestamped and the analysis program doesn' have to be very complicated.

Share this post


Link to post
Share on other sites
  • 0

I am probing the pmod port P2 of an icoboard gamma.

C/C++ is my native language.  Converting from CSV to something (anything) else isn't difficult, it's just annoying.  ;)

Dan

Share this post


Link to post
Share on other sites
  • 0

@D@n,

My comments rarely explicitly dive deeply into all of the ideas floating around in my mind. I like to start simply as possibly. I understand that you are having difficulty with my not making the same inferences as you are making. You can ( somewhat ) easily extract the transitions from the data dump for the channel that has the confusing display and see what's there. This should narrow the discussion considerably. Personally, I think that this whole post has been interesting and might be useful to everyone. Anyone reading it who wants to question any of the statements should weigh in.

My concern is that your tool might be another poorly supported product that doesn't quite live up to its billing. There are alternatives that are 4-5 times the price but much more robust. I have no problem with the concept as long as it does what it is advertised to do and doesn't over sell its capabilities to the less technically inclined. Using non-staff for technical support just isn't a very good thing for a vendor to be doing.

Share this post


Link to post
Share on other sites
  • 0

Hi @D@n

Finally I see the "glitches" you are talking about. It is very hard to notice on my high resolution monitor.

These thin vertical lines show up with Plot Width of 3 or 4. Please use width 1 or 2 until this gets fixed.

Thank you for the observation.

i1.thumb.png.a8d3fbd80fbb8a6390a5bc03d248c940.pngi2.thumb.png.4403274e55893996707bd38d24cb4f5b.png

Share this post


Link to post
Share on other sites
  • 0

@D@n,

Yeah, I don't know that it's worth my time trying to figure out what the logic standard is from the device that you are probing. The threshold that you mention is probably OK for most 3.3V logic but it could be interesting to see if an alternate setting has different results. The decision threshold is but one factor in knowing if a signal is operating within specifications.

Share this post


Link to post
Share on other sites
  • 0

@zygot,

The whole point of this post was that the DD could be made better and with a little more help it might "live up to its billing".  I was writing for the purpose of letting the Digilent staff know what my observations were, so as to keep this from being a "poorly supported product."  To date, I've been pleased with the responsiveness of the Digilent staff to issues I've brought up on this forum.  (Thanks @attila for noticing the thread and my comments!)

My frustration with this thread is that it feels like your comments have pulled the thread off track from my observations, by entertaining hypotheses and arguments over those hypotheses which are not consistent with what I have been observing.

Dan

P.S. Give me another half hour or so, and I'll have all the transitions extracted.  It's not going to be hard at all to do, just annoying.

Share this post


Link to post
Share on other sites
  • 0

@D@n,

Well now that a staff member has weighed in to support your hypothesis I can stop making suggestions about possible issues. I am still interested in your findings. I still believe that the discussion is useful even if it hasn't revealed the nature of what's been causing your confusion. I'd expect this sort of issue to be discovered and fixed prior to releasing a product if it is in fact just an algorithmic issue. Admittedly, I have expectations for products that I purchase that used to be a given and now seem to be high.

Share this post


Link to post
Share on other sites
  • 0

Ok, here's my data set for reference, in a nice 4.3k VCD file.  (The CSV file took up 1.2GB!!!!)  It's not the first data sample set I posted, but it is the sample set I posted most recently.  (No, I didn't keep track of the trigger location, present within the CSV file, although I could have ...)

If you pull the data up with GTKWave, you'll see that there are no phantom transitions within it.  However, Digilent's Waveforms software has a nice continuous zoom capability that GTKWave doesn't have ... ;)  Further, I was enjoying the thicker trace lines in Waveforms that are not present in GTKWave.

Dan

P.S.  @attila, Saving data into a VCD format would be a nice capability to add to the waveforms package.  It's not hard to do, and it only took me only a couple minutes to convert from the CVS to a VCD format.  (Conversion program attached ...)

shotdata.vcd

genvcd.cpp

Share this post


Link to post
Share on other sites
  • 0

@zygot,

You are missing a piece of the context ... I didn't purchase the DD.  It was given to me by Digilent's marketing department together with a request that I review the product.  Part of the purpose of this post was to share what I had found.  :D

Dan

Share this post


Link to post
Share on other sites
  • 0
Posted (edited)

Here's another image showing what I consider to be a problem:

 

dd-mini-pulses.png

Notice the small, single-line pulses, displayed when the setting is set for a thick pulse-width.  While you might argue that this is an accurate representation of very narrow pulses, if the line width is set to be thick, it would make more sense to display these with thicker pulses.

Dan

Edited by D@n
Oops -- I hit send too soon.

Share this post


Link to post
Share on other sites
  • 0
Posted (edited)

@D@n,

About the context that wasn't explicitly mentioned until now. Knowing the vastness of your budget for such things I surmised that this was the case. Your general tone bolstered this suspicion.  My suggestion is that Digilent have you review products before they are released and all communication be off public forums until you have any issues resolved... then release the product. I'd trust you to do a thorough job of it. I have found the vast majority of your posts to be very helpful and generally technically correct. I respect you. Given the sketchy way that "reputation" is assessed on the Digilent forums and general feedback about how the less technically knowledgeable view your opinions I suggest you take some time to reconsider doing such work. I have grave concerns about solicited " product reviews"; especially when there are payments involved; even if only in merchandise.  Just because "everybody" is doing it doesn't mean that it's OK. This concern is especially true when not all pertinent information about the relationship surrounding the people involved in such a review is not explicitly and plainly stated prominently up front. I realize that your post wasn't the review which accounts for the third sentence. I've worked for companies who valued their image so Digilent's philosophy is a puzzlement to me; especially since their main customer base consists of students and educational institutions.

Edited by zygot

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this