All Activity

This stream auto-updates     

  1. Past hour
  2. Just to follow up on this. I did manage to remove the default demo app (after many attempts). Since then, I do find that my cmod A7 is now more accessible (probably 30-40% of the time - still not perfect - but better than the < 5% I saw previously) So I am able to write bitstreams to the device pretty consitently. Another observation (now I have removed the demo app) is that the first time I try to use the device (after it has been disconnected from USB power for a while) it is detected almost 100% of the time. Just cycling the power (once it has been up for a while) does not do it. I now am wondering whether this is because the device has had a chance to cool? Maybe my success after removing the demo app, is not because the default demo is using the uart, because my replacement app (which does nothing except illuminate a led when button is pressed) is using less power, and thus not thermally loading the device. Hmmm. Anyway I am far more productive now. So am reasonably pleased with this strategy.
  3. The following observation is merely a philosophical viewpoint. If you've never built anything or flown an aircraft would you try to assemble a DYI prop plane kit with the intention of learning how to be a self-taught pilot? I realize that most beginners are motivated by grand dreams and I'm not knocking that at all. Perhaps it's better to take the slow difficult road of developing skills and conceptual understanding before tackling difficult projects. You need a solid foundation to learn from other's code. You might well succeed in replicating kwiber's experience but it will be of limited usefulness.
  4. Today
  5. Hi @flying, There is a lot to unpack here. Firstly I would like to draw your attention to the fact that this post was originally started by malkauns regarding the porting of the petalinux project with PCAM capabilities to the Zybo Z7-10 board, very similar to a previous topic which you have started. If you follow the steps in the previous post by vicentiu you should have the HW configuration finished by vivado, and then all you need to run is: petalinux-config --get-hw-description=<PATH-TO-HDF-DIRECTORY> petalinux-build petalinux-package --boot --force --fsbl images/linux/zynq_fsbl.elf --fpga images/linux/system_wrapper.bit --u-boot In the Petalinux-Zybo-Z7-20 project and you should have everything working the way it's described in the projects readme. Secondly (although off topic), regarding the "the "desktop" of the Petalinux.", our petalinux project so far has no GUI in the FS(file system) therefore you will not be able to see anything on the HDMI display. The HDMI is currently configured as a pass trough, in the petalinux project, at 720p; which means everything that is sent via the HDMI RX connector is forwarded to the HDMI TX connector without processing. All of this is routed trough the FPGA, meaning that without programming the .bit into the FPGA, the HDMI will not work. For more details on this please refer to the block diagram of the Zybo-Z7-20-base-linux which is an intuitive description of what I explained here. Lastly, regarding the rest of the questions (also off topic), most are answered by reading trough the petalinux user guide ug1144 which describes how petalinux works and the workflow. I would also recommend reading up on how the Zynq processor is used in bare-metal and the role of the ps7_init files, the .bit, the .hdf and the BOOT.bin.Unfortunately going in to detail about these, would be hard without making it in to a tutorial or a step by step guide with page long explanations at every step. I hope this helped clarify some of your queries. -Ciprian
  6. There is some pretty messy aliasing happening in Waveforms Live's buffer view and also the main chart. For example, below is the AWG connected to channel 1, zoomed out. It looks fine on the main chart when zoomed in (though the buffer view remains full of alias artefacts). It would be great if the signal envelope was represented properly in both the buffer view and the chart, regardless of zoom.
  7. So I've successfully logged some data to SD card on the OpenLogger MZ (0.1807.0); I can download the .log file and though I've yet to try it I see the dlog utility can be used to spit out a CSV file. How can you use WFL to view the logged data?
  8. This article series evaluated the lowRisc and SiFive cores. I went through the lowRisc getting started guide on my Nexus A7 and it worked as advertised.
  9. @jrosengarden There are Xilinx guides on creating IP and repositories as well as tutorials. I'm curious as to why you want to go to all of the bother? You say that you created your full adder using the board design flow? Why not create it with an HDL so that you can instantiate it any way you please.
  10. Thanks JColvin; I was able to reduce / eliminate the "fall behind" errors by reducing the sampling rate (should have tried this earlier!); I found that up to 100kS/s seems to work reliably in terms of streaming data; however interacting with the AWG at anything over about 20kS/s causes ""Could not set AWG parameters" errors. Hmm, I just noticed the official specs / FAQ states streaming over wifi at 10kS/s. So I guess this is working as expected. Also I found the firmware version info is reported in the 'Settings' page you referred to; as is the Waveforms Live version. Re. logging: setting the log to the console logs a boatload of debug level logs, not errors. Indeed, I can't actually find the errors reported via the 'toast' notifications logged to console at all in the logs when enabled.
  11. Hi all. I'm still fairly new to Vivado (2018.3) and having trouble trying to figure something out. I HAVE tried searching the web and forum for an answer but haven't come up with one. My Problem: I've completed a Vivado project for a full adder. It was created with a schematic design (block design?). it works great, my test bench confirms everything and it loaded and executed in my Basys3 board perfectly. So now I'm trying to figure out how to "package" this up and save it as a new schematic symbol. I want to create a cascade of this full adder in another project (as in a 4 bit adder, an 8 bit adder, etc). I can't seem to figure out how to do this. Any help/guidance would be greatly appreciated. P.S. - I KNOW there is already a 14 bit adder/subtractor in the IP library but I'm trying to get to this point on my own without using the pre-built IP. THANKS! 🙂
  12. I'd like to run RISC-V with very basic (command line only, obviously) Linux on a Nexys A7 just to play around with it. Since I'm a beginner I find it hard to adapt steps for other boards. If anyone could suggest a tutorial that either matches Nexys A7 exactly, or would be as easy as possible for me to adapt, that would be great.
  13. Hello forum, I am a master student with a project involving Impedance measurements. I was thinking about buying the Analog Discovery 2 with the Impendance Analyzer but i would like to use BNC probes and i was wondering if i could use the BNC adapter to take the measurements instead of the imput conections provided by the impedance analyzer. Thanks in advance.
  14. @attila This seems to be a bug with WaveForms. I successfuly captured data with the same configuration (see screenshot above) on WaveForms 3.8.2 (Linux/Windows) while WaveForms 3.10.9 (Windows) and 3.11.4 beta (Windows) lost digital samples.
  15. I want to create bsp for 2018.3 to zybo-z7-10. I am working on petalinux and vivado 2018.3 with zybo-z7-10. i found out that you released bsp till 2017.4. So i try to create bsp for 2018.3. i referenced some other guides. They said I want to add petalinux edk repository in xsdk after creation of hdf from vivado. But i cannot find that repo in petalinux 2018.3. /petalinux/2018.3/components. In this i cannot find that repository. I stucked up with this step. Whether i want continue from this step or is there any other way to create a bsp file?
  16. Thanks for the answer. I also tried without data compression and that doesn't fix the issue.
  17. Hi @Vroobee, I experienced this when the WiFi signal of our local network was not strong enough where the OpenScope was located. Also check if you have a MAC filter or static IP adresses enabled in your local network. I think the Scope will prompt that error also when the connection has been refused by the other side for any reason. Regards Fabian
  18. Hi @Phil_D, Where do you see ADG612? OpenScope MZ uses TS3A5017 to switch between 4 gains (0.075, 0.125, 0.25, 1). And as far as I can see from the schematic there is no gain switch for the OpenLogger. Are you sure you are asking your question in the correct (sub-)forum? Regards Fabian
  19. Sduru

    AXI4 and Vivado ILA

    In the link , it says that this error is related to XDC file. "The following are some common causes of this issue. XDC constraints are case sensitive. These warnings can occur if the case type of the object name in XDC is not the same as the signal in the RTL code" But in my constraints file, there is no case sensitive problem. I cannot solve the problem. Please help...
  20. Sduru

    AXI4 and Vivado ILA

    Thank you @zygot . I've created the block design without AXI4 Stream like in the following: But, I am getting the following errors: [Common 17-55] 'get_property' expects at least one object. Resolution: If [get_] was used to populate the object, check to make sure this command returns at least one valid object. [BD 41-1273] Error running post_propagate TCL procedure: ERROR: [Common 17-55] 'get_property' expects at least one object. ::digilentinc.com_ip_MIPI_D_PHY_RX_1.3::post_propagate Line 6 Do you have any idea for those errors above?
  21. Am I correct in assuming that there is no testbench and that rgb2grey is your toplevel entity? That is you just let Vivado decide how to simulate rgb2grey? Did you have a timing constraint for the clk period when you implemented the design? Was the timing score 0? Now that I'm trying to read your code the organization needs work. Why is all of your logic in one process? Why is the process creating output when active_i is de-asserted? Also, I'm curious as to your reasoning for changing your internal registers to type integer from std_logic_vector. Lastly, I'm having a hard time correlating the simulation waveforms to your code. I don't see a rst or 24-bit output in your code.
  22. I see in the documentation that the ADG612 gain switch between high gain and low gain to the ADC. I see in Waveforms Spectrum analyzer that there are many gain options: 0.01x, 0.1x, 1x, 10x, 100x. Which hardware gain setting is used in Waveforms for those gain settings? Thanks!
  23. I was able to recreate the image you show above following your instructions so maybe all hope is not lost. I am still getting really strange readings though trying to do my project. I using the AD2 to power a micro-controller with positive V+ set to 3.3V and with the the grounds tied together. In order to get an idea for how much power the microcontroller is consuming I placed a 10 ohm resistor in series with the V+ output from the AD2 and i'm trying to use the scope to measure the voltage drop across that resistor to get an idea of the current being used. Below is the view I am getting from the scope which doesn't make any sense to me. Why would it be centered around -3.212 V? The voltage drop across the resistor should be only a few mV. The square wave does make sense because the program on the microcontroller is being used to flash an LED at that frequency. Any ideas?
  24. Yesterday
  25. Hi JColvin, I borrowed a Xilinx ZC702 board and brought it up. I built a Vivado project, generated bit file and SDK. I can run LED tests via GPIO with ZC702. We really want to use Zedboard since it is smaller. How much work involved to bring up Zedboard? what is included in the Zedboard package? How many useful I/O in Zedboard? Thanks, Shuguang
  26. these two photos maybe a little more revealing about my problem.
  27. @askhunter It's not clear from your pictures what it is that you are referring to since the times scales are different. The purpose of post place and route timing simulation is to show the relative signal path delays in your implemented design as well as possible inferred latches or registers hidden by IP. The RTL simulation merely indicates if your behavioural logic is performing as you intended ( assuming that the testbench is well designed ). It is merely a simplified (no delays, no setup, no hold times) idealistic representation of simple logic. If the timing simulation doesn't give the same results as the RTL simulation then it's unlikely that your hardware will behave as you intend either. In the typical professional setting a lot of people are working on parts of a large design effort simultaneously. No one can afford to schedule a design effort where everything is done sequentially. In such a case timing simulations become a very important indicator of risks of projects not making deadlines. It simply isn't possible to create a lot of hardware, software, test protocols etc sequentially or even in parallel and 2 weeks before shipment throw all that stuff together for the first time and then figure out why things don't work. So we have a lot of ways to do simulation that offer increasingly more accurate, and hence reliable, views of how our design ( after it's been optimized, re-worked and reduced to LUT equations ) might actually work in a system before having to run it in hardware. When there are 10 engineers doing parts of 1 large FPGA design and all of those parts are integrated it's not uncommon for some of them to start failing due to limited routing resources and clock line limitations.
  1. Load more activity