digicloud14

Members
  • Content Count

    35
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by digicloud14

  1. JColvin, I appreciate you doing that. I've been struggling for a while to figure out the VDMA driver, any help would be appreciated. Thanks, Chris
  2. Hi all,I have a Zybo Zynq 7000 and have been working on a custom design in Vivado 2014.4 which uses the VDMA IP core. I want to access the data stored in the VDMA in Linux userspace and am using embedded Linux (a Ubuntu 12.04 distribution). I have generated a device tree for the design and have verified that Linux recognizes the VDMA as a device (it is listed in /sys/bus/platform/devices).My question now is how to actually use the VDMA? I've been reading around on the forums and online and see that you must either use the Xilinx VDMA driver for Linux or use mmap, but many people reccomend using the driver instead of mmap.There does not seem to be a lot of documentation on the VDMA driver, so I was wondering if anyone could help me out with how to install/configure the VDMA driver and use it? I found this http://www.wiki.xilinx.com/DMA+Drivers+-+Soft+IPs#AXI%20VDMA but it is very brief. It mentions it can be configured through menuconfig but it does not explain how to do that. I am very new to Linux and have no experience with menuconfig but from what I've read I can mess up a lot of things if I don't know what I am doing with menuconfig.Can anyone show me how or point me towards a good guide or tutorial for configuring and using the VDMA driver,or Linux drivers in general?Thank you,Chris
  3. Resolved the issue. Appended the bindings for the ethernet from the zybot device tree into my device tree and the ethernet now works. Thanks for the suggestion Mendeln! -Chris
  4. HI Mendeln, Are you using the default device tree provided by Xillinux or some other? Yes I made modifications to my device tree to add a UART and VDMA. I can't seem to figure out why doing this messed up the ethernet. If it's not too much trouble, could you post your devicetree so that I can take a look and try to figure out why my ethernet isn't working? Thank you, Chris
  5. Hi Mendeln, I've since tried adding in the xillybus cores to my Vivado design in an attempt to get the graphical interface back but couldn't manage to successfully synthesize/generate a bitstream for the project. There were more issues than I knew what to do with. I did have a question for you however: Are you able to use ethernet with your current setup? I have ethernet set up in my Vivado configuration, I have an entry in it for my device tree, but whenever I try to boot Linux and connect to the internet (to install programs/packages), I cannot. When I do an ifconfig, the only address that shows up is that loopback address. I've gone into /etc/network/interfaces and used a text editor to manually assign an ip address, but no luck. When I do a demesg, something relevant I saw was "xemacps e000b000.ps7-ethernet: eth0: no PHY setup" I've tried changing the entry in the device tree but can't seem to get it to work. Any advice/tips? Thank you, Chris
  6. Mendeln, You're trying make use of the VGA or HDMI out with this kernal configuration? I currently am just using putty and the UART port to boot, but I am thinking I will try to add either vga or hdmi out to make use of the graphical interface. I have a feeling I'll run into complications with using Xillybus IP cores in my design, but we'll see. Once I start working on it I'll report back here to let you know how it went. Thanks again. Josh, Haha glad we could live up to your expectations of the forum. This place has been immensely helpful to me while working on my project, I cannot be thankful enough. Glad we were able to tackle this problem as well. Thanks all, Chris
  7. Mendeln, I cannot thank you enough. I just got it to boot following your instructions. For anyone else who comes along and has similar issues, here is what I did (thanks to Mendeln): 1. Mount Xillinux to SD card. 2. Get the uimage kernal from http://www.instructables.com/files/orig/FAJ/VKWE/I7CCBSBN/FAJVKWEI7CCBSBN.zip. Replace the uimage on the sd card with the uimage found in the boot folder. 3. Generate a BOOT.bin with fsbl.elf, your custom Vivado design .bit bitstream file, and the uboot.elf from here https://www.dropbox.com/s/8ldvbwd66k0i94d/zybo_bsd_hdmi.zip?dl=0 4. Using either your own devicetree or the one found here https://www.dropbox.com/s/ilwddtkzn2gdcmj/zybo_robot_dts.zip?dl=0 add the following line: bootargs = "console=ttyPS0,115200 root=/dev/mmcblk0p2 rw earlyprintk rootfstype=ext4 rootwait devtmpfs.mount=1"; 5. Add the devicetree.dtb, the BOOT.bin you generated, and your bitstream file to the sd card. You should now be able to boot into Xillinux.
  8. Okay, glad to know that I am trying to do the same thing as you and you got it working. Just to clarify the steps: 1. I have the Xillinux image mounted on the SD card. 2. I already have a devicetree that is specific to my custom Vivado design. I added the bootargs = "console=ttyPS0,115200 root=/dev/mmcblk0p2 rw earlyprintk rootfstype=ext4 rootwait devtmpfs.mount=1" to my device tree as you suggested. What do I do with the kernal downloaded from the instructables link? Do I replace the uimage that is on the SD card after mounting the Xillinux image with the kernal image from that download? 3. I have my own Vivado design from which I created a fsbl.elf and a device tree. I am using a uboot.elf I found from digilent. Using those, I made my own BOOT.bin. What uboot.elf did you use to make your BOOT.bin? Did you get it from the linaro project files? Sorry for so many questions, I am very new to this but your help is immensely appreciated. Thanks, Chris
  9. Hello, I think we do want to do the same thing. I'd like to use Xillinux, but I'm not using any Xillinux (Xillybus) IP cores. Since I didn't think that would work, I tried booting Linaro from the the above tutorial, but I am receiving the same error I did when trying to boot Xillinux. I have my own custom Vivado design, and I just want to boot Ubuntu on the Zybo. I am using Vivado 2014.4, and have a design I have been working on. From the design, I have generated the .bit bitstream, an fsbl.elf, I am using a uboot.elf from Digilent, and created a devicetree using the SDK. I can post my project or any relevant files you'd like to take a look at. At this point I don't care if it's Xillinux Ubuntu or the Linaro Ubuntu, I just want to boot one of them. I am just confused as to if I need a different fsbl.elf and uboot.elf. I don't really understand or know what ramdisk is, but it appears that Xillinux uses it and Linaro does not? Using Linaro, if I make the change to my device tree with that bootargs statement, should I then be able to boot? Thanks, Chris
  10. Follow up question: I followed the tutorial here http://www.instructables.com/id/Setting-up-the-Zybot-Software/?ALLSTEPS for booting Linaro (Ubuntu) on the Zybo. I partitioned the SD card as per instruction and have all the files loaded on the SD card. However, instead of following the Zybot project, I'd like to use my own Vivado design. Whenever I loaded my devicetree, BOOT.bin, and bitstream onto the ZYBO_BOOT partition of the SD card and tried to boot, I receive an error saying "wrong ramdisk image format" "ramdisk image is corrupt or invalid" Is this because I have the wrong uboot.elf or fsbl.elf? From my understanding, the Linaro tutorial is for booting without a ramdisk (there is no mention of it in the tutorial). I believe the uboot.elf and fsbl.elf that I am using in my BOOT.bin are for a linux distribution that uses the ramdisk image. In this case, do I need to regenerate my BOOT.bin with a fsbl.elf and uboot.elf that are made for not using the ramdisk image? I hope what I am asking makes sense, I am very new to Linux and still learning.
  11. Mendeln, Thank you for the response. After a lot of searching, I don't think the Zybo Linux Reference Design exists or has yet to be released. I'll take a look at the Zedboard one and also the links you posted. I had been using Ubuntu 12.04 from Xillinux image as well. I really liked the graphical interface, but since I changed my design I removed the VGA out for the GUI and have just been using the UART instead. I'd like to continue using Xillinux if possible just because I have already written some scripts for the project as well, but if I HAVE to switch hopefully nothing I've written will have issues on a different distribution. Since I made a lot of changes in my Vivado custom design, I had to regenerate a new BOOT.bin to incorporate my new .bit bitstream file, and also a new device tree. When I tried loading the BOOT.bin, bitstream, device tree onto an SD card with the Xillinux Image written on it, I got a boot error that reads "wrong ramdisk image format" In any case, I'd just like to run Ubuntu on the Zybo. Thanks, Chris
  12. Hi all, I've been working on a custom Vivado design that uses embedded Linux. I have been using Xillinux 1.3 but I've switched to using Vivado 2014.4 and Xillinux is not compatible with anything beyond 2014.1, I am looking to switch to a different distribution of Linux. I have been following the directions of the following tutorial https://www.digilentinc.com/Data/Products/ZYBO/Embedded_Linux_Hands-on_Tutorial.pdf I already have my own uboot.elf, fsbl.elf, boot.bin, bitstream, and device tree for the custom design. All I'm really looking for is a Linux kernal image for the Zybo Zynq that I can load onto an SD card with my other previously generated files. On page 25 of the tutorial, in the section "Test Kernel Image with Pre-built File System", there is a bullet that says "Pre-built File System Image: ramdisk Image is available in ZYBO Linux Reference Design" I've done a lot of googling and looking through the forums but I can't seem to find this "Zybo Linux Reference Design." Can anyone point me towards where I can find this? Or can someone provide me with the Linux Kernal image (uimage) for the Zybo?
  13. Hey Sam, Thanks for your post. That link was pretty helpful in bettering my understanding of a device tree. I am fairly new to embedded linux (and linux in general) so the documentation is useful. I was able to compile the .dts into .dtb. Turns out resolving the syntax error that came before the fatal error fixed everything. The culprits were the lines for the interrupt channels on the vdma, they were both initialized to <-1> in the device tree, which apparently is not an acceptable value. I'm not quite sure what the values should be, but changing them to a positive integer resolved the error and allowed me to compile the .dtb. Thanks for the help, Chris
  14. Run, Thank you, that makes a lot of sense. I'll also switch to the PLL for the clocking wizard. Marshall, If you find yourself with time to spare and wouldn't mind doing the design with the PS, I certainly won't say no to that. Thank you! Thanks, Chris
  15. Thanks for the responses. I have a virtual machine running now, but I am still having trouble using the DTC. Whenever I try to run the following command to compile the .dts, I get: chris@Chris-VirtualBox:~/device_tree_bsp_0$ dtc -I dts -O dtb -o system.dtb system.dts Error: pl.dtsi:76.22-23 syntax error FATAL ERROR: Unable to parse input tree It's driving me nuts. I've been trying for hours to compile the .dts and everything I've found online as suggestions to solve the issue are not working. Any ideas?
  16. Hey Marshall and Jieming, Thank you both for taking the time to help me out. After a few modifications to the XDC and design wrapper, I was able to get the DDC connections properly set up and successfully generated the bitstream for the design. I did get a critical warning stating that my design did not meet timing constraints, so I'm working on correcting those now using the information you posted above. You guys are extremely helpful, it is very much appreciated. **Question about timing (sorry!) Just to verify a few things: The Zybo has a max buffer bit clock of 600 MHz, so the recommended TMDS_Clk is 600/5 = 120 MHz. The DVI2RGB core requires a 200 MHz ref clock. In my design I am using the Processing System (PS), so similar to Run's design, I have an FCLK from the PS going into a clocking wizard. Following Marshall's post, I commented out the timing constraints in the dvi2rgb xdc. When configuring my clocking wizard, do I also need to change mine from MCMM to PLL, even though I am using the Processing System? I'm not sure because Marshall's design does not use the processing system, he instead creates his own clock. Finally, to constrain the TMDS_Clk to 120 MHz, is this simply done when I connect the inputs to the DVI2RGB core and specify that the signal is a clock and enter 120 MHz, as shown in this image? http://i.imgur.com/SINVpSG.png Or do I need to additionally add something to the Master_Zybo or DVI2RGB XDC? I'm just a bit confused here because Jieming mentioned to add this to my XDC: create_clock -name sysclk -period 25 -waveform {0 12.5} [get_ports HDMI_CLK_P] But that gives a frequency of F = 1/25 nanoseconds= 40 MHz, unless I am missing something. I think my problem lies here. When attempting to generate a bitstream, it does so successfully without any errors, but I am still failing a timing constraint (much better then previously when I was failing several). Here is a view of my timing report http://i.imgur.com/WWZS1iu.png. As you can see, my CLK_OUT_5x_HDMI_clk timing fails with an actual pulse width of 1.212 nanoseconds. That gives a frequency of about 825 MHz, or 165 MHz x 5. So somewhere in the design there is a constraint keeping the TMDS clock at 165 MHz, even though I commented out the timing constraints in the dvi2rgb.xdc. When creating the ports for TMDS_Clk_p and TMDS_Clk_n I set them as 120 MHz clock signals as shown in the first image. But apparently this isn't holding? I'm thinking I need a line in the master xdc such as: create_clock -name sysclk -period 8.33 -waveform {0 12.5} [get_ports HDMI_CLK_P] So sorry for all the questions, but you have no idea how much you guys have helped me out. Thanks, Chris
  17. Hey Jieming, Thanks for the answer. Yes I'm running a version of Ubuntu on my Zynq right now, but all the Vivado related design has been done on my Windows computer. Sounds like I'll either need a virtual machine on my computer or I'll need to compile the dtb on the Zynq itself. I'll consider switching to Petalinux. Thank you for all your help. Chris
  18. Hi all, I have a Zynq -7000 development board and am using Vivado 2014.4. I have a block design that successfully synthesizes and generates a bitstream. My design uses Xilinx's VDMA core, and I want to use the Xilinx's VDMA driver with it (http://www.wiki.xilinx.com/DMA+Drivers+-+Soft+IPs#AXI VDMA). The driver guide says "The device tree node for AXI VDMA will be automatically generated, if the core is configured in the HW design, using the Device Tree BSP." However, I am having trouble generating a device tree for the design. I have been referencing this http://www.wiki.xilinx.com/Build+Device+Tree+Blob and have gotten as far as generating a .dts file, but I'm having trouble understanding how to compile a .dtb from that. From what I understand, it seems that I need build a Linux kernal to use the device tree compiler? I have been building my project on a Windows 8 computer. Is there another way I can generate a .dtb file from the .dts file? Any help is appreciated Thanks, Chris
  19. Hey Marshall, When you get a chance, could you post the ZYBO_Master.xdc file you used for your project? I can't seem to find it on Git. Thanks, Chris
  20. Run, I've tried every combination of removing the DDC port from the DVI2RGB core and commenting out the ports on the xdc, but I can't seem to get past the error. Jieming, in part 3 of your first post, you mentioned that Vivado no longer handles inout as a single port. I think this is the problem I'm running into currently. You said I need to make changes to the top level module? Something like: HDMI_SDA <= HDMI_SDA_O when HDMI_SDA_T = '0' else 'Z'; HDMI_SDA_I <= HDMI_SDA; Could you elaborate on this part? I noticed those names match those in the Zybo Master XDC for the HDMI SDA and SCL. Is that what you meant by the top level module? Additionally, if there's anything from your project you could share that you think may help, I would greatly appreciate that!
  21. Hey Marshall, The reason I'm trying to see if I can simply remove the DDC lines is because even after following your instructions (making external, checking the design wrapper and changing the xdc port names) I cannot get the implementation to build properly. I get an error that says: [Place 30-58] IO placement is infeasible. Number of unplaced terminals (2) is greater than number of available sites (0). The following Groups of I/O terminals have not sufficient capacity: IO Group: 0 with : SioStd: LVCMOS18 VCCO = 1.8 Termination: 0 TermDir: BiDi RangeId: 1 Drv: 12 has only 0 sites available on device, but needs 2 sites. Term: iic_0_scl_io Term: and iic_0_sda_io In my design wrapper, I have two signal names iic_0_scl_io, and iic_0_sda_io. I tried modifying the master xdc to reflect those names but I still get the above error. I've also tried without the DDC (disabled it in the customize ip menu as Run suggested) but it would not build either. I'll probably take a look at your project on the github and see what I can gather from there. Thank you for your help and enjoy your vacation! Chris
  22. Hey Marshall, In your post you mentioned you needed to have the DDC correctly assigned because it was causing problems. In my design, I'm not trying to output the inputted video to a display, it's endpoint is in the VDMA/Linux userspace. In this case, can I just leave the DDC pins on the DVI2RBG core disconnected or will that still cause problems? In my Vivado synthesis it keeps giving me warnings about the unconnected DDC pins. Thanks, Chris
  23. Thanks for the update, Sam! The information is very helpful, thanks for providing your block diagram as well. I had a feeling the V4L2 driver would be the tricky part. I'll probably try and take the easy way out and just hardcode the resolution as you suggest. Will you come back and update here when you release the bare-metal design or will that be its own post? Also could you provide a link to the wiki you mention? Can't seem to find it through Google. Once again, thanks very much Sam, you have been incredibly helpful! One more thing: You mentioned that there is an additional pixel clock output on the DVI2RGB core. The core I am using does not have that additional clock output, but you did say its the same frequency as the regular pixel clock. In my design, since I don't have that clock, do I just connect anything that would be connected to that to the regular pixel clock? I hope my question makes sense.
  24. Well, Run looks like you've got all the help you need from Marshall! You can also check out this other thread I started here https://forum.digilentinc.com/topic/286-hdmi-sink-on-zybo-zynq/ where Sam has been helping me with the design I mentioned in the initial post as he has been working on a similar design as well.
  25. Run, Sam from Digilent mentioned in another post that HDMI_OUT_EN needs to be driven low in order for it to act as a sink. That determines whether the 5V and HPD act as inputs or outputs, so if I'm understanding correctly there is no need for you to manually tie those high or low. I'm not 100% sure however, so if anyone else knows please correct me. Also, Run, did you run into a problem such as this while making sure that the DVI2RGB port connections matched those in the XDC file: WARNING: [BD 41-1306] The connection to interface pin /dvi2rgb_0/TMDS_Data_n is being overridden by the user. This pin will not be connected as a part of interface connection TMDS I opened the XDC file to find the names of the ports, and created ports in the block diagram that matched those. When I connected the ports I just created to the pins of the DVI2RBG core, i received the above warning. Have I done something wrong or is this expected? *Just realized I may have been looking at the wrong XDC file. Run, do you know what the proper names for the ports from the DVI2RGB core XDC are? Are they TMDS_Data_n[2:0], TMDS_Data_p[2:0], TMDS_Clk_p, and TMDS_Clk_n?