Ciprian

Digilent Staff
  • Content Count

    31
  • Joined

  • Last visited

  • Days Won

    2

Ciprian last won the day on January 14

Ciprian had the most liked content!

About Ciprian

  • Rank
    Frequent Visitor

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Ciprian

    Video capture in Petalinux on Zybo

    Hi @Ben B, Regarding your question on using Zybo Z7-20 to capture HDMI signals. It is possible and using UIO is also an option, but because we are using the VDMA to get the Video signal it's better to use a DMA driver. Unfortunately Xilinx does not provide a complete DMA driver for any of their DMA IPs, therefore I have been using this DMA driver which includes the VDMA functionality as well. To make things as easy as possible, I generated a example project for you with the VDMA used to capture video streams and OpenCV functions to write a *.bmp file. What you need to do in order to get it working is: 1. load the HDMI2BMP.elf to /home/root on your rootfs portion of your board 2. after the board boots you need to load the axi_dma_driver root@Zybo-Z7-20:~# insmod /lib/modules/4.9.0-xilinx-v2017.4/extra/xilinx-axidma.ko 3. run the HDMI2BMP.elf This will generate a test.bmp in /home/root with the captured image. The source file for the app is in the SDK folder. Changes which I had to do to the original petalinux project are: - create a new module in petalinux petalinux-create -t modules -n xilinx-axidma --enable - copy the necessary file to Petalinux-Zybo-Z7-20/Zybo-Z7-20/project-spec/meta-user/recipes-modules/xilinx-axidma/files and update the MAKE file and the xilinx-axidma.bb - update the system-user.dtsi in /Petalinux-Zybo-Z7-20/Zybo-Z7-20/project-spec/meta-user/recipes-bsp/device-tree/files - write the demo program Hope this helps. -Ciprian Zybo-Z7-20-HDMI-RX_peta.zip
  2. Ciprian

    Petalinux Microblaze: Sourcing BitBake failed?

    It is not recommended to install petalinux anywhere else then "/opt/pkg/petalinux", we encountered similar issues when installing petalinux in other locations... That's why we usually specify in our petalinux project how to install petalinux and where. -Ciprian
  3. Ciprian

    PYNQ-Z1 Zynq temperature in Linux

    Hi @lowuze, As far as I know there is no Digilent Linux for PYNQ, we however have Petalinux for Arty Z7-20 which is similar to the PYNQ(Arty does not have a Microphone), it has by default XADC capabilities activated (driver in the kernel and configured device tree). Unfortunately we do not have a example project on how to read the XADC from Linux but you can access it in: /sys/bus/iio/devices/iio:device0 For more information about how to use it please search for Xilinx XADC on linux, THIS might also give you a better understanding on how the driver works and how to interface with it. -Ciprian
  4. Ciprian

    Vivado sysnthesis fail..Pcam

    Hi @Hong, Thank you for bringing up this issue, unfortunately without a log file we don't actually know what's wrong with it because the message you have provided is a generic error message. Could you please provide the log file? Taking in to account that 2018.3 is a very recent product (currently 4 days old) we have not yet updated our demo projects to this version. Differences between Vivado versions tend to come with changes which might make our demos fail. We usually recommend a specific Vivado version in our demo projects, if you go to Zybo-Z7-20-pcam-5c Demo you will notice that in the description of the project we specify "Created for Vivado 2017.4". We can guarantee that it will work in that version. If you need it to work with 2018.3 you will have to wait for us to update our demo or try to update it yourself. - Ciprian
  5. Ciprian

    I2C PMOD access under Linux

    My first suggestion was just basically replacing PmodTMP3_0: PmodTMP3@40900000 { compatible = "xlnx,PmodTMP3-1.0"; interrupt-names = "I2C_Interrupt"; interrupt-parent = <&microblaze_0_axi_intc>; interrupts = <4 2>; reg = <0x40900000 0x1000>; with PmodTMP3_0: PmodTMP3@40900000 { clock-frequency = <100000000>; compatible = "xlnx,xps-iic-2.00.a"; interrupt-names = "I2C_Interrupt"; interrupt-parent = <&microblaze_0_axi_intc>; interrupts = <4 2>; reg = <0x40900000 0x1000>; and it should work... In theory. Either way, I'm glad you got it going and thank you for your tutorial. - Ciprian
  6. Ciprian

    Using PMOD RTCC with PetaLinux 2017.4 on Arty Z7

    Hi @troden, "/dev/misc/rtc" is the place where RTCs will be instantiated once the RTC driver has been called and applied, for many embedded processors this is either part of their architecture or there is a explicit RTC IC on the board for them whit the driver in the kernel module. Unfortunately this not the case with the Pmod RTCC, this one is an external RTC . You first need to either add the PmodRTCC IP or the AXI I2C IP to your hardware design, then you need to update the dts to include the IP and finally, before building the kernel, you need to include the xilinx I2C driver (and optionally if you managed to adapt the mcp7941x linux driver to work with the xilinx i2c driver include that one as well) after building the kernel you can start accessing the RTC in "/dev/misc/rtc" if the mcp7941x linux driver works otherwise it will be a generic I2C device. Here is a link on how to use the xilinx I2C driver. Here is a link on the mcp7941x linux driver discussion on a separate forum. - Ciprian
  7. Ciprian

    I2C PMOD access under Linux

    All the information regarding TMP3 is here including pin layout and a link to the IC manufacturer datasheet (for the register space configuration). As far as I can tell the IP is just a wrapper of a AXI IIC IP so I'm guessing that if you instantiated it in the DTS using the xilinx I2C driver you should be fine. If that doesn't work you can always try to just simply replace the TMP3 IP with a AXI IIC IP and then respecting the IC datasheet you should get it going. - Ciprian
  8. Ciprian

    device-tree (spartan 6)

    Unfortunately you are trying to do something hard on a old board using outdated tutorials. This will not be easy because you seam to lack the experience with embedded linux as well. I'l try to help you as much as I can but you will need to read up on the following terms and understand how they fit together and interact: hardware platform, device tree, first stage boot loader (aka FSBL), second stage boot loader (u-Boot), kernel, ramFS, root FS, user space and kernel space. Because you are using a soft core processor (microblaze) we cannot determine if the dts (device tree source) is ok or not, the device tree is a representation on what you have in your hardware configuration and what addresses you have at base address for each IP therefore without the EDK project or at least the address editor we cannot validate you source file. From what I've read, you seam to have everything up and going except linux booting, I'm guessing there is a problem with your second stage boot loader (probably using uBoot) which has not configured properly (uBoot also has a defconfig which needs to have UART and debug capabilities active), unfortunately that's the most I can tell you about this without actually seeing the sources. There is no tutorial that I know of which will take you step by step through this process but I recommand you focus on this: https://xilinx-wiki.atlassian.net/wiki/spaces/A/pages/18842560/MicroBlaze - Ciprian
  9. Ciprian

    Zynq book - tutorial 5 Zybo Z7

    Hi @n3wbie, I had a similar problem, for me it was the fact that I did not have enough space allocated to the stack in the linker script, if you changed the dimension of the RecSamples variable then that might be the issue. Regarding the sin wave, I'm not sure what you want to do but if you want to generate a sin wave from within the and then play it back to the Head Phones then you can simply use the sin function in C (you need to add the math.h library and make sure you activate the library in the project settings, described here). Otherwise you can set Mic or Line in, connect the jack to you PC and play a Sin video from youtube then you can look at the recorded samples. I'm guessing you are more familiar with MATLAB, you can try that to; the idea is that as long as you are feeding it the right samples either way works. Hope this helps, Ciprian
  10. Ciprian

    Zynq book - tutorial 5 Zybo Z7

    Hi @n3wbie, Th working project is attached. what you have to take in to account when using this audio codecs with Digilent products is that you need to configure the codec (using I2C) as well as receive the samples using I2S IP core. Basically one is for the control of the codec and the other one is to receive the samples. I have written a small driver for both the I2S core and the I2C SSM2603 which is in the source files of the SDK project (in the sdk folder) which configures the registers for the codec and I2S IP core; the documentation for the codec can be found here. The IP core has not yet been documented which is the main reason we have not added it to the Digilent vivado-ip library, but it needs a 100MHz input for it to be able to synthesize the 12.228 MHz MCLK and the subsequent clocks for the I2S protocol. The demo project reads the buttons and based on the ones you press it will: BTN0 - Record 1s BTN1 - Set Mic input BTN2 - Set Line In input BTN3 - Playback 1s The project is not really optimized so it uses a variable "RecSamples", allocated to the stack memory which holds the recorded samples(48000 samples representing 1s at a 48KHz sampling rate) and it is also used fro play back, so don't press play back before record. The rest should be easily traceable from the comments in the driver and the main source code. If you have any other questions feel fr to ask. Ciprian ZyboZ7Audio.zip
  11. Ciprian

    Zynq book - tutorial 5 Zybo Z7

    Hi @n3wbie, I tried to rebuild you project but there are some errors from downloading the project from Drive, for future reference when uploading a project make sure you archive it first. Regardless, I started to build a simple demo that does exactly what you want to do, it will not be fully documented but it will be documented enough so that you can understand how the IP works and how to use it in future project. This will probably take a bit of time, so I will have the demo ready for you by the beginning of next week, and I will post project and the description as a reply to this post. Regards, Ciprian
  12. Ciprian

    Genesys-2: Ethernet Interface - design migration not working

    Unfortunately it is not as simple as copying from one board to another. First of you must make sure that you check the configuration of the Ethernet IP in the hardware design and make sure it is configured right, compare it to our user demo although I'm not sure if we are using Ethernet lite IP in our design. https://reference.digilentinc.com/learn/programmable-logic/tutorials/genesys-2-user-demo/start The second thing you must do, and this is most likely the issue in your case, make sure that the IP ports in the XDC are mapped correctly. From a Linux and device tree perspective if you have the same IPs as the original design for the KC705 (and all the peripherals are present in the Genesys 2) then it should be OK. Basically if only the board changes in your project it's more likely a board specific issue (wrongly mapped XDC, peripherals not present, etc.) rather then a software issue. Ciprian
  13. Ciprian

    booting from sd card nexys video

    Depending on the dimensions of your code (.elf file) you can chose to either run it from the BRAM or from DDR (provided you have a DDR controller IP in the block design). If your elf file is small enough to fit in to the BRAM you can splice it in to the bit file using SDK. This is done by programming the FPGA from SDK and in the Software Configuration section, instead of bootloop you put your .elf file in (the blue part in the attached picture), then you go to <SDK workspace directory>\<Name of Hardware Platform> and copy the download.bit on to your SD card. If your code is not small enough to fit in to the BRAM then you have to use QSPI.
  14. Ciprian

    Adding Nexys Video Board to Vivado HLS ???

    It is close, the version which I have is: <board name="Nexys_Video" display_name="Nexys Video" family="artix7" part="xc7a200tsbg484-1" device="xc7a200t" package="sbg484" speedgrade="-1" vendor="digilentinc.com" /> Although I don't know if we have an official version for this board in HLS, the one I provided should work. If it does not you can also just chose the correct part in the parts list, for HLS there are no board specific settings besides the FPGA part.
  15. Ciprian

    The Complete HLS Procedure

    Well generally you have a C code which you want to optimize of which only a part can be optimized, so you are right so far. This is due to different limitations in the way HLS transforms your C code in to HDL; basically once you start writing code for HLS you need to understand how that code will be synthesised by HLS in order to obtain the best performance. You can find more details about the directives in UG902. I have unfortunately not worked with CPU-GPU accelerations so I don't exactly know how it works, but I assume that you have the GPU and the CPU in the same PC/Laptop. If this is the case then you can offload some functions to the GPU without actually knowing the interface between the CPU and GPU, you just give it the task and when it's finished its finished. Now if there are a lot o small actions which the GPU has to do there will be a back and forward between the GPU and The CPU for reading and writing data in which case the bandwidth between the processors becomes and issue. At least this is how I understand the issue, please correct me if I'm wrong or the info is incomplete. When it comes to FPGA's most of the time you don't have an FPGA in a PC/Laptop (I have not heard of one so far but there might be one somewhere) so you have to chose how to interface with your FPGA (USB, PCI, Ethernet, etc) depending on the required speed and bandwidth you can chose what you like. Being on the Digilent forum I assume you have one of our boards or are considering buying one so you either have to make due with what high-speed interface you have on your dev. board or you can buy a board which suites your needs. The idea is that not knowing what HLS is going to be used for and how, the authors of the UG871 will not focus on the bandwidth between the FPGA and something else, because they assume you will chose an interface which can handle the desired data at the desired speed. There is also the point that you might have a soft-core processor implemented in the FPGA and do not desire to send data to a different processor which is not in the FPGA, in which case you will have to focus on the throughput of the soft-core CPU and the HLS core. Or you could use a Zynq which has a ARM processor next to a FPGA, but I digress.... Now from the HLS perspective they do take in to account the throughput of the data which will be flowing in to the HLS core and they do warn you about the limitations of the core using the initiation interval (aka 1/throughput) which you will get in the synthesis report, but that is the maximum throughput which your HLS core can support and that can be optimised using the HLS directives. If you need a high speed data transmission for bulk data then you will need to interface your HLS core using an AXI-Stream interface. For example if you want to accelerate your C code from a PC using PCI it would look something like this: CPU <=> PCI <=PC side============FPGA side=> PCI <=> PCI-to-Axi-Stream core <=> HLS core Hope this has clarified some of your questions... Ciprian