Ciprian

Digilent Staff
  • Content Count

    53
  • Joined

  • Last visited

  • Days Won

    3

Reputation Activity

  1. Like
    Ciprian got a reaction from ahmedengr.bilal in PCAM elf on PetaLinux from SD card   
    Hi @ahmedengr.bilal,
    Like I mentioned in the previous post there is no HDMI output from the Linux side, neither the embedded rootFS provided by petalinux nor the kernel configuration we give out is set to accommodate this feature.
    Regarding the missing media-ctl and v4l2-ctl, you have not activated the v4l-utils in the rootfs configuration of the petalinux. to do this you need to navigate to your petalinux project folder and run:
    petalinux-config -c rootfs Once the menu appears you need to go to Filesystem Packages->misc->v4l-utils and activate: v4l-utils, libv4l, media-ctl. Rebuild the whole project and it should be working now.
    -Ciprian
  2. Like
    Ciprian got a reaction from Kris Persyn in Digital Twin   
    Hi @Kris Persyn,
    It depends on how you manage your resources, driving immersive visuals on a HDMI display can be done in multiple ways at different resolutions, some are PL taxing others are DDR taxing; you could generate entire frame buffers in PL or PS or you could find a optimal algorithm to change just the previous frame or you could allocate a high number of frame buffers and then run them in a loop.
    It also depends on how math lab synthesizes the IP you will need to add to your design.
    If you design your project properly and don't aim for a resolution higher more 720p( I'm being conservative, we managed to drive the HDMI at 1080p with processing filters without a problem)  I think it should be enough for what you want to do, resource wise.
    My suggestion, download and install Vivado, download and install the board files, create and implement your project look at the resource consumption and then buy a board.
    - Ciprian
  3. Like
    Ciprian got a reaction from jpeyron in Qustions about arty-z7 HDMI out demo   
    Hi @Mingfei,
    If you are referring to Arty-Z7-XX-hdmi-out, where XX is either 10 or 20, demo then you  have series of solutions to make it work faster.
    Before I go in to detail regarding this let's talk a bit what the demo actually does. It basically reads one frame (dispCtrl.framePtr[dispCtrl.curFrame]) and then takes every pixel and inverts it in to the output frame (dispCtrl.framePtr[nextFrame]) this is a software approach (based on how DemoInvertFrame() is written) which means that it will depend on the processor how fast it can manage the data. Therefor stuff like printing text on the console (like printf("Volage.....)) will slow it down. As for the solutions there are 2 way to try and speed it up:
    1. Software, VDMA has a feature called frame buffer parking you can read all about it in the User Guide (PG020 for v6.3), it basically parks the output frame on one buffer and lets you edit the other 2 frames without interrupting the video out stream. This will increase your out put to 50 fps but the refresh rate of what you want to do, the actual processing, will still only work at 5 frames.
    2. Hardware, you could take advantage of your FPGA and write a IP core which does the inversion in HW thus offloading the task from the processor and getting to almost no processing delay; this of course means you will have to redesign your project. I would recommend writing the IP in HLS because it's easier and placing it at the output stage be teen the VDMA and the Subset Convertor.
    -Ciprian
  4. Like
    Ciprian got a reaction from jpeyron in Wrong colors and rendering output in HDMI with Linux base system   
    Hi @Luighi Vitón,
    I'm not sure it's a HDMI issue, have you tried to use X11 over SSH do you get the same result?
    -Ciprian
  5. Like
    Ciprian got a reaction from jpeyron in Hdmi out from zybo   
    Try adding this:
    &i2c0 { clock-frequency = <100000>; status = "okay"; }; Here: <petalinux_project>/project-spec/meta-user/recipes-bsp/device-tree/files/system-user.dtsi
    -Ciprian
  6. Like
    Ciprian got a reaction from Michael P in Hdmi out from zybo   
    Try adding this:
    &i2c0 { clock-frequency = <100000>; status = "okay"; }; Here: <petalinux_project>/project-spec/meta-user/recipes-bsp/device-tree/files/system-user.dtsi
    -Ciprian
  7. Like
    Ciprian got a reaction from Cristian.Fatu in Digital Twin   
    Hi @Kris Persyn,
    It depends on how you manage your resources, driving immersive visuals on a HDMI display can be done in multiple ways at different resolutions, some are PL taxing others are DDR taxing; you could generate entire frame buffers in PL or PS or you could find a optimal algorithm to change just the previous frame or you could allocate a high number of frame buffers and then run them in a loop.
    It also depends on how math lab synthesizes the IP you will need to add to your design.
    If you design your project properly and don't aim for a resolution higher more 720p( I'm being conservative, we managed to drive the HDMI at 1080p with processing filters without a problem)  I think it should be enough for what you want to do, resource wise.
    My suggestion, download and install Vivado, download and install the board files, create and implement your project look at the resource consumption and then buy a board.
    - Ciprian
  8. Like
    Ciprian got a reaction from BogdanVanca in Digital Twin   
    Hi @Kris Persyn,
    It depends on how you manage your resources, driving immersive visuals on a HDMI display can be done in multiple ways at different resolutions, some are PL taxing others are DDR taxing; you could generate entire frame buffers in PL or PS or you could find a optimal algorithm to change just the previous frame or you could allocate a high number of frame buffers and then run them in a loop.
    It also depends on how math lab synthesizes the IP you will need to add to your design.
    If you design your project properly and don't aim for a resolution higher more 720p( I'm being conservative, we managed to drive the HDMI at 1080p with processing filters without a problem)  I think it should be enough for what you want to do, resource wise.
    My suggestion, download and install Vivado, download and install the board files, create and implement your project look at the resource consumption and then buy a board.
    - Ciprian
  9. Like
    Ciprian got a reaction from Amin in Zybo Z7-20 Petalinux 2018.3/Any linux Installation   
    Hi @Amin,
    It depends on what you are planning to do. If you only need a Linux running on your Zybo Z7-20, then I can give you our pre-build BOOT.bin, kernel+rootfs images. This approach is based on our Demo HW platform, you will have the benefit of a lot of IPs in the design which allows a very versatile approach to the board, unfortunately this will not allow you to add any new IP to the design.
    If you want to build you own platform and base the on board Linux on it then you will need to install petalinux on linux(I recommend Ubuntu) and build/customize it the way you want, @jpeyron sent you the links for this in the previous post.
    -Ciprian
  10. Like
    Ciprian got a reaction from theUltimateSource in Video capture in Petalinux on Zybo   
    Hi @Ben B,
    Regarding your question on using Zybo Z7-20 to capture HDMI signals. It is possible and using UIO is also an option, but because we are using the VDMA to get the Video signal it's better to use a DMA driver. Unfortunately Xilinx does not provide a complete DMA driver for any of their DMA IPs, therefore I have been using this DMA driver which includes the VDMA functionality as well. To make things as easy as possible, I generated a example project for you with the VDMA used to capture video streams and OpenCV functions to write a *.bmp file.
     
    What you need to do in order to get it working is:
    1. load the HDMI2BMP.elf to /home/root on your rootfs portion of your board 
    2. after the board boots you need to load the axi_dma_driver
    root@Zybo-Z7-20:~# insmod /lib/modules/4.9.0-xilinx-v2017.4/extra/xilinx-axidma.ko 3. run the HDMI2BMP.elf
    This will generate a test.bmp in /home/root with the captured image.
     
    The source file for the app is in the SDK folder. Changes which I had to do to the original petalinux project are:
    - create a new module in petalinux
    petalinux-create -t modules -n xilinx-axidma --enable - copy the necessary file to Petalinux-Zybo-Z7-20/Zybo-Z7-20/project-spec/meta-user/recipes-modules/xilinx-axidma/files and update the MAKE file and the xilinx-axidma.bb
    - update the system-user.dtsi in /Petalinux-Zybo-Z7-20/Zybo-Z7-20/project-spec/meta-user/recipes-bsp/device-tree/files
    - write the demo program
     
     
    Hope this helps.
    -Ciprian
    Zybo-Z7-20-HDMI-RX_peta.zip
  11. Like
    Ciprian got a reaction from jpeyron in Video capture in Petalinux on Zybo   
    Hi @Ben B,
    Regarding your question on using Zybo Z7-20 to capture HDMI signals. It is possible and using UIO is also an option, but because we are using the VDMA to get the Video signal it's better to use a DMA driver. Unfortunately Xilinx does not provide a complete DMA driver for any of their DMA IPs, therefore I have been using this DMA driver which includes the VDMA functionality as well. To make things as easy as possible, I generated a example project for you with the VDMA used to capture video streams and OpenCV functions to write a *.bmp file.
     
    What you need to do in order to get it working is:
    1. load the HDMI2BMP.elf to /home/root on your rootfs portion of your board 
    2. after the board boots you need to load the axi_dma_driver
    root@Zybo-Z7-20:~# insmod /lib/modules/4.9.0-xilinx-v2017.4/extra/xilinx-axidma.ko 3. run the HDMI2BMP.elf
    This will generate a test.bmp in /home/root with the captured image.
     
    The source file for the app is in the SDK folder. Changes which I had to do to the original petalinux project are:
    - create a new module in petalinux
    petalinux-create -t modules -n xilinx-axidma --enable - copy the necessary file to Petalinux-Zybo-Z7-20/Zybo-Z7-20/project-spec/meta-user/recipes-modules/xilinx-axidma/files and update the MAKE file and the xilinx-axidma.bb
    - update the system-user.dtsi in /Petalinux-Zybo-Z7-20/Zybo-Z7-20/project-spec/meta-user/recipes-bsp/device-tree/files
    - write the demo program
     
     
    Hope this helps.
    -Ciprian
    Zybo-Z7-20-HDMI-RX_peta.zip
  12. Like
    Ciprian got a reaction from JColvin in Vivado sysnthesis fail..Pcam   
    Hi @Hong,
    Thank you for bringing up this issue, unfortunately without a log file we don't actually know what's wrong with it because the message you have provided is a generic error message. Could you please provide the log file?
    Taking in to account that 2018.3 is a very recent product (currently 4 days old) we have not yet updated our demo projects to this version. Differences between Vivado versions tend to come with changes which might make our demos fail. We usually recommend a specific Vivado version in our demo projects, if you go to Zybo-Z7-20-pcam-5c Demo you will notice that in the description of the project we specify "Created for Vivado 2017.4". We can guarantee that it will work in that version.
    If you need it to work with 2018.3 you will have to wait for us to update our demo or try to update it yourself.
    - Ciprian
  13. Like
    Ciprian got a reaction from n3wbie in Zynq book - tutorial 5 Zybo Z7   
    Hi @n3wbie,
    I had a similar problem, for me it was the fact that I did not have enough space allocated to the stack in the linker script, if you changed the dimension of the RecSamples variable then that might be the issue.
    Regarding the sin wave, I'm not sure what you want to do but if you want to generate a sin wave from within the and then play it back to the Head Phones then you can simply use the sin function in C (you need to add the math.h library and make sure you activate the library in the project settings, described here). Otherwise you can set Mic or Line in, connect the jack to you PC and play a Sin video from youtube then you can look at the recorded samples. I'm guessing you are more familiar with MATLAB, you can try that to; the idea is that as long as you are feeding it the right samples either way works.
    Hope this helps,
    Ciprian
  14. Like
    Ciprian got a reaction from n3wbie in Zynq book - tutorial 5 Zybo Z7   
    Hi @n3wbie,
    Th working project is attached. what you have to take in to account when using this audio codecs with Digilent products is that you need to configure the codec (using I2C) as well as receive the samples using I2S IP core. Basically one is for the control of the codec and the other one is to receive the samples. I have written a small driver for both the I2S core and the I2C SSM2603 which is in the source files of the SDK project (in the sdk folder) which configures the registers for the codec and I2S IP core; the documentation for the codec can be found here. The IP core has not yet been documented which is the main reason we have not added it to the Digilent vivado-ip library, but it needs a 100MHz input for it to be able to synthesize the 12.228 MHz MCLK and the subsequent clocks for the I2S protocol.
    The demo project reads the buttons and based on the ones you press it will:
    BTN0 - Record 1s
    BTN1 - Set Mic input
    BTN2 - Set Line In input
    BTN3 - Playback 1s
    The project is not really optimized so it uses a variable "RecSamples", allocated to the stack memory which holds the recorded samples(48000 samples representing 1s at a 48KHz sampling rate) and it is also used fro play back, so don't press play back before record. The rest should be easily traceable from the comments in the driver and the main source code.
    If you have any other questions feel fr to ask.
    Ciprian
    ZyboZ7Audio.zip
  15. Like
    Ciprian got a reaction from jpeyron in Zynq book - tutorial 5 Zybo Z7   
    Hi @n3wbie,
    Th working project is attached. what you have to take in to account when using this audio codecs with Digilent products is that you need to configure the codec (using I2C) as well as receive the samples using I2S IP core. Basically one is for the control of the codec and the other one is to receive the samples. I have written a small driver for both the I2S core and the I2C SSM2603 which is in the source files of the SDK project (in the sdk folder) which configures the registers for the codec and I2S IP core; the documentation for the codec can be found here. The IP core has not yet been documented which is the main reason we have not added it to the Digilent vivado-ip library, but it needs a 100MHz input for it to be able to synthesize the 12.228 MHz MCLK and the subsequent clocks for the I2S protocol.
    The demo project reads the buttons and based on the ones you press it will:
    BTN0 - Record 1s
    BTN1 - Set Mic input
    BTN2 - Set Line In input
    BTN3 - Playback 1s
    The project is not really optimized so it uses a variable "RecSamples", allocated to the stack memory which holds the recorded samples(48000 samples representing 1s at a 48KHz sampling rate) and it is also used fro play back, so don't press play back before record. The rest should be easily traceable from the comments in the driver and the main source code.
    If you have any other questions feel fr to ask.
    Ciprian
    ZyboZ7Audio.zip
  16. Like
    Ciprian got a reaction from n3wbie in Zynq book - tutorial 5 Zybo Z7   
    Hi @n3wbie,
    I tried to rebuild you project but there are some errors from downloading the project from Drive, for future reference when uploading a project make sure you archive it first. Regardless, I started to build a simple demo that does exactly what you want to do, it will not be fully documented but it will be documented enough so that you can understand how the IP works and how to use it in future project.
    This will probably take a bit of time, so I will have the demo ready for you by the beginning of next week, and I will post project and the description as a reply to this post.
    Regards,
    Ciprian
  17. Like
    Ciprian got a reaction from jpeyron in The Complete HLS Procedure   
    Well generally you have a C code which you want to optimize of which only a part can be optimized, so you are right so far. This is due to different limitations in the way HLS transforms your C code in to HDL; basically once you start writing code for HLS you need to understand how that code will be synthesised by HLS in order to obtain the best performance. You can find more details about the directives in UG902.
    I have unfortunately not worked with CPU-GPU accelerations so I don't exactly know how it works, but I assume that you have the GPU and the CPU in the same PC/Laptop. If this is the case then you can offload some functions to the GPU without actually knowing the interface between the CPU and GPU, you just give it the task and when it's finished its finished. Now if there are a lot o small actions which the GPU has to do there will be a back and forward between the GPU and The CPU for reading and writing data in which case the bandwidth between the processors becomes and issue. At least this is how I understand the issue, please correct me if I'm wrong or the info is incomplete.
    When it comes to FPGA's most of the time you don't have an FPGA in a PC/Laptop (I have not heard of one so far but there might be one somewhere) so you have to chose how to interface with your FPGA (USB, PCI, Ethernet, etc) depending on the required speed and bandwidth you can chose what you like. Being on the Digilent forum I assume you have one of our boards or are considering buying one so you either have to make due with what high-speed interface you have on your dev. board or you can buy a board which suites your needs. The idea is that not knowing what HLS is going to be used for and how, the authors of the UG871 will not focus on the bandwidth between the FPGA and something else, because they assume you will chose an interface which can handle the desired data at the desired speed. There is also the point that you might have a soft-core processor implemented in the FPGA and do not desire to send data to a different processor which is not in the FPGA, in which case you will have to focus on the throughput of the soft-core CPU and the HLS core. Or you could use a Zynq which has a ARM processor next to a FPGA, but I digress....
    Now from the HLS perspective they do take in to account the throughput of the data which will be flowing in to the HLS core and they do warn you about the limitations of the core using the initiation interval (aka 1/throughput) which you will get in the synthesis report, but that is the maximum throughput which your HLS core can support and that can be optimised using the HLS directives. If you need a high speed data transmission for bulk data then you will need to interface your HLS core using an AXI-Stream interface. For example if you want to accelerate your C code from a PC using PCI it would look something like this:
    CPU <=> PCI <=PC side============FPGA side=> PCI <=> PCI-to-Axi-Stream core <=> HLS core 
    Hope this has clarified some of your questions...
    Ciprian 
  18. Like
    Ciprian got a reaction from Alex in FPGA USB Audio   
    I would recommend something Zynq based because you already have the USB controller in the ARM of the Zynq (Zybo or Zed). If you want to build your own USB controller or use a custom on in the FPGA than you have to find something with only FPGA (Nexys Video  or Genesys 2). All four of these boards have an audio codec on them which supports the I2S protocol.
    Stay away from: Pynq, Arty Z7, Nexys 4 DDR and Nexys 4; these have an analog circuit which receives the audio samples via PWM or PDM, so no audio interface.
    Cipi
  19. Like
    Ciprian got a reaction from Bianca in FPGA USB Audio   
    I would recommend something Zynq based because you already have the USB controller in the ARM of the Zynq (Zybo or Zed). If you want to build your own USB controller or use a custom on in the FPGA than you have to find something with only FPGA (Nexys Video  or Genesys 2). All four of these boards have an audio codec on them which supports the I2S protocol.
    Stay away from: Pynq, Arty Z7, Nexys 4 DDR and Nexys 4; these have an analog circuit which receives the audio samples via PWM or PDM, so no audio interface.
    Cipi
  20. Like
    Ciprian got a reaction from Bianca in Github Digilent/linux-digilent repository   
    Hi Saurabh,
    Sadly the documentation you are referring to is outdated, the newer versions of linux kernel do not have the same config files as the one presented in that document. Also as far as I know Digilent dose not have an updated version of this document or a tutorial which will help you guide your way through the maze which is embedded Linux and Linaro (as far as I know).
    Now depending on what you might want you have a couple of options:
    If you are looking only to use Linux on the board, for the sake of using Linux without actually adding or creating costume IP in the FPGA part of the Zynq. I recommend using an "plug and play" kind of a solution, called Xilinux by Xillybus. It is sort of an black box approach, from a hardware perspective, with a hard to modify Linux Kernel. You basically just have to build the hardware copy the kernel and the image on a SD card and you are good to go. It is also more stable than Linaro, from what I tested. 
    http://xillybus.com/xillinux
    If you are more into building your costume hardware and costume kernel, I recommend using the Analog Devices tutorial for the ZedBoard. Although it is harder to follow and you have more steps than with the Xilinux, it will allow you to change things in it with little ease. This approach will also teach you more about embedded linux than the other one, because it will force you to self study on this subject. 
    https://wiki.analog.com/resources/tools-software/linux-drivers/platforms/zynq#build_and_install_the_kernel_image   
    Also if you are more inclined towards this solution I strongly suggest you look over the Zybo Linux example from Digilent. Granted it is not constructed for the ZedBoard but the configuring and building of the kernel and hardware is very similar, you might get a better perspective on the whole embedded Linux world if you look at the Zybo example also.
    http://www.instructables.com/id/Embedded-Linux-Tutorial-Zybo/?ALLSTEPS
    There is no easy beginner friendly solution for what you want. Nothing like the raspberry pi where you just plug in a SD card and turn a switch. But ultimately it will teach you more than the pi and with a little persistence it will also be a lot more "powerful" then the pi.
    Cheers,
    Ciprian
  21. Like
    Ciprian got a reaction from Bianca in ZYBO not working with Vivado 2015.4   
    Hello Ag,
    I downloaded the "reference design" which was uploaded by kypropex. I took the board support files from the Digilent website (https://reference.digilentinc.com/vivado:boardfiles2015) and copied the zybo file in to D:\Xilinx\Vivado\2015.4\data\boards\board_files, this is because I installed Vivado on drive D:\. The validation of his design was success full and so was the bit generation. I retried this with the board files which you uploaded and I didn't get an error when validating project and generating the bit. Make sure that the board files are in the correct folder D:\Xilinx\Vivado\2015.4\data\boards\board_files\zybo\B.3. The good news is that as far as I can tell you are right, there is something wrong with the identification of the board files.
    Just to make sure that the board is functioning correctly; the SDK project from kypropex, has a valid bit file(which was probably exported by him), try to program that file along with the elf file in to the FPGA from SDK 2015.4. This is covered on page 32 of the zynqbook. 
    Ciprian
  22. Like
    Ciprian got a reaction from Bianca in ZYBO not working with Vivado 2015.4   
    1. It should work , I currently have ISE 14.7, VIvado 2014.3, VIvado 2014.4, VIvado 2015.1 and VIvado 2015.4 installed so Xilinx products will probably not cause any issues among themselves. The only issues which I have experienced so far from having so many versions of Vivado installed is a drastic lose of hard-disk space, every version is approx 20 GB and a problem with the environment variable, which is mostly used to determine the "default"(for a lack of a better word) version of Vivado you are currently using.
    2. If you follow the tutorial from the zynqbook you aren't doing anything wrong. Most likely there is something wrong with your vivado installation, or the board package. This is also verified by the fact that the project from kypropex generates errors.
    3.  TCC0_WAVEx_OUT,  SDIO_0,  USBIND are ports of the zynq processor which maybe connected to the FPGA part of the zynq. They are set and configured inside the processing_system, because you are a novice at this I strongly recommend you ignore them for the time being, or else you might get into complications and loose sight of your current project.   
    Try a clean reinstall, and then redo the project from zero.
    Best regards,
    Ciprian