Search the Community

Showing results for tags 'video'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • News
    • New Users Introduction
    • Announcements
  • Digilent Technical Forums
    • FPGA
    • Digilent Microcontroller Boards
    • Non-Digilent Microcontrollers
    • Add-on Boards
    • Scopes & Instruments
    • LabVIEW
    • FRC
    • Other
  • General Discussion
    • Project Vault
    • Learn
    • Suggestions & Feedback
    • Buy, Sell, Trade
    • Sales Questions
    • Off Topic
    • Educators
    • Technical Based Off-Topic Discussions

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


AIM


MSN


Website URL


ICQ


Yahoo


Jabber


Skype


Location


Interests

Found 17 results

  1. I have been following this tutorial and have had no luck. I am uncertain about how to configure the QSPI IP, because the tutorial starts assuming that I have done that part successfully, so I am not even sure if this is the root of my problem. I have tried these two configurations of this IP, compiled them, and exported them to the SDK, and none of them solved the problem: I made sure JP4 is in the QSPI position. On step 3.1 in the tutorial, I can see that the FPGA is programmed successfully and I see the following output (since I chose not comment out the VERBOSE define as suggested in the tutorial): While programming the flash on step 4 I notice that my FPGA code is erased from the board (leds I had assigned to outputs turn off). Is that supposed to happen? At the end of the tutorial I get no "hello world" output on the terminal after resetting the board, though the FPGA does seem to program from the flash successfully, so that portion works, but I can't get the C-code to run from the flash. Here is the sdk_console_output.txt so you can see the steps I took in the sdk to program the board.
  2. Serial digital interface (SDI) is a digital video interface used with the most of professional video cameras. It uses BNC connector and operates at speeds of 3 Gb/s or about, depending on the standard. The more detailed specification can be found in wikipedia. Most of the SDI adapters for FPGA use FMC connectors, like this one. There is no FMC connector on Zybo Z7 board, but it does have multiple PMOD ports. Could they probably substitute? If some simple extra circuitry is required, we maybe could build it on the top of some generic PMOD adapter like this. My major doubts are, would such interface be fast enough? I am not aware at which frequency Zybo PMOD port and the custom circuitry attached to it could operate.
  3. i want to work on a video project which board will be better zybo or pynq . as i have studied that pynq has unbufferd hdmi. suggestions will be highly appreciated. Thanks
  4. I've been trying to understand how to utilize AXI-Stream IPs for Video processing and display via VGA for a few days now, but can't seem to get any circuit to work. Here is a test circuit I created: I have a Video Test Pattern Generator connected to an Axi4-stream to Video Out IP driven by a Video Timing Controller IP. Here is a 100 ms simulation for the circuit: Vsync does not get generated, so clearly there is something wrong with this circuit. All examples I have found online include a MicroBlaze or Zynq processor with their design connected to the VTPG, could this be a reason my circuit is not working? Is it possible to do what I am trying to without a processor? What exactly is the role of a processor in these circuits? My development board is a Nexys 4 DDR. I've gotten VGA to display in the past using IPs I created myself, but they weren't AXI compliant. I have attached the tcl file to build my block design Any guidance would be appreciated! design_1.tcl
  5. Hi, I have been looking on the Digilent site for a while now but I haven't found what the maximum operational temperature of the Nexys Video board is. I use the board to test another in house developed board but I need to run a thermal test at 60°C. Is this possible with the Nexis Video? Kind regards, Oceley
  6. Dear experts, I am actually new in this field and have a very few experience with zybo board. I have implemented the zybo_hdmi_in_demo which is required for my master thesis. Output video streaming at the VGA monitor shows a cropped part of my input video source. What should I do now? And can I use other HDMI source rather than my PC? And what is the preferred input HDMI video resolution? any kind of support or suggestions is highly appreciated.
  7. I'm using the Nexys Video board and I'd like to use the FIFO capability of the FTDI chip (IC13 connected to J12) to get data from the FPGA quickly and easily while keeping the JTAG lines high-impedance. I would like to use the FT2232H FIFO port while using our own JTAG (J17). The JTAG lines on that chip are high impedance until the USB cable is plugged in and I'd like to keep them high impedance while using the USB port. If you don't know, can you send me the schematic page for IC13/J12?
  8. Dear expertise, I have implemented the hybo_hdmi_in demo and it's perfectly working. Now, I want to show a binary mask in the region of interest at the VGA output. Now, my question, Is it possible to do it only by modifying video_demo.c file. Any kind of coding related idea will be helpful. Thanks in advance- Shuvo
  9. Hello, I'm trying to understand the HDMI capabilities of the Nexys Video Artix-7. I don't own a board yet, so these queries are based on reading spec sheets; please excuse any errors or omissions on my part. The FPGA on the Nexys Video is XC7A200T-1SBG484C, which supports 4 GTP transceivers at 3.75 Gbit/s [1]. However, based on my best interpretation of the Nexys Video data sheet [2] the HDMI ports aren't using the GTP transceivers. The GTPs are used for DisplayPort and FMC connector. Given the HDMI ports aren't using the GTPs, what is the maximum data rate the FPGA can support for them? The HDMI input has an Analog AD8195 buffer, which supports 2.25 Gbps data rate [3]. The HDMI output has a TI TMDS141 buffer, which also supports a 2.25 Gbps data rate [4]. This seems to limit the Nexys to 720p60 or 1080p30, whatever the FPGA may be capable of. Though if these rates are per TDMS channel then that's plenty for 1080p60. However, in the Digilent HDMI demo a video format of 1080p60 is shown [5]. In summary, can someone clarify what video formats and data rates the Nexys Video is capable of on HDMI input and output? Thanks in advance, Will For reference the data rate of some common HDMI formats: 720p60 - 1.45 Gbit/s (HDMI 1.0+) 1080p30 - 1.58 Gbit/s (HDMI 1.0+) 1080p60 - 3.20 Gbit/s (HDMI 1.0+) 2160p30 - 6.18 Gbit/s (HDMI 1.4+) 2160p60 - 12.54 Gbit/s (HDMI 2.0+) [1] https://www.xilinx.com/support/documentation/data_sheets/ds181_Artix_7_Data_Sheet.pdf (page 50) [2] https://reference.digilentinc.com/_media/reference/programmable-logic/nexys-video/nexysvideo_rm.pdf [3] http://www.analog.com/en/products/audio-video/hdmidvi-transmitters/ad8195.html [4] http://www.ti.com/product/tmds141 [5] https://reference.digilentinc.com/learn/programmable-logic/tutorials/nexys-video-hdmi-demo/start
  10. Hello everyone! I'm not sure whether this forum is the right place to ask this question but still. I have connected a low-cost OV7670 camera to this Digilent example: https://reference.digilentinc.com/learn/programmable-logic/tutorials/zybo-hdmi-demo/start?redirect=1 Here is what I've done. I took the OV7670 - > AXI4Stream core from here (link below) and attached it instead of HDMI input. I changed this module to have not 32 bit RGBA output but 24 bit RGB input https://lauri.xn--vsandi-pxa.com/hdl/zynq/xilinx-video-capture.html and also I took the OV7670 Controller from here (link below) and also attached it to the design https://lauri.xn--vsandi-pxa.com/hdl/zynq/zybo-ov7670-to-vga.html The system works o'k. What I would like to do is to remove the HDMI part from this design. I just want the image to be captured by the camera and be shown on VGA screen. If I understand it right the axi_gpio_video and the v_tc_1 ip-cores send some interrupt essential for the stream to start. I am interested and I have no understanding of what I have to do to remove the HDMI part from the design so that I always saw the image from my OV7670. Do I have to somehow simulate the interrupts? Can I do this in C code? Thank you very much for response in advance.
  11. Hi, I am new at this area. My current project required to know the use of HDMI and VGA port of Zybo board. As a starting point I got a sample demo project(https://github.com/Digilent/Zybo-hdmi-in) by digilent but that was done in Vivado 2016.4. I am working on Vivado 2017.2 windows PC. I successfully convert that project in current version and able to generate Bit stream successfully. The problem is when I lunch SDK, it gives me errors. Can anyone help me? Or can anyone gives me some simple project which from where I can get idea about how to use HDMI and VGA port? Thanks.
  12. Hello! I'm newbie in xilinx, and I have one more problem with microblaze with ddr3. I want to have access to DDR3 memory in my MB processor, without processor caches. I implement some design, write very simple code: #include "xparameters.h" int main() { int a = 0; for(;;) { a++; } return 0; } and I can't start debug ... When I start debug, it don't stay at main (but, thread is still running). When I pause it, I see in disassembler what processor stays at _hw_exception_handler In attach you can see system, linker mapping, and problem.. Please help me.
  13. Hello everyone! Finally I end my degree final project and I obtain a good mark (9.4/10) I have been working on this project 1 year more less. Summarizing this project... This project is based on HDMI IN SDSoC project from digilent GitHub (you can read this in the readme in my GitHub). I have increased this project with a JPEG encoder, new image filters and plugin architecture to develop new filters without have to write in the internal code. In this link, you obtain all the files and you can read about my DFP. I'm sorry but the memory is in Spanish. This system have a good performance with the plugin architecture (38.82 FPS with 1920x1080 images). If you want to do somethig cool with this project, the next step would be put Linux OS. I have learned a lot in this forum! Thanks! Regards, Raúl.
  14. Hello everyone, I am using a Nexys 4 DDR for a school project. I am building a system that uses a video camera to detect and track human motion. Several questions: 1. Which port should I use to connect the camera? The immediate one available on the board is the USB host connector, but is it possible to use it to connect a camera? 2. Are there any PMODs available to connect a camera module? 3. Any recommendations for a specific camera model to use for this project? Basically, I need to take the video input, perform some filtering to recognize face and arms, downsample the video and store it in a memory buffer, and output the video real-time to a VGA monitor. Thanks!
  15. Greetings everyone, This is the first ever post of a beginner who has set on the path to learn embedded systems. Please forgive me if haven't followed rules of posting. I took the embedded system plunge few weeks back. Bought a strong laptop, a Zybo board, OV7670 camera, installed Vivado. Read online tutorials like 'blinking LEDs' and 'HDMI-to-VGA out' (and the others ones in Zynq Book) to get myself accustomed with Vivado etc. [Abbreviations in the text: PS= Processing System, PL= Programmable Logic] I have been visiting a blog lately and have found that quite helpful. A couple of weeks back I started this project mentioned on the same blog (http://lauri.võsandi.com/hdl/zynq/xilinx-vdma.html). What I'm doing is a bit simpler version of this as I omitted some part of the design which I thought wasn't required, I'll mention about that later. Pretty much like a mixture with another project involving test pattern generator (http://lauri.võsandi.com/hdl/zynq/xilinx-video-capture.html) My aim in this project is to get the stream from OV7670 camera, take it through PL (AXI VDMA IPs) to the PS, and then view the stream inside some window within Xillinux (linux). I feel doing so will ensure a bit of learning in both the PL and PS. Following is my progress along with doubts I have so far: 1- I have pasted the picture of my block design (called VDMA_Trial). After playing hit and trial, I managed to go around all the initial errors and successfully generated the bitstream. Which was a big relief. If you compare my block design with the one mentioned on the link I pasted above, you'll notice mine has lesser number of IP blocks. I did not need the RGB, HSYNC and VSYNC outputs from the PL so I omitted that part and only focused on taking the camera stream to the PS. Do you think this makes sense? Or do I need to have the complete IP blocks even if I don't wish to see the stream on HDMI (or VGA)? 2- I'm a normal computer user who has used Windows PCs most part of the life. So I don't have much idea of CLI based Linux. After reading from online resources, I booted Xillinux image (downloaded from xillybus.com) on the Zybo through SD card. Since I wish to watch the stream from the camera inside the GUI Xillinux. From a bit of reading I got to know about something called V4L2, which I'm still trying to figure how to install on Xillinux. Now assuming the design in point 1 is fine, can I simple paste the bitstream of the project in the SD card along with the Xillinux boot files (there are some other files in SD card too for Xillinux: devicetree, ulmage and xillydemo.bit). Will the hardware design in the PL activate and start writing the video stream information to the PS memory (DDR) when I boot up the Zybo with this SD card? If not, what step must I follow? Do I need to launch SDK to write some code to tell PS what to do? But I will already have the Xillinux OS running on PS, am I correct when I say I'll have to write and compile some code in Xillinux to tell PS to fetch the stream which is being written to PS memory by VDMA (from PL). 3- While reading different stuff, I came across this thing related to V4L2 and OV7670 (http://www.cs.fsu.edu/~baker/devices/lxr/source/2.6.31.13/linux/drivers/media/video/ov7670.c). To a layman like me, it looked like the code to set up OV7670 camera using V4L2 driver. Will I need to compile and run this code in Xillinux to watch the stream? I tried, but it just did not compile. My apologies for making it long and prob'ly silly. Just started traversing a steep learning curve, will take time to learn. I really look forward to seeing your enlightening responses. If there's any more info you need please let me know (in easy English ). Regards, Haris.
  16. My goal is to send data from the fpga to the pc using DSPI. This data, in this case RGB data, gets transformed into video using Unity. There are, however, several problems with sending data from the fpga to pc. Currently the pc is receiving data from the fpga as follows, From the pc side: Open the device (NexysVideo) using the function DmgrOpen(&hif, "NexysVideo") Enable DSPI transfer using the function DspiEnable(hif) Set the SPI mode using the function DspiSetSpiMode(hif, idMod, fShRight) Set the master (pc) clock frequency using the function DspiSetSpeed(hif, frqReq, &pfrqSet) Set the slave select (SS) to the logic 0 state using the function DspiSetSelect(hif, fSel) Receive bytes from the slave (fpga) using the function DspiGet(hif, fFalse, fFalse, bFill, rgbRcv, cbRcv, fFalse), in a while loop There are several things that are not working properly on the pc side, looking at point 3, the SPI mode I would like to use is mode 1, shift data at the rising edge and sample data at the falling edge. However, I can't use mode 1, and currently using mode 0 (sample data at the rising edge and shift data at the falling edge). Another thing is point 4, setting the SCK clock frequency. I noticed that the maximum clock frequency is 30 MHz. When determining the clock frequency, however, using the fpga the clock frequency is about 12 kHz, which is much lower than I would expect! I checked the clock frequency on the pc side using DspiGetSpeed(...) and got 30 MHz, but clearly the fpga receives a lower frequency. From the fpga side: A shift register is connected to the FT2232H IC, I used the diagram on page 14 from http://www.europractice.stfc.ac.uk/vendors/nexys_video_rm.pdf for the connection layout The clock from the pc to the FT2232H is connected to the shift register If the slave select (SS) signal is a logic 0, then Every clock cycle a bit is shifted into the FT2232H The ultimate goal is to connect a camera, which outputs raw RGB data to the fpga, and feeds it to the pc. To test if transferring something simple works I continuously send the bitstream "111111110000000000000000", which would get translated to 0xFF0000 or the colour red. This however, does not seem to work, it seems as if the pc and fpga are not in sync. The pc receives values per byte such as (0, 127, 128). Is there something I missed?
  17. I plan to use Nexys video for image processing. I did a project before where the FPGA gets the data straight from the sensor but now I will be using one of the interfaces (HDMI, USB, etc). There is a number of options, probably any of them valid, I'm just asking for suggestions. The most available cams have USB interface but Nexyx does not have 'pure' USBs unless I'm wrong and I'm not that much willing having an embedded processor just to interface USB to my video format stream. Second thing I though was to use the HDMI input but can't find many cameras with that output... Of course I want to avoid 'pro' and expensive cameras if I can get away with a £20 one. I've seen some USB to HDMI cables (I assume they have some chip somewhere), will they work? Has anybody used them for cameras, not for Pc to Screen? Third I also saw many WiFi cameras, so another option is to add some WiFi module to nexys and connect to the cam that way. As there will be a number of protocols in the middle, again, has anybody tried that before? Last I could just wire the CCD sensor to the I/O through the mezzanine connector, then I wonder why does it deserve to be called 'video' board. Any new suggestion also welcome, thanks guys! John