Search the Community
Showing results for tags 'ov7670'.
Found 4 results
Hi all, I'm working on a video streming project with a zedboard and an ov7670 camera module. I found a similar project online, made by Mike Field, and I was able to make it work. But I want to buid the project using custom IP's. and I found this project: https://lauri.xn--vsandi-pxa.com/hdl/zynq/zybo-ov7670-to-vga.html I used the same (working) VHDL code, the same XDC file, the same setup, and followed his instructions, but for some reason, whenever I try to create custom IP's (with or without AXI), I can't make it work, and the monitor says "no signal". Can anyone please help to figure out what I'm doing wrong here? Thanks, Shlomi.
Hello everyone! I'm not sure whether this forum is the right place to ask this question but still. I have connected a low-cost OV7670 camera to this Digilent example: https://reference.digilentinc.com/learn/programmable-logic/tutorials/zybo-hdmi-demo/start?redirect=1 Here is what I've done. I took the OV7670 - > AXI4Stream core from here (link below) and attached it instead of HDMI input. I changed this module to have not 32 bit RGBA output but 24 bit RGB input https://lauri.xn--vsandi-pxa.com/hdl/zynq/xilinx-video-capture.html and also I took the OV7670 Controller from here (link below) and also attached it to the design https://lauri.xn--vsandi-pxa.com/hdl/zynq/zybo-ov7670-to-vga.html The system works o'k. What I would like to do is to remove the HDMI part from this design. I just want the image to be captured by the camera and be shown on VGA screen. If I understand it right the axi_gpio_video and the v_tc_1 ip-cores send some interrupt essential for the stream to start. I am interested and I have no understanding of what I have to do to remove the HDMI part from the design so that I always saw the image from my OV7670. Do I have to somehow simulate the interrupts? Can I do this in C code? Thank you very much for response in advance.
Greetings everyone, This is the first ever post of a beginner who has set on the path to learn embedded systems. Please forgive me if haven't followed rules of posting. I took the embedded system plunge few weeks back. Bought a strong laptop, a Zybo board, OV7670 camera, installed Vivado. Read online tutorials like 'blinking LEDs' and 'HDMI-to-VGA out' (and the others ones in Zynq Book) to get myself accustomed with Vivado etc. [Abbreviations in the text: PS= Processing System, PL= Programmable Logic] I have been visiting a blog lately and have found that quite helpful. A couple of weeks back I started this project mentioned on the same blog (http://lauri.võsandi.com/hdl/zynq/xilinx-vdma.html). What I'm doing is a bit simpler version of this as I omitted some part of the design which I thought wasn't required, I'll mention about that later. Pretty much like a mixture with another project involving test pattern generator (http://lauri.võsandi.com/hdl/zynq/xilinx-video-capture.html) My aim in this project is to get the stream from OV7670 camera, take it through PL (AXI VDMA IPs) to the PS, and then view the stream inside some window within Xillinux (linux). I feel doing so will ensure a bit of learning in both the PL and PS. Following is my progress along with doubts I have so far: 1- I have pasted the picture of my block design (called VDMA_Trial). After playing hit and trial, I managed to go around all the initial errors and successfully generated the bitstream. Which was a big relief. If you compare my block design with the one mentioned on the link I pasted above, you'll notice mine has lesser number of IP blocks. I did not need the RGB, HSYNC and VSYNC outputs from the PL so I omitted that part and only focused on taking the camera stream to the PS. Do you think this makes sense? Or do I need to have the complete IP blocks even if I don't wish to see the stream on HDMI (or VGA)? 2- I'm a normal computer user who has used Windows PCs most part of the life. So I don't have much idea of CLI based Linux. After reading from online resources, I booted Xillinux image (downloaded from xillybus.com) on the Zybo through SD card. Since I wish to watch the stream from the camera inside the GUI Xillinux. From a bit of reading I got to know about something called V4L2, which I'm still trying to figure how to install on Xillinux. Now assuming the design in point 1 is fine, can I simple paste the bitstream of the project in the SD card along with the Xillinux boot files (there are some other files in SD card too for Xillinux: devicetree, ulmage and xillydemo.bit). Will the hardware design in the PL activate and start writing the video stream information to the PS memory (DDR) when I boot up the Zybo with this SD card? If not, what step must I follow? Do I need to launch SDK to write some code to tell PS what to do? But I will already have the Xillinux OS running on PS, am I correct when I say I'll have to write and compile some code in Xillinux to tell PS to fetch the stream which is being written to PS memory by VDMA (from PL). 3- While reading different stuff, I came across this thing related to V4L2 and OV7670 (http://www.cs.fsu.edu/~baker/devices/lxr/source/184.108.40.206/linux/drivers/media/video/ov7670.c). To a layman like me, it looked like the code to set up OV7670 camera using V4L2 driver. Will I need to compile and run this code in Xillinux to watch the stream? I tried, but it just did not compile. My apologies for making it long and prob'ly silly. Just started traversing a steep learning curve, will take time to learn. I really look forward to seeing your enlightening responses. If there's any more info you need please let me know (in easy English ). Regards, Haris.
Hi JColvin, We used zybo for our board prototype integrated with OV7670 Camera. every time I run it. I have a warning like this: INFO: [Labtools 27-1434] Device xc7z010 (JTAG device index = 1) is programmed with a design that has no supported debug core(s) in it. WARNING: [Labtools 27-3123] The debug hub core was not detected at User Scan Chain 1 or 3. Resolution: 1. Make sure the clock connected to the debug hub (dbg_hub) core is a free running clock and is active OR 2. Manually launch hw_server with -e "set xsdb-user-bscan <C_USER_SCAN_CHAIN scan_chain_number>" to detect the debug hub at User Scan Chain of 2 or 4. To determine the user scan chain setting, open the implemented design and use: get_property C_USER_SCAN_CHAIN [get_debug_cores dbg_hub]. Hopefully you help me to fix it! Thanks and have a nice day!