Search the Community
Showing results for tags 'uvc'.
Hi all, I m a beginner in FPGA(zync 7000). I want to implement a project which took images from two cameras, one with usb(uvc) interface and one with csi-2 interface. One thing to note that i not using both cameras simultaneously. Only once at a time(Switch over whenever required) With first USB camera, i want to do some image proseesing functions like filtering and CLAHE(Contrast-limited adaptive histogram equalization) on the captured image. Then the processed is images is displayed on a HDMI or RGB interface mini projector(DLP 2000). Here i indicated both HDMI
Hello, everyone! I am now working on a project which have an action cam connected to HDMI Input and then filtered with HLS Video Library and then i wanted to use the Zybo as a generic webcam to the pc host. What I wanted to ask is how do I configure my UVC linux kernel so that it receives the video output from my PL that I connect to my PS through AXI Mem Interconnect, and then ready to use as USB Webcam? Also, can I run an OpenCV apps which generated from Vivado SDSoC and output the video to the uvc? Any helps and hints appreciated, Thanks!
Hi all, I'm new in this world and I hope someone could help me in this issue I'm working on a project in the Uniersity where I would like to use the Zedboard as UVC device to stream video from FPGA to the host. I'm able to build, configure and run the kernel from SD but I'm having some problem to connect the Zedboard as peripheral mode and to detect it as UVC gadget. In particular I've configured the kernel to act in peripheral mode following this instruction : http://www.wiki.xilinx.com/Zynq+Linux+USB+Device+Driver To let the Zedboard act as gadget I started from this gui