Search the Community

Showing results for tags 'camera'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • News
    • New Users Introduction
    • Announcements
  • Digilent Technical Forums
    • FPGA
    • Digilent Microcontroller Boards
    • Non-Digilent Microcontrollers
    • Add-on Boards
    • Scopes & Instruments
    • LabVIEW
    • FRC
    • Other
  • General Discussion
    • Project Vault
    • Learn
    • Suggestions & Feedback
    • Buy, Sell, Trade
    • Sales Questions
    • Off Topic
    • Educators
    • Technical Based Off-Topic Discussions

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


AIM


MSN


Website URL


ICQ


Yahoo


Jabber


Skype


Location


Interests

Found 13 results

  1. I am doing an image processing (not video) with Zybo Z7, and I am writing my own algorithm in C and C++ language for the image processing algorithm. For the simulation (without camera), I have some ready photos that can be loaded from SD card and then I want to test the performance of my algorithm on. I know Xilinx provide xsdps driver for the SD card, but I don't know how to use it for my project. But For the real application, the image processing will be done on the photos that are taken regularly by a camera. My priority, for now, is to first test my code using simulation. Could someone provide me a tutorial or a sample project? All the tutorials on the internet are about processing Video (not simple images)
  2. Hi everyone, I am trying to connect an OV 7670 camera to my basys 3 board and program it so that it shows the live camera feed on a monitor connected to the VGA port of the basys 3. The only thing I was able to do is connect the camera to the pins on the basys 3. I've looked everywhere, including here, but cannot write the other components, like frame buffer, myself. I would appreciate any help or any suggestions about where to look for help. P.S. I have enough knowledge of VHDL and the basys 3 board to implement an adder that uses the 7-segment display. Thanks.
  3. HI, for my new PYNQ-Z1 to connect over its "USB HOST" interface: (1) will any USB Hub do, e.g., when concurrently a USB WiFi dongle and a USB-Webcam are connected to the PYNQ-Z1? (2) what is your recommendation(s) of a "compatible" stereo camera module (cf. attachment) for the PYNQ-Z1 board ? Thanks.
  4. Hello, I am working on a project using an Z7-20 FPGA for computer vision processing using a CMOS camera sensor. I am planning to run convolutions on the image data for usage such as a Sobel filter. (to avoid RAM requirements, I will most likely use a Line Buffer to store the data while it is in use. This means I will need a camera sensor which can connect directly to the FPGA and outputs one pixel at a time (through multiple wires for each color), with a well-defined (or customizable) clock to synchronize the data transmission. I want like something cheap, rugged, and easy to wire like the OV7670 (as this is for a robot, I cannot use something too sensitive to vibration), but something that has a little higher resolution and framerate (I was thinking 480p or 720p at 60fps, or 1080p at 30fps, but lower is fine if the price is too expensive), as well as being a breakout board which can be easily wired (I can make a custom pin header adapter from the camera module to the FPGA's ports if needed, but I don't want to have to use a very small/thin one or something that requires SMD soldering). Any suggestions? Thanks in advance.
  5. Hello! I have been investigating how multiple clock domains work and how you can send data ASAP from a camera module to a SDRAM (taking a pic). I am currently using a Nexys-Video and a Zed board and wonder if I could get some tips. The problems I encountered during my research is: -The picture I take has to be stored ASAP, meaning I have to use the mig7 interface for the SDRAM and HDL code. However this will be hard since it requires me to understand how the MIG7 works and thus writing a HDL that is adjusted to work with it. - What is the maximum frequency of a micro blaze? Using the micro blaze would be easier for me since I can directly write and read to/from the SDRAM using the peripheral libraries. However, the maximum clock frequency of a MB processor is not so much thus making that choice a bottleneck. - How should the communication between the camera and the SDRAM work? My initial idea was to use a buffer (BRAM) and store my picture there and then somehow do pipelined reading using the micro-blaze. Or alternatively, to send bytes to the micro-blaze directly without using a buffer. However I don’t really know if that is a good idea since the camera and the micro blaze work on different clock frequency levels. Would be glad if you could help me on the way. Regards, John
  6. Hello, is it possible to sample an external analog signal using the XADC? Or do I need the AMS101 Eval-Card for that? I'm using the Sony XC-HR50 analog Camera (datasheet: https://pro.sony.com/bbsc/ssr/product-XCHR50/ ), which has a CCD Sensor. I am very new to Zedboard and trying do save the picture of the camera. Thank you, kind regards The Video Output is shown below:
  7. I'm trying to understand how to set up clocks and read data from a MIPI camera sensor. The sensor (Omnivision 5647) uses the MIPI CSI-2 protocol with D-PHY for the physical layer. The stage I am trying to get to is to be able to observer SoT (Start of Transmission) signals after which I can start parsing the CSI-2 protocol packets. In a small MIPI writeup located at http://archive.eetasia.com/www.eetasia.com/ART_8800715969_499489_TA_a466fca2_3.HTM there are 2 statements that are to be taken into consideration when trying to read data: "The high speed payload data from the transmitter is transmitted on both the edges of the High speed differential clock (DDR clock)" "The high speed differential clock and the data transmitted from the transmitter are 90 degrees out of phase and with the data being transmitted first." Using VHDL and Vivado, how do I create logic to successfully read data from this sensor? I have the following code written (with notes/questions) but I'm pretty sure its wrong. It was put together based on my limited understanding and reading various other source code that perform similarly: http://pastebin.com/FGvChHis I was told that in order to derive the correct delay value I would have to sample the output clock at the rising edge. If it is not 1, decrement the delay value. If it is 1, increment the delay value. This way the delay should always be within +/- 1 of the ideal value. I have experimented with this code and tried to see how many SoT's I can detect but its very low (<10 per minute). This is probably due to random chance. Really need help on this one!
  8. Hello everyone, I am using a Nexys 4 DDR for a school project. I am building a system that uses a video camera to detect and track human motion. Several questions: 1. Which port should I use to connect the camera? The immediate one available on the board is the USB host connector, but is it possible to use it to connect a camera? 2. Are there any PMODs available to connect a camera module? 3. Any recommendations for a specific camera model to use for this project? Basically, I need to take the video input, perform some filtering to recognize face and arms, downsample the video and store it in a memory buffer, and output the video real-time to a VGA monitor. Thanks!
  9. Hello, I want to feed video from my smartphone camera to Genesys and want to display the video on the PC. I don't know how to interface. How should I connect my smartphone camera with Genesys board? I also want to do the same thing with Nexys 4 DDR board. Any kind of information will be helpful. Thank You.
  10. Hi JColvin, We used zybo for our board prototype integrated with OV7670 Camera. every time I run it. I have a warning like this: INFO: [Labtools 27-1434] Device xc7z010 (JTAG device index = 1) is programmed with a design that has no supported debug core(s) in it. WARNING: [Labtools 27-3123] The debug hub core was not detected at User Scan Chain 1 or 3. Resolution: 1. Make sure the clock connected to the debug hub (dbg_hub) core is a free running clock and is active OR 2. Manually launch hw_server with -e "set xsdb-user-bscan <C_USER_SCAN_CHAIN scan_chain_number>" to detect the debug hub at User Scan Chain of 2 or 4. To determine the user scan chain setting, open the implemented design and use: get_property C_USER_SCAN_CHAIN [get_debug_cores dbg_hub]. Hopefully you help me to fix it! Thanks and have a nice day!
  11. aaleman

    Rpi Camera - FPGA

    HiI have been working with the Raspberry Pi Camera to make an interface with an FPGA and take the data from the camera to the FPGA memory. I want to connect the Camera Module directly with the FPGA but I have not found any document that explain the communication protocol or some procedure to make it possible to connect and configure the camera with the FPGA. If someone know any document or have an idea that can be helpful for my project I will be very grateful. Thanks.
  12. I plan to use Nexys video for image processing. I did a project before where the FPGA gets the data straight from the sensor but now I will be using one of the interfaces (HDMI, USB, etc). There is a number of options, probably any of them valid, I'm just asking for suggestions. The most available cams have USB interface but Nexyx does not have 'pure' USBs unless I'm wrong and I'm not that much willing having an embedded processor just to interface USB to my video format stream. Second thing I though was to use the HDMI input but can't find many cameras with that output... Of course I want to avoid 'pro' and expensive cameras if I can get away with a £20 one. I've seen some USB to HDMI cables (I assume they have some chip somewhere), will they work? Has anybody used them for cameras, not for Pc to Screen? Third I also saw many WiFi cameras, so another option is to add some WiFi module to nexys and connect to the cam that way. As there will be a number of protocols in the middle, again, has anybody tried that before? Last I could just wire the CCD sensor to the I/O through the mezzanine connector, then I wonder why does it deserve to be called 'video' board. Any new suggestion also welcome, thanks guys! John
  13. Jay

    Nexys Video + Camera

    Hi, I have used Atlys + VmodCam for probing camera enhancement algorithms. This time, upgrading to Nexys Video + Camera is needed due to the termination of Atlys. Is there any suggestion for what kind of camera module is the best in the market? Jay