• 0
takieddine

Interfacing VmodCam controller with BRAM

Question

Hi Jon,

I am trying to use the ISE demo project that configures VmodCam video feeds at resolution 1600x900 using HDMI port, I made the following modifications:

- Fist of all, I removed the framebuffer controller of DDR2 and replace by a BRAM controller.

- I am trying to read one frame from camera A and sending it to pc through UART, I Implemented an RGB-to-grayscale converter in order to minimize the memory usage 3 times. Each time I store 1600 bytes in the BRAM and then sends them through the UART transmitting interface, however there is a big difference between the data transfer rate of the VDHCI connector used to connect VmodCam to atlys board and the maximum data transfer rate of the UART protocol which is 115200 bps, my question which parameter in the camera should I control to ensure correct reading of data, and can I use the video capture mode to take one frame of the video feed, In the manual there is a mention of  FIFO in the camera, how many pixels can the fifo store, and finally can I take  the videtiming controller out of the desing ?

Thank you very much,

TE SAIDI

Share this post


Link to post
Share on other sites

20 answers to this question

Recommended Posts

  • 0

Hi TE SAIDI,

Unfortunately, we were only able to get the VmodCAM to work in the 2 resolutions with the 2 different designs. I'm glad to here you were able to get the bram to work. You can get more information about the fifo,data transfer and potentially if you'd still need the video timing controller from here with a partial reference manual of the MT9D112. For more information about the MT9D112 I would suggest to request Micron for the full data sheet. 

thank you,

Jon 

Share this post


Link to post
Share on other sites
  • 0

Hi Jon,

Are you saying that the VmodCam cannot work with a resolution between 1600x900 and 640x480 ? my problem is that I couldn't display the video feeds on an HD tv screen after I changed the resolution from 1600x900 to 640x480n despite the fact that the vga demo project works just fine, hence I would like to understand the difference between context A(preview mode) and context B (capture mode), I am waiting for your response.

Thank you,

Unfortunately I didn't get the BRAM to work, 

Share this post


Link to post
Share on other sites
  • 0

@takieddine,

That would depend upon what's not working.  :P  What happens when you try to read data from the camera controller?  Oh, and are you clocking at the same speed?  Often times the DDR controller produces the clock for the rest of the design.  (I know it does this for my Arty design ...)  If you remove the DDR controller, you'll need to make certain that you are using the same clock as before ... or at least so I would assume.

Dan

Share this post


Link to post
Share on other sites
  • 0

Thank you Dan,

Actually, I think that the system controller( Syscon) is the one that provides the camera controller with the PCLK, so what really makes the problem ?

 

Share this post


Link to post
Share on other sites
  • 0

@takieddine,

But you still haven't said what's not working.  Can you tell us what part of your design is failing?  It's really hard to evaluate what's wrong with your FPGA hardware, configuration, and design from a desk on a different continent when ... you haven't told us at what part of the process things are failing.

Just for humor, here's an example of a customer support phone call regarding someone who called in to say their software program wasn't working.  :D  Notice how many questions needed to be asked and answered before the tech support person was finally able to diagnose the problem.

The same sort of thing is true here, albeit without the humor.  :(  Can you tell us more about what is and isn't working?  Can you load a configuration to the board?  Can you talk to the board? Can you tell if the board is properly reading from the camera?  Can you tell if it is properly storing the image into memory?  Where is your system failing?

Thanks,

Dan

Share this post


Link to post
Share on other sites
  • 0

Thank you for being reponsive,

For the moment I don't have a full HD screen to test the resolution of the project (1600x900), so I designed an image acquisition system by using BRAM for pixel storage after converting RGB to grayscale, I tried to test the output of camera controller A ( input of port A of DDR2 controller) after converting it to grayscale, with DDR2 in the design I can clearly see the blinking of LEDS changing indicating a reading of consecutive pixel feeds from the camera; however after excluding the DDR2 ram controller (fbctl) from the design the reading stops. It should be noted that I didn't change anything for clock domains, I am using the original design clocking scheme for the camera controller. Can you please at least guide me of how to troubleshoot this issue ? I hope that I gave you enough details about the problem.

Thank you for your help in advance,

TE SAIDI

Share this post


Link to post
Share on other sites
  • 0

Well, let's start with the clock.  Create a counter, using the clock you that is synchronous with the pixels coming off the camera.  Set your LED's to some number off that counter.  Is that clock toggling?

Dan

Share this post


Link to post
Share on other sites
  • 0

Hi Dan,

New diagnosis, I implemented an 8-bit counter that increments with camApclk; excluding the video timing controller, everything works just fine, but after excluding the DVI transmitter module from the design the reading just stopped, I would like to understand what is the relationship between CamApclk and the DVI transmitter, I think that this is the key to troubleshoot this issue ?

As always, thank you in advance for your help,

Taki

Share this post


Link to post
Share on other sites
  • 0

At this point, I like your approach: chase down the clocks and the clock differences until you find out the "cause" which has changed.

Be aware, your not going to be able to do this like a black box.  You're going to need to get inside and understand what's going on.  In other words, don't be timid.  Get in there, and let us know what you find.

Dan

Share this post


Link to post
Share on other sites
  • 0

@takieddine,

You were doing pretty well there with the LED's.   :D

I can understand, though, that LED's might get old real fast.  Still --- one LED can give you a lot of diagnosis information.  A flashing LED can indicate the presence of a clock.  An LED that blinks 2x, and is then off for a while might give some indication other than an LED that blinks 4x and is then off for a while.

Should you wish to leave LED debugging, there are a lot of diagnosis tools out there.  Your choice may depend a lot on your personal preference.  Lot's of folks use the Xilinx simulation tools, or even the Xilinx ChipScope.  I'd recommend these tools to you, although I've never used them--not even on my very first Verilog project.  I myself use Verilator a lot--although it is difficult to debug hardware using Verilator.

Whatever tool you use, you'll need to get inside the design, understand what is going on within it, and then figure out where your problem is coming from.  Comparing the working against the non-working design will be a big help if you can do it.

Dan

Share this post


Link to post
Share on other sites
  • 0

Hi Dan,

Sorry for not replying as soon as possible, in my debugging I figured out that the signal "PLL_lock" is common  between syscon and fbctl components, my guess here is that this is the signal that causes the issue discussed previously, any suggestions ?

TE SAIDI

Share this post


Link to post
Share on other sites
  • 0

@takieddine,

If you have a PLL_lock signal being driven by two components, then I would immediately check to see if I had other signals with multiple drivers.

My suggestion would be that you would find the PLLs in your design, and merge them into one PLL if possible.  The lock signal should come out of this PLL.  If you can't place them into one PLL, then build as many as you need--but make sure you pay attention to which wires are PLL outputs and which are inputs.

I would also suggest as a matter of design principle, that all of your PLLs and indeed all of your Xilinx specific logic should be either in your top level design file, or in a special Xilinx specific file nearby that toplevel design file.  This practice helps to make your code more portable between FPGA architectures--since you have all of your device specific logic in one place.  (Yes, I know, Xilinx IP doesn't do this--my experience being with the MIG IP ... but I would still recommend this as a matter of principle and good engineering practice.)  Even better, once you have done this, you'll know where your clocks are coming from, where they are generated from, etc.

Dan

Share this post


Link to post
Share on other sites
  • 0

Thank you very much,

That 's exactly what I was intending to do, designing my own clock generator, I will proceed in the debuggin and see the results.

Taki Eddine

 

 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now