Jump to content
  • 0

Nexys Video DPTI Virtual Screen


robfinch

Question

I was just wondering if it's possible to create a virtual screen displayed on the PC monitor through DPTI? It seems like it should be, but before I do a lot of work I was wondering about any drawbacks or gotchas? Can DPTI be put into a fast loop for screen refresh? I was thinking primarily of text display. 8kB * 30 Hz = 240kHz rate.

Link to comment
Share on other sites

11 answers to this question

Recommended Posts

@robfinch

Hey this is an interesting question. It sounds so simple on the face of it but has so many unknowns when extrapolated to a final implementation.

First of all it's not clear to me what you are trying to do. Do you want to create your own Putty application but using a USB data stream? ( I think that such an application already exists ).  Let's put aside questions about how you want to present your data and start with overall data transfer rates.

Let's say we want to transfer 8KB 8-bit bytes every 33.333 ms. This amounts to 8192x30x8 = 1966080 bps. I've found that using a UART at a 921600 baud rate with Putty or Python on a PC is pretty robust. Of course with a UART we usually need 10 bits per character to provide stop and start bits so we'd need to bump that up to a 3686400 baud rate if we want to constrain ourself to standard baud rates in order to handle the desired data rate. Even still a UART seems to be a reasonable possibility. The UART as a COM port is pretty easy to work with though Microsoft doesn't make creating a serial application easy for Windows. I can say that at 921600 baud using Putty it's pretty difficult to read that much text scrolling through a virtual screen in Putty so perhaps you have a different presentation in mind. Rendering a display of say, 80x100 characters, in a static screen will require some thinking about how both the PC application and FPGA will operate. Do you really want to send redundant data every 33 ms? Perhaps. Technically, there is no problem with 3686400 baud rates as far as the hardware goes but getting a PC application to work robustly might be difficult. I've never had a reason to try it.

As far as USB 2.0 goes you should have no problem streaming data at 10x the desired data rate if you pay attention to the details. You still need to render the data into a desired presentation though. The problem with USB is that the protocol overhead can become an issue with low data rates and short transfer lengths. You can always pad you data to overcome these issues; that is you might need to transfer a lot more bytes than needed to accomplish your overall goals.

2 hours ago, robfinch said:

before I do a lot of work I was wondering about any drawbacks or gotchas?

So here comes the advice; and it probably isn't the advice that want you want. In terms of data transfer you certainly could use DPTI or a UART interface. The question is how do you write a PC application to render it. Once you've decided on how the text will be presented you will need to figure out if your OS will allow your application time to get the data and render it. Certainly rendering 8KB at a 30 Hz rate isn't going to be an issue for a modern PC. If there is a delay in rendering the data is this a problem? That depends. So here comes the advice. Create some intermediate projects to experiment with the different elements to such a project. I'd start with the PC application to render your 80x100 text screen. Then I'd add the UART or DPTI interface to the application and get a feel for what the issues are. Since the format for a UART is fixed and there is no packet protocol to deal with this is the easiest interface to work with with the fewest surprises. DPTI has a lot more considerations to avoid undesirable performance penalties. I urge anyone wanting to use a USB interface to read the available standards and understand the protocol before trying to use it. You'll still need to experiment as the OS layers will have a large impact but at least you will have a foundation for doing intelligent experimentation.

Hopefully this will kick of a useful discussion addressing your question.

Link to comment
Share on other sites

Thanks for the reply. I'm already building an interface using a serial port, hopefully to run at 1M baud. DPTI seemed like a natural interface to use to me since it's potentially faster. Building a PC app shouldn't be a problem. I've done some emulation of virtual screens and games before. 

In the cmod there's a uart interface connected to a dma channel which continuously dumps a buffer through the serial port - there's no software involved. To send data all the micro has to do is store data in the buffer. I was planning on using 16 '0xff' bytes as a sync marker which will set the position of the data on-screen. I've set the serial to buffer 4k at a time before signalling a receive event, so the interface will hopefully not be using up too much time on the PC. It still has to parse the sync position and transfer data to a display buffer. Eventually the app may be made into a custom control. 

Also keystrokes are being sent back through the interface. 

Link to comment
Share on other sites

@robfinch

I need to clarify a point that I made in the previous post. Obviously, real UART serial ports haven't been seen on a PC for some time now; so even though the UART interface is easier to grasp from an HDL viewpoint it is still a USB device from a PC viewpoint. So PC driver and application overhead is the same as it would be using DPTI. For really low baud rates you may not notice this at all. For really high baud rates you likely will notice.

In a few of my Project Vault submissions I've included VHDL code for a UART from Opencores as part of a testbench. When clocked at 100 MHz this particular implementation starts to show problems above about 7 Mbaud on an Artix device. That's just an anecdotal round-about figure from a bit of experimentation I did a few months ago. This amounts to about 14 baud period samples so you should be able to envision what possible problems might be encountered at that rate. I'm just talking about FPGA implementation here.

If frame to frame latency is an issue and you can create multiple frames faster than 33 ms. on your FPGA platform then clearly there might be an advantage to using DPTI over a USB COM port device. I've never had an impetus to do any investigation into latency for USB UART applications. If you do choose DPTI I suggest sending the largest blocks of data within reason ( you'll need a large buffer in the FPGA ) and avoid partial packets. It's astonishing how slow USB can be if you aren't careful.

Link to comment
Share on other sites

Is there a secret to getting the serial to work? I haven't had any luck, there's no data received event occurring in windows and not even garbage received. The port (COM6) is successfully opened. I had some confusion about which way the data signals in the CMOD are travelling, so I tried switching them around, still no luck. The signals are labelled "uart_rxd_out" which I assume is output from the uart to the FPGA (in other words rxd for the FPGA), and "uart_txd_in" which is actually output from the FPGA? I haven't been able to get the serial to work on a different board as well.

I tried lowering the baud rate to 9600 baud. Still no luck.

It sounds like from your post that 1M baud should be achievable. I have a 16MHz clock acting directly as a 16x clock for the uart. I'm tempted to use an lfsr to feed the txd to see if anything makes it through the interface.

 

Link to comment
Share on other sites

No secrets. Look around in the Project Vault part of the forum. There are several that use the UART including most of mine. I agree that the nomenclature can be confusing to people new to Digilent FPGA boards. There is no "official" standard for naming port signal names for UART interfaces. Digilent chooses to pick names from the viewpoint of the FTDI device so uart_rx_out is an input to the FPGA and an output from the USB device. They are consistent which is all the we can ask of them. 

16 MHz seems to be pretty low to clock an HDL UART for typical implementations. My Project Vault submissions have a testbench for simulation so that you can delve into the inner workings. Simulation won't solve hardware issues with outputs driving outputs but are key to good FPGA development. As far as running hardware always read the FPGA board schematic and the datasheet for any interface device you are using to do a sanity check on who's driving what pins... before powering any implementation. It's a good habit to foster.

Once you understand how the hardware works and have a simulation that seems to work you can check things like baud period to be sure that your implementation has a chance of working on hardware.

 

Link to comment
Share on other sites

16Mhz is the 16x baud rate clock which was switched to 14.607MHz so it's 921600 baud 16 times clock which is the max that shows up in the device manager window in windows. Rather than use a high speed clock, divide it down, and generate a clock enable, the clock is being used directly and the clock enable is a constant 1. A lot of the system is clocked at a lower frequency to reduce power consumption. 

I tried transmitting to the cmod and the serial activity light goes on. So it seems to be able to receive data. I then connected the txd and rxd together in the FPGA and it still doesn't seem able to send data.

I will have a look at the project vault.

Link to comment
Share on other sites

Got it to work (serial version). First version. Hard-coded "Hello World!". The whole soc with transmit and receive is only about 300 LUTs. Probably a lot fewer LUTs than would be required for a video display. Now to get the micro working.

 

HelloWorld.png

Link to comment
Share on other sites

The serial transmitter and DMA channel are on one side of a dual ported block ram, which has it's own clock domain. Ditto for the receiver. The other side of the block ram is connected to the micro on a different clock domain. In short the block ram is used to cross domains. It should be okay. There is also a global high-speed clock (100MHz) for some timing.

Link to comment
Share on other sites

On 6/23/2019 at 10:15 PM, robfinch said:

16Mhz is the 16x baud rate clock which was switched to 14.607MHz so it's 921600 baud 16 times clock which is the max that shows up in the device manager window in windows. Rather than use a high speed clock, divide it down, and generate a clock enable, the clock is being used directly and the clock enable is a constant 1. A lot of the system is clocked at a lower frequency to reduce power consumption. 

For some reason my answer to this question interrupted what I was doing (despite the passing of a few weeks).

I wanted to add a few thoughts.

Trying to get a design to work and trying to get a working design operating at minimal energy consumption usually should be two different efforts. Optimizing a working design for minimal current consumption is generally far easier to do than trying to get an untested design optimized. One other advantage to having a non-optimized design and an optimized design is that you can compare the actual difference in power consumption, and get some feedback as to how well your efforts offer an ROI (investment in this context being mostly time)

One of the concepts behind RS-232 and UARTs in general is that they ought to work with some significant level of baud rate error. Clocking a UART HDL design with a minimal clock might reduce energy consumption appreciably but also might reduce your designs' robustness when connected to another device with an acceptable but significant baud rate offset.  It depends of course on how you do it. I think of this problem as a sampling issue. If your UART uses too few samples per baud period then robustness will decrease with baud rates and bad rate offsets between connected devices.

In the world of UART serial interfaces baud rates are less of an absolute and more of a suggested number with error boundaries.

 

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...