Jump to content
  • 0

Are FPGAs right for me?


chris235

Question

Hi,

this is my first post (after introducing myself in the New Users forum). I hope my question isn't inappropriate here...

 

TL;DR  How feasible is it to program e.g. an Artix-7 (or Spartan-6) at a "low level" wrt. instantiation, placing and routing etc?

 

I'm just a hobbyist, indeed electronics/programming is only one of my hobbies... so I don't actually *need* to use FPGAs. I'd be doing it for fun/satisfaction/distraction. Unfortunately, I suffer from severe depression (not Corona-related) so I'm looking to *avoid* a particular sort of frustration. (ie. fighting the tools).

I've started reading some books (a VHDL textbook, FPGA Prototyping by Verilog examples -- Pong Chu, Intro To FPGA Book -- Mike Field etc) and gone through a bunch of Xilinx user guides, and even a couple of YouTube videos and technical EE papers...

... and I'm worried that FPGAs might not be a good fit for me :-(

In reading about whether to learn VHDL or Verilog I've come across the ADA vs C language analogy. With regard to automatic floor-planning I've come across the "trying to overcome bad coding by using an optimizing compiler" analogy... etc.

Now, I'm weird. For small projects I actually *prefer* ARM assembly code to C, let alone Java or, God forbid!, C++. In electronics I prefer to place components and route circuits manually.

So, I guess I'd be wanting to instantiate things rather than have the software infer them and have more, rather than less (or no) control over placing  and routing. I understand that for big projects or large teams you just want/need to get something that meets requirements, that can be ported to a different device, or maintained by someone else. But that's not the situation I'm in.

From what I've read so far I get the impression that one might need to fight the software, endlessly rewriting one's HDL in the hope that the software will eventually infer whatever it is one actually wanted... Is this still the case? Or am I 20 years out of date? :-)

On top of all that, I'm on a Mac, so to even install/try out the Xilinx software, I'd need to dig out my Linux laptop...

Link to comment
Share on other sites

7 answers to this question

Recommended Posts

You might have a look at the open-source Lattice toolchain. It'll probably work in a Linux virtual box on any host (haven't tried).

Fighting the tools is, unfortunately, a large part of the FPGA experience.

In principle, you should be able to instantiate the hardware-level primitives e.g. LUTs (CLBs) see UG474.
You will not have direct control over the routing, though, but that's understandable (e.g. prevent hardware damage from conflicting driver assignments).
For an example, check what has been written about the Picoblaze implementation. I think that IP block may be pretty close to the FPGA equivalent of handcoded assembler.

If you're looking for a new obsession, check James Bowman's J1 CPU on FPGA (you can emulate it very nicely with Verilator so there's no need to fight with the tools if you don't want to). If you can allow yourself some solution-looking-for-a-problem thinking, it can be awesome with the right problem. I used it as control processor for real-time fractal generation on FGPA, check the projects section if interested. Be aware, there are a few hurdles that aren't immediately obvious - partly even not directly rooted in technical topics - that make scaling it to other problems very difficult (a true FORTH enthusiast will of course deny their existence ...)

 

Link to comment
Share on other sites

2 hours ago, chris235 said:

From what I've read so far I get the impression that one might need to fight the software, endlessly rewriting one's HDL in the hope that the software will eventually infer whatever it is one actually wanted...

BTW what you're describing might happen to someone who never bothers to dig deeper, for example studying the output from the tool, say warnings and timing analysis, and making sense of it. For example, BRAM at higher frequencies requires use of hardware registers, a DSP even multiple levels. This can have serious architectural implications (if you port e.g. a Xilinx J1 design to Lattice you'll run into a few of those). Still, I'd argue that the "partnership" with the tool is much more efficient than reaching the same conclusions from low-level design work. And maybe we shouldn't overestimate our intellect as humans - the set of inequalities optimized by the FPGA tools is mind-boggling and you may get some new respect for the tools if you try to understand why it made such a mess of my nicely structured design, scattering gates all across the chip.

There are also a few very common mistakes that make the "partnership" with the tool difficult. For example, use switches / buttons / LEDs without relaxing timing constraints (false_path) and you may ask an almost impossible problem. But giving it some thought, the "iteration process" may still be necessary but becomes a systematic and constructive process.

Link to comment
Share on other sites

... and finally:

Be aware that any ideas about connecting logic gates like 4000 or 7400-series circuits will most likely send you on a path to nowhere.

Modern FPGA design is based on the "synchronous logic" paradigm: Signals are guaranteed to arrive before a clock edge and remain valid for a given time (setup- and hold concept, see static timing analysis) BUT what they are doing in-between is unspecified.

Ignore this fact and FPGA design will become dark voodoo and implementation-dependent non-determinism.

Link to comment
Share on other sites

Thanks xc6lx45.

The J1 looks like fun. I like the Magic-1 too! :-) http://www.homebrewcpu.com

 

51 minutes ago, xc6lx45 said:

BTW what you're describing might happen to someone who never bothers to dig deeper, for example studying the output from the tool, say warnings and timing analysis, and making sense of it.

I think I'll try some (very) simple stuff on some Lattice chips. Part of what I wanted to find out is what sort of output the tools provide.

Link to comment
Share on other sites

4 hours ago, chris235 said:

From what I've read so far I get the impression that one might need to fight the software, endlessly rewriting one's HDL in the hope that the software will eventually infer whatever it is one actually wanted... Is this still the case? Or am I 20 years out of date?

The tools are very good at using VHDL and Verilog as design source input. If you are constantly rewriting your HDL because you don't get the results intended then it's because you aren't as good at HDL concepts as you should be. The burden of whether or not an HDL source correctly reflects your understanding of the problem and what the resulting logic will do is on you. By the time the tools get to analyze, optimize, and restructure it you should be confident that it works as intended. The way to do this is through verification, mostly simulation. The synthesis tools don't understand intent, just what you tell them. The FPGA vendors provide enough information for users to understand the tools and figure out how to communicate with the tools.

Of course software design is the same. Good software practice involves verification as well. For the casual hobbyist all of this is too much to bear. They just want to try out ideas and see results.

If you recognize yourself as a hobbyist who looses interest when the hobby starts looking like real work then FPGA development isn't for you.

Programmable logic has a place but really you can do almost anything with a cheap SBC like the Raspberry Pi or specialized microprocessor. There's less to learn and you can write and try out code much quicker. FPGAs are for specialized applications where standard microprocessors don't do well.

I suggest that instead of taking a broad but nebulous view you think in terms of specifics. What exactly would you like to do? Can you do it using a Raspberry Pi? You might get better feedback asking well defined and specific questions from this forum.

The thing about programmable logic is that, unlike software where everything is built using libraries that take care of all of the little details, the HDL description of what you intend to do requires you to figure out all of the details at the lowest levels. Now, I understand that the GUI board design approach provides a limited set of toys to play with but that only gets you so far. Still, that could be enough.

The great thing about FPGA development is that you don't need to spend a penny to try it out or learn. Just download a reasonably sized free version of the tools and start playing. It doesn't matter what part you select. If you want to develop a skill in an HDL just learn how to use the integrated simulator. Intel provides a free version of ModelSim that I use for my generic HDL code development whether I'm designing for a Xilinx or Intel FPGA. ModelSim is much more friendly for proper simulation than the Vivado simulator and as long as I don;t use vendor specific primitives it really doesn't matter who's tools will get to implement the logic design.  

4 hours ago, chris235 said:

I hope my question isn't inappropriate here..

I think that this is a great question for this forum.

Link to comment
Share on other sites

17 hours ago, chris235 said:

I like the Magic-1 too! :-) http://www.homebrewcpu.com

Now that is serious business :-)

If I approach it from that corner, you'll find an FPGA a breadboard the size of a basketball court but without the wiring delay.

But, heed my warning about "synchronous logic". Some of the breadboard thinking doesn't easily translate to FPGA, or it becomes a topic for specialized experts (say, in ASIC emulation). Keywords here for a systematic approach: "Moore / Mealy machines" but forget any "entry-level" textbook material that e.g. builds flipflops from logic gates...

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...