Jump to content

The usefullness of Soft CPU Cores in FPGA Devices


Recommended Posts

@D@n, that prolific poster of mostly helpful information on any topic suggested that I create this thread. I suggested that he create the thread knowing how dear to his heart the subject is. ( actually it's one dear to my heart as well... hence this thread ) Since he hasn't done so I guess that it's up to me. I agree that a place to bounce some thoughts on the topic is a good idea that might benefit a large audience. So, I'll start of the discussion with this:

I've been concentrating on FPGA development for a long time but doing hardware and software development for a lot longer. I LOVE the idea of a SOC in an FPGA. I LOVE the fact that all of the FPGA vendors have teamed up with ARM to include hard CPU cores in some of their products. We aren't quite there yet as to having a real SOC, though the Cypress PSOC line is close though limited in the area of programmable logic. No company that I've ever worked for uses a device because it's a cool thing to do. ( OK, I'll back off that statement because I did do work for a small company doing radar research and decided that using GaAs logic would help secure an SBIR grant. ). There are a lot of things that one considers when designing a product that you want to make a profit on. I've used soft CPU cores in my own projects but only one for a commercial application ( and that was for a startup building equipment to demonstrate their IP, not for a product meant to be sold). To get to the point I'll start off with a few assertions as to why I haven't, over the course of 20 years of FPGA development experience ) used a soft CPU in a commercial product.

  1. Soft CPU cores use a lot of resources.
  2. Filling an FPGA with a soft CPU core with no room for including the kinds of things that programmable logic is good at makes absolutely no sense ( to me )
  3. Why waste the valuable resources of an expensive, power hungry FPGA when a low power highly integrated, high performance micro will do a better job, offering more on board memory, more functional blocks, lower power, more power saving options, well you get the idea....
  4. While the Xilinx support for ZYNQ is excellent and in some areas better than the software toolchain for embedded processors from companies that only make micro-controllers, by and large the tool chains offered by the micro companies are more stable and easier to use.
  5. A CPU in and FPGA just adds to the headaches, tool bugs and work-arounds, and general effort of developing a design. Who needs more work?
  6. I'm all for using soft CPU cores in an FPGA... there just aren't a lof of applications that demand one.
  7. It's rare that I can't use traditional HDL constructs like state machines or programmable controllers to achieve what the aims for an FPGA design. There may well be a application that requires a soft CPU core because the state machines get too complex, though I expect that I'd just use an FPGA with a hard ARM core for that case. 
  8. If anyone reading this wants to go through the work of using a MicroBlaze or NIOS using tutorials that might work with old versions of Vivado or Quartus but nit the one that you have,  as an educational exercise I am whole heartedly in support. And it will be an educational experience in many ways.

That should be enough to start the fireworks. I didn't even include the comments that prompted D@n to suggest this thread for the sake of harmony.

Link to comment
Share on other sites

@zygot,

Some quick thoughts in an initial response:

  1. Soft cores don't need to use a lot of resources.  Just because MicroBlaze appears to be designed with the goal of making you purchase a more expensive FPGA doesn't mean that the Microblaze design is a particularly good one--for many of the reasons you cite above.
  2. Agreed.  The purpose of an FPGA is to support programmable logic.  If all you want is a CPU, there are faster/better/cheaper CPU's that don't use FPGA's.  If you want to do something with FPGA logic, then no matter what soft-core CPU you pace on the FPGA ... there needs to be room left over for FPGA types of things--else it would've made more sense to purchase a regular CPU.  This was one of my opening points at my last ORCONF presentation (slides #'d 3-6).
  3. Agreed, same reason.
  4. While the more typical micro tool-chains may be more complete, I'm going to argue that this shouldn't be a controlling argument.  A better tool-chain makes more sense in the case of a fixed set of known peripherals.  A less than complete tool-chain (i.e., doesn't support Linux, doesn't have drivers for all of the devices) can still be very valuable when working with FPGA fabric.  Indeed, this is sort of the development you want--programming from a very high level will leave you wrestling with your purpose: creating and interacting with new devices in the fabric.
  5. Just because adding a CPU can make design more difficult, it doesn't mean that placing a soft-core within an FPGA is a bad idea in general.  It just means that it hasn't been done well yet.
  6. FPGA's are all about configurable logic.  Some logic makes a lot of sense in the fabric, other logic is cheaper within an soft-core.  Striking the right balance is sort of the point of this discussion.

Let me add that everything comes down to price.  Several prices are relevant: price per LUT, price per kB of block RAM, price for the tool-suite, size-weight and power price of integrating another chip on the circuit board, price of what it takes to design using the chip/board, etc.  Judging by your comments above, you believe that a Microblaze increases your "price".  After examining micro-blaze, I agreed (see slide #'d 7).  This was my motivation in designing the ZipCPU: creating a CPU that can be placed within an FPGA without increasing your price.  Any CPU you place into  your design needs to use a minimum of resources.  This was the reason for demonstrating the ZipCPU on a Spartan 6LX4 (the CMod-S6)--to prove that it could be placed within an FPGA without consuming too many resources.  (Yes, it can run a multi-tasking O/S, in less than 2,400 LUTs including peripherals.)

I am most certainly disappointed with Xilinx's code bloat.  Their DDR3 SDRAM code stole a quarter of my Arty's FPGA resources---before I even got to adding any of my own logic to it.  That's also without adding in the MicroBlaze processor.  Each of these has a cost.  Sure, there are times and places where  you want a CPU core within your FPGA, but you still want to use your FPGA as an FPGA along the way.  Something that consumes all of your FPGA, leaving you with little left over, will also have you grumbling about how much you have to pay to do your task.

I want small, efficient, FPGA components that 1) fit the job, and 2) leave me with the most of my FPGA left over for whatever else I want to use.

This is also one of the reasons why I love the Wishbone bus (B4/pipelined) so much: The AXI bus is a huge bloated combobulation consuming way more logic than I want to pay for.  The wishbone is much simpler, and the B4/pipelined version can easy handle one transfer per clock (when requests are pipelined).  Sure, it doesn't have all the features of AXI, but I'm going to argue that most of those features are unnecessary.

So, let me offer some reasons for wanting a CPU within a design:

  1. A soft-core CPU just makes sense for some logic.  In some examples, the alternative is to add a micro controller onto your design, add to your SWaP, and to suffer communications latency issues between the high speed FPGA logic and the CPU.
  2. It makes sense when trying to teach students how CPU's work--but only if all of the internal's of the CPU are available for inspection.
  3. A CPU allows you to share logic resources (the CPU) between multiple applications within the FPGA
  4. A CPU is the most generic state machine.

These are not reasons for including a MicroBlaze within a design.  For those reasons, you might wish to ask @Notarobot.  (I'm very opinionated in this regard against MicroBlaze, and against Vivado's pretty picture design method(s) for reasons we've discussed in other posts.)

Dan

Link to comment
Share on other sites

1 hour ago, D@n said:

These are not reasons for including a MicroBlaze within a design.  For those reasons, you might wish to ask...

Have to admit that I don't understand that sentence.... If you are saying that there are reasons for not using MicroBlaze I have nothing to add. If you are saying that there are reasons for not always using a MicroBlaze in all of your designs then I will repeat assertions that I've made elsewhere. This is a touchy subject in some quarters.

 

1 hour ago, D@n said:

Let me add that everything comes down to price.

Well, usually that's my experience. I have worked on  products that don't care about price but have to have maximum control over power consumption and or adding heat to a system operating in a harsh environment where there are limited choices for cooling options.

1 hour ago, D@n said:

A better tool-chain makes more sense in the case of a fixed set of known peripherals. 

When I think of a software tool chain I think of high level language adherence to standards such as ISO C or ANSI C so that syntax is less of an issue. I think of libraries supporting peripherals as an adjunct to the tool chain... but I'm not a software guy so I accept whatever correction you deem appropriate. Really, why I bring up the tool chain is that I want to code with confidence that I will get expected results, that I will be able to control optimizations, use inline assembly, and most importantly, not have to worry that every new release of the tools breaks my code developed a day before. I hate reviving a project from a few months previous that was in great shape and finding that I have another week making it work with whatever version I am using now. I really don't want the tools to make my life more difficult or define my schedule. If I provide a working design to a client and a month later he moves to Vivado 20xx.xxx when it doesn't work he will not be understanding, it will be my fault, and I lose. More often than not I don't use peripheral libraries... but then again most of my embedded software or firmware development depending on who you talk to, has been with DSPs coding assembly or simple pre-processor languages so maybe I'm not the best person to discuss libraries. One other concern that is vital to understand is that of risk. In industry you are always pushed to meet unrealistic deadlines and hiccups with tools can be deadly to a product schedule or an engineers career. For that one NIOS design there was as lot of negative reaction to my choice to use a soft CPU. Looking back it is still the correct choice because of the schedule ( 12-16 hour 5-6 days a week  with 3 FPGA/hardware engineers) and the fact that the hardware was always in development. When an experiment needed to be run there was no time to spend hours changing FPGA source and rebuilding configuration files. Recompiling C for the NOIS provided the quick turn around time to to useful experiments. Still... I paid a price for the tool shortcomings. 

 

2 hours ago, D@n said:

So, let me offer some reasons for wanting a CPU within a design:

  1. A soft-core CPU just makes sense for some logic.  In some examples, the alternative is to add a micro controller onto your design, add to your SWaP, and to suffer communications latency issues between the high speed FPGA logic and the CPU.
  2. It makes sense when trying to teach students how CPU's work--but only if all of the internal's of the CPU are available for inspection.
  3. A CPU allows you to share logic resources (the CPU) between multiple applications within the FPGA
  4. A CPU is the most generic state machine.

Hmmm... I'll have to cogitate a bit on this. I certainly disagree that A CPU is the most generic state machine. But we may have a difference of opinion on what should be called a CPU and what might be called a programmable controller. I've designed and coded bit slice logic and wouldn't call it a CPU. I have no disagreement with helping students conceptualize the elements of what makes up a CPU and how it works. So there's no need to debate the educational case for using a soft CPU core that has source and simulation open to inspection. Ditto understanding the concepts of compiler construction and many other relevant topics. I'll hold off on you #3 assertion for now.

Link to comment
Share on other sites

@zygot

"These are not reasons for including a Microblaze" means simply that.  My discussion was of soft-core CPU's in general, arguing that a soft-core CPU could be valuable in certain circumstances.  Consider this analogy, if you have screws (some problem sets) you don't need a hammer (soft-core CPU).  Other problem sets (nails) need a hammer (soft-core CPU).  But just because you need a hammer (soft-core CPU) doesn't mean you need a 4-lb sledge hammer (Microblaze) that requires two hands to lift and is difficult to apply to simple nails.  :o I'm not arguing that Microblaze is the right solution for all (or any) problems, but rather that there is a place for a soft-core CPU (not necessarily MicroBlaze) within the FPGA engineering disciplines.

If you want to hear from someone who is going to argue that Microblaze is a good, better, more appropriate solution ... then you are going to need to talk to someone else, because I'm not the one who is going to make that argument.  That was why I suggested @Notarobot might have a valuable opinion.  Others are welcome to share their opinions in this matter as well.  :D

Now, as for your comments about the tool chain, there are three peripherals where "ANSI" starts to make sense: the disk-drive (SD-card for example), the console (serial port), and the network.  If those are the only peripherals your CPU needs--then use your desktop or a Raspberry Pi.  If the peripherals you want are an OLEDrgb, an I2S sound port, a flash memory, a PS/2 mouse, buttons, switches, LEDs, a GPS time controller, or some other custom peripheral--then there is no ANSI solution.  In general, if you are working with an FPGA, your purpose is going to be to integrate such custom peripherals together--in which case there isn't going to be any ANSI solution.  In these examples, having to write a Linux driver for the routines or having to handle the headaches of virtual memory just so that you can line up with a standard tool-chain (i.e. Linux) may be more work than a simple memory mapped peripheral within an FPGA requires.

As for every new release breaking prior code, well, this is an industry problem that goes well beyond FPGA's.  Every time I upgrade my O/S I tend to break most of my applications.  It's an unfortunate reality.  I can keep this from happening, though, if I never upgrade my O/S.  ;)

I think the solution to upgrading tools breaking code is to separate the tools and the build chain from the code they produce.  For example, were you to use a ZipCPU, you are welcome to download the version of binutils and gcc that goes with the CPU.  You'd then have the source, and the ability to maintain it.  Should you then upgrade Vivado, the ZipCPU would still work--because its written in very portable Verilog.  Should you wish to upgrade the toolchain or the CPU itself (especially over any ISA break), well then we're back to the struggles with upgrading the components of any system.  Still, unlike Vivado, you'd be able to go back and look at the code to find out what gets broken, why, and how--just like any other piece of open source software.

There's one more advantage to the soft-core CPU I'd like to mention: some peripherals have very complicated logic to set them up.  Often these peripherals can be made much simpler by sharing that logic between a soft-core CPU and the peripheral.  In this case, the PModCLP comes to mind although it is only one example among *many*.  (An SD-card might make another good example.)  When I was last trying to build a controller for the 2-line display in Verilog, the controller was becoming quite complex and unmanageable.  However, if you adjust the controllers logic so that it gets shared between a soft-core CPU and the FPGA's HDL, the solution can be greatly simplified.  (I will entertain arguments that problem was becoming unmanageable simply because of my inexperience at the time, as that was partly to blame in my experience.)

Dan

Link to comment
Share on other sites

Gentlemen,

I don't have much to say in defense of soft CPU but I can see circumstances when they can be a very good choice, for example for ASICs.

In my observations the cost of labor on software development, especially, on FPGA is far exceeding the cost of hardware. One month of fully loaded salary of an experienced engineer might cost for a company from $20k to $35k or even more. At current prices on COTS hardware any savings on hardware would only be meaningful if a mass production in very large quantities is the goal. 

With this in mind my choice is usually based not on hardware cost but on project specs and for possibility for speedy development. I appreciate Zynq with ARM processors because of a large variety of libraries and solutions. It is relatively straight forward to implement I/O with external world. I believe ARM will be supported for a foreseeable future or long enough for me to worry.

With Zynq as my current choice I didn't have reasons for using Microblaze.

Thank you for the good weekend discussion.

Link to comment
Share on other sites

@D@n,

I won't argue the hammer verse 5-lb sledge argument. It's been my experience that any soft CPU core is a sledge. As to Notarobot; I know his opinion. I respect the fact that his opinion makes perfect sense to him and doesn't tack my experiences at all. No problem there. If @Notarobot chooses to weigh in on this I'll appreciate whatever he has to say. I greatly prefer a plethora of perspectives over just one; and if they are at odds with mine even better for me and any bystander. Anyone interested in this thread: Bring it on! That's not meant to be confrontational, it totally meant to be inclusive. I think that I know what a number of Digilent staff would say if they could speak freely; we've bantered the tool version issue before.

I don't understand your peripheral ANSI comment. My issues simply have to do with syntax and what the compiler thinks is an error. I used to read a monthly that had snippets from the people deciding what exactly the proper syntax fir C is. I've wasted a lot of time with compilers that didn't adhere to any standard and C isn't, in my experience, a cut and dry proposition; especially when it comes to micro-controller tools. My comment about standard C wasn't about peripherals at all.

I haven't used ZipCPU but I have used cores compatible enough to simple CPUs like those from Atmel because I can use their tool chain to produce code. And, don't get me wrong their tools can be a pain to use, but they are well supported and make less work for me. If I find a project where state machines or programmable controllers just aren't working then I might give your ZipCPU a try. With no CPU there's no sharing of logic. Every time I create HDL for an interface I've expanded my IP. That's the way that I want to work. I learn more, get better at complex logic and if I have a similar interface good commentary from old work helps frame the initial design approach. As you can guess I like the low level stuff better than the system design stuff. I understand where you are coming from but I can't agree with your argument that: "Often these peripherals can be made much simpler by sharing that logic between a soft-core CPU and the peripheral". If you design a nice stand alone interface you can use it anywhere. If you need to use a CPU to make it work then what do you do when you don't have room for the CPU? What have you learned about basic logic design? Maybe that's why you like using a CPU... huh?

 

Link to comment
Share on other sites

@Notarobot,

You got your post in while I was writing mine. Thanks for the view.

Embedded Software design is very hard. It's hard to get it to produce correct results under all operating conditions and it's hard to test for all of the operating conditions. You need code to test code. Even if your code was totally working an Alpha particle can come along and hits the CPU in the right place while executing the right subroutine ( however rare that might be ) an BAM! your CPU is accelerating the car instead of applying the brakes. (I don't want a drive by wire car... I want mechanical brake linkage). I can't count the number of new languages that will "fix" software development in my 35+ years and have come and gone and development is just more complex and difficult.

FPGA development is many times harder than software. There are way too many details to account for any way too many degrees of freedom for failure.; the worst being "random" "intermittent" errors. Just getting the same results from the same source is a big issue.Some companies can't figure out how to use FPGAs. I know of large companies with dozens of FPGA engineers who have issues they can't resolve across their development teams. I know of companies that have abandoned FPGAs all together. The tools set issues are to blame. Poor planning and a lack of understanding of a good development process is to blame. The complexity is to blame. And yet I have worked for small companies who have pumped out 2 new very complex products all having multiple FPGA devices a year and were successful. This is a huge topic that has another spot on the forum.

I happen to like the ARM based FPGA solutions too; that is when an ARM is necessary. Sometimes this is the case. Ethernet stacks, USB stacks, visual user interfaces... wouldn't want to do that without a well supported ecosystem, Managing code that includes FPGA source and processes as well as software source and processes is more complicated than managing either alone.

Link to comment
Share on other sites

@zygot,

Let me add another reason for wanting a CPU within a design: it's faster, cheaper, etc., to update the software within a design than it is to update the logic via the tools.  As a result, add hoc tests and designs can be created and tried very quickly with a CPU--but not within the HDL.

Dan

Link to comment
Share on other sites

@zygot,

I am totally agree with you saying: " FPGA development is many times harder than software. "

That is the point I wanted to make: leverage proven C-code solutions and use HDL when it is necessary, when it gives you substantial advantage in performance.

Nobody cares how you accomplished the task, how many problems you solved. In the end the customers want it to works as they asked. It is also gratifying to feel when you made smart efficient system using the most efficient combination of tools.

Every task has many possible solutions. That is why I think putting emphasis on architecture is critical and Vivado block design puts it in your view and make your analyze and question your design.

Link to comment
Share on other sites

@Notarobot,

19 hours ago, Notarobot said:

One month of fully loaded salary of an experienced engineer might cost for a company from $20k to $35k or even more

Sorry, but I just won't be able to sleep until I comment on this. By "fully loaded" do mean with bacon, and cheese, and  broccoli, and guac and salsa? Sounds fattening. Or do you mean that the corporate controller has a drinking problem? Either you are doing something wrong or more likely there is some magic accounting arithmetic involved. Since you mention COTS I assume the you are a military contractor. I've done some work in that business and most of them wouldn't spend a dime of their own money on productivity improvements even if it meant doubling net income. I'll bet your FPGA engineers would be surprised to know that they cost that much, you know given their salaries and the tools that they have to work with. Generally the products that they make have no support for test and debugging, they don't want to pay to have product sitting around in the lab, and heaven's forbid producing testable, debuggable versions of hardware that can't be sold to someone. So high development costs, and exorbitant costs for even simple upgrades are built into the process. Of course, if the customer is willing to pay for that why not?

As to your views on Vivado's point and click path to FPGA development I think that you have a way too idealized view of how easy it really is and a way too rosy view of what the real costs are. But I respect your right to that view. It certainly helps sell larger and more expensive than necessary FPGA devices.

Link to comment
Share on other sites

@D@n

15 hours ago, D@n said:

et me add another reason for wanting a CPU within a design: it's faster, cheaper, etc., to update the software within a design than it is to update the logic via the tools.  As a result, add hoc tests and designs can be created and tried very quickly with a CPU--but not within the HDL

This is a reason for (possibly in some circumstances) including a CPU in a design during the development phase. But I'll argue that having some simple debug interfaces and using the PC as the CPU I can do better.... and have less of an impact on the actual design, and support any kind of design easier and without custom CPU code to deal with. This is a feature of most of the Digilent Project Vault projects that I've submitted. I have offered the one case in my experience where a soft CPU as a core was the correct choice... for a LAB environment. If the ZYNQ had been available when I was working on that project I assuredly would have chosen that instead of NIOS ( or most likely an ALTERA FPGA with an ARM as Xilinx wasn't going to be a vendor for that project regardless of my input...) NIOS and MicroBlaze are good for entrapment not good FPGA development. Third party soft CPU cores are once in a blue moon a good choice.

Most products can't have post configuration programmability for a wide variety of reasons. If you are not creating a product, and your design only uses up 10% of an FPGA, and you want to add software maintenance and development, and your interfaces don't work without a soft CPU, and you just really like the idea of having a CPU in your FPGA then you may have a point. It doesn't match my experience at all, except for one case. I'd be happy to hear about the projects that you've done that just can't be done without a CPU, for a client or for your own projects. I have nothing to offer in that regard, aside from the ones that need a full Ethernet or USB stack or heavy display presentation... and then I'd use a ZYNQ based platform.

Except for playing around projects there just aren't many applications for the kind of situation that the above quote calls for. I'd much rather have a focused, fixed FPGA design with a USB or very limited Ethernet connectivity that does some sort of processing internally and sends data to and receives commands from a PC. That way any analysis or control can be done using Python, C, Ada, in my framework of choice if necessary, on hardware designed to do such work. Using an FPGA as a PC replacement just doesn't make sense for any project that I've wanted to do. I can have as much post-configuration programmability as I want with a USB UART for 98% of my designs. If there's a lot of data involved then I'll just use the USB interface or if the board has an FMC connector USB 3.0 as the control and data pipe. Everything else will be a PC application. The best interface for this kind of work flow is the PCIe interface using the KC705 sitting in my computer.

 

Link to comment
Share on other sites

@zygot,

Let me clarify my postings.

1. The term COTS is widely used in commercial industries as well. Most of control system integrators use COTS: PLCs, various measurements devices and subsystems. There are thousands of such companies doing nothing but system design and integration using commercial of the shelf components.

2. If you look at the annual salaries range in the US for the EE with FPGA skills on any job seeking site you might find that the range is between $75k and $140k depending on area location and expertise level. This translates into about $6.5k to $11.5k monthly. Now we should add benefits (medical and 401k add-ons) and multiply by the overhead expense coefficient which ranges from 1.6 to almost 3 in some businesses. With all variations of benefits and overhead calculations the total expense on such EE engineer will easily reach the range I estimated above. All this expense should be charged to the project in order for the company to be profitable. Obviously, the customer should allow such expenses when signing the contract.

I think we all know how squeezed budgets are these days. Technical progress created customer perception that advance development can be done sooner and for less money then similar previously. Under these conditions it is more efficient to get more expansive hardware if it allows for cutting the development time. The same is true regarding the software. BTW, as far as I know expense on software is always an overhead. Customers might allow to lease software but not to buy it on the project funding.

I would be happy to hear what is wrong with my assessment.

Thank you for useful discussion.

Link to comment
Share on other sites

@Notarobot

Item 1: No argument from me.

Item 2: Here's where we argue a bit. When I am negotiating a wage for an employee position I'm told that XX is the maximum hourly wage for someone in my "classification" and that benefits are 30-40% of base salary. If I am negotiating work as a contractor I'm told that YY is the maximum wage for a full time employee and benefits are 10% of base salary. When the company that I work for no longer is in the design phase and I'm unemployed I know exactly what the value of those benefits are when I use COBRA. As long as I've been an engineer the 401K benefits have been a hoax as it's rare the an engineer lasts long enough in one company to get vested in the company contributions. When companies offered pensions ( before I started out ) engineers nearing vestment status always got canned just prior to being eligible.  In my experience everything that I'm told is a tactic to get the cheapest labor possible. I can't think of anything that I've been told " this is company policy" prior to becoming an employee that has ended up bring true. There are no hard rules.  If I get the job I find that someone out of school and having no usable experience gets paid more than I do and sometimes has more status, because in companies salary is the measure of status and credibility, not experience, or talent, or skills or being technically correct; and I end up training that person. The classification scales are just a way to lower wages ( unless unions are involved ). Engineers are listed exempt 40 hour/week employees. They are required to work 60-70 hour weeks because that reduces the labor costs by 40%. It's all lies and it's all about how you want to do your accounting. I've run my own business. I understand capital expenditure and depreciation. This game can be played any way you want to. I've yet to meet an employer who wants to play the game to the employee's advantage. I've never worked at a place where an engineer doing the technical work gets more than $100K. Your numbers of $300-360K a year are a stretch even if we are talking about an occasional outlier. As an average it's unbelievable. I'm happy to do your engineering labor for $80/hr plus expenses... and it won't cost you anywhere near $300K/year; unless your accounting team finds a way to turn that expense into a profit.

Yes we do agree that we have squeezed budgets. And unrealistic schedules. And duress caused by monopolies and bad government policies. And other forces that will harm everyone in the long term. There are a lot of reasons for this. Mostly though, most companies these days are pyramid schemes that benefit a shrinking class of elite. This is not the world that existed when I was a lad. But then public companies weren't the play things of stock brokers manipulating an imaginary pool of cash, trying to keep the biggest pyramid scheme of all time afloat.

Link to comment
Share on other sites

@zygot

I agree with the picture you described. There are no illusions here.

However, I had better luck with companies and believe me there are big and small companies paying more than $100k. Check for yourself, for example indeed.com, and will see what new hire can get. It varies by the state and location. To my knowledge CA, northern VA, MA offer pretty high paying positions. In my opinion the best overall job is in R&D.

Have a good night!

Link to comment
Share on other sites

@Notarobot,

yeah... and we kind of got off the subject. Neither of us wants to to get me going on the glamorous life of an electronics design engineer; especially one with a lot of experience. But the whole cost of development idea is valid and germane to the discussion thread.

So, does anyone want to make a case for the project that they've done that absolutely had to have a soft CPU core? I know that there are some. I'm just arguing the position that they are the unusual case and not the norm.

Link to comment
Share on other sites

@zygot,

At issue is not whether or not a project needs a CPU soft-core, but first whether or not a project needs a CPU.  If you need a CPU, you can either create an external one--such as another chip on your board, or you can create one within the FPGA.  I will argue that from a BOM and layout perspective, placing one chip on a board is cheaper than placing two chips on a board.  Hence, there's value in a soft-core CPU.  If the soft-core CPU, on the other hand, forces you to need a more expensive chip, then the benefit of the soft-core CPU isn't nearly as clear.

You've done many designs so far with both a CPU and an FPGA on the same board. Let's just let these be your examples.  They reference designs that need both a CPU and an FPGA.  There's a time and place for both.  I get that.

In your case, the CPU was a separate chip on your BOM.  There was a cost to purchasing that extra chip, and to integrating it onto the board with everything else.  Consider the benefits if you could fit this within the FPGA, now, in terms of simplified circuit design.

Just something to think about.

One reason not to use a soft-core CPU, from a project I was involved in some time ago, was that the project needed to have an extremely low power mode.  We used the external CPU (an MSP430) to cut the power to the FPGA.  That's one CPU capability that cannot be done within an FPGA.  ;)

So let me reiterate my list of reasons for having a soft-core CPU within an FPGA, and let me add one more reason as well:

  1. A soft-core CPU just makes sense for some logic.  It can be the most appropriate tool for the job.
  2. It makes sense when trying to teach students how CPU's work--but only if all of the internal's of the CPU are available for inspection.
  3. A CPU allows you to share logic resources (the CPU) between multiple applications within the FPGA.  (i.e., the CPU can do one task at one time, and another task at another time, without needing any more resources than the single CPU needed in the first place.
  4. A CPU is the most generic state machine.
  5. It's faster/easier/cheaper to change the software on a soft-core CPU, and hence what the FPGA does in general, than it is to build a modified FPGA design.  (2 minutes vs 10+minutes)  Hence, development time is faster.
  6. If you need a CPU + FPGA solution, getting both on the same chip can save both design and BOM costs
  7. A soft-core CPU integrated into the internals of an FPGA can communicate information with the FPGA at a much higher speed (over whatever bus the CPU is connected to--AXI or Wishbone), than the FPGA can communicate with an off-chip processor. 

Dan

Link to comment
Share on other sites

@D@n,

10 hours ago, D@n said:

At issue is not whether or not a project needs a CPU soft-core, but first whether or not a project needs a CPU.

Uh... NO. That's not the topic at hand. Programmable logic has had a supporting role in products with a microprocessor or micro-controller since before the first FPGA device appeared. I know because my career spans a period that pre dates the first programmable PAL from AMD. It almost spans the life of the microprocessor. No one wants to debate the topic of whether or not a CPU is necessary for a project or product ( of course there are projects and products that need one ). No one wants to debate the topic of whether or not a project with a CPU needs programmable logic ( it doesn't ). No one wants to debate whether or not having an FPGA with an embedded hard ARM processor is a great idea ( I started out the thread stating that it is ).

This topic was created as a result of another thread in which I stated that FPGA projects don't need a soft CPU core and that it is a dis-service to students and beginners to give the impression that you can't develop without one. Specifically, I chastised Digilent for selling FPGA boards with interfaces and peripherals that have no design support without requiring a MicroBlaze, and the block diagram flow. All of their board demos use a MicroBlaze and the block diagram flow. Worse yet their more expensive boards require using non free IP and a temporary license to see a few of these interfaces work. That's the topic. But I applaud the attempt at reworking the topic to avoid the fact that you can't defend your position. Have you ever thought of being a lawyer or applying for a job as the President's spokesperson?

This thread is, indirectly, about whether or not Digilent should be helping customers new to FPGAs with learning HDLs and FPGA development processes or selling a concept and design flow that is not the norm or used in industry. It is, indirectly about whether an FPGA board vendor should support its products so that that customers have the freedom to take advantage of all of its resources without have to be tied to one design flow that is not necessary.

So, don't tell me about designs or projects that need a CPU. Don't change the topic. Let's discuss an FPGA project that you've done that required a soft CPU core to function. One that isn't for educational purposes, because we agree on that use case.

As you your revised list of assertions:

10 hours ago, D@n said:
  • A soft-core CPU just makes sense for some logic.  It can be the most appropriate tool for the job.

Sure, we agree that it might make sense for some logic. We just haven't discussed what logic you have in mind. So provide the case that illustrates your assertion.

11 hours ago, D@n said:

A CPU allows you to share logic resources (the CPU) between multiple applications within the FPGA.  (i.e., the CPU can do one task at one time, and another task at another time, without needing any more resources than the single CPU needed in the first place.

This is just plain nonsense. Simple HDL constructs provides the greatest flexibility for processing tasks in parallel and sharing logic resources. A CPU is a bottleneck that executes instructions sequentially. It requires more effort to develop and maintain a whole new level of complexity, which is the code and code development tools and maintenance. Simple HDL constructs allow you to share logic resources without any of this. A CPU requires using significant amounts of resources and will have a negative impact on place and route and timing. You can use a soft CPU as a crutch but it is not a basic and necessary construct for logic design.

11 hours ago, D@n said:

A CPU is the most generic state machine.

Nonsense. A state machine is the most generic state machine. A CPU is logic that has an instruction set for doing operations not normally performed by a state machine and has a programmable programmatic flow. A CPU can do control. A state machine can do control. Of course there are times when a CPU is useful. So make the case where your state machine has to be replaced with a CPU. But don't slip in a case where a CPU is necessary... that's another topic. Talk about a case where you can't design a state machine because the "CPU is more generic". The only requirement demanding the use of a soft CPU core is programmability. I can do mathematical algorithms without programmability or a CPU. I can do DSP processing without programmability or a CPU. It's rare that I need programmability and if I do I'll design a programmable state machine. A CPU is just overkill and unnecessary functionality; unless of course you need to use a CPU in your project. A CPU is not a generic state machine, But projects that require programmability is not what this topic is about.

11 hours ago, D@n said:

It's faster/easier/cheaper to change the software on a soft-core CPU, and hence what the FPGA does in general, than it is to build a modified FPGA design.  (2 minutes vs 10+minutes)  Hence, development time is faster.

It's faster and cheaper to not have a CPU and not have software to deal with. It's better to learn how to do logic design properly so that you don't waste FPGA resources when you don't have to. Yes, one time in my experience I needed a CPU in my FPGA. That's because the team decided to use FPGA devices for all processing and not use a microprocessors. Again, this topic isn't about projects that require a microprocessor. Yes it's usually quicker to modify and debug software than logic. But it's quicker to design correct logic without a CPU. Its better to develop the skills of logic design without a post configuration programmable crutch. In the real world no one builds a product with nebulous functionality. If you want to work as an FPGA designer you better know how to cram as much functionality into the smallest cheapest FPGA possible. And your product is not likely to have field programmability available, I'll argue that I can get along just fine doing FPGA development without a soft CPU core.

11 hours ago, D@n said:

A soft-core CPU integrated into the internals of an FPGA can communicate information with the FPGA at a much higher speed (over whatever bus the CPU is connected to--AXI or Wishbone), than the FPGA can communicate with an off-chip processor

Rubbish. Publish the project of your choice demonstrating your soft CPU core of choice that transfers data at a higher speed than 8 lanes of PCIe or even a single 5 Gbsp transceiver. I've used both and never needed a soft CPU core to use either. Data transfer is limited by the interface not the internal logic in an FPGA ( unless you are doing something wrong...) Using a soft CPU core has nothing to do with transferring data from an FPGA to anything. Your CPU is a bottleneck if it's involved.

 

Link to comment
Share on other sites

@zygot,

Sorry, I guess I hadn't realized that I had gotten off topic.

So let's assume then that your design needs both a CPU and an FPGA.  At that point the question becomes which is simpler to build, and which has a lower price.

But this was not where you took the discussion, so let me try to follow you.

You would like to change this discussion from whether or not soft-core CPU's are useful, to whether or not every one of Digilent's example designs should be using one.  To that question, I think my blog speaks for itself: I'm presenting discussions about how to program FPGA's in HDL (verilog specifically), without using any of the vendor gimmicks to do so.  Further, because I like to use Verilator to test my designs, I am in many ways restricted from ever using proprietary IP (or, at least, I haven't figured out how to combine proprietary IP into a Verilator simulation ...). 

For this reason, the example designs I've discussed on this forum and within my blog are pure Verilog creations.  I've shared designs showing how to handle block RAM, QSPI flash interactions, how to communicate with an SD-card, how to use a PModAMP2, the PModGPS, the PMod keypad, USBUART, OLEDrgb, the two-line PModCLS LCD driver, and more.  The more complicated designs include a soft-core CPU, the ZipCPU--which you may, or may not include at your choice.  I've also got a demo design showing how to use the DDR3 memory on the Arty without using any more Xilinx IP beyond the MIG generated core.  Things I'd like to present include a demonstration of how to generate a VGA signal, an HDMI signal, and even an open-source focused method for adding and removing items from a designUpcoming blog posts will discuss things like how to use a CORDIC to generate a sine and cosine waves, as well as how to build and test digital filters.  I'm still hoping to handle the differential PMod challenge, but just haven't gotten that far yet.  Posting on that topic will require discussing how to generate an arbitrary frequency clock generator from an FPGA design, how to handle clock transfers, gear boxes, synchronization(s), and more.

The blog also has a particular emphasis on how to find errors within a design--something that is very difficult to do with Xilinx's pre-canned IP.

This has been my response to Digilent's focus: presenting the focus that I think would be more valuable.  If people like my approach better, then bonus.  If not ... then these ideas will be voted off the reservation.  In both cases, individuals can vote with their feet mouse as to which approach they like better.  That said, the blog did get over 13k hits just this last week.

Of course ... I'm not getting paid (much) to do this, so if you want to support me then please feel free to become one of my Patreon sponsors;)

Dan

Link to comment
Share on other sites

13 hours ago, D@n said:

You've done many designs so far with both a CPU and an FPGA on the same board. Let's just let these be your examples.  They reference designs that need both a CPU and an FPGA.  There's a time and place for both.  I get that.

@D@n,

Forgot to reply to this. Yes I have. But most of my FPGA work does not have an FPGA and CPU, or an FPGA and a DSP, or an FPGA an any programmable controller. They were for hardware targets with just an FPGA. I don't think you realize how superfluous a CPU device is for most applications. Yes, for a lot of my work the system has a computing device somewhere. But the board with the FPGA doesn't. The functionality of the FPGA is used to support the computing device... not the other way around. This is my perspective. Logic, whether in the form of RTL ( that is transistor-resistor-logic), MSI packages or 1600 ball FPGA devices are generally supporting elements that do what a general purpose fixed functional micro-controller can't do well.

So I'll pose another assertion: Almost anything that you can do with a CPU and ROM I can do with state machines using fewer resources, less superfluous functionality, having better timing and complications for optimal place and route of the design and and without the hassle of software.

Link to comment
Share on other sites

@zygot

Ok, I'll bite on this one.

4 minutes ago, zygot said:

So I'll pose another assertion: Almost anything that you can do with a CPU and ROM I can do with state machines using fewer resources, less superfluous functionality, having better timing and complications for optimal place and route of the design and and without the hassle of software.

Yes, agreed.

However, there's one thing a state machine can't do: reconfigure itself for another problem.

Let me explain: many peripherals these days are requiring more and more software-types of interactions to get them started up.  Examples would be DDR3 memories, that OLEDrgb, the PMod CLP, and more.  Even the full interaction with a flash memory can generate a *very* complicated state machine.  While all of this logic could be accomplished via a state machine rather than inside a CPU, doing so will require a different state machine for each peripheral.  As an alternative, a soft-core CPU could handle *all* of these tasks, one after another or even all three interleaved with each other, and so the state machines would no longer be required.  This was what I meant when I said earlier that a CPU is the most general form of a state machine.

Dan

Link to comment
Share on other sites

@D@n

5 minutes ago, D@n said:

So let's assume then that your design needs both a CPU and an FPGA. 

Let's not make this assumption as this is a different topic than the one that I posted.

6 minutes ago, D@n said:

You would like to change this discussion from whether or not soft-core CPU's are useful, to whether or not every one of Digilent's example designs should be using one

No, I would like to keep this particular thread on topic as to whether or not a soft CPU is necessary for doing FPGA development. The issue of Digilent's support of their products and an assertion that Spartan 3 devices are ancient and somehow not worth using prompted my assertion that they are not. It is a related and relevant side topic.

I appreciate you desire to turn this into an opportunity for self promotion. May I suggest that your efforts at that would be better served by providing real examples that support your case? It's easy to state a position and avoid defending it. It's a bit riskier to publish actual examples open to inspection and debate. That's what I'm asking for; supporting examples of actual work illustrating why basic HDL constructs are inadequate fir general FPGA development. A CPU is not a basic HDL construct but a complex design element useful for particular applications.

Link to comment
Share on other sites

2 minutes ago, D@n said:

However, there's one thing a state machine can't do: reconfigure itself for another problem.

I probably wouldn't use a simple state machine to address that problem. I might design a programmable state machine without the extraneous logic of a CPU. I've actually been working on a demonstration of a programmable state machine for the Project Vault so I've been thinking about this. The biggest issue is coming up with example applications for it that can't be done with standard state machines. I might add a UART interface to my design to change the functionality of a state machine post configuration. I might even use partial reconfiguration ( most FPGA vendors have this capability ) to alter the functionality of an FPGA on the fly in an operating system. I'd only use a CPU if that was absolutely necessary. You haven't come up with a single use case to require any of those. I'm talking about real world stand alone applications that use an FPGA.

Link to comment
Share on other sites

@zygot,

A specific example?

How about a QSPI flash controller.  You can see the state machine I built for controlling a fairly generic QSPI flash here.  The controller is required to be able to both read and write (erase+program) the flash device.  Further, it is also required to be able to read the ID and the status/control/configuration words of the device.  (Reading the ID helps to prove the device works, you are likely familiar with the others--for those who are not, feel free to look up the QSPI flash data sheet or just ask.)

A much simpler QSPI flash controller, one that only provides read access to the device, can be found here.  I needed the simpler controller in order to try to fit on the Spartan 6/LX4 found on the CMod-S6.  (Incidentally, I'd recommend this flash approach to anyone struggling with a small design, such as a Spartan 3)

There are two stressing use cases associated with the controller: rewriting the .bit file onto the flash (requires lots of reading and verifying), and reading values from the flash when running.  Both of these need to be done in as short a time as possible (a 5 minute flash rewrite can be annoying), subject to the requirement of using only a minimum number of resources on the device.

Does this operation require a CPU?  No.  RTL code was presented above that doesn't require a CPU, and I regularly use the controller without the CPU.

Could the controller be built simpler by using a CPU to handle the not-so-standard interactions (get device ID, erase, etc)?  I think so.

Dan

Link to comment
Share on other sites

@D@n,

19 minutes ago, D@n said:

Let me explain: many peripherals these days are requiring more and more software-types of interactions to get them started up.  Examples would be DDR3 memories, that OLEDrgb, the PMod CLP, and more

Don't know how I forgot to address this. None of the things you refer to require software. I've used every peripheral that you can think of without a CPU or software. The SDRAM and DDR happens to be a sore spot with me as Xilinx and Altera make it very difficult to design with these types of memory without using their soft CPU core or vendor specific flow. When a FPGA board vendor has one of these on their board they should support it without requiring that design flow.

Link to comment
Share on other sites

@zygot,

Not all CPU's are "complex design element"s.  Consider this one as an example: it's based upon the generic forth based stack computer design, and much simpler than even my favorite ZipCPU.  Simple commercial 8-bit designs exist as well--such as picoblaze for an example.  These are simpler than a microblaze, and might be better suited to the Spartan 3.

As for the demonstration you mentioned: I'd love to see it, and look forward to the time when you might share it.

Dan

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...