Jump to content
  • 0

Behavioral simulation ignores integer ranges.


FlyingBlindOnARocketCycle

Question

on a simple design that adds or subtracts button presses on a basys3, I have the counter signal set as an integer from 0 to 7.  When implementing this design it performs as expected.  When counting up or down the 7 segment display goes from 7 back to 0 or from 0 to 7 when exceeding the integer range of the counter.  

When running this design in the behavioral simulation, the range is ignored.  Counting from 0 to (-1) does not show a value of 7 but that of a 32 bit vector of all 1's. (negative 1)

Thoughts?

Link to comment
Share on other sites

2 answers to this question

Recommended Posts

I believe you can set a simulation flag to enforce range bounds checking, and it will error if you end up with an out-of-range value. It is off by default.

You got it lucky with your range testing on hardware - if you had a a range from 0 to 5, it would most likely still count through 6 and 7. The synthesis tools has this sort of internal dialog:

- Hum, this is an integer will be used with values between 0 and 7, as that is what the HDL tells me.

- I could replace it with a UNSIGNED that can hold values between 0 and 7, as it only needs positive numbers.

- To hold the numbers between 0 and 7 I only need 3 bits.

- Great! I will implement this signal as a 3-bit unsigned binary value

- Because this is a 3-bit value I can update it with a 3-bit adder

...

I doesn't statically inspect the rest of the code to enforce that you will only use numbers in the stated range. It also doesn't put any additional logic enforce the range as stated (e.g. clamping, or wrapping of results), you have to ensure that your design stays within the stated range.

For this reason I prefer to use UNSIGNED(n downto 0 ), so I know exactly what the tools will produce. I am sure others will disagree, and prefer the higher level of abstraction, and finding ranges is very useful in simulation (as long as the "range check" option is turned on!).

... diversion...

An important consideration for this is when inferring memory blocks. If you define a memory block that has 768 entries, you might end up with a physical memory block that has 1024 entries, 256 of which you might not be using. Or it might be made up of three blocks with 256 entries each, and the required logic to make it work as if it is a single memory block.

So why is this important?

Well, if you set the write address to an invalid index of 1023 or 768 (as only 0 to 767 are valid) you might find that you corrupt entry 511 or 256 by accident. Or maybe you won't today, depending on what the synthesis tools feel like doing and the address decode logic during the last build.

The tools are designed to take as many shortcuts as possible to give you exactly what you asked for, no more and no less, with most optimized use of FPGA resources. Don't be surprised if unexpected inputs give unexpected outputs ;)

Link to comment
Share on other sites

Hamster for the win!  .... again!

When I picked range 0 to 7 for my little test, I did so because I believed the tools would set a binary digit limit that is required for the range. A 0 to 7 range would set a "roll over"  but now I see your suggestion of a range 0 to 5 would have also verified the performance of a 0 to 7 output and with more clarity what the tools are doing.  Like you pointed out nothing will enforce the range, but a range of some sort will get enforced by the binary digit limit, dictated by the range.  You would think that sort of check would be on by default in simulation.

Your answer was very educational. 

Thanks.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...