kc5tja

Members
  • Content Count

    13
  • Joined

  • Last visited

About kc5tja

  • Rank
    Member

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. The bill came due because of laziness. Speculation which respects permissions boundaries would have been perfectly fine. It is the fact that CPUs speculate without respect for permissions is what lead directly to Spectre (at least the version that lets you read into kernel memory). That said, my plans are not to go hog-wild with runtime optimizations. An in-order pipeline is a natural, relatively inexpensive performance boost. As I indicated elsewhere, I already have a CPU that runs at 25MHz, but it needs minimum 3, maximum 7 cycles per instruction. I'd like to drop that as much as I can. I really enjoy estimating performance by counting instructions and treating them as single-cycle abstractions. It's also a great help when bit-banging I/O. Out-of-order and/or speculation are necessary only to compensate for ultra-deep pipelines. Keep the pipes short, and you simply don't need speculation to meet your performance goals. Much of the performance gains you'd expect to come from superscalar execution can be had with macro-op fusion.
  2. Talk of pipelines is poignant for me, as one of the biggest differences between the Kestrel-2DX's KCP53000 and the Kestrel-3's KCP53010 will, in fact, be that the latter has a 5-stage (maybe 6-stage, not sure yet) pipeline. They should otherwise be software compatible with each other. (The other being that some form of memory protection will be introduced; probably in the form of software-managed TLBs.)
  3. Yes. You are going to fail. You are going to fail hard. You are going to fail so hard, you'll want to flip your table, walk away, curse everything as a waste of time, and never look back. Do all of these things; except, I'd recommend not flipping that table. I find the cursing to be cathartic, and the walking to be mind-clearing. Maaaaaaaybeee try not to be as public about the cursing as *I* have been. I have a reputation. You might not, and it could damage yours. But if you must, curse into an empty room. Scream loud if you must. Get it off your chest; then, get back on the wagon. Walk away; walk far, far away. Never look back; if you do, you'll tag some of that baggage along with you. Drop it like a moldy sack of hot potatoes. However, as I said before, don't flip that table! Even though you might not look back, that doesn't mean you won't *be* back. Life finds a way. It always does. It just takes longer than you'd like sometimes. Instead, strive for small victories. Remember where things last worked. You are exploring a multi-faceted design *space*, not a single path on a 2-dimentional map. My 14-step development plan I wrote above? It's just my current vision. It WILL change. And so will yours. Accept this as normal. Frustrating!! Absolutely! But definitely normal! Because after you walk away, eventually, you'll want to return. And when you do, you can wipe the table clean, and go back to the last thing you know worked. Pick up the pieces from there and build upon your successes. Your progress will be a slog, but eventually, you'll find a way towards your goal. I'll let you know when I've found mine.
  4. In order of mention... Status on Kestrel Project. I went back to working on the Kestrel-2 and creating a refinement of this architecture. Instead of the 16-bit stack architecture CPU, however, I replaced the core with my KCP53000 CPU, a M-mode only RV64I RISC-V processor. This has allowed me to expand the design of the computer rather significantly relative to the original design. Kestrel-2's address space was laid out like so: $0000 $7FFF Program RAM $8000 $BFFF I/O space $C000 $FFFF Video RAM The block RAMs were pre-loaded with software to run at synthesis time. There is no ROM, and the video display was driven at 640x200 monochrome (bitmapped). The Kestrel-2DX, the modern incarnation of the basic concept, is substantially renovated. As indicated above, the CPU is now a 64-bit RISC-V core, with a memory map as shown here: http://chiselapp.com/user/kc5tja/repository/kestrel-2dx/wiki/Memory Map It has a proper ROM (which is implemented in Verilog as a giant case-statement because I don't have enough block RAMs to use as a ROM) which holds a very minimal BIOS-like thing. This frees up quite a bit of space from RAM, where I am currently writing a dialect of Forth to serve as its system software. This design is, however, pushing the limits of the Digilent Nexys-2 FPGA board. Although I have plenty of logic left, the fact that the ROM is synthesized from LUTs is enough of a burden to drop the maximum clock speed to just above 26MHz, which is dangerously close to the 25MHz it's designed to run at. Of all the computer designs I've made, I've been especially happy with this one. Despite not being finished yet, I'm having a total blast with it, which is exactly what I wanted from my neo-retro computer designs. It looks, feels, and behaves like a classic computer, despite having a modern 64-bit core. I've won. (I just need to finish Forth for it!) The Kestrel-3 will be a new computer design with somewhat more modern capabilities. First and foremost, it'll be my first design based around the Chisel-3 DSL. I've finally learned enough to feel comfortable with it. (Another personal victory!) The K3 will be built using only open-source FPGA boards though (e.g., BlackIce and/or icoBoard Gamma), which can be targeted with the Yosys development chain. There are several reasons for this, not the least of which is because I want to support that community. I'm planning on a computer with two boards: one comprising the CPU and RAM, and another comprising "the chipset" of the rest of the computer (e.g., video, SD card, keyboard, sound, etc.). Originally, I wanted to target the Altera/Terasic DE-1 FPGA board (since it's available for dirt cheap these days), but I've received enough feedback from my friends and followers of the project that they wanted to follow along but were hesitant to install Altera's ginormous IDE on their box. They wanted something that could run reliably on a Raspberry Pi, and right now, that means Yosys. This fundamentally changes my plans for this computer, and it's not clear I have a good design for it yet. One thing is clear though -- the Kestrel-2DX will end up being an early development terminal for the Kestrel-3. I eat my own dogfood. The Set Back. This problem still exists. The Nexys-2's PSRAM chip remains dead to the world for me. I've long since given up with this chip. Near as I can tell, the *only* project that successfully reports success with it is the Nexys-2 BIST bitstream, which leads me to simply not trust this BIST. I *have*, however, written designs to access the SRAM on the icoBoard and have successfully confirmed my ability to read and write to that board's SRAM chip. So I'll be going that route. Another reason to use these boards instead of the DE-1; anything more complex than basic SRAM is straight-up frightening to me. I've been burned enough to never want to use them again. Once I get a working platform that boots on its own with SRAM but without SDRAM, then I have a basis on which I can tweak the design and run software to exercise the SDRAM chips. With luck, things will work. But I want a known-good platform first and foremost. The Future. I never made progress with my original Kestrel-3 design or intentions. Reverting to working on the Kestrel-2 and upgrading it to the new Kestrel-2DX design has restored my interest and faith in my abilities as a hobby hardware designer. While I still have plans for the Kestrel-3 (see http://chiselapp.com/user/kc5tja/repository/kestrel-3/wiki/Base Specs), it's not clear how I'll achieve these goals just yet. My current plans are to perform the following broad steps for development: Develop a dumb GPIO adapter. If I stick with Wishbone B.4/Pipelined, this is already done. I've been strongly considering switching to TileLink TL-UL though. This might give me wider access to parts written by others for the RISC-V ecosystem. Develop a debug controller, where I can send read/write byte/half-word/word/double-word requests to. Since I have access to raw GPIO on the Kestrel-2DX, this is not likely to use RS-232 framing or anything. It'll probably be bit-banged, for simplicity's sake. A few PMODs will be needed for this. This will serve as a surrogate for the final CPU design that I intend. Make sure I can toggle LEDs using the debug port interactively from the Kestrel-2DX. Port my Serial Interface Adapter core to the Kestrel-2DX. Confirm it works in loop-back mode. Port my Serial Interface Adapter to both the Kestrel-3 designs. Interactively confirm that the serial link works on the Kestrel-3 in loop-back mode. Interactively confirm that the serial link works between the 2DX and the 3. Develop final SRAM interface. Make sure I can perform basic RAM tests interactively from the Kestrel-2DX. Develop a "ROM" system using block RAMs. (from CPU's perspective, it's ROM; from debug interface, it's RAM.) Make sure I can write to and read back from the "ROM" interactively from the Kestrel-2DX. Port the KCP53000 to run on the new platform. Write first-boot firmware that writes "Hello world" to the SIA or something. Upload it from the Kestrel-2DX. Boot the Kestrel-3 for the first time, and hope for the best. This will likely change as I learn more about the design. Note how none of this even concerns itself with the graphics, sound, or other goodies I've been looking for. Unlike the Kestrel-2DX, it doesn't even have the MGIA to fall back on. This is because the CPU will consume the overwhelming majority of the iCE40HX8K part; I'll probably need to off-load the niceties to a slave peripheral that's PMOD-accessible.
  5. Hang tight folks; I'm catching up. Will respond shortly.
  6. Thanks! That did the trick. The circuit seems pretty flaky, but it does generate video. If I toggle SW0, the video it produces isn't always aligned with the video edges, but it is stable.
  7. I have the s3e1200 chip.
  8. @sbobrowiczThanks; looking over the VHDL, what I can make of it at least, it's doing what my own state machines are doing. Unfortunately, I was unable to synthesize the project for my Nexys-2 due to errors involving collisions on IOBs.
  9. Two definitions of "it works" are, (1) I can use external RAM for a video frame buffer, and (2) the CPU runs no slower than 2 MIPS. So far, the lack of this chip's ability to adhere to published documentation on its timing parameters forced me to degrade this project's features twice already, and leaves me to believe that the Nexys-2 is simply inappropriate for what I'm trying to do. I'm not willing to do it a third time. I'm evaluating now if my time would be better spent just porting the project to use another FPGA board equipped with real static RAM.
  10. Adding registers to the address and/or data output buses (the only signals w/out registers) introduces a clock of latency. This means now I'd need _seven_ clocks per individual hit to RAM, and it would rule support for 16-color 640x480 displays impossible. Like I said on IRC: accesses to memory must consume no more than six clocks. That is a hard limit. Also, does a Spartan 3 series FPGA even have ODDRs? I don't recall reading about them in the Spartan 3 family documentation. Maybe I missed them.
  11. It works with the BIST code, but that's the only place it seems to work. Under any other circumstances I've attempted to use the PSRAM chip, it has either failed outright to commit writes, OR, has failed to drive the DQ pins on read, as per my bullet list above. The EPP controller is inscrutible to me, I don't know what it's doing or how it does it. Moreover, it's bottlenecked with an 8-bit bus; I really need a 16-bit bus. Per your suggestion on IRC about using an on-chip logic analyzer/scope thing, I attempted to write one that would work for my hardware, but it ends up crashing ISim when I attempt to simulate it. I'm --><-- this close to just throwing in the towel on the project, after having spent years working on it. I'm about ready to tell people, "You get 48KB of block RAM, and that's that. Have fun." No more 16MB, not even 256KB. I'm just so very, very, very frustrated right now.
  12. Greetings everyone. I'm Samuel A. Falvo II, creator of the Kestrel-3 home-brew computer project. I'm currently using the Nexys-2 board as a development platform, since that's a known-working software/hardware configuration for me. My plan is to migrate to something different/better after I get a working reference model here. Related to this project, I'm completely ineffectual at getting the PSRAM chip to function. I was wondering if someone else has created a 16-bit Wishbone(-compatible/-like) interface to the PSRAM chip that I can re-use in my MPLv2-licensed project. After a month of trying to hack my own, I've come to the conclusion that my PSRAM chip is utterly dead on arrival: When in asynchronous mode, I could never get the chip to respond to memory writes beyond a certain (and seemingly completely unpredictable) point. But, it could read OK, as evidenced by a stable image of noise on my monitor (video refresh is configured to use external memory). When in synchronous mode, I'm utterly unable to get the PSRAM chip to drive the DQ pins when trying to read from the chip. As a result, video shows a stable pattern of pixels corresponding to the last 16-bit word written out to memory (since this precharges the DQ lines, the video circuit sees this state and treats it as valid pixel data). It is completely unknown if it's actually responding to write transactions. I just don't know, and honestly, I'm getting rather desperate now. So far as I'm able to tell, I'm well within the timing parameters of asynchronous and (especially) synchronous mode operation. If there's an existing design I can re-use to sanity-check my design, that would be especially helpful. Thanks in advance.