Profile cover photo
Profile photo
Jamie Iles
Software professional and hardware tinkerer
Software professional and hardware tinkerer
About
Jamie's posts

FPGA tools seem to be the way to take the fun out of an FPGA design.  Over the last few evenings I've been working on optimizing the Oldland SDRAM controller to perform bursts to reduce the cache fill+miss penalty, and got everything working pretty quickly in simulation using a Micron model.  Once I put that on the FPGA though I saw memory corruption after a few seconds in u-boot.  SignalTap might have helped if it could run at the same time as another virtual JTAG user and didn't have tiny buffers.

I suspected that it was a timing issue and needed a change to the phase shift of the SDRAM PLL, and it was partly this, but compared to software development, these tools are so far behind.  To calculate the phase shift you need to know the timings for the external device from the datasheet (fair enough), along with timings for the FPGA design.  Then you plug them into a couple of calculations and use the results to calculate the phase shift, plug that back into the design tools, constrain the design with some horrible TCL which seems to be reimplemented for every design, no two the same and then resynthesised.  Finally you need to check that none of that has changed the FPGA timing used for the calculations.

I can't believe that this is the recommended flow, rather than telling the tools that a collection of pins are a synchronous interface and data should be center aligned to a given clock with the given external timings and that can then constrain the design and calculate the phase shift?  To me, this is the equivalent of constructing a delay in software by writing a for loop in C, looking at the disassembly to count the instructions, checking the instruction timings in the CPU datasheet and adjusting the loop count.

Anyway, it works now and SDRAM accesses have much higher bandwidth!

#quartus   #fpga  #timing

Post has attachment
The Oldland CPU has progressed quite a bit since my last update and is now somewhat usable.  On the hardware side of things I have converted the caches to configurable N-way set-associative caches and added software managed TLB's along with a user mode, so the system is now capable of running a real OS with privilege separation and memory protection.  The DE0 nano board gained a Microchip ENC28J60 module so it can now talk to the outside world over Ethernet.

In the software world I have ported u-boot, newlib and RTEMS so it's running real software.  Correctly handling interrupts in RTEMS was somewhat fiddly and exposed bugs in the RTL, simulator, gcc port and RTEMS itself!

The next tasks are optimization of the SDRAM controller, which is a bit of a bottleneck, and porting the Linux kernel.

#verilog   #rtems  #fpga

Post has attachment
I've been hacking away on my 32-bit RISC CPU (https://github.com/jamieiles/oldland-cpu) and things are coming along nicely.  Instruction+data caches are working and I've also added an SPI master for talking to SD cards and other peripherals.

Simulations using Icarus were a bit slow for my liking (tests took about 25 seconds to run each time, compared to about 0.2 seconds in the C simulation) so I got the simulation running under Verilator and that only takes about 0.4 seconds to run.  It also warns about more errors that Icarus doesn't, and in a fashion that puts Quartus to shame.

On the software front I have a working port of GCC with thanks to +Anthony Green's Moxie GCC port as an excellent reference.  There's still some optimization work to do, but that's the same with the rest of the project!

Post has attachment
My hobby project 32-bit RISC processor.  Very much a work in progress, but the on-chip bootrom + sram are working along with SDRAM, UART and virtual JTAG.  Porting binutils was the most painful part, but working with verilog and the new tools is a lot of fun.

Lessons learnt so far:
 - proprietary FPGA tools are much more painful to use than free software Verilog tools.
 - having a C model to verify the RTL simulation against is essential, as are a decent set of quick to run tests.
 - wave forms for debugging is a good way to go insane, but gtkwave's filter files and filter processes reduce the pain somewhat.
 - generate as much source and data from a single definition as possible.

Overall Verilog is a nice change from software, and it makes me appreciate our software tools that much more!

Post has attachment
GDB stub for debugging bare-metal apps on a Raspberry Pi.

Post has attachment
mosh (http://mosh.mit.edu/) really is an excellent tool, especially when combined with flaky VPN connections!

Post has attachment
The Quireboys, Bristol Fleece 23rd April 2012
PhotoPhotoPhoto
3 Photos - View album

GDB built for ARM/Linux knows that you can't single step instructions from 0xffff0000->0xffffffff on Linux, which is problematic if you are using it to test a gdbstub in a simulator with high vectors... I found out the hard way.

Post has shared content

Post has attachment
The Answer + The Union, Bristol O2 Academy 2012-03-16
PhotoPhotoPhoto
The Answer + The Union (3 photos)
3 Photos - View album
Wait while more posts are being loaded