This project aims to run a 64 × 32 RGB LED matrix from a TinyFPGA AX2, with two full RGB565 framebuffers (double buffering), and a UART interface.
Due to memory constraints, I'm aiming to have a 16-bit framebuffer. Each pixel will require 2 bytes, and the layout will be as follows:
To display this correctly, I'll need to be able to control the brightness of each LED with a range of at least 6-bits.
As mentioned above, the Embedded Block RAM (EBR) available in a TinyFPGA's MachXO2-1200 is 64-kbit.
Frame size:
Two frames:
To ensure that the display is pristine and doesn't have any tearing, I plan to provide two buffers.
The FPGA can be rendering from buffer A, while the host is writing new data into buffer B. Once the new buffer is ready, the FPGA can switch and render from B, while the host writes to A.
The switch can be synchronised to the vertical sync.
I'd really like to be able to update the display at ~30 fps.
With a UART, the frame has 1× start bit, 8× data bits, and 1× stop bit - thus 10 symbols per byte, or 20 symbols per pixel. There is also likely to be some idle-time between frames, so adding a 10% margin is sensible.
A quick bit of maths implies that this should be possible with a baudrate of < 1.5 Mbit/s.
A baudrate of ~2 Mbit/s should be easy enough to work with, and should provide ample headroom.
The LED matrix is driven fast enough that each row is rendered beyond what is troubling to the human eye - something that seriously bugs me. This means that glancing away or past the display won't leave flickering streaks - somewhat like Dianna outlines in her "The Projector Illusion" video.
To keep up with this, the display must be rendered from internal memory, and cannot feasably be driven directly via the UART.
I've written more about the implementation of varios parts of the project on their own pages: