If it's acknowledged that differences such as an Espresso core implementing the MESI cache coherency protocol (the CPU as a whole implements MERSI), whereas Broadway implements MEI, why does the page still say the Espresso is just three "Broadway" cores and not three "Espresso" cores?
The Wii's GPU is not called the GX. GX is the API that is used to perform graphical operations (similar to OpenGL, Vulkan, etc).
I don't remember the actual name, but I remember seeing it being discussed in the Rare Gaming Dump.
GX is also used on the GameCube, too.
Similarly, GX2 is not the name of the GPU on the Wii U. That is also just the API used to perform graphical operations.
The real name of the Wii U's GPU is "GPU7" (You can find many sources online by looking up that word). I assume 7 comes from R700.
(Some trivia: since textures are tiled/swizzled on Wii U, a library named AddrLib (you can find variants of it online) was provided with the Wii U system libraries to perform texture tiling on the CPU. The variant provided on Wii U is indeed the R700 one.)
(Chip family on Wii U is 0x51, to be precise. I don't know if there exists a table out there to tell us exactly what that is, though. Also, don't be misled by the class name being R600AddrLib, AddrLib uses the same class for both R600 and R700.)
The Wii U does not support OpenGL at all. It does not target a specific version of GLSL either as GX2 expects shaders to be provided as binaries. The SDK does provide a compiler, though, and it does only support up to GLSL v3.3, but with extensions provided that can enable features from newer GLSL versions.
MEM2 on the Wii U contains an extra section outside the default 1 GB reserved for user applications called the "foreground bucket heap", which is 40 MB. This heap is usually used for graphics (as I will explain later below), however, even though it is mentioned in the article, it is outside the graphics section:
"Be as it may, there’s a slight compromise: ‘foreground’ apps may claim an extra 40 MB as long as they are being displayed (a.k.a shown in the foreground). However, as soon as the user switches to a background app, this block gets automatically deallocated. This is not exactly a ‘simple’ feature, leaving it up to developers to find good use."
For extra clarification, Color (Render Targets + TV Render Buffer) / Depth-Stencil buffers are usually allocated from MEM1 heap. TV / DRC scan buffers (buffers sent to the displays) are usually allocated from foreground bucket heap. This is the usual use of the foreground bucket heap, which does make it a "simple" feature, contrary to what the article says, as it's very easy to dispose and reconstruct scan buffers. (Just deallocate on leaving foreground and allocate on acquiring foreground.)
The article should make it a little more obvious, imo, that these usages of MEM1 and foreground bucket are the recommended ones. It's not like everything is automatically allocated for you or something, and it's not like you can't use MEM2 for everything, afaik.
I have several comments on the Wii U article.