Cool. How does this compare to stuff like desktop PCs, taking into account the fact that it's different architecture? I know GHZ doesn't mean everything, but do these numbers tell us any good info?
Here's some food for thought. Running PS2 games on a PC will bring a single-core CPU running at frequencies as high as 3.6 - 3.8Ghz to it's knees.. The highest clocked chip in the PS2 was a little under 300mhz (MIPS5900,) yet it will bring cpus over 3000mhz to a crawl.
The reason is parallel processing. Rather than putting in 1 high-clocked chip, consoles get many low-clock chips that all work parallel. In addition to the parallel execution, the chips in consoles aren't typical chips. They have specialized instructions for the kind of loads they handle (Vector processors, encoding/decoding chips, texture compression/decompression chips etc.) You might recall our PC's boards also have dedicated chips like NICs, memory controllers, audio processors etc. but in addition to the many different cpus in consoles, they also have those chips hehe!
Even trying to run them on multi-core CPUs found in PCs is non trivial, because of high tightly synchronized the chips in consoles are.
Are these specs good enough to make people who cry about it being inferior to MicroSony's stuff shut up?
Haha, not even maybe.
In my honest opinion, it's just knee-jerk reactions founded in profound ignorance. I don't think the Wii U's performance issues are a fault of the Wii U's "power" but rather the fact that these ports are rushed and half-assed, not even making an attempt at properly optimizing the engine for the Wii U's architecture.
The PS3 initially went through the same struggle in the beginning. Games were visibly crisper and more responsive on the 360 than the PS3... Even though the PS3's FPOS (Floating-Point Operations per Second) and data through-put (How much data can be moved from the ram into the Cell processor's 7 different cores) puts it in a league of it's own in terms of raw processing power.