I can see what they were trying to do, to be honest.Well the CPU is underclocked at about 50%, but for the most part you are right. One thing you are leaving out is that there is a ton of cache, a common tradeoff these days for clockspeed. However, I don't know how that compares to the PS4/X1. We'll probably need to wait till they are released.
One thing that frustrates me about Nintendo is that they always have to do the "weird thing". This time it's a slow CPU and moderate GPU with a ton of extra cache. They get to say "we have something that nobody else has", but it's also always something that developers don't seem to want. Having a standard makes it easy to develop something for all consoles. They seem to expect third party developers to want to do something special for their system. It's probably why exclusives generally look a lot better on Nintendo systems. They are designed for it and nothing else.
The assumption was that the huge amount of cache will allow developers to send tons upon tons of floating point calculations, calculations the Wii U's CPU is terrible at, to the GPU and back fast and seamlessly, decreasing the need for a strong CPU.
The problem that surfaced later was that the GPU turns out to be "acceptable, but nothing to write home about" when it comes to the XBox One and the PS4. The latter two machines won't have to resort to GPGPU and even if they will, they'll have spare wiggle room. The Wii U will have to rely on it, so the already somewhat inferior GPU will only have more work to do outside of graphics. Moreover, developers are not used to GPGPU yet and unless the SDK does it automatically, they'll just forget about it at times - this will lengthen the optimization process. Similarly developers weren't 100% ready for 64-bit development in the early days of the Nintendo 64, nor were they ready for specialized multicore development in the early days of the PS3.