# WiiU clockspeeds found by marcan (Wii hacker)



## Feels Good Man (Nov 29, 2012)

Marcan is a well known Wii hacker (if you didn't know) so the information is very unlikely to be baseless speculation.










> Wii U codenames worth knowing: system Cafe, CPU Espresso, GPU/SoC/etc. Latte, ARM secure processor Starbuck (we made that one up).
> 
> 1.243125GHz, exactly. 3 PowerPC 750 type cores (similar to Wii's Broadway, but more cache).
> 
> ...


 
https://twitter.com/marcan42

I don't know too much about hardware so I won't comment though it does seem low.


----------



## Valwin (Nov 29, 2012)

seems legit


----------



## McHaggis (Nov 29, 2012)

Hacking news is coming pretty thick and fast for the Wii U.  Things are looking up.


----------



## Lanlan (Nov 29, 2012)

Cool. How does this compare to stuff like desktop PCs, taking into account the fact that it's different architecture? I know GHZ doesn't mean everything, but do these numbers tell us any good info?


----------



## Foxi4 (Nov 29, 2012)

Clock speeds are not a good representation of horse power - everything depends on how much of a boost the cache provides, how polished is the architecture and how much the GPGPU capabilities will be able to take the toll off the CPU itself. That, and remember that it's a tripple-core, so the communication between them also plays a factor, and that's improved via eDRAM.


----------



## Rydian (Nov 29, 2012)

Specu...
Specu...!
SPECU...!
SPECULATION, *HO!*

PowerPC 750CL: https://www-01.ibm.com/chips/techlib/techlib.nsf/products/PowerPC_750CL_Microprocessor
According to wikipedia, this is the closest public match to what the Wii has, with the Wii's clocked at 729Mhz.

PowerPC 750: https://www-01.ibm.com/chips/techlib/techlib.nsf/products/PowerPC_750_Microprocessor
According to Marcan, this is the architecture used in the Wii U, three cores, clocked at 1.25Ghz... ?  No, this itself can't be right.

Looking at the date of the base 750 (1997 introduction which is multiple years _before_ the Wii's 750CL was developed), *I highly doubt that "PowerPC 750" is the exact model*.

He's most likely just stating that the CPU is within the 750 family... but I don't know enough to guess if it's another modification to the 750CL or whatever.
_However, it's likely almost the same in performance per-clock_!

Given that the architecture's the same and we know the clock rate and junk, we can now make educated guesses to how much more powerful the Wii U is compared to the Wii.  Now, this is just a rough guess, and Marcan DID say that the Wii U's CPU has more cache.  If this is something like L1 or L2 cache (and not something with a lot more latency) then this would help a lot, as it's been shown that more L1 and L2 cache is actually a nice performance improvement.  Given the simplicity of the 750 I doubt this is something like shared L3 cache AMD pulled off... 

The Wii U's using the same base architecture at a ~72% clock boost, with more low-latency cache and three more cores.  _The CPU, in my guess, is about 4-5 times as powerful as the Wii's at best._

People, please remember that it's primarily the GPU that's responsible for graphics.  Just figured I needed to post that...


EDIT: HAHA MATH FAIL.  Fixed.  Also note that when running single-core programs, it wouldn't be nearly that much of an improvement.


----------



## Snailface (Nov 29, 2012)

_"it involves Wii U hacks"_
_"we're calling the WiiU security processor the Starbuck (vs. Starlet on Wii). And it seems to be about equally vulnerable"_

This is bigger news than clock speed imo.
OP's name is oddly appropriate for this thread.


----------



## Gahars (Nov 29, 2012)

And here comes another round of spec-U-lation/specsulation (take your pick).

Still, it definitely sounds like hacks for the Wii U are no longer a question of if, but when.


----------



## the_randomizer (Nov 29, 2012)

He found this out, how?


----------



## Taleweaver (Nov 29, 2012)

Nice to hear. Unfortunate, due to not keeping up with computer development, I have pretty much no idea of whether this is good or not. 


In a cynical way, here's the more important question:

Are these specs good enough to make people who cry about it being inferior to MicroSony's stuff shut up?


Not sure what to make of the remark of not wanting to say how he found it out. With statements like that, I'm actually more inclined to think they too a wiiU apart and did some investigating on the parts themselves than that they somehow hacked into the thing enough to run a benchmark.


----------



## McHaggis (Nov 29, 2012)

the_randomizer said:


> He found this out, how?





			
				marcan's twitter said:
			
		

> sorry, I'd rather not talk about how I got that yet. It doesn't involve leaks, it involves Wii U hacks


 


Wever said:


> Are these specs good enough to make people who cry about it being inferior to MicroSony's stuff shut up?


You can't really draw a direct comparison.  Clock speed isn't comparable across different chips and architecture.


----------



## the_randomizer (Nov 29, 2012)

McHaggis said:


> You can't really draw a direct comparison. Clock speed isn't comparable across different chips and architecture.


Fair enough


----------



## Foxi4 (Nov 29, 2012)

Like I said earlier, clock speeds alone are relatively easy to measure, but that and the architecture are not enough to determine the total horse power. There are too many variables in the equation, such as cache and its implementation, communication between cores, embedded memory, its type, speed and implementation and all the other junk you can jam on the silicone to improve its performance. Before the chip is x-rayed, successfuly decapped etc. we may only draw rough estimates that have little to do with facts.


----------



## Forstride (Nov 29, 2012)

And not a single fuck was given.  "News" about system specs for anything should be banned, like sales figures.  All it does is cause arguments and bring out the elitism in people.


----------



## Qtis (Nov 29, 2012)

Forstride said:


> And not a single fuck was given. "News" about system specs for anything should be banned, like sales figures. All it does is cause arguments and bring out the elitism in people.


Considering Nintendo hasn't released anything official about it, it's news indeed. Not maybe to you, but someone else may find it interesting. Also as GBAtemp is mainly about consoles (and in many way pro-modding) I don't see how this doesn't fit the category. Sales figures on the other hand are a completely different matter and are easily manipulated into saying what you want: "LOL THE IPHONE IS NOT SELLING ANYMORE LOL!" and then we see the new phone released and people buying it like there is no tomorrow. 

OT: This doesn't mean much to me in the sense that I won't probably get anything out of it. On the other hand the prospect of having a modified console is interesting if nothing else.


----------



## MrDiesel (Nov 29, 2012)

The fact that the security processor is almost equally vulnerable (as the Wii's), makes this post *more* interesting, than the fact the WiiU has a certain clock speed...


----------



## Hyro-Sama (Nov 29, 2012)

So when can we expect to begin playing backups on the Wii U?

That's all anyone really cares about.


----------



## DinohScene (Nov 29, 2012)

Forstride said:


> And not a single fuck was given. "News" about system specs for anything should be banned, like sales figures. All it does is cause arguments and bring out the elitism in people.


 
So you're saying that any news in hacking progress should be banned?


----------



## Feels Good Man (Nov 29, 2012)

Forstride said:


> And not a single fuck was given. "News" about system specs for anything should be banned, like sales figures. All it does is cause arguments and bring out the elitism in people.


 
Yes, let's not encourage discussion among users unless it's positive news. Believe it or not but not all discussion is calm and us getting along. That's just anti-discussion and we're here to debate and argue with each other to prove each other wrong. That's how message boards survive


----------



## dicamarques (Nov 29, 2012)

It would be so weird that the Wii U would be hacked before the 3DS :S But well thats progress 
Edit: Oh you ninty keeping the wii u unsecure just like the wii


----------



## TVNewsIsBiased (Nov 29, 2012)

Lanlan said:


> Cool. How does this compare to stuff like desktop PCs, taking into account the fact that it's different architecture? I know GHZ doesn't mean everything, but do these numbers tell us any good info?


 
Here's some food for thought. Running PS2 games on a PC will bring a single-core CPU running at frequencies as high as 3.6 - 3.8Ghz to it's knees.. The highest clocked chip in the PS2 was a little under 300mhz (MIPS5900,) yet it will bring cpus over 3000mhz to a crawl.

The reason is parallel processing. Rather than putting in 1 high-clocked chip, consoles get _many_ low-clock chips that all work parallel. In addition to the parallel execution, the chips in consoles aren't typical chips. They have specialized instructions for the kind of loads they handle (Vector processors, encoding/decoding chips, texture compression/decompression chips etc.) You might recall our PC's boards also have dedicated chips like NICs, memory controllers, audio processors etc. but in addition to the many different cpus in consoles, they _also_ have those chips hehe!

Even trying to run them on multi-core CPUs found in PCs is non trivial, because of high tightly synchronized the chips in consoles are.



Wever said:


> Are these specs good enough to make people who cry about it being inferior to MicroSony's stuff shut up?


 
Haha, not even maybe.

In my honest opinion, it's just knee-jerk reactions founded in profound ignorance. I don't think the Wii U's performance issues are a fault of the Wii U's "power" but rather the fact that these ports are rushed and half-assed, not even making an _attempt_ at properly optimizing the engine for the Wii U's architecture.

The PS3 initially went through the same struggle in the beginning. Games were visibly crisper and more responsive on the 360 than the PS3... Even though the PS3's FPOS (Floating-Point Operations per Second) and data through-put (How much data can be moved from the ram into the Cell processor's 7 different cores) puts it in a league of it's own in terms of raw processing power.


----------



## FAST6191 (Nov 29, 2012)

Interesting, on matters of determining potential performance from assembly and architectures these days if see the phrase along the lines of "I just work there, I don't live there" you will probably get somewhere close to what I peg myself at so I am out for the time being. I hope some more stuff is revealed at 29C3 next month, timings seem a bit tight though but given the whole wii presentation a couple of years back a month might be more than enough.


----------



## RupeeClock (Nov 29, 2012)

The WiiU can already interact with the 3DS, maybe this'll lead to some interesting discoveries 3DS side if the WiiU is supposedly so vulnerable.

For the sake of compliance the SmashStack exploit in Brawl still works perfectly fine in Wii mode, it's nuts.


----------



## 9thSage (Nov 29, 2012)

Of COURSE people here'd automatically think any kind of hacking would equal backups running on WiiU.  That's really not neccesarily the case, in fact I hope it's not.  I do want the system to actually be successful.


----------



## FAST6191 (Nov 29, 2012)

Other than Hyro-Sama's jest post I see nothing of the sort 9thSage and instead see discussion of what it could mean and how it might play out as far as the resulting games- I am not quite sure if that counts as projecting but either way a bit less of it is probably in order.


----------



## Rockhoundhigh (Nov 29, 2012)

In marcan I trust, though honestly that clock speed seems a bit ridiculous, I mean 1.24 GHz!? Unless the GPGPU is magically supposed to make up for that deficit then I don't know what Nintendo was thinking processor wise still at least it's three cores.


----------



## the_randomizer (Nov 29, 2012)

Newsflash, lower clock speeds =/= lower performance.

How else could a Core i7 at 1.8 GHz outperform a Core 2 Duo at 2.4 GHz? I really wish would stop denouncing the Wii U. It's pissing me off.


----------



## Felipe_9595 (Nov 30, 2012)

9thSage said:


> Of COURSE people here'd automatically think any kind of hacking would equal backups running on WiiU. That's really not neccesarily the case, in fact I hope it's not. I do want the system to actually be successful.


 
Ps1, Ps2 and Wii says hi.


----------



## chyyran (Nov 30, 2012)

Great, we know the WiiU is vulnerable.

However, these numbers will just be an excuse for idiotic PS360 fanboys to bitch about how the WiiU is inferior. Like we don't have enough of those already.


----------



## the_randomizer (Nov 30, 2012)

Punyman said:


> Great, we know the WiiU is vulnerable.
> 
> However, these numbers will just be an excuse for idiotic PS360 fanboys to bitch about how the WiiU is inferior. Like we don't have enough of those already.


 
My point exactly, clock speeds only tell a chapter of the story, not the entire novel.


----------



## Devin (Nov 30, 2012)

I'm definitely glad they're getting somewhere in terms of learning how the thing works and as a 360 fanboy I can honestly say that I don't care about clockspeeds. As long as the games run, and run well. Then the WiiU is fine in my books. But sadly I have to wait till Christmas to open mine..


----------



## Valwin (Nov 30, 2012)

o nice to know the clock speed in WII MODE


----------



## tmv_josue (Nov 30, 2012)

^ I don´t think that's in vWii mode.


----------



## DiscostewSM (Nov 30, 2012)

If the WiiU bombs, are we going to call it the PiiU?


----------



## 9thSage (Nov 30, 2012)

Felipe_9595 said:


> Ps1, Ps2 and Wii says hi.


Oh yeah?  Say hi back for me if you would.


----------



## Rydian (Nov 30, 2012)

So does half of gbatemp have me on ignore or something?

The number of "we can't make comparisons" and "this doesn't tell us the power" posts _made after my paragraph-level post with resource links and comparisons_ is just... sad.


----------



## Snailface (Nov 30, 2012)

Rockhoundhigh said:


> *In marcan I trust*, though honestly that clock speed seems a bit ridiculous, I mean 1.24 GHz!? Unless the GPGPU is magically supposed to make up for that deficit then I don't know what Nintendo was thinking processor wise still at least it's three cores.


And you should trust him. He has, since he revealed the Wiiu's alarmingly slow clock, elaborated on the architectural differences between the Espresso and the Xenon: basically the Xenon doesn't have near the advantage most people speculate (despite the great difference in clock speeds)
https://twitter.com/marcan42/status/274182672652308480

There are many more tweets supporting this opinion this if you care to read his full twitter feed.


----------



## the_randomizer (Nov 30, 2012)

Snailface said:


> And you should trust him. He has, since he revealed the Wiiu's alarmingly slow clock, elaborated on the architectural differences between the Espresso and the Xenon: basically the Xenon doesn't have near the advantage most people speculate (despite the great difference in clock speeds)
> https://twitter.com/marcan42/status/274182672652308480
> 
> There are many more tweets supporting this opinion this if you care to read his full twitter feed.


 
Just read his post and all I can say is QFMFT.


----------



## Maxternal (Nov 30, 2012)

I know it's not meant to be a processing powerhouse or anything but I'm kinda wondering how the clock speed of the Starbucks compares to that of Starlet.

The next thing that comes to mind is how they compare to ARM procs in handheld game systems.


----------



## tronic307 (Nov 30, 2012)

Rockhoundhigh said:


> In marcan I trust, though honestly that clock speed seems a bit ridiculous, I mean 1.24 GHz!? Unless the GPGPU is magically supposed to make up for that deficit then I don't know what Nintendo was thinking processor wise still at least it's three cores.


Those clock speeds honestly make no sense unless "cool to the touch" is Nintendo's ultimate performance metric. I thought it would at least be 1458MHz, (2x Wii) and that the GPU would be 486MHz with memory at 729 or 972MHz, and the audio DSP would remain at 121.5MHz. What happened to Nintendo's even multipliers for reduced latency? If Marcan is right, at least we'd know why it doesn't get as hot as the Wii, or even the 3DS.


----------



## DiscostewSM (Nov 30, 2012)

tronic307 said:


> Those clock speeds honestly make no sense unless "cool to the touch" is Nintendo's ultimate performance metric. I thought it would at least be 1458MHz, (2x Wii) and that the GPU would be 486MHz with memory at 729 or 972MHz, and the audio DSP would remain at 121.5MHz. What happened to Nintendo's even multipliers for reduced latency? If Marcan is right, at least we'd know why it doesn't get as hot as the Wii, or even the 3DS.


 
Nintendo has only shown to be using even multipliers in their handhelds. I have yet to see such comparisons with their consoles, but I think it has to do with handhelds being an all-in-one device, syncing with the refresh rate of the display.


----------



## pasc (Nov 30, 2012)

I gotta say:

I find it ridiculous how fast the WiiU is cracked open and yet how long we now have the 3DS and it remains unhacked... I would love to use my 3DS as a Gamecontroller for my PC for example.

Or stream videos from my ftp server and stuff like that... meh.

Still... the WiiU seems too awesome... however since I got a Wii only last February and still haven't finsihed Metroid Prime Trilogy (stuck in MP2 for now..., work eats my time like crazy :/ )


----------



## DiscostewSM (Nov 30, 2012)

pasc said:


> I gotta say:
> 
> I find it ridiculous how fast the WiiU is cracked open and yet how long we now have the 3DS and it remains unhacked... I would love to use my 3DS as a Gamecontroller for my PC for example.


 
Um, hacking into the Wii Mode of the Wii U is like hacking into the DS mode of the 3DS. I don't recall Marcan saying he got the clock frequencies through the Wii Mode, if I'm not mistaken.


----------



## tronic307 (Nov 30, 2012)

DiscostewSM said:


> Nintendo has only shown to be using even multipliers in their handhelds. I have yet to see such comparisons with their consoles, but I think it has to do with handhelds being an all-in-one device, syncing with the refresh rate of the display.


N64:
93.75MHz CPU
62.5MHz GPU
CPU is 1.5x GPU Clock

Gamecube:
81MHz Audio DSP @System BCLK
162MHz GPU 2x
324MHz RAM 4x
486MHz CPU 6x

Wii:
121.5MHz Audio DSP @System BCLK
243MHz GPU 2x
486MHz RAM 4x
729MHz CPU 6x


----------

