Hardware Wii U has 3 times the memory bandwidth of the Xbox One.

Duo8

Well-Known Member
OP
Member
Joined
Jul 16, 2013
Messages
3,613
Trophies
2
XP
3,024
Country
Vietnam
It’s been speculated that taking the information above into consideration, the Wii U’s total bandwidth of gigabytes per second, including the possible 1024 bits per macro and GPU – which, according to TechPowerUp clocks in at 550mhz – would come out to around 563.2GB per second. Keep in mind that the Xbox One runs about 170GB per second of bandwidth between the DDR3 and eSRAM, as outlined by Xbit Labs.
This could explain the Wii U’s ease of hitting 1080p at 60fps with nary many complaints from those who actually enjoy working with Nintendo’s hardware.

Some other info:
There’s tons more interesting tidbits in the article, such as the system’s DX11 equivalent capabilities as well as it’s ability to achieve Shader Model 5.0 capabilities. A source that spoke with CinamaBlend, who wished to remain anonymous, stated that of course the Wii U could not actually use Direct X since that is a Microsoft owned API, as well as Shader Model 5.0, but it can achieve some of the same affects that those API’s use through work-around.

Source:
:arrow: GaminRealm article
 

Yepi69

Jill-sandwiched
Member
Joined
Nov 29, 2010
Messages
2,862
Trophies
2
Age
28
Location
Behind you
XP
1,776
Country
Portugal
This kinda gets me thinking about AMD's new API Mantle, but does this mean that a WiiU could handle any game a XBone could play? Probably not....

Well, according to Duo8, the Wii U seems to be more powerful than XBone, so my guess is since its more powerful, the Wii U should be able to play XBone games as well, of course the coding would be different but graphically it would.
 

trumpet-205

Embrace the darkness within
Member
Joined
Jan 14, 2009
Messages
4,363
Trophies
0
Website
Visit site
XP
693
Country
United States
What's holding Wii U performance is neither the GPU nor memory bandwidth, but the CPU itself.
This kinda gets me thinking about AMD's new API Mantle, but does this mean that a WiiU could handle any game a XBone could play? Probably not....
Mantle has nothing to do with console. It is a PC only API that designed to give it low level control, which were only available in console.
 

vayanui8

Well-Known Member
Member
Joined
Nov 11, 2013
Messages
1,086
Trophies
0
XP
908
Country
United States
For as much as people bash the wii u for being underpowered, I find this to be extremely ironic. While it may still be less powerful than the ps4, this is proof that it isn't as bad as people say it is. Hopefully this will drop some of the bashing, but I'm certain the haters will just find some new reason, or intentionally stay in the dark.
 

Luckkill4u

4 guys in a car ( ͡° ͜ʖ ͡°)
Member
Joined
Jul 13, 2008
Messages
1,028
Trophies
1
Age
30
Location
Insomnia
Website
www.gbatemp.net
XP
1,131
Country
Canada
Well, according to Duo8, the Wii U seems to be more powerful than XBone, so my guess is since its more powerful, the Wii U should be able to play XBone games as well, of course the coding would be different but graphically it would.

I'm pretty sure that doesn't work that way. The way I see it is that the WiiU's GPU can move data appx x4 faster than the XBone but that doesn't mean it can handle better graphics. This is telling us that the WiiU can get data faster from A to B than the XBone and would explain why its easier to get HD content on the WiiU.

IMO the XBone is sad... No Titanfall in HD is weak way to go if they are going to make this there next big game. I played the beta on my PC at 1080p ultra gfx and it was great, yup great!
 
  • Like
Reactions: Ray Lewis

trumpet-205

Embrace the darkness within
Member
Joined
Jan 14, 2009
Messages
4,363
Trophies
0
Website
Visit site
XP
693
Country
United States
Keep in mind number alone doesn't tell a thing about overall performance. That's like saying whoever has the higher TFLOPS has the highest performance, which had been proven wrong in the past.
 

grossaffe

Well-Known Member
Member
Joined
May 5, 2013
Messages
3,007
Trophies
0
XP
2,799
Country
United States
I've been saying for a long time that Memeory Hierarchy of the Wii U was well-engineered and was a benefit to the system. People always want to talk about the frequency or the flops a processor can churn out under ideal situations, but memory hierarchy is so under-appreciated (probably because it's not as easy to understand how it actually affects a system).

With a good memory hierarchy, you can get more out of your processors as it limits the length of delays waiting for data to be pulled from memory. On-board registers are instantaneous, cache is still rather fast, RAM is pretty slow, and then HDDs move at a snail's pace. With a well thought-out hierarchy, you can limit the frequency of calls out to memory orders of magnitude slower than the previous level of hierarchy. All them gigaflops aren't gonna do much when the data needing processing isn't available.
 
  • Like
Reactions: Sterling

Nathan Drake

Obligations fulfilled, now I depart.
Member
Joined
Jan 2, 2011
Messages
6,192
Trophies
0
XP
2,707
Country
Does anybody with all that technical know-how want to educate us idiots on what this could actually mean in a situation where the Wii U was say, performing? And what about the PS4 and Xbox One? Is the Wii U still too different for the average developer to give a damn in regards to porting things to a system that is totally different from the competition, or even developing for it? Because from what I read, it basically just said that the Wii U still can't really use any of the standards that the competition uses, so although it's capable, it's still inferior if you're developing for multiple consoles.
 

Sterling

GBAtemp's Silver Hero
Member
Joined
Jan 22, 2009
Messages
4,023
Trophies
1
Age
32
Location
Texas
XP
1,100
Country
United States
Does anybody with all that technical know-how want to educate us idiots on what this could actually mean in a situation where the Wii U was say, performing?

Basically, the WiiU can move 4x the graphics data between the buses that need it. Say the WiiU and the XBone needs to redraw the screen (taking it to the basics). If the xBox One has the same level of detailed sprites, the Wii U will theoretically be able to render 4x the sprites, or the same amount of sprites at 4x the frames. That's if games relied solely on the graphics card though. Since the CPU plays a huge factor, the PS4 and the One have the Wii U outclassed in any race.

Adding to what gossaffe said, both the Wii and Wii U have been really well designed. If that buffer overflow hadn't existed in the first place, the Wii might have been the last to be hacked.
 

Sterling

GBAtemp's Silver Hero
Member
Joined
Jan 22, 2009
Messages
4,023
Trophies
1
Age
32
Location
Texas
XP
1,100
Country
United States
Edit: My university has the shittiest internet tonight. My b.

It also doesn't account for the bottlenecks that the RAM, and storage media can cause. In a game like Galaxy Wars, most of the level calls is kept the CPU cache/RAM, making it less frequent to move things from the internal storage to the graphics card (it's also a really small game). Whereas in a game like Skyrim where it's frequently pulling things from everywhere, the bottlenecks lie in the disk. That's why installing a game makes it run better since it can simply pull from the fast storage.

EDIT: The above is why the PS4 might win out in the long run. It has the more efficient RAM streaming of the two.
 
  • Like
Reactions: Ray Lewis

grossaffe

Well-Known Member
Member
Joined
May 5, 2013
Messages
3,007
Trophies
0
XP
2,799
Country
United States
Basically, the WiiU can move 4x the graphics data between the buses that need it. Say the WiiU and the XBone needs to redraw the screen (taking it to the basics). If the xBox One has the same level of detailed sprites, the Wii U will theoretically be able to render 4x the sprites, or the same amount of sprites at 4x the frames. That's if games relied solely on the graphics card though. Since the CPU plays a huge factor, the PS4 and the One have the Wii U outclassed in any race.

Adding to what gossaffe said, both the Wii and Wii U have been really well designed. If that buffer overflow hadn't existed in the first place, the Wii might have been the last to be hacked.
I'm actually curious as to how much the CPU will matter moving forward. In the past generation, CPU was important because the GPUs were already overloaded, so things that are better done on GPU like physics were pushed to CPU. The Wii U was designed with GPGPU in mind, so it should be pushing more onto the GPU rather than the CPU. I'm also wondering how much the Xbone and PS4 will get out of the oct-core Jaguar. Last I knew, most games only really saw performance boosts up to three cores, is parallelism going to take a huge step and actually utilize all those cores? (well, aside from the resources being wasted by overhead; isn't one of them dedicating two cores to OS?).
 

Site & Scene News

Popular threads in this forum

General chit-chat
Help Users
    HiradeGirl @ HiradeGirl: Have a nice day. Life. Week. Month. year.