Separate names with a comma.
Discussion in 'Wii U - Console, Accessories and Hardware' started by Duo8, Feb 25, 2014.
Some other info:
About time Nintendo decides to step in and balance games between quality and gameplay.
This kinda gets me thinking about AMD's new API Mantle, but does this mean that a WiiU could handle any game a XBone could play? Probably not....
Well, according to Duo8, the Wii U seems to be more powerful than XBone, so my guess is since its more powerful, the Wii U should be able to play XBone games as well, of course the coding would be different but graphically it would.
What's holding Wii U performance is neither the GPU nor memory bandwidth, but the CPU itself.
Mantle has nothing to do with console. It is a PC only API that designed to give it low level control, which were only available in console.
For as much as people bash the wii u for being underpowered, I find this to be extremely ironic. While it may still be less powerful than the ps4, this is proof that it isn't as bad as people say it is. Hopefully this will drop some of the bashing, but I'm certain the haters will just find some new reason, or intentionally stay in the dark.
Careful about saying anything positive about the WiiU, it's dangerous around these parts.
I'm pretty sure that doesn't work that way. The way I see it is that the WiiU's GPU can move data appx x4 faster than the XBone but that doesn't mean it can handle better graphics. This is telling us that the WiiU can get data faster from A to B than the XBone and would explain why its easier to get HD content on the WiiU.
IMO the XBone is sad... No Titanfall in HD is weak way to go if they are going to make this there next big game. I played the beta on my PC at 1080p ultra gfx and it was great, yup great!
Good news about the Wii U!?! Quick, get the Nintendoomers in here!
Keep in mind number alone doesn't tell a thing about overall performance. That's like saying whoever has the higher TFLOPS has the highest performance, which had been proven wrong in the past.
I've been saying for a long time that Memeory Hierarchy of the Wii U was well-engineered and was a benefit to the system. People always want to talk about the frequency or the flops a processor can churn out under ideal situations, but memory hierarchy is so under-appreciated (probably because it's not as easy to understand how it actually affects a system).
With a good memory hierarchy, you can get more out of your processors as it limits the length of delays waiting for data to be pulled from memory. On-board registers are instantaneous, cache is still rather fast, RAM is pretty slow, and then HDDs move at a snail's pace. With a well thought-out hierarchy, you can limit the frequency of calls out to memory orders of magnitude slower than the previous level of hierarchy. All them gigaflops aren't gonna do much when the data needing processing isn't available.
Pfft, I only care about games, console power doesn't matter... unless my console of choice has a technical advantage in a select few categories, of course.
Does anybody with all that technical know-how want to educate us idiots on what this could actually mean in a situation where the Wii U was say, performing? And what about the PS4 and Xbox One? Is the Wii U still too different for the average developer to give a damn in regards to porting things to a system that is totally different from the competition, or even developing for it? Because from what I read, it basically just said that the Wii U still can't really use any of the standards that the competition uses, so although it's capable, it's still inferior if you're developing for multiple consoles.
Where is your source. Sounds hokey.
Basically, the WiiU can move 4x the graphics data between the buses that need it. Say the WiiU and the XBone needs to redraw the screen (taking it to the basics). If the xBox One has the same level of detailed sprites, the Wii U will theoretically be able to render 4x the sprites, or the same amount of sprites at 4x the frames. That's if games relied solely on the graphics card though. Since the CPU plays a huge factor, the PS4 and the One have the Wii U outclassed in any race.
Adding to what gossaffe said, both the Wii and Wii U have been really well designed. If that buffer overflow hadn't existed in the first place, the Wii might have been the last to be hacked.
Thank you. That was precisely the type of explanation I was looking for.
I'll raise you one: I enjoy Michael Pachter's opinions.
It also doesn't account for the bottlenecks that the RAM, and storage media can cause. In a game like Galaxy Wars, most of the level calls is kept the CPU cache/RAM, making it less frequent to move things from the internal storage to the graphics card (it's also a really small game). Whereas in a game like Skyrim where it's frequently pulling things from everywhere, the bottlenecks lie in the disk. That's why installing a game makes it run better since it can simply pull from the fast storage.
EDIT: The above is why the PS4 might win out in the long run. It has the more efficient RAM streaming of the two.
The Wii U was designed to rely more on the GPGPU than the CPU, which benefits the Wii U in this case.
I'm actually curious as to how much the CPU will matter moving forward. In the past generation, CPU was important because the GPUs were already overloaded, so things that are better done on GPU like physics were pushed to CPU. The Wii U was designed with GPGPU in mind, so it should be pushing more onto the GPU rather than the CPU. I'm also wondering how much the Xbone and PS4 will get out of the oct-core Jaguar. Last I knew, most games only really saw performance boosts up to three cores, is parallelism going to take a huge step and actually utilize all those cores? (well, aside from the resources being wasted by overhead; isn't one of them dedicating two cores to OS?).