PS4 & XONE CPU Architectures & Models Examined

Rydian

Resident Furvert™
OP
Member
Joined
Feb 4, 2010
Messages
27,880
Trophies
0
Age
36
Location
Cave Entrance, Watching Cyan Write Letters
Website
rydian.net
XP
9,111
Country
United States
Screen_Shot_2013_05_22_at_11_52_48_PM.png

Out-of-order, base architecture is from 2010, 28nm fabrication process, has some performance/complexity tradeoffs to reduce power consumption and die size (down to 3.1mm^2 from 4.9mm^2) as Jaguar is designed to give tablets a big performance increase, more instructions than previous low-power models (SSE4.x, hardware AES and more), increased floating-point performance over Bobcat, and AMD claims ~15% instructions-per-clock gains over bobcat.

So from the Jaguar architecture, AMD is building two main cores.
  • Kabini is for the desktop market and there's already model specifications out there, all with "AMD HD 8xxx" GPUs integrated. The 8xxx series supports Directx 11.1, OpenGL 4.3, and OpenGL ES 3.0 (insert Dolphin mention here). The current models do not have Turbo enabled, though Jaguar itself supports it. TDP is 9-25W.

  • Temash is the second core built off Jaguar, and created for low-power environments. TDP is 3.9-9W, one of the publicly-available models will have Turbo (can clock up to 1.4ghz from 1.0), and Temash can be dual or quad-core.
As for the specific models and setups that the PS4 and XONE are using, both of them are using 8 core setups... which means that both of them are using two Jaguar units, a multi-CPU setup (rarely seen outside servers/farms and the failed Skulltrail). Since sharing data and processes across two different CPUs is a different and more taxing/limiting process than sharing across cores on one physical CPU, it seems that one of the four-core CPUs will be used for the main program and/or games, with the other four-core CPU used for overlays, background processes, possibly additional hardware control, etc.

And here's the chart most people have more interest in...
specs.png

The architecture used currently clocks up to 2.0ghz, most rumors point to clocks of 1.6ghz for both the PS4 and XONE. The PS4 and XONE seem to have similar or even the same CPU specs, but as we've already heard the PS4's going to be using way-higher-clocked RAM, and 50% more GPU cores.

If we knew the specs for the various AMD HD 8xxx cores then we could possibly figure out the closest CPU models for the PS4 and XONE and fill in the missing specs on both sides, but that info doesn't seem available just yet as the cores only commercial launched yesterday, it'll take time for the CPU-Z and GPU-Z databases to mark all the exact differences between the models... and that also assumes that the PS4 and XONE would be using the integrated GPUs and not a separate dedicated chip (the XONE motherboard appears to have no expansions cards so the GPU is either the on-die CPU one or somewhere else on the motherboard).

Also for anybody that's still holding out hopes for 360 BC on the XONE...
The move away from PowerPC to 64-bit x86 cores means the One breaks backwards compatibility with all Xbox 360 titles. Microsoft won’t be pursuing any sort of a backwards compatibility strategy, although if a game developer wanted to it could port an older title to the new console. Interestingly enough, the first Xbox was also an x86 design - from a hardware/ISA standpoint the new Xbox One is backwards compatible with its grandfather, although Microsoft would have to enable that as a feature in software - something that’s quite unlikely.


icon11.gif
Jaguar Source
icon11.gif
PS4 & XONE Source



Also here's my obligatory "Consoles are just pre-setup desktops with a locked OS" line.
 

Celice

Well-Known Member
Member
Joined
Jan 1, 2008
Messages
1,920
Trophies
1
XP
628
Country
United States
Soo, the XOne is just an underpowered PS4 with more fees attached?
Why would I want this thing again?

It needs some REALLY good exclusives to attract any customers.
If it costs $100 less and the two systems continue to share, like, 80% of their total game library (withholding exclusives), I don't see why someone wouldn't, so long as the games aren't perpetually priced as high as possible (Xbox is looking a lot like how Origin...).

I'm actually super interested in what the launch prices will be. If they're high enough, it might end up that getting a gaming PC would be more economic than going with a dedicated console. Sony at least is looking to offer exclusive games worth getting on their systems.
 

Rydian

Resident Furvert™
OP
Member
Joined
Feb 4, 2010
Messages
27,880
Trophies
0
Age
36
Location
Cave Entrance, Watching Cyan Write Letters
Website
rydian.net
XP
9,111
Country
United States
MS wants the Xbox One to be an all-in-one system, eliminating everything in the living room but the Xbox One and yet they force users to use the Xbox 360 along with the Xbox One. All-in-one, not so much.
I don't have an XBOX 360.

Is there something wrong with me?
 

Sakitoshi

GBAtemp Official Lolimaster
Member
Joined
May 8, 2012
Messages
2,256
Trophies
2
Age
33
Location
behind a keyboard or a gamepad
Website
sakiheru.blogspot.com
XP
2,911
Country
Chile
I don't have an XBOX 360.

Is there something wrong with me?
not at all, all what you need is a PS3 and a tuner(all tv's have it conveniently integrated) for have a multimedia center that plays bluray, digital movies, music and AAA games, you can add a Wii(U) for retro gaming, also if you want digital tv, providers give you a digital box when you purchase his services, and with a free remote control!! no need to talk to your electronics!!!.
 
  • Like
Reactions: kehkou

Foxi4

Endless Trash
Global Moderator
Joined
Sep 13, 2009
Messages
30,824
Trophies
3
Location
Gaming Grotto
XP
29,819
Country
Poland
As far as I know, both CPU's have been customized by AMD - comparing stock values is a little bit of a stretch since we don't know what kind of add-ons Sony and Microsoft chose for their versions. All we know right now is that both are based on AMD Jaguar, but by itself that doesn't mean much. ;)
 

chartube12

Captain Chaz 86
Member
Joined
Mar 3, 2010
Messages
3,921
Trophies
1
XP
2,280
Country
United States
Soo, the XOne is just an underpowered PS4 with more fees attached?
Why would I want this thing again?

It needs some REALLY good exclusives to attract any customers.

So was partially the 360. didn't you read how microsoft paid IBM for their part of the research used to make the ps3, to make the 360? I posted it back when I first joined.
 

Rydian

Resident Furvert™
OP
Member
Joined
Feb 4, 2010
Messages
27,880
Trophies
0
Age
36
Location
Cave Entrance, Watching Cyan Write Letters
Website
rydian.net
XP
9,111
Country
United States
As far as I know, both CPU's have been customized by AMD - comparing stock values is a little bit of a stretch since we don't know what kind of add-ons Sony and Microsoft chose for their versions. All we know right now is that both are based on AMD Jaguar, but by itself that doesn't mean much. ;)
The sources include an XONE teardown and info from suppliers, which is why I included the actual numbers.
 

Sakitoshi

GBAtemp Official Lolimaster
Member
Joined
May 8, 2012
Messages
2,256
Trophies
2
Age
33
Location
behind a keyboard or a gamepad
Website
sakiheru.blogspot.com
XP
2,911
Country
Chile
So was partially the 360. didn't you read how microsoft paid IBM for their part of the research used to make the ps3, to make the 360? I posted it back when I first joined.
this...
the X360 cpu is a beta version of the PS3 Cell, Microsoft rushed the launch of his console and used the beta stage cpu even when was poor version of that hardware, but in that version of the consoles war the X360 has a better GPU that was easy to program for developers. the PS3 RSX is not as powerfull as the X360 ATI card, but the Cell can handle all that the RSX can't and help in the rendering process, that's why ports sucked in the other console when where fully developed on one then ported to the another, like Bayonetta and Final Fantasy XIII.

now is a very similar scenario, only that the PS4 has more powerfull GPU, by the rumors anyway.
 
  • Like
Reactions: chartube12

FAST6191

Techromancer
Editorial Team
Joined
Nov 21, 2005
Messages
36,798
Trophies
3
XP
28,311
Country
United Kingdom
If it is two dies I wonder if things will be merged onto a single one in later life like some of the server chips. We have already seen marginally variable devices hit and refinements made but this could be quite interesting.

The low power stuff caught me a bit off guard though, it seemed like consoles were heading higher and higher there for a while.
 

Rydian

Resident Furvert™
OP
Member
Joined
Feb 4, 2010
Messages
27,880
Trophies
0
Age
36
Location
Cave Entrance, Watching Cyan Write Letters
Website
rydian.net
XP
9,111
Country
United States
So Rydain, would this mean next gen games would be able to run on PC set ups with the same specs? Or is there some kind of super optimization that has to be done?
They wouldn't be for the same OS so they'd need to be virtualized at the least, emulated at the worst.

EDIT:
The low power stuff caught me a bit off guard though, it seemed like consoles were heading higher and higher there for a while.
Well this is "low TDP" in comparison to PC stuff.
 

Foxi4

Endless Trash
Global Moderator
Joined
Sep 13, 2009
Messages
30,824
Trophies
3
Location
Gaming Grotto
XP
29,819
Country
Poland
the X360 cpu is a beta version of the PS3 Cell
...the two processors are nothing alike in the grand scheme of things? Work on the Cell began back in 2001 and it was backed by Sony, Toshiba and IBM themselves, the idea was to create a central processing unit supported by specialized cores around it, hence the PPE+SPE combination.

Microsoft took no part in its creation or design - the Xenon CPU cores are merely based on the modified PPE. It doesn't make it a Cell processor or its "beta" - the PPE is just a normal PowerPC core. The whole point of the Cell was the introduction of the SPE cores, not the central PPE.
 

chartube12

Captain Chaz 86
Member
Joined
Mar 3, 2010
Messages
3,921
Trophies
1
XP
2,280
Country
United States
^^^^^^^^^^^

I think the point he was making is the 360's cpu is it based on unrefined and unfinished work. The rest of his statement is dead on of what happened with the 360.
 

kehkou

does what Nintendon't
Member
Joined
Dec 19, 2009
Messages
798
Trophies
1
Location
The Duke City
XP
1,093
Country
United States
Handheld = Designed for lower power consumption to conserve battery -YAY!

Console = Designed for lower power consumption to...I don't know why. Its connected to mains power! Overclock the hell out of it for Christ sake!
 

Site & Scene News

Popular threads in this forum

General chit-chat
Help Users
  • No one is chatting at the moment.
  • The Real Jdbye @ The Real Jdbye:
    the vram is one advantage when it comes to AI but ends up being slower even with that and really AI is the only use case that needs more than 12gb vram right now
  • Psionic Roshambo @ Psionic Roshambo:
    Interesting lol
  • Psionic Roshambo @ Psionic Roshambo:
    I think I watched a video where two games at 4K where eating just over 16GB of RAM and it's the one case where the 7900XT and XTX pulled ahead (minus RTX of course)
  • Psionic Roshambo @ Psionic Roshambo:
    So my opinion is that they could age a bit better in the future, and maybe AMD will continue improving them via drivers like they tend to do. No guarantee there but they have done it in the past. Just a feeling I have.
  • The Real Jdbye @ The Real Jdbye:
    cyberpunk at 4k without DLSS/fidelityfx *might* exceed 12gb
    +1
  • The Real Jdbye @ The Real Jdbye:
    but that game barely runs at native 4k
  • Psionic Roshambo @ Psionic Roshambo:
    I think it was some newer games and probably poorly optimized PS4 or PS5 ports
  • The Real Jdbye @ The Real Jdbye:
    they definitely will age better but i feel dlss might outweigh that since it looks about as good as native resolution and much less demanding
    +1
  • Psionic Roshambo @ Psionic Roshambo:
    When I played Cyberpunk on my old 2080 Ti it sucked lol
  • The Real Jdbye @ The Real Jdbye:
    AMD could introduce something comparable to DLSS but nvidia's got a lot more experience with that
  • The Real Jdbye @ The Real Jdbye:
    least amd 7xxx has tensor cores which the previous generations didn't so there is the potential for AI upscaling
  • Psionic Roshambo @ Psionic Roshambo:
    They have FSR or whatever it's called and yeah it's still not great
  • The Real Jdbye @ The Real Jdbye:
    so AMD seem to finally be starting to take AI seriously
  • Psionic Roshambo @ Psionic Roshambo:
    Oh yeah those new 8000 CPUs have AI cores built in that's interesting
  • Psionic Roshambo @ Psionic Roshambo:
    Maybe they plan on offloading to the CPU?
  • Psionic Roshambo @ Psionic Roshambo:
    Would be kinda cool to have the CPU and GPU working in random more
  • Psionic Roshambo @ Psionic Roshambo:
    Tandem even
  • The Real Jdbye @ The Real Jdbye:
    i think i heard of that, it's a good idea, shouldn't need a dedicated GPU just to run a LLM or video upscaling
  • The Real Jdbye @ The Real Jdbye:
    even the nvidia shield tv has AI video upscaling
  • The Real Jdbye @ The Real Jdbye:
    LLMs can be run on cpu anyway but it's quite slow
  • BakerMan @ BakerMan:
    Have you ever been beaten by a wet spaghetti noodle by your girlfriend because she has a twin sister, and you got confused and fucked her dad?
  • Psionic Roshambo @ Psionic Roshambo:
    I had a girlfriend who had a twin sister and they would mess with me constantly.... Until one chipped a tooth then finally I could tell them apart.... Lol
  • Psionic Roshambo @ Psionic Roshambo:
    They would have the same hair style the same clothes everything... Really messed with my head lol
  • Psionic Roshambo @ Psionic Roshambo:
    @The Real Jdbye, I could see AMD trying to pull off the CPU GPU tandem thing, would be a way to maybe close the gap a bit with Nvidia. Plus it would kinda put Nvidia at a future disadvantage since Nvidia can't make X86/64 CPUs? Intel and AMD licensing issues... I wonder how much that has held back innovation.
    Psionic Roshambo @ Psionic Roshambo: @The Real Jdbye, I could see AMD trying to pull off the CPU GPU tandem thing, would be a way to...