# Is the PS4's GDDR5 really all that it is cracked up to be?



## DiscostewSM (Mar 13, 2013)

I've gone over many other forums which have people quite ecstatic about the PS4 using GDDR5 as its main RAM bank, but I can't help but feel disturbed by it. I mean, yes, it has a much high bandwidth than DDR3, but isn't the latency on the DDR3 far lower in comparison? GPUs are able to mask the GDDR5 high latency because they are designed to work in parallel processing with its hundred upon hundreds of cores (which is why it is called "Graphic"-DDR), but with the CPU (in the case of the PS4), it only has 8 cores, and it has to deal with non-graphical components, which pretty much defines the rest of what a game makes up. It is far more linear in design than a GPU is, so wouldn't that create a possible bottleneck in that respect? People on other forums dismiss this possible bottleneck, saying that GDDR5 will work just as good on the CPU as DDR3 does, but no one has actually shown any proof of such, only saying that it does.

Can anyone here clarify this argument up please?


----------



## Ergo (Mar 13, 2013)

I can clarify why this is getting talked up so much--it's the only bright spot in the PS4 hardware. Everything else is nothing to write home about, especially if you're a PC gamer.

So I guess I'd just say that it doesn't matter what the issues with the GDDR5 are, it's the Cell chip of the PS4 or the Emotion Engine of the PS4, i.e. a talking point to rally the 'Sony builds another super computer' crowd, that is completely impervious to logic and reason.

Note: I am not going to become embroiled in a flame war over this but, please, knock yourself out if you must!


----------



## Guild McCommunist (Mar 13, 2013)

A thread about the power of the PS4?


----------



## Fishaman P (Mar 13, 2013)

From what I know, GDDR5 itself doesn't have high latency; the issue on PCs is the latency between the GPU and the CPU.  Since the PS4 isn't modular, they can be wired right to each other.


----------



## Foxi4 (Mar 13, 2013)

Ergo said:


> I can clarify why this is getting talked up so much--it's the only bright spot in the PS4 hardware. Everything else is nothing to write home about, especially if you're a PC gamer.


Sure. Most PC's these days are _equipped with hexacore CPU's with on-die integrated graphics chips connected by GPU-grade, fast shared memory so that every asset is kept in one pool, nullifying the need for copying stuff back and forth. (Not.)_

It's not all about _"the memory used being GDDR5"_, it's about how the hardware is designed to be efficient at a low cost. If we were to compare it to a PC then yes - pretty _"weak"_ but so was every console ever made - the key lies in how it performs in comparison and as we've seen on the presentation, it performs perfectly fine.


----------



## Maxternal (Mar 13, 2013)

Also, if you're saying that it's a good thing to connect to the GPU but not needed in the CPU, remember that here BOTH are supposedly using the same memory so it IS what's connected to the GPU.


----------



## Foxi4 (Mar 13, 2013)

Maxternal said:


> Also, if you're saying that it's a good thing to connect to the GPU but not needed in the CPU, remember that here BOTH are supposedly using the same memory so it IS what's connected to the GPU.


There's another benefit of connecting the GPU and the CPU and that's GPGPU - the CPU can pass floating point operations directly to the GPU which is far more efficient at them and the GPU can pass the processed information right back to the CPU... which is great. That, and you only have one heat source to worry about rather than two.


----------



## Gahars (Mar 13, 2013)

My sources from Sony have told me, and I quote, "It is all that _and_ a bag of potato chips."


----------



## Maxternal (Mar 13, 2013)

Foxi4 said:


> There's another benefit of connecting the GPU and the CPU and that's GPGPU - the CPU can pass floating point operations directly to the GPU which is far more efficient at them and the GPU can pass the processed information right back to the CPU... which is great. That, and you only have one heat source to worry about rather than two.


Good point. Hadn't thought of that. Kinda reminds me of back in the pre-pentuim days where you could have your separate floating point coprocessor if you wanted a really powerful number crunching machine ... only better.


----------



## marcus134 (Mar 13, 2013)

I haven't found any number on gddr5 latency, but as a general rule of the thumb, latency increase with clock cycle increase.

This is also true with system ram, for example you can purchase some ddr2 800 cl5 or some ddr3 1600 cl10 (standard ram module held in any reputable computer shops in this day of March the 13)

The CAS or CL latency is the amount of cycles it takes between a request and an answer.
For the ddr2, it means that it takes 5 cycle to start uploading the requested data while for the ddr3 it takes 10 cycles.
However, ddr2 does 800 cycles per second while ddr3 makes 1600 cycles.
So the ddr3 compared to the ddr2 take twice the amount of cycle to answer but does twice the amount of cycles per second, in human time both have the same latency.
(in this case the ddr3 has twice the bandwidth and a lower price)

Again, without numbers it's pretty hard to compare this situation with the situation of gddr5 vs ddr3.
Also, considering the fact that bobcat (and most likely the retail jaguar cpu too) doesn't have a gddr5 memory controller on die and we have no idea of the amount of customization (investment) Sony made on the cpu, the memory controller could be located on the chipset which would negate the advantage of low latency ram.


----------



## smf (Mar 13, 2013)

DiscostewSM said:


> I mean, yes, it has a much high bandwidth than DDR3, but isn't the latency on the DDR3 far lower in comparison?


 
You have to look at the actual figures, latency doesn't give you anything meaningful. For example, compare memory with a 1khz clock and a latency of 1 clock cycle with memory with a 1ghz clock with a latency of 10 clock cycles. The latency only affects opening the RAM line, any other transfer from the same line can be done without any latency.

I imagine that the CPU & GPU in the PS4 will be tuned well to utilize the bandwidth and limit the amount of time they are sitting around waiting for a ram line to open.


----------



## trumpet-205 (Mar 14, 2013)

You got it all wrong. The use of GDDR5 is not what impressed people. The fact that 8 GB GDDR5 memory is _*unified*_ is what impressed people. Unified as in both CPU and GPU use the same memory.

This is complete opposite of PS3, where both CPU and GPU use dedicated 256 MB memory. So data must pass from CPU memory pool to GPU memory pool.

With unified memory, GPU can access the needed data instantly.


----------



## chyyran (Mar 14, 2013)

trumpet-205 said:


> You got it all wrong. The use of GDDR5 is not what impressed people. The fact that 8 GB GDDR5 memory is _*unified*_ is what impressed people. Unified as in both CPU and GPU use the same memory.
> 
> This is complete opposite of PS3, where both CPU and GPU use dedicated 256 MB memory. So data must pass from CPU memory pool to GPU memory pool.
> 
> With unified memory, GPU can access the needed data instantly.


 

Personally, I didn't even think of the RAM being unified when I saw the presentation. Just that there's so much more than the WiiU. :/


----------



## marcus134 (Mar 14, 2013)

trumpet-205 said:


> You got it all wrong. The use of GDDR5 is not what impressed people. The fact that 8 GB GDDR5 memory is _*unified*_ is what impressed people.


 
That's odd, in the pc realm, Shared memory is the bane of the low end, how could it become the holy grail of living room gaming?

what's good is that they're using gddr5 so the ram bandwidth won't bottleneck as much as it would with ddr3, allowing dev to use bigger texture and anti-aliasing.

The problem with the jaguar APU, both the gpu and cpu component access the ram trough a single memory controller,they can only use one type of ram because of that.


----------



## BORTZ (Mar 14, 2013)

What we need to remember is that this isnt even the PS4's final form.


----------



## Rydian (Mar 14, 2013)

Foxi4 said:


> There's another benefit of connecting the GPU and the CPU and that's GPGPU - the CPU can pass floating point operations directly to the GPU which is far more efficient at them and the GPU can pass the processed information right back to the CPU... which is great. That, and you only have one heat source to worry about rather than two.


'Dat parallel processing.



marcus134 said:


> I haven't found any number on gddr5 latency, but as a general rule of the thumb, latency increase with clock cycle increase.


It's not the RAM's clock speed, but that different types time themselves differently and when they make a new type it takes a while for production to get to the point where low latencies can be created consistently.  It's why DDR2 had higher latency than DDR for a while, DDR3 had higher latencies than 2 reaches for a while, etc.


----------



## hisagishi (Mar 14, 2013)

GDDR5 isn't anything special. I mean sure its nice, but at the end of the day its just a modified version of DDR3. 

Since we are getting DDR4 on Haswell-EX boards I can assume high end gpus within a year or two will start using GDDR6.

As far as comparing it to a PC... Its mid road right now but thats not what these consoles are about. Its a dedicated platform where people can learn everything there is to know about it and use that to there advantage when making games. Unlike with PCs since there are always so many different parts. (IE you can't make a game rely on an 8 core proc or hyperthreading as most people don't have the "high end" so doing so will just make you look like a sloppy console port dev.)


----------



## wolfmanz51 (Mar 14, 2013)

yeah 8GB of GDDR5 ram is all its cracked up to be, many ps3 games are not even 8 GB so devs could load an entire game into Ram and access data from it faster than from the disc/HDD/Cloud(that last ones gonna be slow unless u have low latency Internet). Of course we don't know how much of that ram will be reserved for the feature heavy OS, but even if its half(unlikely) that's 4-6 GB of ram just for the games. Also the AMD Processor is highly compatible with GDDR5 chips. 
What we should be criticizing, is the Plan to go with x86/x64 architecture for a "gaming" console.
It's great for Devs porting PC games using middle ware, great for GAKIAIs server compatibility, and its technically an up-to-date Jaguar AMD version.
Still this is outdated architecture cria 1978, built up, using workarounds to achieve boosts in performance (multiple cores, hyper-threading, that Jaguar tech mentioned earlier), its designed to maintain compatibility with Intel programming. There are better forms of architecture out there for Running Video Game Code, just look at what has been done ARM architecture on android/ios devices/3DS Resident evil. This is also why its not BC with Ps3. This also means Cell Architecture in the PS3 that Sony went further into debt for was a waste of their time, and money, even though they could have updated Cell for PS4.
Being realistic its the way these circuits will be connected with the GDDR5 and the GPU that will make a difference next gen.


----------



## Foxi4 (Mar 14, 2013)

wolfmanz51 said:


> This also means Cell Architecture in the PS3 that Sony went further into debt for was a waste of their time, and money, *even though they could have updated Cell for PS4.*


They could have shot themselves in the balls too, but why would they? Programming for the CELL was convoluted.


----------



## wolfmanz51 (Mar 14, 2013)

Foxi4 said:


> They could have shot themselves in the balls too, but why would they? Programming for the CELL was convoluted.


yeah that's a good point, Sonys no stranger to shooting themselves in the balls with a 15 year in the RED streak so i guess its time they stop, I was just trying to make an example of how they could use a different architecture, but chose this to compete with PCs and next box, I suppose its more cost effective too, so my drunken point is null, im going to stop posting tonight and go to bed.


----------



## hisagishi (Mar 14, 2013)

wolfmanz51 said:


> yeah 8GB of GDDR5 ram is all its cracked up to be, many ps3 games are not even 8 GB so devs could load an entire game into Ram and access data from it faster than from the disc/HDD/Cloud(that last ones gonna be slow unless u have low latency Internet). Of course we don't know how much of that ram will be reserved for the feature heavy OS, but even if its half(unlikely) that's 4-6 GB of ram just for the games. Also the AMD Processor is highly compatible with GDDR5 chips.
> What we should be criticizing, is the Plan to go with x86/x64 architecture for a "gaming" console.
> It's great for Devs porting PC games using middle ware, great for GAKIAIs server compatibility, and its technically an up-to-date Jaguar AMD version.
> Still this is outdated architecture cria 1978, built up, using workarounds to achieve boosts in performance (multiple cores, hyper-threading, that Jaguar tech mentioned earlier), its designed to maintain compatibility with Intel programming. There are better forms of architecture out there for Running Video Game Code, just look at what has been done ARM architecture on android/ios devices/3DS Resident evil. This is also why its not BC with Ps3. This also means Cell Architecture in the PS3 that Sony went further into debt for was a waste of their time, and money, even though they could have updated Cell for PS4.
> Being realistic its the way these circuits will be connected with the GDDR5 and the GPU that will make a difference next gen.


 
Why switch to something new for a potential performance increase of how much exactly, 10%? 25%? Even 50% increase in performance wouldn't really justify the huge learning curve for most game devs to get it right. Oh and 8gb of ram/vram fills up way faster than you might think.


----------



## Gnargle (Mar 14, 2013)

Foxi4 said:


> Sure. Most PC's these days are _equipped with hexacore CPU's with on-die integrated graphics chips connected by GPU-grade, fast shared memory so that every asset is kept in one pool, nullifying the need for copying stuff back and forth. (Not.)_
> 
> It's not all about _"the memory used being GDDR5"_, it's about how the hardware is designed to be efficient at a low cost. If we were to compare it to a PC then yes - pretty _"weak"_ but so was every console ever made - the key lies in how it performs in comparison and as we've seen on the presentation, it performs perfectly fine.


A hexacore CPU is no good in and of itself. (Also it's an octacore). The Jaguar is designed as a mobile CPU, meaning it's mainly for use in laptops. It's only clocked at ~ 1.2GHz per core, has low L2 cache, yadda yadda yadda. Any modern PC with, say, an Intel Core i3 processor blows that shit away. Also - the GPU on-die is good, but it's a fairly low end GPU, only outputting 1.84 TFLOPS. It's rumoured to be based on the Radeon 6670, which is a mid-range card 2 years old that gets rebranded every year or so.
Second: "GPU-grade RAM" is not really a good thing when it's shared between both the CPU and GPU. GDDR5 is super high bandwidth, but also ha a REALLY high latency - fine for texture streaming and the like, tasks performed by a GPU, NOT GREAT for CPU tasks. That's why any PC with a discrete graphics card has a separate pool of RAM for the GPU - DDR3, with low latency but low bandwidth for the CPU, and GDDR5, with high bandwidth but high latency for GPU tasks.
Sony are all about the show. The specs of the PS4 are pretty shite compared to even a mid-range PC, but as soon as they turn up on stage and go HOLY SHIT LOOK AT ALL THE GEE DEE DEE ARR FIVE RAM IN IT, Sony fanboys suddenly think that it's the most powerful thing in existence.
Also the PS4 isn't really going to be built at that low a cost. GDDR5 is expensive as all hell, and the Jaguar is all new. Expect at least a $500 price tag on the PS4 and even that would be a loss considering marketing costs and the like.


----------



## Kouen Hasuki (Mar 14, 2013)

wow didn't think this was really a point on Conversation, But aye I can deffo see it making all the difference over DDR3


----------



## hisagishi (Mar 14, 2013)

Gnargle said:


> A hexacore CPU is no good in and of itself. (Also it's an octacore). The Jaguar is designed as a mobile CPU, meaning it's mainly for use in laptops. It's only clocked at ~ 1.2GHz per core, has low L2 cache, yadda yadda yadda. Any modern PC with, say, an Intel Core i3 processor blows that shit away. Also - the GPU on-die is good, but it's a fairly low end GPU, only outputting 1.84 TFLOPS. It's rumoured to be based on the Radeon 6670, which is a mid-range card 2 years old that gets rebranded every year or so.
> Second: "GPU-grade RAM" is not really a good thing when it's shared between both the CPU and GPU. GDDR5 is super high bandwidth, but also ha a REALLY high latency - fine for texture streaming and the like, tasks performed by a GPU, NOT GREAT for CPU tasks. That's why any PC with a discrete graphics card has a separate pool of RAM for the GPU - DDR3, with low latency but low bandwidth for the CPU, and GDDR5, with high bandwidth but high latency for GPU tasks.
> Sony are all about the show. The specs of the PS4 are pretty shite compared to even a mid-range PC, but as soon as they turn up on stage and go HOLY SHIT LOOK AT ALL THE GEE DEE DEE ARR FIVE RAM IN IT, Sony fanboys suddenly think that it's the most powerful thing in existence.
> Also the PS4 isn't really going to be built at that low a cost. GDDR5 is expensive as all hell, and the Jaguar is all new. Expect at least a $500 price tag on the PS4 and even that would be a loss considering marketing costs and the like.


 
6670? Even a 7850 is "only" at 1.76Tera flops, and 1.2ghz per core sounds painful.


----------



## Rydian (Mar 14, 2013)

hisagishi said:


> Oh and 8gb of ram/vram fills up way faster than you might think.


... under what circumstances that would apply to a console?


----------



## Kouen Hasuki (Mar 14, 2013)

Rydian said:


> ... under what circumstances that would apply to a console?


 
Thats what I was thinking since it doesn't have a whole OS, plus extra's like Skype and Antivirus ect ect to eat at it

Especially when you consider the current gen consoles had done fine on in comparison to PC's a tiny amount of ram lol


----------



## hisagishi (Mar 14, 2013)

Fair enough. I was just thinking with having to load levels into ram and share that space with the GPUs things it might get a little cramped. Though I suppose I am more used to using loads of Vram for modded games like skyrim, and most games don't even use up a full 4gb (I think PS2 uses 8gb sometimes but thats an MMOFPS)

Bad logic on my part.


----------



## trumpet-205 (Mar 14, 2013)

wolfmanz51 said:


> This also means Cell Architecture in the PS3 that Sony went further into debt for was a waste of their time, and money, even though they could have updated Cell for PS4.


 
Sony can't update Cell. Cell development was stopped by IBM (IBM only agrees to continue produce existing Cell CPU and provide support). Cell really is a failure for IBM because the only customer is Sony.


----------



## wolfmanz51 (Mar 14, 2013)

trumpet-205 said:


> Sony can't update Cell. Cell development was stopped by IBM (IBM only agrees to continue produce existing Cell CPU and provide support). Cell really is a failure for IBM because the only customer is Sony.


well if i wasent plastered i wouldn't have made the cell argument, as i remember now reading perviously what you just said, lol. Yet sony paid out of pocket to help develop CELL and sold consoles at a loss, IBM didn't lose any money on selling Cell chips in fact they sold a shitload of them to sony and the military, but now the Military is using something different i think its Power architecture


----------



## Deleted_171835 (Mar 14, 2013)

It's a welcome surprise considering most were expecting 4GB but it's not Jesus-incarnate, going to cure cancer and bring on world peace type awesome like some fanboys have been hyping it to be as explained already.


----------

