Is the PS4's GDDR5 really all that it is cracked up to be?

Discussion in 'General Gaming Discussion' started by DiscostewSM, Mar 13, 2013.

  1. DiscostewSM
    OP

    Member DiscostewSM GBAtemp Psycho!

    Joined:
    Feb 10, 2009
    Messages:
    4,800
    Location:
    Sacramento, California
    Country:
    United States
    I've gone over many other forums which have people quite ecstatic about the PS4 using GDDR5 as its main RAM bank, but I can't help but feel disturbed by it. I mean, yes, it has a much high bandwidth than DDR3, but isn't the latency on the DDR3 far lower in comparison? GPUs are able to mask the GDDR5 high latency because they are designed to work in parallel processing with its hundred upon hundreds of cores (which is why it is called "Graphic"-DDR), but with the CPU (in the case of the PS4), it only has 8 cores, and it has to deal with non-graphical components, which pretty much defines the rest of what a game makes up. It is far more linear in design than a GPU is, so wouldn't that create a possible bottleneck in that respect? People on other forums dismiss this possible bottleneck, saying that GDDR5 will work just as good on the CPU as DDR3 does, but no one has actually shown any proof of such, only saying that it does.

    Can anyone here clarify this argument up please?
     


  2. Ergo

    Member Ergo GBAtemp Advanced Fan

    Joined:
    Oct 29, 2008
    Messages:
    614
    Country:
    United States
    I can clarify why this is getting talked up so much--it's the only bright spot in the PS4 hardware. Everything else is nothing to write home about, especially if you're a PC gamer.

    So I guess I'd just say that it doesn't matter what the issues with the GDDR5 are, it's the Cell chip of the PS4 or the Emotion Engine of the PS4, i.e. a talking point to rally the 'Sony builds another super computer' crowd, that is completely impervious to logic and reason.

    Note: I am not going to become embroiled in a flame war over this but, please, knock yourself out if you must!
     
  3. Guild McCommunist

    Member Guild McCommunist (not on boat)

    Joined:
    May 6, 2009
    Messages:
    18,151
    Location:
    The Danger Zone
    Country:
    United States
    A thread about the power of the PS4?
    [​IMG]
     
  4. Fishaman P

    Member Fishaman P Speedrunner

    Joined:
    Jan 2, 2010
    Messages:
    3,176
    Location:
    Wisconsin
    Country:
    United States
    From what I know, GDDR5 itself doesn't have high latency; the issue on PCs is the latency between the GPU and the CPU. Since the PS4 isn't modular, they can be wired right to each other.
     
  5. Foxi4

    Reporter Foxi4 On the hunt...

    pip
    Joined:
    Sep 13, 2009
    Messages:
    22,736
    Location:
    Gaming Grotto
    Country:
    Poland
    Sure. Most PC's these days are equipped with hexacore CPU's with on-die integrated graphics chips connected by GPU-grade, fast shared memory so that every asset is kept in one pool, nullifying the need for copying stuff back and forth. (Not.)

    It's not all about "the memory used being GDDR5", it's about how the hardware is designed to be efficient at a low cost. If we were to compare it to a PC then yes - pretty "weak" but so was every console ever made - the key lies in how it performs in comparison and as we've seen on the presentation, it performs perfectly fine.
     
    kashin and Rydian like this.
  6. Maxternal

    Member Maxternal Peanut Gallery Spokesman

    Joined:
    Nov 15, 2011
    Messages:
    5,210
    Location:
    Deep in GBAtemp addiction
    Country:
    Costa Rica
    Also, if you're saying that it's a good thing to connect to the GPU but not needed in the CPU, remember that here BOTH are supposedly using the same memory so it IS what's connected to the GPU.
     
  7. Foxi4

    Reporter Foxi4 On the hunt...

    pip
    Joined:
    Sep 13, 2009
    Messages:
    22,736
    Location:
    Gaming Grotto
    Country:
    Poland
    There's another benefit of connecting the GPU and the CPU and that's GPGPU - the CPU can pass floating point operations directly to the GPU which is far more efficient at them and the GPU can pass the processed information right back to the CPU... which is great. That, and you only have one heat source to worry about rather than two.
     
    kashin and Maxternal like this.
  8. Gahars

    Member Gahars Bakayaro Banzai

    Joined:
    Aug 5, 2011
    Messages:
    10,254
    Location:
    New Jersey
    Country:
    United States
    My sources from Sony have told me, and I quote, "It is all that and a bag of potato chips."
     
    Tom Bombadildo likes this.
  9. Maxternal

    Member Maxternal Peanut Gallery Spokesman

    Joined:
    Nov 15, 2011
    Messages:
    5,210
    Location:
    Deep in GBAtemp addiction
    Country:
    Costa Rica
    Good point. Hadn't thought of that. Kinda reminds me of back in the pre-pentuim days where you could have your separate floating point coprocessor if you wanted a really powerful number crunching machine ... only better.
     
  10. marcus134

    Member marcus134 GBAtemp Advanced Fan

    Joined:
    May 7, 2011
    Messages:
    584
    Location:
    Québec
    Country:
    Canada
    I haven't found any number on gddr5 latency, but as a general rule of the thumb, latency increase with clock cycle increase.

    This is also true with system ram, for example you can purchase some ddr2 800 cl5 or some ddr3 1600 cl10 (standard ram module held in any reputable computer shops in this day of March the 13)

    The CAS or CL latency is the amount of cycles it takes between a request and an answer.
    For the ddr2, it means that it takes 5 cycle to start uploading the requested data while for the ddr3 it takes 10 cycles.
    However, ddr2 does 800 cycles per second while ddr3 makes 1600 cycles.
    So the ddr3 compared to the ddr2 take twice the amount of cycle to answer but does twice the amount of cycles per second, in human time both have the same latency.
    (in this case the ddr3 has twice the bandwidth and a lower price)

    Again, without numbers it's pretty hard to compare this situation with the situation of gddr5 vs ddr3.
    Also, considering the fact that bobcat (and most likely the retail jaguar cpu too) doesn't have a gddr5 memory controller on die and we have no idea of the amount of customization (investment) Sony made on the cpu, the memory controller could be located on the chipset which would negate the advantage of low latency ram.
     
  11. smf

    Member smf GBAtemp Advanced Fan

    Joined:
    Feb 23, 2009
    Messages:
    839
    Country:
    United Kingdom
    You have to look at the actual figures, latency doesn't give you anything meaningful. For example, compare memory with a 1khz clock and a latency of 1 clock cycle with memory with a 1ghz clock with a latency of 10 clock cycles. The latency only affects opening the RAM line, any other transfer from the same line can be done without any latency.

    I imagine that the CPU & GPU in the PS4 will be tuned well to utilize the bandwidth and limit the amount of time they are sitting around waiting for a ram line to open.
     
  12. trumpet-205

    Member trumpet-205 Embrace the darkness within

    Joined:
    Jan 14, 2009
    Messages:
    4,363
    Country:
    United States
    You got it all wrong. The use of GDDR5 is not what impressed people. The fact that 8 GB GDDR5 memory is unified is what impressed people. Unified as in both CPU and GPU use the same memory.

    This is complete opposite of PS3, where both CPU and GPU use dedicated 256 MB memory. So data must pass from CPU memory pool to GPU memory pool.

    With unified memory, GPU can access the needed data instantly.
     
  13. Ron

    Member Ron somehow a weeb now.

    Joined:
    Dec 10, 2009
    Messages:
    2,837
    Location:
    here
    Country:
    Canada

    Personally, I didn't even think of the RAM being unified when I saw the presentation. Just that there's so much more than the WiiU. :/
     
  14. marcus134

    Member marcus134 GBAtemp Advanced Fan

    Joined:
    May 7, 2011
    Messages:
    584
    Location:
    Québec
    Country:
    Canada
    That's odd, in the pc realm, Shared memory is the bane of the low end, how could it become the holy grail of living room gaming?

    what's good is that they're using gddr5 so the ram bandwidth won't bottleneck as much as it would with ddr3, allowing dev to use bigger texture and anti-aliasing.

    The problem with the jaguar APU, both the gpu and cpu component access the ram trough a single memory controller,they can only use one type of ram because of that.
     
  15. BORTZ

    Global Moderator BORTZ wtf, nintendo

    Joined:
    Dec 2, 2007
    Messages:
    10,652
    Country:
    United States
    What we need to remember is that this isnt even the PS4's final form.
     
    Rydian likes this.
  16. Rydian

    Member Rydian Resident Furvert™

    Joined:
    Feb 4, 2010
    Messages:
    27,883
    Location:
    Cave Entrance, Watching Cyan Write Letters
    Country:
    United States
    'Dat parallel processing.

    It's not the RAM's clock speed, but that different types time themselves differently and when they make a new type it takes a while for production to get to the point where low latencies can be created consistently. It's why DDR2 had higher latency than DDR for a while, DDR3 had higher latencies than 2 reaches for a while, etc.
     
  17. hisagishi

    Member hisagishi GBAtemp Regular

    Joined:
    Feb 22, 2013
    Messages:
    276
    Country:
    United States
    GDDR5 isn't anything special. I mean sure its nice, but at the end of the day its just a modified version of DDR3.

    Since we are getting DDR4 on Haswell-EX boards I can assume high end gpus within a year or two will start using GDDR6.

    As far as comparing it to a PC... Its mid road right now but thats not what these consoles are about. Its a dedicated platform where people can learn everything there is to know about it and use that to there advantage when making games. Unlike with PCs since there are always so many different parts. (IE you can't make a game rely on an 8 core proc or hyperthreading as most people don't have the "high end" so doing so will just make you look like a sloppy console port dev.)
     
    Rydian likes this.
  18. wolfmanz51

    Member wolfmanz51 MrNintendosense

    Joined:
    Nov 24, 2008
    Messages:
    427
    Location:
    Somewhere in cali
    Country:
    United States
    yeah 8GB of GDDR5 ram is all its cracked up to be, many ps3 games are not even 8 GB so devs could load an entire game into Ram and access data from it faster than from the disc/HDD/Cloud(that last ones gonna be slow unless u have low latency Internet). Of course we don't know how much of that ram will be reserved for the feature heavy OS, but even if its half(unlikely) that's 4-6 GB of ram just for the games. Also the AMD Processor is highly compatible with GDDR5 chips.
    What we should be criticizing, is the Plan to go with x86/x64 architecture for a "gaming" console.
    It's great for Devs porting PC games using middle ware, great for GAKIAIs server compatibility, and its technically an up-to-date Jaguar AMD version.
    Still this is outdated architecture cria 1978, built up, using workarounds to achieve boosts in performance (multiple cores, hyper-threading, that Jaguar tech mentioned earlier), its designed to maintain compatibility with Intel programming. There are better forms of architecture out there for Running Video Game Code, just look at what has been done ARM architecture on android/ios devices/3DS Resident evil. This is also why its not BC with Ps3. This also means Cell Architecture in the PS3 that Sony went further into debt for was a waste of their time, and money, even though they could have updated Cell for PS4.
    Being realistic its the way these circuits will be connected with the GDDR5 and the GPU that will make a difference next gen.
     
  19. Foxi4

    Reporter Foxi4 On the hunt...

    pip
    Joined:
    Sep 13, 2009
    Messages:
    22,736
    Location:
    Gaming Grotto
    Country:
    Poland
    They could have shot themselves in the balls too, but why would they? Programming for the CELL was convoluted.
     
    Rydian and wolfmanz51 like this.
  20. wolfmanz51

    Member wolfmanz51 MrNintendosense

    Joined:
    Nov 24, 2008
    Messages:
    427
    Location:
    Somewhere in cali
    Country:
    United States
    yeah that's a good point, Sonys no stranger to shooting themselves in the balls with a 15 year in the RED streak so i guess its time they stop, I was just trying to make an example of how they could use a different architecture, but chose this to compete with PCs and next box, I suppose its more cost effective too, so my drunken point is null, im going to stop posting tonight and go to bed.
     

Share This Page