Upcoming Intel GPU Might Not Suck

Discussion in 'Computer Hardware, Devices and Accessories' started by Rydian, Apr 27, 2013.

  1. Rydian

    Rydian Resident Furvertâ„¢

    Feb 4, 2010
    United States
    Cave Entrance, Watching Cyan Write Letters
    So anybody's who's into gaming and PC parts knows that while Intel has held the performance crown for CPUs for years now, their integrated GPUs have always been the bottom of the barrel, to the point that the OpenGL FAQ specifically notes how terrible Intel is at supporting it. We've seen some contrast from Intel, who did projects for concepts like real-time ray tracing, but so far better hardware has not actually surfaced.

    Well, with AMD proving that integrated GPUs can actually compete with low-cost dedicated cards if you're willing to cut into your bottom-line a little, it looks like Intel might actually start offering GPUs that don't suck, using such astounding technologies like "actually have your own fucking RAM for the GPU instead of using a portion of the system RAM". That's 22nd-century thinking right there, I tell 'ya.

    On a more serious note, Intel's primary concerns were heat and cost, which is why these aren't being packaged for the ultra-mobile devices. They look like they want to aim for the same power as a GT 650M, and while that's a mid-range mobile GPU, it's still multiple levels above the current Intel integrated offerings, and if they can get that type of performance in average consumer products, it might actually mean the minimum GPU performance in new computers raising yet again...

    Except the plans right now don't call for any separate CPUs with it, the only products will be build-into-motherboard packets for OEMs and the like.

    Still, it's a step.
    Sagat and raulpica like this.
  2. raulpica

    raulpica With your drill, thrust to the sky!

    Oct 23, 2007
    PowerLevel: 9001
    I've read around that the DRAM might be used as additional cache when a Discrete GPU is used. That sounds interesting.
  3. Gahars

    Gahars Bakayaro Banzai

    Aug 5, 2011
    United States
    New Jersey
    A wintel for Intel? Or just Intel-igence?
    kehkou likes this.
  4. kehkou

    kehkou does what Nintendon't

    Dec 19, 2009
    United States
    The Duke City
    Not holding my breath for cutting edge from intel, bot this at least shows some promise.
  5. Rydian

    Rydian Resident Furvertâ„¢

    Feb 4, 2010
    United States
    Cave Entrance, Watching Cyan Write Letters
    If so it wouldn't be low-latency stuff. Sort of like how AMD came out with L3-cache and people, used to L1/L2 cache being such boons, were disappointed that L3's latencies meant it wasn't used the same way.
  6. FAST6191

    FAST6191 Techromancer

    pip Reporter
    Nov 21, 2005
    United Kingdom
    Alas I have heard this sort of thing several times before. Never quite "this is the year of linux desktop" levels of "yeah mate, good luck with that" but it is still I will believe it when I see it.
    Rydian, RodrigoDavy and Foxi4 like this.
  7. The Milkman

    The Milkman GBATemp's Official Asshat Milkman

    Jan 12, 2011
    United States
    Throwing milk at the bitches!
    Wow, 650, thats better then my GPU :O!

    Its nice to see that Intel finally is getting into GPUs properly, now all these x86 devices that run off Intels wont just be for browsing the web and launching Word.
  8. trumpet-205

    trumpet-205 Embrace the darkness within

    Jan 14, 2009
    United States
    It is good to see Intel is delivering better iGPU for mobile and BGA solutions, but Intel needs to do a better job on LGA parts. Currently people don't care about iGPU if you are using LGA parts because people have dedicated GPU instead. What Intel has and fails to exploit it more seriously is Quick Sync. Quick Sync is the only MPEG2/H264 hardware encoder that doesn't produce crap quality, and currently only usable by small number of softwares.