A friend of mine recently bought himself a new computer (which I personally think is a bit of a ripoff compared to even my parent's laptop) with the following specs. [quote = "Advent DT2204 Desktop"]Processor Pentium® Dual Core Processor G630 (2.70 Ghz, 3MB cache) Operating System Windows 7 Home Premium 64bit RAM 6GB (4GB +2GB), DDR3, 1333Mhz, Maximum expandable memory 16GB Hard drive 1 TB, (variable) Seagate Optical disk drive Multiple CD/DVD player 16xDVD, 48xCDR, 24xRW / recorder 24x DVD -/+R, 8x DL -/+R, 8x +RW, 6x -RW, 12x -RAM USB 9 (3 front, 6 rear) Modem/Ethernet 10/100/1000 Gigabit Ethernet Audio interface 3 x 3.5mm Expansion card slot 1 x PCIe x 16, 1 x PCIe x 1 [/quote] That cost £320 without even an i3 processor. Then again I'm not the best at telling what's a decent buy. Maybe this is a decent deal for a prebuilt computer? Anyway it has an Intel Graphics 1000 (GT1) chip in it. Which CPUZ says has an 850MHz core and 874MB of RAM available to it. I know Intel have stepped up their game in terms of graphics a lot these days but I was wondering if for any kind of gaming he might be better off with even my old DX9 card (a GeForce 7900GS which I believe has 256MB of VRAM and a 4xxMHz processor). The main problem with the card is that he wouldn't get any DX10/11 stuff and no hardware acceleration stuff. I remember when I bought the card DX9 was still widely supported despite the switchover to DX10 being underway (I was running XP anyway). No idea if this is still the case or if many devs are switching off support for DX9 cards. After doing a bit of research I think he might be better with his integrated but I'm curious as to what you guys think? Tbh I'd forgotten just how old the card was.