So there's been a huge push for cloud gaming recently, especially with services like OnLive, but it hasn't been all that easy. One of the biggest problems right now is that the serious costs behind don't make it appear to be pretty impractical at the moment.
Nvidia, according to an announcement they just made today, may have found a way over that hurdle.
The article goes in greater detail on the whole thing; if you're interested, it's definitely worth a read.
So, what do you think? Is cloud gaming the way of the future, or is it doomed to come crashing down like the video game equivalent Chicken Little?
Nvidia, according to an announcement they just made today, may have found a way over that hurdle.
VentureBeatNvidia announced today that it has created unique features in its Kepler-based graphics chips that could make cloud-based gaming much more practical. The company has also formed partnerships with companies such as Gaikai to make cloud games much cheaper and more appealing to gamers and maybe even eliminate the need to create a new generation of consoles.
That latter possibility may be a remote one for now, but the fact that it is possible suggests the potential disruption that could occur if Nvidia executes its vision for cloud gaming: Multi-brained graphics chips in data centers will be able to handle computing tasks for home players.
...In an interview with GamesBeat, Nvidia senior vice president Dan Vivoli said Kepler-based graphics chips will be able to handle four times as many server-based games at the same time while using half the power and running at half the cost.
“The beauty of this approach is that GPUs will continue to improve all of the time,” said Vivoli. “Your hardware stays the same, but the data center hardware can be upgraded to handle better games.”
Game streaming companies such as OnLive, Gaikai, and Otoy have introduced cloud infrastructures in the past few years that enable users to play high-end games on low-end hardware. The serious computing is handled in the data center, and video is streamed to the user’s computer, where only a display is needed to view the game. Servers in the data centers can use high-end graphics cards to handle high-end games, but it usually takes one expensive graphics card to process one game being played by one user.
But with improved multitasking on Nvidia’s Kepler-based graphics chips, each data center server can handle multiple users at a lower cost. That makes cloud gaming more economical, Vivoli said.
The article goes in greater detail on the whole thing; if you're interested, it's definitely worth a read.
So, what do you think? Is cloud gaming the way of the future, or is it doomed to come crashing down like the video game equivalent Chicken Little?