# HD vs Full HD vs 4K. FPS vs Res. What are the benefits?



## Amadren (Jul 21, 2015)

The title is pretty self explanatory. What are the differences between a game at:

60fps -> 1366x768
60fps -> 1920x1080
30fps -> 1366x768
30fps -> 1920x1080

I always privilegiate fps vs res because I really see no big differences between games at different res. And you, what's your opinion? And also, is 4k really usefull?


----------



## VinsCool (Jul 21, 2015)

Since I'm not a graphic whore, and my PC runs games rather poorly, I always go for performances over resolution.


----------



## sarkwalvein (Jul 21, 2015)

There was a thread very similar to this one not so long ago.
The difference is normally very easy to see, unless you have a bad display.
4K resolution would look better, your eyes should be able to notice the difference if your display is good and big enough.
That said for most types of games fps/performance is way more important than resolution in order to get a good gameplay experience.


----------



## RevPokemon (Jul 21, 2015)

I mainly care more about the FPS regardless of the res. Generally I can't tell the difference that well when its 720P (provided the graphics look good) to 4k unless I look up close but of course it depends what I'm playing/watching and since I play games like 8+ feet away from my TV affects my opinion of course


----------



## Amadren (Jul 21, 2015)

Honestly I don't see so much differences between HD and Full HD (never tried 4K) anyway I'm "hypermethrope" (sorry, don't know this word in english) and see too much details. So I can easily see differences between 30 and 60 fps (60 fps looks smoother when for me there's no differences between 1 and 30 fps xD)


----------



## Hungry Friend (Jul 25, 2015)

720p(and lower if it's an older game or has good AA) is fine and much less time and money should be spent on making games pretty because it's FUCKING EXPENSIVE. Good art direction is where it's at as long as the graphics are good enough(which is subjective) I couldn't care less about resolution and such as long as a game performs well. 60+fps is preferable of course in all cases but 30 is often sufficient, for example MGS3 Subsistence(PS2, HD version is 60) plays just fine despite having fairy frequent hiccups. It's steady enough for the game to retain fairly precise controls and as long as it's fun to play, the superficial stuff is inconsequential. Games seen soul, not flash.

Raw horsepower is overrated anyway; look at games like Mario 3, World, Yoshi's Island, Super Metroid, Final Fantasy VI, Chrono Trigger and Cross and Shadow of the Colossus among many others on older systems that STILL look great. Good art direction makes raw horsepower nearly irrelevant.


----------



## GamerzHell9137 (Jul 25, 2015)

<FPS = Smoother gameplay
<Resolution = Crispier picture

If the game has fast gameplay then i prefer 60 FPS, it its a slow gameplay game then i prefer higher resolution.


----------



## JazzCat.CL (Jul 25, 2015)

I really like more the resolution sometimes, because you can "look" better at things at a game.
Obviously, graphics are a good thing in terms of having detailed items, lighting and shadows.


----------



## DinohScene (Jul 25, 2015)

Performance, day in day out.
Fuck resolution, fuck graphics.
Game has to run good, that's all I care about.


----------



## mightymuffy (Jul 25, 2015)

4k for the most part is a great white elephant: only if you're rich will you benefit from a 4k TV as the screen would have to be at least 66" if viewing from the usual distance....
Even on a monitor with yer face right up against it 1440p is often considered a sweet spot, with the difference between that and 4k being rather minimal.

I have a 46" TV (yeah it's a few years old now) and have a PS4, XO and a PC on it.... Tried Witcher 3 on both PS4 and XO, sitting about 7-8 feet away. PS4 = 1080p, XO = 900p - no difference when playing. Tried Battlefield 4 on them both however: 900p on PS4, 720p on XO, same distance away - difference (not HUGE, but apparent)
My point? I forgot that halfway through typing . But in the case of Witcher 3, XO had the better framerate, so was in my opinion, from my viewing distance, the better version....


----------



## FAST6191 (Jul 25, 2015)

I wonder if one day we will get proper motion blur in games and can end this 60fps farce.

4k in games and moving pictures... yeah I can leave it. 4k when working in CAD, spreadsheets, video editing and such like. I want more. To that end if the public at large wants to help drive the price down a bit then I am OK with that.


----------



## Taleweaver (Jul 25, 2015)

It sort of depends on the game as well. A higher resolution means it'll show more on the screen, and on games like RTS'es that isn't a minor difference. Obviously, if it's a turn based RTS, resolution becomes even more important than the FPS.

On an First Person Shooter, frames per second is really all that's important. Or in short: the FPS is all that matters in an FPS. 


My monitor's a native 1080p, but I really can't tell the difference with this and 720 unless I do a side-by-side comparison and really look for it (I'm not one to play the highest of the highest games...if I do, the resolution is the first thing that goes down in an FPS or TPS). As such, I won't be upgrading to 4k soon (I game on a monitor...not 5 meters away from it).


----------



## Chary (Jul 25, 2015)

FPS is the most important thing, for sure. Sure, resolution is nice, but what's the point of a pretty looking game, if it plays choppy?


----------



## Kioku_Dreams (Jul 25, 2015)

60FPS. I used to not care.. Until I built a high end machine. I was spoiled. I've also found that going on a lower res is noticeable. Especially if it's not the native resolution of your monitor/TV. I haven't experienced 4K yet, nor 1440P. So, I can't say much there.


----------



## HaloEffect17 (Jul 25, 2015)

I agree.  The game has to be playable.  And if the FPS is hindering the experience of the game, that's a problem.


----------



## Catastrophic (Jul 25, 2015)

1920x1680 at 60fps is currently the most kickass setup for what it's worth as common displays uses that resolution. 30fps is decent but isn't desirable for fast paced, reaction based games. 4K doesn't have enough support and requires far too expensive hardware for there to be a reason to invest in it for gaming.


----------



## G0R3Z (Jul 25, 2015)

It's not about how useful they are. 

If a game is terrible, it doesn't matter what resolution it's on or how fast it plays. I'd rather play a game at a lower resolution (or settings) as long as it runs smoothly. The games experience is more important than graphics. Graphics whores are not real gamers, simple as. If you can't appreciate the game without it being photo realistic, then you have no business playing it. 

As a PC gamer, playing at the native resolution is the most important. I have a 1080p monitor, overclocked to run at 75hz. I'd rather set some settings lower to run it at that resolution, with 60fps or more. When that's fine, then turning up the options is next. As someone with a pretty good machine - I use what's called super sampling in order to run some games at 2560 x 1440 on my monitor. It renders the whole resolution on my 1080p screen, making the game look very crisp. 

The experience is the most important. Resolution should be at your display's native, and 60fps is pretty awesome for gaming. You really notice the jump to fps if you've come from a 30fps console. My friends are flabbergasted when they see the difference.


----------



## Amadren (Jul 25, 2015)

Ok. So is a 3x monitor solution a good alternative? I mean it -should- be smoother and there should be more details than a Full Hd screen. For exemple, if I want to update my gaming rig, should i go with 4k at 60 fps or 3x 1080p at 60fps?


----------



## Selim873 (Jul 25, 2015)

This is a PC focused reply:
I'm honestly not sure in my opinion, as I love a high framerate and resolution.  I haven't really found my sweet spot yet.  I currently have two 1080p 144hz monitors in my setup, but I kind of want to sell them in exchange for a single 1440p 144hz display.  The one I want (The cheapest, actually) also has support for FreeSync.  I don't like my SLI setup for temperature reasons, plus my GTX 760's won't support full DX12 and I'd like to have that option if a developer decides to use it, and the R9 Nano (Or maybe the Fury X) looks really nice.

All in all, I think my main focus is actually framerate, since I use 144hz displays that can output at 144fps maximum rather than the standard 60.  If you're looking for frame rate, I would definitely get a 144hz display and use DisplayPort no matter what, since most games go way above 60 if your hardware's good enough, and that itself is easy to nail on newer cheap hardware since developers are now trying to focus on 4k so 1080p or 720p will perform SO much better than it used to, but it's a little tough to hit 144, so you'll almost never have to use VSync.


----------



## G0R3Z (Jul 25, 2015)

Amadren said:


> Ok. So is a 3x monitor solution a good alternative? I mean it -should- be smoother and there should be more details than a Full Hd screen. For exemple, if I want to update my gaming rig, should i go with 4k at 60 fps or 3x 1080p at 60fps?



It isn't about what looks better, it's about the experience. For example, are you going to be playing many games that is going to take advantage of those three screens? It's down to your own preferences and how you prefer gaming. Do you prefer a Higher resolution then 4k. If you prefer a higher field of view then multi-monitor. Technically three 1080p screens would be slightly easier to run. 

I myself would opt for the three monitors as I've always wanted to play games like Skyrim and Fallout across those extra screens. If you play driving games or simulators like Elite Dangerous, they're apparently amazing experiences in multi-screen.


----------



## Amadren (Jul 25, 2015)

G0R3Z said:


> It isn't about what looks better, it's about the experience. For example, are you going to be playing many games that is going to take advantage of those three screens? It's down to your own preferences and how you prefer gaming. Do you prefer a Higher resolution then 4k. If you prefer a higher field of view then multi-monitor. Technically three 1080p screens would be slightly easier to run.
> 
> I myself would opt for the three monitors as I've always wanted to play games like Skyrim and Fallout across those extra screens. If you play driving games or simulators like Elite Dangerous, they're apparently amazing experiences in multi-screen.



I already have two screens but only use one for gaming cause I code alot and it's very helpful to have the documentation ready to use. For the games I'll play The Witcher, Skyrim, Metro... So I think they should be beautiful with multiple monitors ^^


----------



## FAST6191 (Jul 25, 2015)

Yeah multi screens are odd when it comes to games. For the most part it is almost like trying to play widescreen 10 years ago. Fortunately said people doing the widescreen stuff have since gone on to cover multiple monitors and crazy high resolutions ( http://www.wsgf.org/mgl being a good start there).

For day to day computing I can not do without multiple monitors, preferably three but two will do, especially if I can have a portrait one. If I am typing a document with references to another, editing graphics, editing video or doing CAD then I will first seek another monitor, and even go buy a cheapo graphics card if the machine I am using does not handle it. It is that much of a dealbreaker for a lot of what I do these days.

For games two monitors is not ideal as the middle of the screens would be where your crosshairs/the horizon is so you do want three, unless you have one monitor like a normal single screen game and another for inventory/map/radar/general data but there are not so many games that support such a setup, though nothing is stopping you from having extra info in a web browser/video player/chat window or something. These days most would probably look to something like the eyefinity from AMD for that sort of thing, back when the Matrox triple head to go (thtg) was the weapon of choice for those things that might not technically have supported it.

Also "4k at 60 fps" is easier said than done and not just because that is a lot of numbers to crunch. Many 4k screens (I will leave the discussion of the various types of 4k for later) will only do 4K at 30Hz, especially if they go out over HDMI (the newer revisions of HDMI will support it but they are not there in graphics cards or monitors, much less the ones mortals can afford). To that end you will probably then want a displayport setup or possibly certain types of DVI (most monitors will opt for displayport though). Though consumer 4k is quite literally 4 1080p monitors stapled together there are also things that are perhaps not as nice with them and dual monitors can still have some advantage.
The other side of that is 4K monitors if they are going cheap might not be as nice as what you can get for the same money in the 1080p or greater than 1080p but not 4k world. Most 4K monitors that go for sane prices will be TN type displays where you can get some nice IPS stuff for similar money back in 1080p land. TN does tend to come with lower latencies by default and modern ones do pretty well for colours and viewing angles compared to days of old but you might care about such things. Much like consumers subsidising 4K for me I would not mind them also doing it for colour calibrated monitors but it is a harder sell.


----------



## Amadren (Jul 25, 2015)

Oh I didn't knew it for 4k monitors... I think I'll go with 3x 1080p screen and upgrade my graphic cards (a hd 7970 crossfire will do the trick?). Then I'll wait 4/6 years until 4k become more user friendly ^^ (and cheaper)


----------



## G0R3Z (Jul 25, 2015)

Amadren said:


> Oh I didn't knew it for 4k monitors... I think I'll go with 3x 1080p screen and upgrade my graphic cards (a hd 7970 crossfire will do the trick?). Then I'll wait 4/6 years until 4k become more user friendly ^^ (and cheaper)



4K screens are already starting to hit low prices. And it isn't any less user friendly to use than multi-monitor setups that need tweaking and configuration changes to sometimes make them work. And 4/6 years is quite a long time to wait for tech to progress. 4K should be pretty damn cheap by next year even. Having a rig to power the 4k is another matter.

Another option would be a 21:9 monitor. Coders love them and they're great for gaming. Being one large screen, they don't have ugly, distracting bezels like multi-monitor setups either. They're normally about 2560x1080 but also have 4k variants like dell's 3440 x 1440 model.


----------



## Amadren (Jul 25, 2015)

G0R3Z said:


> 4K screens are already starting to hit low prices. And it isn't any less user friendly to use than multi-monitor setups that need tweaking and configuration changes to sometimes make them work. And 4/6 years is quite a long time to wait for tech to progress. 4K should be pretty damn cheap by next year even. Having a rig to power the 4k is another matter.
> 
> Another option would be a 21:9 monitor. Coders love them and they're great for gaming. Being one large screen, they don't have ugly, distracting bezels like multi-monitor setups either. They're normally about 2560x1080 but also have 4k variants like dell's 3440 x 1440 model.


I'll check it too, thanks ^^


----------



## tbb043 (Jul 25, 2015)

240p is all you need.


----------



## Amadren (Jul 25, 2015)

tbb043 said:


> 240p is all you need.


32*


----------



## emmanu888 (Jul 25, 2015)

Don't care about FPS, i mean people complained that Batman wasn't at 60FPS but it had a movie feel to it at 30FPS to be honest


----------



## Hungry Friend (Jul 30, 2015)

60fps is obviously preferable and even needed for fighting games and other games that require pinpoint precision/timing, but the steadiness of a game's FR is more important than 30 VS 60fps or whatever. Fluctuating framerates in action-heavy games can really fuck with your timing so keeping the framerate consistent is the most important thing. However, one of my favorite games is MGS3 Subsistence(PS2 version) and it has a pretty wonky FR at times, but its gameplay is such that it doesn't bother me much. 60fps or a rock solid 30 would obviously be preferable though.(HD version is 60, but it fluctuates)

Performance all the way. If you have to sacrifice resolution and general graphical fidelity to make a game run well, do it. If a game's art direction is good then it'll look good even if you have to strip it down for performance.


----------



## goober (Jul 30, 2015)

Honestly, if no one can tell the difference between, say, GTA V running on a PS3/360 at 720p versus on PC, same quality settings, at 1080 or higher I really don't know what to say... The added benefit of higher resolutions is that the higher you go, particularly at 1440p or higher, the less Anti-Aliasing you need, less jaggies. Considering that AA leaves less than perfect results and is a bit of a resource hog, this can be a great boon at squeezing out performance. Rather than FPS alone FPS stability is most critical. 

I personally can't stand lower resolution games at least when not at the native resolution of the monitor and I can handle some FPS fluctuation because depending on the game and engine you're just spec whoring instead of graphic whoring hardly any victory there... But for racing games and certain fighting games (not like many fighting games require heavy specs anyway) I'll choose FPS over sheer resolution simply because the fluidity there does make a difference.

Ultimately it's the hardware. People make these observations on certain monitors and stay in that class and therefore can't imagine the other worlds. I'd definitely agree that 4K gaming isn't worth the current FPS sacrifices needed these days but 1440p is definitely the sweet spot of affordable performance and a tangibly improved gaming and desktop space experience.


----------



## Amadren (Jul 30, 2015)

Honestly I don't see any differences between a game in 720p on a 720p monitor and a game in 1080p on a 1080p monitor ^^


----------



## Hells Malice (Aug 1, 2015)

emmanu888 said:


> Don't care about FPS, i mean people complained that Batman wasn't at 60FPS but it had a movie feel to it at 30FPS to be honest



Can't tell if trolling


----------



## emmanu888 (Aug 1, 2015)

Hells Malice said:


> Can't tell if trolling


I'm not trolling, i do like having high FPS in a game but sometimes i like that movie feel that 30FPS gives to games that uses 30FPS


----------



## Hells Malice (Aug 1, 2015)

emmanu888 said:


> I'm not trolling, i do like having high FPS in a game but sometimes i like that movie feel that 30FPS gives to games that uses 30FPS



So this is why Ubisoft still sells games.
People like you _do_ exist.

I mean I shouldn't even be surprised anymore.


----------



## Hungry Friend (Aug 1, 2015)

I can certainly tell the difference between 720p and higher resolutions as well as other details, but I just don't think graphics are that important as long as a game is fun to play. For example, I'd be perfectly fine with the FFVII remake having graphics on the level of FFXII(in at least 720p) or something or on the level of the 2006 tech demo; I'm not all that picky and I hate seeing companies spend millions upon millions just to make a game pretty and often sacrificing gameplay & performance as a result.


----------



## G0R3Z (Aug 1, 2015)

emmanu888 said:


> I'm not trolling, i do like having high FPS in a game but sometimes i like that movie feel that 30FPS gives to games that uses 30FPS



Just no. That's what consoles are for. PC gamers shouldn't have to put up with half finished crap like Arkham Knight. 30FPS is insulting.


----------



## emmanu888 (Aug 1, 2015)

G0R3Z said:


> Just no. That's what consoles are for. PC gamers shouldn't have to put up with half finished crap like Arkham Knight. 30FPS is insulting.



All i'm saying is i watched Arkham Knight being streamed on Twitch at 30FPS and it had a movie feel to it


----------



## GamerzHell9137 (Aug 1, 2015)

emmanu888 said:


> All i'm saying is i watched Arkham Knight being streamed on Twitch at 30FPS and it had a movie feel to it


Yeah but its a game, not a movie. You wanna have a smoother gameplay, not the other way round.


----------



## G0R3Z (Aug 1, 2015)

emmanu888 said:


> All i'm saying is i watched Arkham Knight being streamed on Twitch at 30FPS and it had a movie feel to it



Except it isn't a movie. If you want a movie, go watch The Dark Knight. We're PC gamers, we want the 60 FPS that we paid hundreds of pounds more to have.


----------



## Vipera (Aug 1, 2015)

Hells Malice said:


> So this is why Ubisoft still sells games.
> People like you _do_ exist.
> I mean I shouldn't even be surprised anymore.


My God what an obnoxious comment.



emmanu888 said:


> All i'm saying is i watched Arkham Knight being streamed on Twitch at 30FPS and it had a movie feel to it


You should play games to 60fps. Trust me, it's a whole different story.


----------



## Hells Malice (Aug 1, 2015)

Vipera said:


> My God what an obnoxious comment.



It sucks having taste, I know.


----------



## Vipera (Aug 1, 2015)

Hells Malice said:


> It sucks having taste, I know.


Doesn't mean you have to be a dick about it.


----------



## Hells Malice (Aug 1, 2015)

Vipera said:


> Doesn't mean you have to be a dick about it.



Oh that was_ far _from being a dick. I thought I was quite polite about it.
Anywho that's enough wasteful offtopic banter, eh?


----------



## The Real Jdbye (Aug 1, 2015)

I partially agree with the OP. Things look a lot smoother at 1080p compared to 720p, but that's the main difference. Games are more limited by the texture resolution IMO, so they don't really look more detailed at higher resolutions much of the time, but under certain conditions or with certain games they do. It's easier on the eyes because you don't see any jaggies, and for me, that's worth it, but it requires a lot more GPU power than the improvement is really worth if your current hardware can't handle it well. Better to have a lower resolution with higher graphics settings and 60 FPS, than a higher resolution with lower graphics settings and 60 FPS, the graphics settings make a bigger difference than the resolution. However FPS is more important than anything else to a certain point. If the FPS gain is massive (like going up 20 FPS, from 20-40 or 40-60 for example), I'll turn down the settings, but if the FPS gain isn't that big and the FPS is already 40+ and doesn't drop much below that, I'm fine with leaving the settings turned up.

As for 4K, you supposedly can't even tell the difference from 1080p unless you have abnormally good eyesight, or you sit way too close to the screen. I'm talking about using a TV as a monitor (or using a 32" monitor) and sitting as close to it as you would a smaller monitor. At that point you would tell the difference easily, but an average person sitting a normal distance away from a TV or a regular-sized monitor wouldn't be able to tell the difference. That's just what I've read on some website, don't remember which one it was but it was one of the big ones. I think they're right though, it obviously depends on your screen and how far away you sit as well as your eyesight, but if I can't see individual pixels and edges look smooth, how is upgrading to 4K going to improve anything?
I didn't notice much of a difference going from a 720p phone to an 1080p one, things might look a tiny bit smoother but it's so miniscule that I can't tell.

tl;dr It really depends on your screen size and how far away you sit whether it's worth it or not. For most people, 4K won't make much of a difference, but there are exceptions of course. 1080p is a reasonable screen resolution and if you have the hardware to run games well at that resolution, it's preferrable, but if you don't, it's not a big loss.


----------



## RevPokemon (Aug 1, 2015)

The Real Jdbye said:


> I partially agree with the OP. Things look a lot smoother at 1080p compared to 720p, but that's the main difference. Games are more limited by the texture resolution IMO, so they don't really look more detailed at higher resolutions much of the time, but under certain conditions or with certain games they do. It's easier on the eyes because you don't see any jaggies, and for me, that's worth it, but it requires a lot more GPU power than the improvement is really worth if your current hardware can't handle it well. Better to have a lower resolution with higher graphics settings and 60 FPS, than a higher resolution with lower graphics settings and 60 FPS, the graphics settings make a bigger difference than the resolution. However FPS is more important than anything else to a certain point. If the FPS gain is massive (like going up 20 FPS, from 20-40 or 40-60 for example), I'll turn down the settings, but if the FPS gain isn't that big and the FPS is already 40+ and doesn't drop much below that, I'm fine with leaving the settings turned up.
> 
> As for 4K, you supposedly can't even tell the difference from 1080p unless you have abnormally good eyesight, or you sit way too close to the screen. I'm talking about using a TV as a monitor (or using a 32" monitor) and sitting as close to it as you would a smaller monitor. At that point you would tell the difference easily, but an average person sitting a normal distance away from a TV or a regular-sized monitor wouldn't be able to tell the difference. That's just what I've read on some website, don't remember which one it was but it was one of the big ones. I think they're right though, it obviously depends on your screen and how far away you sit as well as your eyesight, but if I can't see individual pixels and edges look smooth, how is upgrading to 4K going to improve anything?
> I didn't notice much of a difference going from a 720p phone to an 1080p one, things might look a tiny bit smoother but it's so miniscule that I can't tell.
> ...


Oh I know its true plus it depends on what you looking at too


----------



## The Real Jdbye (Aug 1, 2015)

RevPokemon said:


> Oh I know its true plus it depends on what you looking at too


What's more important when it comes to screens nowadays is the color balance, contrast, and such. Those can make a big difference and people might get a better monitor with a higher resolution, but also better color balance and contrast and think the resolution is what makes the difference, simply because it looks better.


----------



## RevPokemon (Aug 1, 2015)

The Real Jdbye said:


> What's more important when it comes to screens nowadays is the color balance, contrast, and such. Those can make a big difference and people might get a better monitor with a higher resolution, but also better color balance and contrast and think the resolution is what makes the difference, simply because it looks better.


Ohh I know and a good refresh rate


----------



## The Real Jdbye (Aug 1, 2015)

RevPokemon said:


> Ohh I know and a good refresh rate


Sure, that matters too, but not that much beyond 60hz. i can easily tell the difference between 60 and 120hz but I'm sure 120hz is rather overkill.


----------



## RevPokemon (Aug 1, 2015)

The Real Jdbye said:


> Sure, that matters too, but not that much beyond 60hz. i can easily tell the difference between 60 and 120hz but I'm sure 120hz is rather overkill.


Yeah but really for gaming the higher can be bad but for videos 120hz is pretty nice


----------



## The Real Jdbye (Aug 2, 2015)

RevPokemon said:


> Yeah but really for gaming the higher can be bad but for videos 120hz is pretty nice


You can't even get any videos that are 120hz.


----------



## RevPokemon (Aug 2, 2015)

The Real Jdbye said:


> You can't even get any videos that are 120hz.


I meant that watching stuff on 120hz can look good


----------



## The Real Jdbye (Aug 2, 2015)

RevPokemon said:


> I meant that watching stuff on 120hz can look good


It won't look any better unless the framerate can support it, not really. You simply can't create detail that just isn't there.


----------



## RevPokemon (Aug 2, 2015)

The Real Jdbye said:


> It won't look any better unless the framerate can support it, not really. You simply can't create detail that just isn't there.


Well I always thought it effected to way the frames transition but you know more than me.


----------



## DarkFlare69 (Aug 2, 2015)

I always go with performance. Once the game runs 60FPS or faster,  then I start tweaking settings to make it look better. I don't like laggy and poor Frame rate games


----------



## G0R3Z (Aug 2, 2015)

RevPokemon said:


> I meant that watching stuff on 120hz can look good



Twitch play games are good in 120hz - LoL, CS:GO. Games where you really need that extra edge.


----------



## FAST6191 (Aug 2, 2015)

The Real Jdbye said:


> You can't even get any videos that are 120hz.



My relatively cheap camera does 120fps, albeit at DVD sizes, various gopros are up around there too at decent res and plenty of cameras that cost a bit do 120fps as well.


----------



## The Real Jdbye (Aug 2, 2015)

FAST6191 said:


> My relatively cheap camera does 120fps, albeit at DVD sizes, various gopros are up around there too at decent res and plenty of cameras that cost a bit do 120fps as well.


Sure, but how often do you watch your own videos? Most of the time when you're watching something, it will be a movie, a TV or a youtube video, none of those are ever available in 120 FPS. YouTube can do 60 FPS now but I think that's the limit.


----------



## FAST6191 (Aug 2, 2015)

I suppose it is the same deal as with 4k -- probably useless, or at least nothing to write home about, for most people but if you create the content for people to see/use then that is a different matter entirely. Though most of the time 120fps will probably be slowed down to something far lower for use as slow motion.


----------



## Foxi4 (Aug 2, 2015)

I go for performance over resolution 9 out of 10 times, the only exception is my laptop on which I have a tendency to select the native resolution of the panel whenever possible, but that's only because it tends to _"work better"_ when set up that way. As far as I'm concerned, 1080p is _"good enough"_ for now, I won't be jumping on the 4K bandwagon anytime soon. We're slowly but surely approaching the point where higher resolution of the screen will be irrelevant as we won't be able to perceive the individual pixels anyways. You already have to be ridiculously close to a 1080p screen in order to distinguish one pixel from another on an average-sized living room TV, 4K doesn't provide as big of a _"boost"_ as better performance. The difference between 30FPS and 60FPS is more perceptible to me than the difference between 1080p and 4K in standard living room conditions at an optimum viewing distance.


----------



## Hungry Friend (Aug 2, 2015)

I can tell the difference between 720 & 1080p pretty easily but the difference between 30 and 60fps is really, really obvious at all times. If you're using a small monitor 1080p probably isn't nescessary at all but 60fps actually does improve gameplay/response time so that's why I choose performance. Really though it's all down to good game design. Plenty of great games have shitty resolutions and/or framerates and are still fun to play, but naturally a constant 60+fps is always nice. Playing a KOF or Street Fighter game at 30fps would be a fucking disaster.

Also, Okami is another good example of great art direction trumping raw horsepower. I just don't see the big deal if a game is low res if it's fun to play, but that's probably because I mostly play old games. Suikoden 2, the game my avatar is inspired by, has some sexy looking sprites and as with most 2d games in the 32-bit era, it runs at 60fps at almost all times. It's primitive as fuck by today's standards but the art style is really nice, and the spell effects STILL look good.(FFIX has some sexy spell effects too)


----------



## Amadren (Aug 3, 2015)

I tried playing the same game on two different monitors, the first is a Samsung 21.5", 1080p, 60hz using Dynamic Contrast and plugged using an HDMI port and the second is a BenQ 19", 720p, 60hz pluggend with a VGA port. I always played with the highest setting and at 60fps. I saw no differences between both game. (Except that I saw wider at 1080p than 720p). Both were smooth and really good looking.

I tried:
The Witcher 3
World of Warcraft
Far Cry 3
Serious Sam 3 BFE.

So I think it has something to do with the screen size :/


----------



## Foxi4 (Aug 3, 2015)

Amadren said:


> So I think it has something to do with the screen size :/


It does - that, and the distance from the screen. If you pack 720 pixels on a tiny screen, those pixels will be smaller on a small screen than on a big screen, obviously. The bigger the screen the bigger the pixels, the closer you are to it the easier it is to see them. At some point resolution will reach a plateau where we won't be able to distinguish one pixel from another, at which point we'll just stop bothering with resolution, and we're getting there with 4 and 8K. As far as standard-sized screens are concerned, 1080p is perfectly fine - you don't _need_ 4K.


----------



## TecXero (Aug 3, 2015)

Most games don't feel very responsive to me at lower framerates, and I always prefer gameplay performance over graphics. There are some games I can still enjoy at 30fps, but it's rare. Resolution isn't that big of a deal to me. I have most of my home theater stuff set to 720p instead of 1080p just because my furniture is far enough away that I can't tell the difference.


----------



## G0R3Z (Aug 3, 2015)

TecXero said:


> Most games don't feel very responsive to me at lower framerates, and I always prefer gameplay performance over graphics. There are some games I can still enjoy at 30fps, but it's rare. Resolution isn't that big of a deal to me. I have most of my home theater stuff set to 720p instead of 1080p just because my furniture is far enough away that I can't tell the difference.



I think it's silly how some people can honestly say that playing at 30fps is okay. It's a dreadful experience. I'd rather bump my game down to 720p than to play at 30fps.


----------



## sarkwalvein (Aug 3, 2015)

G0R3Z said:


> I think it's silly how some people can honestly say that playing at 30fps is okay. It's a dreadful experience. I'd rather bump my game down to 720p than to play at 30fps.


It depends on the type of game, really.
There are many games where it doesn't really matter to have "just 30 FPS", like point and click games, visual novels, tactic RPGs, most turn based RPGs, long etc.
Of course it matters for action based games where your reflexes come into play like FPSes, platform games, racing games, etc.
E.g.: I am suffering Mass Effect 3 that runs at average 30FPS on my Wii U, but I couldn't care less if some Fire Emblem ran at 20FPS.

My rule of thumb for a home console game would be _*at least*_ 720p, 30FPS with more FPS being more welcome than more resolution.


----------



## FAST6191 (Aug 3, 2015)

G0R3Z said:


> I think it's silly how some people can honestly say that playing at 30fps is okay. It's a dreadful experience. I'd rather bump my game down to 720p than to play at 30fps.



30fps is fine for basically everything* if you have proper motion blur (note that most motion blur seen in games is very poor), however so few games seem to want to do it so the higher frame rate does seem to become something to consider for people.

*see also how most videos you have probably ever watched have been 24 fps, 25fps or very occasionally 30fps.


----------



## G0R3Z (Aug 3, 2015)

FAST6191 said:


> 30fps is fine for basically everything* if you have proper motion blur (note that most motion blur seen in games is very poor), however so few games seem to want to do it so the higher frame rate does seem to become something to consider for people.
> 
> *see also how most videos you have probably ever watched have been 24 fps, 25fps or very occasionally 30fps.



Indeed. But for gaming, 30 fps is just not fast enough.


----------



## FAST6191 (Aug 3, 2015)

I maintain that if you were shown 30fps CGI/computer game footage with proper motion blur, which is to say relative to the object, distance and speeds displayed rather than some distant cousin to image morphing that games which try to do it pass off as motion blur, then you would be fine with it. The whole "30fps is cinematic" stuff that various game publishers have tried on is complete and utter horse shit but that is perhaps a different discussion.

The input/reaction time (ooh it is double...) thing has some marginal potential, especially at the ultra high end which is not where most people operate. However most of that is lost in the noise with latency of screens and probably latency of controllers* as well.

*this gets to be fun. If you kick it old school (and it is still a viable way so I am not complaining) and tie a controller read/debounce to vblanks rather than something a bit more stateful/event driven then it becomes a greater issue.


----------



## Foxi4 (Aug 3, 2015)

G0R3Z said:


> I think it's silly how some people can honestly say that playing at 30fps is okay. It's a dreadful experience. I'd rather bump my game down to 720p than to play at 30fps.


I always found 30 FPS to be perfectly fine as long as it's stable. Your brain will adjust to the framerate and the response time naturally without realizing it, it's when framerate fluctuates that you begin noticing issues. Stability of framerate is more important than framerate itself, unstable framerate is very noticable and gets you out of the game quickly.


----------



## WiiCube_2013 (Aug 3, 2015)

I for myself can see that the games I play in high definition don't always (almost always) never look as pleasing as the gameplay videos on YouTube but even still they're loads of fun to play. So yeah, this is something that doesn't go unnoticed but rather just let it slide as I don't mind it.


----------



## Hungry Friend (Aug 3, 2015)

A stable 30fps is fine for games that don't require really, really precise timing like fighting games, old-school shmups, some racing games and some FPS games as well as some others but generally speaking I agree that a steady 30 is fine. If it's a choice between 1080p/30fps and 720p/60fps, I'll choose 720/60 every time however, even if that means less texture detail as well. Unless you're playing on a very large screen, resolution differences aren't going to matter much. There's a huge difference between 720p and 1080p on my 50 inch LCD TV but on my 23 inch PC monitor it's not all that noticeable.

Games have always had fluctuating framerates at times but during the 2600-SNES/Genesis eras, 60fps(50 for PAL games) was generally the target. When polygonal games began to take over, framerate/performance was often sacrificed for more graphical detail and it's been that way ever since. 2d games during the 32-bit era were almost always 60 as well. A surprising number of PS2 games are 60fps however.


----------



## TecXero (Aug 3, 2015)

FAST6191 said:


> 30fps is fine for basically everything* if you have proper motion blur (note that most motion blur seen in games is very poor), however so few games seem to want to do it so the higher frame rate does seem to become something to consider for people.
> 
> *see also how most videos you have probably ever watched have been 24 fps, 25fps or very occasionally 30fps.


Watching it, yeah that's fine, but playing it, it's not. Motion blur doesn't help make it feel more responsive, and even gives me a bit of a headache. That's why I couldn't enjoy Sonic Colors on Wii, which was a game dependant on reaction time. It can break some games for me. That said, there are still games I can enjoy despite the low framerate, like most of the third-person LoZ games. I'd still love to see them at 60fps, but they got them to where they feel responsive enough for me to enjoy them.


----------



## FAST6191 (Aug 3, 2015)

TecXero said:


> Watching it, yeah that's fine, but playing it, it's not. Motion blur doesn't help make it feel more responsive, and even gives me a bit of a headache. That's why I couldn't enjoy Sonic Colors on Wii, which was a game dependant on reaction time. It can break some games for me. That said, there are still games I can enjoy despite the low framerate, like most of the third-person LoZ games. I'd still love to see them at 60fps, but they got them to where they feel responsive enough for me to enjoy them.



I have not seen more than a tech demo try proper motion blur as it is seen in real life. Most games basically blur or try to interpolate a frame similar to the classical morph between two shapes transition type effect*, and yeah headaches, visual discomfort, reduced ability in games perhaps (watch a figure skater, or a vert skateboarder or anybody that spins around a lot and see how they move their head during a spin as it is a related problem), motion sickness as you can not focus on distant objects properly and all that jazz are expected if you do use the weaker stuff here. Real motion blur is a relative thing covering peripheral vision, distance to objects, speeds and all manner of other things, nothing too drastic really (maths wise I reckon I could set a 3 hour exam for 18 year olds that are learning physics to solve a short scene) but not entirely trivial. In real life the rendering is taken care of by reality itself and even crazy low shutter speeds on a camera are still enough to get something. I do not know why they are sinking so much money into light/shadow rendering when this could be a thing but hey.

*for example, just with less change as it is only between two frames and at many times a second as appropriate.


The reaction time thing is something of a... I do not want to say strawman as there is at least the kernel of an argument there that could apply to some people. When programmed in a somewhat sensible fashion though it is more or less only likely to be those that are basically this guy (so not me and probably not anyone else on this site, and I dare say it is not going to be a common ability even among the professional twitch game set)


----------

