Impressive realistic GTA V mod for PC (WIP)

Check out this video of Toddyhancer, a GTA V mod for PC that makes the game look amazingly realistic:



It is still WIP (work in progress) but it looks mighty impressive so far. Do you have what it takes to play it? For more information, check out Martin Bergman's facebook page.
 

omgpwn666

Guy gamer and proud!
Member
Joined
Jun 14, 2008
Messages
2,546
Trophies
0
Age
32
Location
Florida
XP
608
Country
United States
60FPS isn't even that good. 144HZ is where it's at. Going back to 60FPS is like a slideshow.

Can you explain what you mean? I'm a little lost since 60FPS seems to be easily the most I can notice in FPS change. I've played games in locked 30 fps (eww), I've played them at 60 fps, which I think looks beautiful, but I've also played games at 340-500 FPS and have only noticed a 60 FPS feel with massive screen tearing. Also, with a 144HZ monitor 30 fps games still look like 30 fps, is that just my opinion, or is 144HZ monitors supposed to make 30 fps look like 60 fps?
 

dfsa3fdvc1

Well-Known Member
Member
Joined
Jan 3, 2015
Messages
226
Trophies
0
XP
214
Country
Albania
but I've also played games at 340-500 FPS and have only noticed a 60 FPS feel with massive screen tearing. Also, with a 144HZ monitor 30 fps games still look like 30 fps, is that just my opinion, or is 144HZ monitors supposed to make 30 fps look like 60 fps?

Your GPU can output games at pretty much any frame rate, 30,60,144,500,1000,5000... but you need monitor to take advantage of that. So unless you went out and bought a 144hz monitor the highest framerate you've ever seen is probably 60FPS.
144hz monitors are only useful when you input 144hz video into it. If you hook a console game running @ 30FPS into it it'll look exactly like it does on any other TV. It's only useful when you have a GPU that is capable of running a game at that 144hz framerate.

Also sidenote, those TVs you see at Bestbuy advertizing "Plasma screen 240hz!" or "600hz" are bullshit. That's frame interpolation where it'll take the 30FPS or 60FPS video it receives and generate fake in-between frames to make it appear smoother but the effect is pretty bad IMO and nothing like what you see on a true 144hz monitor outputting 144FPS from a PC game.
 

Maximilious

Whistles a familiar tune
Member
Joined
Nov 21, 2014
Messages
2,571
Trophies
1
XP
1,855
Country
United States
People seem to have latched on to 60fps as the way forward and has almost become a mantra which is seldom good.
To me that would then leave the framerate dip issue (losing 20fps from a baseline of 30 is very noticeable, losing 20 from a baseline of 60 is considerably less noticeable with the right screen setup) as the main benefit.

I agree. I grew up in the 8-bit/16-bit era and all this fps jargon has just muddied the water. Most people can't even see the actual difference on anything higher than 40fps. I was going to mention that perhaps 60fps is the "baseline" due to the potential loss of frames depending on the system, but in the end to me, it really doesn't matter as long as I can play the game comfortably. As for the video above, I don't see any problem with the frame rate and I know my PC would handle the extra load if it was more resource intensive.
 

obs123194

Well-Known Member
Member
Joined
Mar 9, 2014
Messages
787
Trophies
0
Age
29
XP
953
Country
United States
60 fps is all I need. The human eye can only notice those many frames anyway. When I play lol or cs go all those unnecessary frames don't feel different from 60+
 

dfsa3fdvc1

Well-Known Member
Member
Joined
Jan 3, 2015
Messages
226
Trophies
0
XP
214
Country
Albania
60 fps is all I need. The human eye can only notice those many frames anyway. When I play lol or cs go all those unnecessary frames don't feel different from 60+

That because you probably have a 60HZ monitor. Even if you can run the game at a higher framerate (ex 300FPS) it's bottlenecked by your monitor. If you had a 144HZ monitor you would definitely see the difference.
 
  • Like
Reactions: obs123194

Steena

Well-Known Member
Member
Joined
Sep 13, 2009
Messages
647
Trophies
0
XP
763
Country
Italy
Can you explain how motion blur in videogames should evolve into in details? Do you mean real-time calculated interpolation or something like this? If so, wouldn't this hurt games by producing inconsistent frame-by-frame visual results for the same situations? Arbitrary interpretation works on a movie because you're always going to see everything once, so it doesn't really matter. But since games work on loops wouldn't seeing the same animation a thousand times change be a problem?
 

omgpwn666

Guy gamer and proud!
Member
Joined
Jun 14, 2008
Messages
2,546
Trophies
0
Age
32
Location
Florida
XP
608
Country
United States
Your GPU can output games at pretty much any frame rate, 30,60,144,500,1000,5000... but you need monitor to take advantage of that. So unless you went out and bought a 144hz monitor the highest framerate you've ever seen is probably 60FPS.
144hz monitors are only useful when you input 144hz video into it. If you hook a console game running @ 30FPS into it it'll look exactly like it does on any other TV. It's only useful when you have a GPU that is capable of running a game at that 144hz framerate.

Also sidenote, those TVs you see at Bestbuy advertizing "Plasma screen 240hz!" or "600hz" are bullshit. That's frame interpolation where it'll take the 30FPS or 60FPS video it receives and generate fake in-between frames to make it appear smoother but the effect is pretty bad IMO and nothing like what you see on a true 144hz monitor outputting 144FPS from a PC game.

Awesome man. Thanks for the information, I do appreciate it.
 

jonthedit

Well-Known Member
Member
Joined
May 30, 2011
Messages
1,682
Trophies
0
XP
1,010
Country
Bangladesh
yes put them in the crazy bin along with people who cant tell the difference between 320kbps vs FLAC and 720p (upscaled to 1080p) vs 1080p
Hey, I love my 144HZ 60fps+!
But I honestly can't tell difference on sound from 192kbps to 320kbps. FLAC is a bit overkill, but its not that big a difference.
Awesome man. Thanks for the information, I do appreciate it.
Everything he said- and note Windows can run in 144HZ, it feels great :)
 

Arras

Well-Known Member
Member
Joined
Sep 14, 2010
Messages
6,318
Trophies
2
XP
5,421
Country
Netherlands
yes put them in the crazy bin along with people who cant tell the difference between 320kbps vs FLAC and 720p (upscaled to 1080p) vs 1080p
f4b9ca808e.png

(source: Amandine Pras, et al. Subjective evaluation of mp3 compression for different musical genres. Audio Engineering Society Convention 127. Audio Engineering Society, 2009.)
Yeah okay.
 

FAST6191

Techromancer
Editorial Team
Joined
Nov 21, 2005
Messages
36,798
Trophies
3
XP
28,348
Country
United Kingdom
Can you explain how motion blur in videogames should evolve into in details? Do you mean real-time calculated interpolation or something like this? If so, wouldn't this hurt games by producing inconsistent frame-by-frame visual results for the same situations? Arbitrary interpretation works on a movie because you're always going to see everything once, so it doesn't really matter. But since games work on loops wouldn't seeing the same animation a thousand times change be a problem?

I should mention as well the old CRT and refresh rate stuff. There is did benefit from high rates, nobody has yet mentioned it but it does inform some people even if it does not apply any more.

http://www.theregister.co.uk/2013/06/25/the_future_of_moving_images_the_eyes_have_it/ has a bit more on the subject in general.

If you are filming something with a video camera then physics means you get proper motion blur, give or take some of the shutter speed stuff but even then it is still "real", heavy CGI can trouble this by similar means to how games fail which is presumably why the 50fps Hobbit stuff was so agreeable to many. Games, and even general CGI, then either basically blur/interpolate the whole frame and hope for the best, or I have seen a few (mainly racing games and ones you do not move too horizontally in) do some kind of... I guess I would call it a blur vignette (though this is more of an effect, in racing games it would tend to come as you hit nitro/top speed/a turbo boost...). In reality though it is all relative in terms of speeds and speeds relative to you/the camera. If you do not want calculations then it really does boil down to "stop that, copy what physics does", which would then mean that items have the proper blur calculated from their velocities in the world and rendered accordingly -- we already do backface culling, camera object visibility rendering (if something behind an obstruction can not be seen by the camera then it is not rendered) and all the multiple levels of shadows/light reflection and calculation wise a back of the envelope session would put it at a similar order of calculation/difficulty.
Humans are pretty attuned to motion blur as reality has it and the weak attempt at it is presumably why so many dislike the present implementations, I can only hope it has not truly negatively impacted its chances of happening.
 

Vipera

Banned!
Banned
Joined
Aug 22, 2013
Messages
1,583
Trophies
0
Location
Away from this shithole
XP
1,365
Country
United States
People seem to have latched on to 60fps as the way forward and has almost become a mantra which is seldom good. Most general video does pretty well at 24-30 but usually manages this by having motion blur relative (mainly as physics sorts all that out) to the speed of things moving in images -- the horizon will be fairly static but your peripheral vision is a different matter, much less the ball speeding towards your face. The lack of motion blur is obscured (hah) somewhat by pumping the frame rate right up but I am not sure it is the better route -- it is a lot of pixels to pump and the bandwidth could probably be spent better. Conventional game style motion blur basically morphs images ( http://research.microsoft.com/en-us/um/people/hoppe/proj/morph/ ) rather than doing them relative to the motion which many people seem quite sensitive to.
Not many game companies/graphics devs seem to be researching it, which is odd to me given the efforts put in to certain types of light physics/reflections/shadows and even the efforts put into fog.

There is an argument for input/reaction times, though I would hold 30 is good enough for most people that are not this guy. Likewise it is possible to divorce control reads from graphics updates/framebuffers/vblanks. To me that would then leave the framerate dip issue (losing 20fps from a baseline of 30 is very noticeable, losing 20 from a baseline of 60 is considerably less noticeable with the right screen setup) as the main benefit.
I hate le pc masterrace hivemind and I hate all the enthusiasts who treat others like shit because they aren't as tech savvy as them, but dude, 30 vs 60fps makes a HUGE difference. Try playing San Andreas with the framelimiter off and then on. First time I did it I felt nausea.
 

FAST6191

Techromancer
Editorial Team
Joined
Nov 21, 2005
Messages
36,798
Trophies
3
XP
28,348
Country
United Kingdom
I hate le pc masterrace hivemind and I hate all the enthusiasts who treat others like shit because they aren't as tech savvy as them, but dude, 30 vs 60fps makes a HUGE difference. Try playing San Andreas with the framelimiter off and then on. First time I did it I felt nausea.
Never said it did not, I just said that the lack of proper motion blur in games means a 60fps or bust mindset has arisen when 30fps could well work just fine if the motion blur was sorted and open up room for other things.
 

Edgarska

Conjurer of cheap tricks
Member
Joined
Oct 24, 2011
Messages
797
Trophies
0
Age
34
XP
2,084
Country
United States
Never said it did not, I just said that the lack of proper motion blur in games means a 60fps or bust mindset has arisen when 30fps could well work just fine if the motion blur was sorted and open up room for other things.

But you're taking detail away with motion blur, on top of the lower frame rate. Motion blur is not needed in games because our eyes already take care of that.
 

Hells Malice

Are you a bully?
Member
GBAtemp Patron
Joined
Apr 9, 2009
Messages
7,122
Trophies
3
Age
32
XP
9,271
Country
Canada
People seem to have latched on to 60fps as the way forward and has almost become a mantra which is seldom good. Most general video does pretty well at 24-30 but usually manages this by having motion blur relative (mainly as physics sorts all that out) to the speed of things moving in images -- the horizon will be fairly static but your peripheral vision is a different matter, much less the ball speeding towards your face. The lack of motion blur is obscured (hah) somewhat by pumping the frame rate right up but I am not sure it is the better route -- it is a lot of pixels to pump and the bandwidth could probably be spent better. Conventional game style motion blur basically morphs images ( http://research.microsoft.com/en-us/um/people/hoppe/proj/morph/ ) rather than doing them relative to the motion which many people seem quite sensitive to.
Not many game companies/graphics devs seem to be researching it, which is odd to me given the efforts put in to certain types of light physics/reflections/shadows and even the efforts put into fog.

There is an argument for input/reaction times, though I would hold 30 is good enough for most people that are not this guy. Likewise it is possible to divorce control reads from graphics updates/framebuffers/vblanks. To me that would then leave the framerate dip issue (losing 20fps from a baseline of 30 is very noticeable, losing 20 from a baseline of 60 is considerably less noticeable with the right screen setup) as the main benefit.

Well once you stop being a shitty console gamer you'll notice a massive difference between 30 and 60 FPS. I can't see the difference just like everyone else, but I can definitely FEEL it. Games feel disgusting when they run at subpar FPS. Some games work fine, but anything actiony does not work at all at under 60. A stable framerate is the most important thing of course, but anything under 60 feels like you're slogging through shit. Games that require precise movements also feel awful at under 60 FPS (shooters and the like).

As usual I have literally no clue what you were actually talking about but at least your first sentence was in English.
 

FAST6191

Techromancer
Editorial Team
Joined
Nov 21, 2005
Messages
36,798
Trophies
3
XP
28,348
Country
United Kingdom
But you're taking detail away with motion blur, on top of the lower frame rate. Motion blur is not needed in games because our eyes already take care of that.
It is not needed if you go to higher frame rates which make up for quite a bit, however it could make lower frame rates work and if we are all supposed to be hankering for realistic reflections, shadows, light and whatever else then realistic motion blur. Your eyes/visual system does not take care of it at all though, it will fool you into seeing motion said sequence of still images but that is not motion blur. Likewise a removal of detail is not necessarily a bad thing even if you do not want realism as much, indeed part of the issue if you are thinking about motion blur as you might have seen it in games is the lack of selective detail removal.

Well once you stop being a shitty console gamer you'll notice a massive difference between 30 and 60 FPS. I can't see the difference just like everyone else, but I can definitely FEEL it. Games feel disgusting when they run at subpar FPS. Some games work fine, but anything actiony does not work at all at under 60. A stable framerate is the most important thing of course, but anything under 60 feels like you're slogging through shit. Games that require precise movements also feel awful at under 60 FPS (shooters and the like).

As usual I have literally no clue what you were actually talking about but at least your first sentence was in English.

I never said that there was not a difference (if you are going to compare current game style 30fps vs 60fps then there is a very clear difference for just about everything that attempts some kind of graphics motion), just that the way current people set about it is potentially not the only way with proper motion blur being the way forward in that instance. I can see the desire for stable fps, especially at lower rates where a dip becomes far more noticeable and with current monitor/input setups, but it need not be the be all and end all either. It is a fundamental difference, though not a terribly radical one, and most people would not have seen proper/realistic motion blur 30fps CGI before, that is OK though as pretty much everybody would have seen conventional video which works wonderfully at lower frame rates.
 

Site & Scene News

Popular threads in this forum

General chit-chat
Help Users
    NinStar @ NinStar: nes remix 1 had the bad one, nes remix 2 had the good one