People seem to have latched on to 60fps as the way forward and has almost become a mantra which is seldom good. Most general video does pretty well at 24-30 but usually manages this by having motion blur relative (mainly as physics sorts all that out) to the speed of things moving in images -- the horizon will be fairly static but your peripheral vision is a different matter, much less the ball speeding towards your face. The lack of motion blur is obscured (hah) somewhat by pumping the frame rate right up but I am not sure it is the better route -- it is a lot of pixels to pump and the bandwidth could probably be spent better. Conventional game style motion blur basically morphs images (
http://research.microsoft.com/en-us/um/people/hoppe/proj/morph/ ) rather than doing them relative to the motion which many people seem quite sensitive to.
Not many game companies/graphics devs seem to be researching it, which is odd to me given the efforts put in to certain types of light physics/reflections/shadows and even the efforts put into fog.
There is an argument for input/reaction times, though I would hold 30 is good enough for most people that are not
this guy. Likewise it is possible to divorce control reads from graphics updates/framebuffers/vblanks. To me that would then leave the framerate dip issue (losing 20fps from a baseline of 30 is very noticeable, losing 20 from a baseline of 60 is considerably less noticeable with the right screen setup) as the main benefit.