Microsoft cuts touchscreen lag to 1ms

shakirmoledina

Legend
OP
Member
Joined
Oct 23, 2004
Messages
6,613
Trophies
0
Age
34
Location
Dar es Salaam
Website
vfootball.co.nf
XP
830
Country
Tanzania
Most panels and controllers out there suffer from about a 100ms delay. For taps and slow swipes that's not an issue but, as you wing your finger around the screen faster and faster (say, while quickly doodling in a painting app), the lag becomes quite apparent. The powerful minds over at Microsoft Research have figured out a way to get that delay down to a measly 1ms.

Really amazing video demonstration.Skip to around 0:52.

P.S - u can test it with ur iphone or ipad

 

Qtis

Grey Knight Inquisitor
Member
Joined
Feb 28, 2010
Messages
3,817
Trophies
2
Location
The Forge
XP
1,737
Country
Antarctica
Looking cool. Not sure if it has a real impact on gaming per se since the lag can be compensated via code (to a level of not noticing much difference). In the drawing part I can really see the advantage. So basically this is a new screen type? Wonder how long it'll take to be reasonably priced
 

Foxi4

Endless Trash
Global Moderator
Joined
Sep 13, 2009
Messages
30,818
Trophies
3
Location
Gaming Grotto
XP
29,789
Country
Poland
Yeah, Mr.Microsoft Guy, but you are forgetting that even with decreased latency, the line will be drawn only as fast as the display can refresh itself. Today's standards of mobile devices are either 30 or 60 frames per second - even if you have a latency of 1ms (the position of the touchpoint is calculated 1000 times a second) you are still going to refresh the screen just 60 times a second, creating another kind of lag - the CPU refreshing the position of the touchpoint faster then the display is capable to signify it.
 
  • Like
Reactions: 3 people

saberjoy

Well-Known Member
Member
Joined
Oct 9, 2011
Messages
548
Trophies
0
Age
28
Location
somewhere you only dream of!
XP
283
Country
India
Yeah, Mr.Microsoft Guy, but you are forgetting that even with decreased latency, the line will be drawn only as fast as the display can refresh itself. Today's standards of mobile devices are either 30 or 60 frames per second - even if you have a latency of 1ms (the position of the touchpoint is calculated 1000 times a second) you are still going to refresh the screen just 60 times a second, creating another kind of lag - the CPU refreshing the position of the touchpoint faster then the display is capable to signify it.
then probably the 'way' they discovered involves a tablet with higher refreshing speeds? just a speculation.
 

Foxi4

Endless Trash
Global Moderator
Joined
Sep 13, 2009
Messages
30,818
Trophies
3
Location
Gaming Grotto
XP
29,789
Country
Poland
Yeah, Mr.Microsoft Guy, but you are forgetting that even with decreased latency, the line will be drawn only as fast as the display can refresh itself. Today's standards of mobile devices are either 30 or 60 frames per second - even if you have a latency of 1ms (the position of the touchpoint is calculated 1000 times a second) you are still going to refresh the screen just 60 times a second, creating another kind of lag - the CPU refreshing the position of the touchpoint faster then the display is capable to signify it.
then probably the 'way' they discovered involves a tablet with higher refreshing speeds? just a speculation.
That is correct, but to achieve this kind of precision one would have to refresh at the rate of 1000 FPS, which is fun and dandy when all you display is a white square but not so much when you have to render a 3D scene etc.
 

saberjoy

Well-Known Member
Member
Joined
Oct 9, 2011
Messages
548
Trophies
0
Age
28
Location
somewhere you only dream of!
XP
283
Country
India
Yeah, Mr.Microsoft Guy, but you are forgetting that even with decreased latency, the line will be drawn only as fast as the display can refresh itself. Today's standards of mobile devices are either 30 or 60 frames per second - even if you have a latency of 1ms (the position of the touchpoint is calculated 1000 times a second) you are still going to refresh the screen just 60 times a second, creating another kind of lag - the CPU refreshing the position of the touchpoint faster then the display is capable to signify it.
then probably the 'way' they discovered involves a tablet with higher refreshing speeds? just a speculation.
That is correct, but to achieve this kind of precision one would have to refresh at the rate of 1000 FPS, which is fun and dandy when all you display is a white square but not so much when you have to render a 3D scene etc.
1000 fps? sheesh thats just overkill. And to think i hadnt even realised there was any lag in the iphone4's screen
 

Foxi4

Endless Trash
Global Moderator
Joined
Sep 13, 2009
Messages
30,818
Trophies
3
Location
Gaming Grotto
XP
29,789
Country
Poland
1000 fps? sheesh thats just overkill. And to think i hadnt even realised there was any lag in the iphone4's screen
It's not just overkill - it's pointless. The human eye cannot percieve anything over aprox. 72 FPS, so why refresh faster then that?

I can see what they're doing - they're trying to register more waypoints as the touchpoint moves and display their collection at the next possible frame to create a better, more precise aproximation of the route your finger traveled and that can indeed be done, but I would like to see it working in an actual rendered environment rather then just as a white box floating around - then we'll see if their new method, whatever it is, is usable in today's systems.
 

epicCreations.or

Well-Known Member
Member
Joined
Mar 13, 2010
Messages
356
Trophies
0
Location
Austin, TX
Website
whalecakes.com
XP
79
Country
United States
1. The human eye is not limited to arbitrary numbers like "72 FPS". The way we see things is perceived dynamically and differently from each other, and our eyes compensate in different ways depending on what we are seeing. LINK

2. Just because the display can't refresh faster than 60Hz doesn't mean that there won't be less lag. The reason we see lag is because of INPUT lag, not because of refresh lag.

EXAMPLE
In this test, for demonstration purposes, a finger is moving across a single axis at a rate of 1 unit/millisecond. The lag, therefore, would be 1 unit offset per millisecond of lag, per output update. This is observable in figure A. Using a 50ms lag display, there is a 50 unit offset on what is happening and what is actually displayed. On a 1ms lag display, there is a 1 unit offset on what is happening and what is displayed.

spreadsheet.png

Figure A: I destroy your argument with math.

As you can see, this technique of removing input lag is mostly unaffected by display refresh rate. The output has dramatically improved when using even the same display! The results of output would be much smoother with a greater refresh rate, of course, but it does not have to be increased to see the fruits of this technological advance.

Q.E.D.
 
  • Like
Reactions: 3 people

Wizerzak

Because I'm a potato!
Member
Joined
May 30, 2010
Messages
2,784
Trophies
1
Age
27
Location
United Kingdom
XP
873
Country
1. The human eye is not limited to arbitrary numbers like "72 FPS". The way we see things is perceived dynamically and differently from each other, and our eyes compensate in different ways depending on what we are seeing. LINK

2. Just because the display can't refresh faster than 60Hz doesn't mean that there won't be less lag. The reason we see lag is because of INPUT lag, not because of refresh lag.

EXAMPLE
In this test, for demonstration purposes, a finger is moving across a single axis at a rate of 1 unit/millisecond. The lag, therefore, would be 1 unit offset per millisecond of lag, per output update. This is observable in figure A. Using a 50ms lag display, there is a 50 unit offset on what is happening and what is actually displayed. On a 1ms lag display, there is a 1 unit offset on what is happening and what is displayed.

--snip--
Figure A: I destroy your argument with math.

As you can see, this technique of removing input lag is mostly unaffected by display refresh rate. The output has dramatically improved when using even the same display! The results of output would be much smoother with a greater refresh rate, of course, but it does not have to be increased to see the fruits of this technological advance.

Q.E.D.

Very nicely put. I was in the process of replying to Foxi4 saying something along these lines but you've explained it much better (thus I deleted what I just wrote).

That's a very nice technology they've been developing, now people (artists) will be able to draw properly on a tablet without all that lag.
 

Foxi4

Endless Trash
Global Moderator
Joined
Sep 13, 2009
Messages
30,818
Trophies
3
Location
Gaming Grotto
XP
29,789
Country
Poland
1. The human eye is not limited to arbitrary numbers like "72 FPS". The way we see things is perceived dynamically and differently from each other, and our eyes compensate in different ways depending on what we are seeing. LINK
72 is an aproximation. It's obvious that people percieve the world differently, they have *different sets of eyes*. It also depends on the circumstances, level of tiredness and so on and so forth.

2. Just because the display can't refresh faster than 60Hz doesn't mean that there won't be less lag. The reason we see lag is because of INPUT lag, not because of refresh lag.

Figure A: I destroy your argument with math.
What I just said directly above your post: "I can see what they're doing - they're trying to register more waypoints as the touchpoint moves and display their collection at the next possible frame to create a better, more precise aproximation of the route your finger traveled and that can indeed be done, but I would like to see it working in an actual rendered environment rather then just as a white box floating around - then we'll see if their new method, whatever it is, is usable in today's systems."
Figure B: Reading is hard.
 

epicCreations.or

Well-Known Member
Member
Joined
Mar 13, 2010
Messages
356
Trophies
0
Location
Austin, TX
Website
whalecakes.com
XP
79
Country
United States
The way you argued your point created the impression that accuracy would not drastically improve in the same display with this new setup, as portrayed in the following quote:
That is correct, but to achieve this kind of precision one would have to refresh at the rate of 1000 FPS, which is fun and dandy when all you display is a white square but not so much when you have to render a 3D scene etc.

The FPS of the program does not need to be raised! It doesn't matter whether it's a white box or a graphic-heavy 3D game, the lag would be massively reduced and the display greatly improved. This is because you only need to actually grab the input data when using it in a frame.

I would like to see it working in an actual rendered environment rather then just as a white box floating around - then we'll see if their new method, whatever it is, is usable in today's systems.

Because of the way the technology is used and implemented, it is unneccessary to see it in a rendered environment to verify if it is actually useful - it will be an improvement (in responsiveness and accuracy) no matter what!
 
  • Like
Reactions: 2 people

Foxi4

Endless Trash
Global Moderator
Joined
Sep 13, 2009
Messages
30,818
Trophies
3
Location
Gaming Grotto
XP
29,789
Country
Poland
I suppose you are correct in that regard, I didn't think of it that way. I was just wondering whether it would be wasteful or not if the latency is way, way faster then what the system is actually rendering and how it would impact the program in a real-life situation, that's why I would like to see it implemented in a system that would be familiar to see the difference somewhat first-hand. Afterall, after implementing it you would be reading a whole lot more of values that most of the time (when not drawing or working in applications that require precision pointing) aren't even necessary.

Obviously it's better if the touchscreen responds faster. :)

EDIT: It's hard for me to put in words what I have in mind so I'll elaborate.

When drawing you require numerous waypoints that will be connected, so in this case I can see how it will improve responsiveness. In moving an object though, the object will be moved only 60 or so times (for the human eye, of course - it will be moving constantly in the background, it's just not displayed), once each cycle, to the corresponding position. 940 registered positions are eliminated as useless unless you program a degree of an offset yourself. There, that sounds better.
 

epicCreations.or

Well-Known Member
Member
Joined
Mar 13, 2010
Messages
356
Trophies
0
Location
Austin, TX
Website
whalecakes.com
XP
79
Country
United States
I agree; there would be much superfluous information, but it's just a result of how the technology works. I don't see this as a bad thing, per se, but instead as a window into newer opportunities. Because there's more information involved anyways, should frame rates in programs and displays increase, there is still accurate input data to rely on.
 
  • Like
Reactions: 2 people

Foxi4

Endless Trash
Global Moderator
Joined
Sep 13, 2009
Messages
30,818
Trophies
3
Location
Gaming Grotto
XP
29,789
Country
Poland
I agree; there would be much superfluous information, but it's just a result of how the technology works. I don't see this as a bad thing, per se, but instead as a window into newer opportunities. Because there's more information involved anyways, should frame rates in programs and displays increase, there is still accurate input data to rely on.
Yeah, you're right. I was just focusing on lower-spec systems on which a whole lot of unused variables would not necessarily be welcome, such as handheld consoles. I mean, 1000 refreshes a second introduces 2000 integers just for touchscreen input and it is unlikely that a developer would use all of them. This is obviously a big advancement, it's just "not for every platform" per-say.

A simple answer to this would be introducing a way to ask the touchscreen for input data manually depending on the circumstances and put it to sleep when input is not needed - then a latency of 1ms would be a godsend - input data exactly when you need it stored in a handful of variables (ammount depends on the offset - could be even two if you don't want offset at all). :)
 

epicCreations.or

Well-Known Member
Member
Joined
Mar 13, 2010
Messages
356
Trophies
0
Location
Austin, TX
Website
whalecakes.com
XP
79
Country
United States
Hmm. I don't really know how the input protocols work, so I'm going to take some information from this page and hope that it applies to the situation. I'm imagining, depending on how things actually work, it would be able to compensate for this. For input-initialized data, it would probably be sent in a packets continuously, and the host would decide what to use (e.g. the most recent coordinates during a frame). If it was host-initialized requests, it would probably just have the device send over the most recent data when needed (e.g. once per frame). In these ways, only the data that is needed would be used, and the rest would either be discarded or wouldn't be sent in the first place.

Again, I don't know how this all works, so I can only imagine it works in one of those two ways.
 

Foxi4

Endless Trash
Global Moderator
Joined
Sep 13, 2009
Messages
30,818
Trophies
3
Location
Gaming Grotto
XP
29,789
Country
Poland
Yeah, I'm guessing it just refreshes two values and you save whichever ones you want using some sort of an interval... unless Microsoft prepares a stupid driver for it that registers way too much info trying to suit everyone's needs.
 

Site & Scene News

Popular threads in this forum

General chit-chat
Help Users
  • No one is chatting at the moment.
    LeoTCK @ LeoTCK: yes for nearly a month i was officially a wanted fugitive, until yesterday when it ended