All of lore.kernel.org
 help / color / mirror / Atom feed
* Minimal jitter = good desktop.
@ 2012-10-06  0:49 Uwaysi Bin Kareem
  2012-10-06  1:06 ` david
  0 siblings, 1 reply; 7+ messages in thread
From: Uwaysi Bin Kareem @ 2012-10-06  0:49 UTC (permalink / raw)
  To: linux-kernel

Reducing jitter seems central for many things.
First of all keypresses seem faster. (less jitter = less latency).
Doom 3 and similar jittersensitive OpenGL applications run smoothly, and  
better than windows. Doom 3 was also my main app to get running well, and  
measuring jitter in the signalpath of OpenGL improved the overall  
computing experience.

Even youtube videos who are not synced to refresh, with a refresh of 60,  
and a videofps of 30, runs quite well.  
http://paradoxuncreated.com/Blog/wordpress/?p=3221

System is responsive, and I have only had one problem, and that is packing  
seems not to work at the moment.

For a fast test in Ubuntu, try  
http://paradoxuncreated.com/Blog/wordpress/?p=2268

Definately a recommended config on the desktop.

Peace Be With You.

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: Minimal jitter = good desktop.
  2012-10-06  0:49 Minimal jitter = good desktop Uwaysi Bin Kareem
@ 2012-10-06  1:06 ` david
  2012-10-06 13:19   ` Uwaysi Bin Kareem
  0 siblings, 1 reply; 7+ messages in thread
From: david @ 2012-10-06  1:06 UTC (permalink / raw)
  To: Uwaysi Bin Kareem; +Cc: linux-kernel

less jitter != less latency

you could (in theory) eliminate jitter by delaying every keypress 
processed for exactly 1 second by having the code paths that process 
keypresses faster insert delays before implementing the results.

that would result in zero jitter, but horrific latency.

latency is how long it takes to do something, jitter is how much the 
latency varies.

normally, if you optimize for one you make the other worse.

If you optimize for latency, you try to finish everything as soon as you 
can. Since some things take longer than others, jitter increases.

David Lang

On Sat, 6 Oct 2012, Uwaysi Bin Kareem wrote:

> Reducing jitter seems central for many things.
> First of all keypresses seem faster. (less jitter = less latency).
> Doom 3 and similar jittersensitive OpenGL applications run smoothly, and 
> better than windows. Doom 3 was also my main app to get running well, and 
> measuring jitter in the signalpath of OpenGL improved the overall computing 
> experience.
>
> Even youtube videos who are not synced to refresh, with a refresh of 60, and 
> a videofps of 30, runs quite well. 
> http://paradoxuncreated.com/Blog/wordpress/?p=3221
>
> System is responsive, and I have only had one problem, and that is packing 
> seems not to work at the moment.
>
> For a fast test in Ubuntu, try 
> http://paradoxuncreated.com/Blog/wordpress/?p=2268
>
> Definately a recommended config on the desktop.
>
> Peace Be With You.
> --
> To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
> the body of a message to majordomo@vger.kernel.org
> More majordomo info at  http://vger.kernel.org/majordomo-info.html
> Please read the FAQ at  http://www.tux.org/lkml/

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: Minimal jitter = good desktop.
  2012-10-06  1:06 ` david
@ 2012-10-06 13:19   ` Uwaysi Bin Kareem
  2012-10-06 14:53     ` el_es
  0 siblings, 1 reply; 7+ messages in thread
From: Uwaysi Bin Kareem @ 2012-10-06 13:19 UTC (permalink / raw)
  To: david; +Cc: linux-kernel

In the context of os-jitter, delay/latency is measured as jitter.

Peace Be With You.

On Sat, 06 Oct 2012 03:06:57 +0200, <david@lang.hm> wrote:

> less jitter != less latency
>
> you could (in theory) eliminate jitter by delaying every keypress  
> processed for exactly 1 second by having the code paths that process  
> keypresses faster insert delays before implementing the results.
>
> that would result in zero jitter, but horrific latency.
>
> latency is how long it takes to do something, jitter is how much the  
> latency varies.
>
> normally, if you optimize for one you make the other worse.
>
> If you optimize for latency, you try to finish everything as soon as you  
> can. Since some things take longer than others, jitter increases.
>
> David Lang
>
> On Sat, 6 Oct 2012, Uwaysi Bin Kareem wrote:
>
>> Reducing jitter seems central for many things.
>> First of all keypresses seem faster. (less jitter = less latency).
>> Doom 3 and similar jittersensitive OpenGL applications run smoothly,  
>> and better than windows. Doom 3 was also my main app to get running  
>> well, and measuring jitter in the signalpath of OpenGL improved the  
>> overall computing experience.
>>
>> Even youtube videos who are not synced to refresh, with a refresh of  
>> 60, and a videofps of 30, runs quite well.  
>> http://paradoxuncreated.com/Blog/wordpress/?p=3221
>>
>> System is responsive, and I have only had one problem, and that is  
>> packing seems not to work at the moment.
>>
>> For a fast test in Ubuntu, try  
>> http://paradoxuncreated.com/Blog/wordpress/?p=2268
>>
>> Definately a recommended config on the desktop.
>>
>> Peace Be With You.
>> --
>> To unsubscribe from this list: send the line "unsubscribe linux-kernel"  
>> in
>> the body of a message to majordomo@vger.kernel.org
>> More majordomo info at  http://vger.kernel.org/majordomo-info.html
>> Please read the FAQ at  http://www.tux.org/lkml/

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: Minimal jitter = good desktop.
  2012-10-06 13:19   ` Uwaysi Bin Kareem
@ 2012-10-06 14:53     ` el_es
  2012-10-06 18:02       ` Uwaysi Bin Kareem
  0 siblings, 1 reply; 7+ messages in thread
From: el_es @ 2012-10-06 14:53 UTC (permalink / raw)
  To: linux-kernel

Uwaysi Bin Kareem <uwaysi.bin.kareem <at> paradoxuncreated.com> writes:

[sorry for cutting out the context], but it's been topposted]

But the problem is, we cannot measure 'jitter' directly.
There is no reliable benchmark that produces results adherent
to what someones' definition of 'jitter' is.

At software level we only have a notion of latency, and that
leads to jitter as david said, but as the kernel is not real-time,
you cannot guarantee every opengl command/fb transfer will be finished
in time for next frame to be drawn.

Maybe if someone could get the information of % finished frames
(or % dropped frames) within one slice of userspace, that would 
be something to build on, but it's still a derivative and with 
unknown bias level.

Lukasz


^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: Minimal jitter = good desktop.
  2012-10-06 14:53     ` el_es
@ 2012-10-06 18:02       ` Uwaysi Bin Kareem
  2012-10-07  1:15         ` el_es
  0 siblings, 1 reply; 7+ messages in thread
From: Uwaysi Bin Kareem @ 2012-10-06 18:02 UTC (permalink / raw)
  To: el_es

This is really simple, and I don`t care about top posting, down posting,  
in the middle comments or whatever. Do whatehver you like, and have no  
other rule that what is in your soul. That is what drives ultimately  
society. Look up Aristotle natural-law. Which actually is based in divine  
nature.

Now jitter, is really easy. Jitter-sensitive OpenGL applications will show  
visible jitter. Doom 3 is extremely sensitive. I have tried to make it run  
well many times, but it wasn`t until I became aware of more unintuitive  
behaviour not according to theory with some settings, and I started trying  
reversing them. And then I found 90hz to be optimal, and giving a  
perfectly running doom 3. Someone actually suggested I try 10000hz BFS  
patch, because "it would reduce latency." Which I did. But then I also  
tried 20hz, and there was little difference on BFS. Ultimately I arrived  
at 90hz with CFS, and tweaking it`s granularity a bit, and it worked well.  
(Better than BFS). So in that case, JITTER is solved. Also a lot of  
low-jitter configs use low hz. So that seems to back it up. And everything  
on my computer seems to be running better. Smoother, more responsive. Even  
the ads in my browser ;(

I also appreciate those who can measure small jitter in the uS range, and  
mitigate it. But I would also like for those, to check if a simple  
hold-logic would be better. For the 10ms filter I mentioned. Say hold for  
1ms at 0, and then to regular peak values. It seems that would be a better  
filter. This just me being a perfectionist ofcourse.

So yes, according to the general definition of "os-jitter" it seems highly  
reduced.

I don`t know at all why you are mentioning opengl calls. Obviously games,  
do run quite well. Atleast now. It is also going to be great to test new  
games coming, and keep iterating knowledge and tuning. Also ofcourse  
OpenGL is a great part of Wayland, and I hear more h/w is used there, and  
hopefully it doesn`t stop performance in games, so one can have an  
effectful desktop, without worrying about game-performance. Some of the  
GUI in doom3, running completely smooth, shows some great potential for  
GUI-ideas aswell :)

Peace Be With You.

On Sat, 06 Oct 2012 16:53:16 +0200, el_es <el.es.cr@gmail.com> wrote:

> Uwaysi Bin Kareem <uwaysi.bin.kareem <at> paradoxuncreated.com> writes:
>
> [sorry for cutting out the context], but it's been topposted]
>
> But the problem is, we cannot measure 'jitter' directly.
> There is no reliable benchmark that produces results adherent
> to what someones' definition of 'jitter' is.
>
> At software level we only have a notion of latency, and that
> leads to jitter as david said, but as the kernel is not real-time,
> you cannot guarantee every opengl command/fb transfer will be finished
> in time for next frame to be drawn.
>
> Maybe if someone could get the information of % finished frames
> (or % dropped frames) within one slice of userspace, that would
> be something to build on, but it's still a derivative and with
> unknown bias level.
>
> Lukasz
>
> --
> To unsubscribe from this list: send the line "unsubscribe linux-kernel"  
> in
> the body of a message to majordomo@vger.kernel.org
> More majordomo info at  http://vger.kernel.org/majordomo-info.html
> Please read the FAQ at  http://www.tux.org/lkml/

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: Minimal jitter = good desktop.
  2012-10-06 18:02       ` Uwaysi Bin Kareem
@ 2012-10-07  1:15         ` el_es
       [not found]           ` <op.wlsf0fuc6426ze@localhost.localdomain>
  0 siblings, 1 reply; 7+ messages in thread
From: el_es @ 2012-10-07  1:15 UTC (permalink / raw)
  To: linux-kernel

Uwaysi Bin Kareem <uwaysi.bin.kareem <at> paradoxuncreated.com> writes:

> 
> This is really simple, and I don`t care about top posting, down posting,  
> in the middle comments or whatever. Do whatehver you like, and have no  
> other rule that what is in your soul. That is what drives ultimately  
> society. Look up Aristotle natural-law. Which actually is based in divine  
> nature.

The natural law on most mailing lists is : avoid top-posting and actively
discourage it. That includes LKML.

> 
> Now jitter, is really easy. Jitter-sensitive OpenGL applications will show  
> visible jitter.
Question isn't of how you 'see' or 'feel' it, question is: how do you give
representative hard numbers of it.

> Doom 3 is extremely sensitive. I have tried to make it run  
> well many times, but it wasn`t until I became aware of more unintuitive  
> behaviour not according to theory with some settings, and I started trying  
> reversing them. And then I found 90hz to be optimal, and giving a  
> perfectly running doom 3. 

> Someone actually suggested I try 10000hz BFS  
> patch, because "it would reduce latency." Which I did. But then I also  
> tried 20hz, and there was little difference on BFS. Ultimately I arrived  
> at 90hz with CFS, and tweaking it`s granularity a bit, and it worked well.  
> (Better than BFS). So in that case, JITTER is solved. Also a lot of  
> low-jitter configs use low hz. So that seems to back it up. And everything  
> on my computer seems to be running better. Smoother, more responsive. Even  
> the ads in my browser ;(

Can you guarantee the same will apply in all hardware combinations? 
What if it doesn't work for e.g. someone with 4 core i5 and ATI gpu?

> 
> I also appreciate those who can measure small jitter in the uS range, and  
> mitigate it. But I would also like for those, to check if a simple  
> hold-logic would be better. For the 10ms filter I mentioned. Say hold for  
> 1ms at 0, and then to regular peak values. It seems that would be a better  
> filter. This just me being a perfectionist ofcourse.

I know how filters work, thank you very much.
But with same outcome one could argue, that someone could put a 
regulator there instead, to auto-tune to a chosen benchmark;
Maybe my personal favorite, GPC? 
But since you cannot measure jitter, you can't minimize that automatically...
or can you ?
Since we're in Control Theory, what we have accessible as 'measurements' 
is either a derivative of jitter or jitter derives from it, with bias unknown.
IOW nice for looking at, but worthless.
So the goal, according to what I derive from looking at LKML, is not
to minimize the latency, but rather to make it as precise as possible, 

> So yes, according to the general definition of "os-jitter" it seems highly  
> reduced.
> 
> I don`t know at all why you are mentioning opengl calls. Obviously games,  
> do run quite well. Atleast now. It is also going to be great to test new  
> games coming, and keep iterating knowledge and tuning. 


My thought experiment is as follows:
Imagine all the layers of the opengl aware application.
A thread or a few, to calculate the scene (spatial, physical, network
threads). Also one to govern them all. This is scheduled in the 
application itself, kernel only sees one process.
The scene composition thread, produces new frames as fast as it 
can run (or if synchronized to vsync, produces just one and then
waits).
An application competes for execution time with other applications.
A thread (or a few) receiving data from app itself, recognizing what
calls it should make, calling glibc/gl/kernel functions. This is still
in userspace and competes with the application itself (say it's the X
server)
A thread or a few, in kernel space, receiving calls from userspace,
translating into calls into hardware. (kernel driver)
This competes for execution time with other kernel level threads,
scheduler, other kernel services.
Now the pipeline in the GPU receives the data and manipulates 
GPU memory; Say, we can generate 2 to 4 frames in between gpu's internal
vsync (exxageration here) so the data in gpu memory changes 2 to 4 
times before the video buffer swap occurs and the output(s) start pushing
the buffer content into the monitor bitstream. 
Say we have comfortably enough time to push 2 scene updates, but 
the third is dubious: not all calls to gpu arrive before vsync HAS TO 
happen. The pipeline will not be reset; and if in the mean time, os scheduler
decides something else should be ran instead of the application or
xserver, it will be alerted of this fact only when its next slice comes.
But then, it will try to finish the remaining calls, which will only add
to injury. Because it will push the remaining calls for the scene
which is delayed because of the caller was preempted during vsync,
instead of recalculating the whole scene with most recent timing.

If we could predict that all the required calls will NOT make it into
GPU in time for 3rd frame update, we just wouldn't run it; but we can't
predict that. This is what I would call 'graphic buffer underrun'.

> 
> Peace Be With You.

Hope it's with you too.

Lukasz


^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: Minimal jitter = good desktop.
       [not found]               ` <op.wltgb7mg6426ze@localhost.localdomain>
@ 2012-10-07 20:28                 ` Uwaysi Bin Kareem
  0 siblings, 0 replies; 7+ messages in thread
From: Uwaysi Bin Kareem @ 2012-10-07 20:28 UTC (permalink / raw)
  To: linux-kernel

I also compiled a 3.6.1 with a local shave (only components I want) + my
low jitter config and tweaks. (most notably 90hz timer, which is where
many go wrong.)
Quake 2, with software renderer, in wine, went from 15/30 fps, to 60fps
with some jitter, with full distro low-jitter kernel, to perfect on the
local shaved.

Wine seems indeed also to be extremely jitter-sensitive.

So low-jitter seems to be good, for many things. Also stopping
ubuntu-daemons was neccesary.

Peace Be With You.

^ permalink raw reply	[flat|nested] 7+ messages in thread

end of thread, other threads:[~2012-10-07 20:28 UTC | newest]

Thread overview: 7+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2012-10-06  0:49 Minimal jitter = good desktop Uwaysi Bin Kareem
2012-10-06  1:06 ` david
2012-10-06 13:19   ` Uwaysi Bin Kareem
2012-10-06 14:53     ` el_es
2012-10-06 18:02       ` Uwaysi Bin Kareem
2012-10-07  1:15         ` el_es
     [not found]           ` <op.wlsf0fuc6426ze@localhost.localdomain>
     [not found]             ` <CAEwFKRBk0X7vAh=Sw-imXeSLN8kBjdfWFXw0Vq2GGygS=0yPrA@mail.gmail.com>
     [not found]               ` <op.wltgb7mg6426ze@localhost.localdomain>
2012-10-07 20:28                 ` Uwaysi Bin Kareem

This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.