Web lists-archives.com

Re: GtkGlArea render fps is different than monitor frame rate

I extracted a little project to be able to show what I am measuring:


The code is extracted-copied from several places, sorry about the quality... I hope it compiles on other boxes, too...

It builds two binaries, egl-demo and gtk-demo -- one creates opengl context with xlib and egl, the other with gtkglarea.

My monitor has a 60.06Hz screen refresh:

$ xrandr --verbose
Screen 0: minimum 8 x 8, current 1920 x 1080, maximum 32767 x 32767
eDP1 connected primary 1920x1080+0+0 (0x6f) normal (normal left inverted right x axis y axis) 344mm x 193mm
  1920x1080 (0x6f) 140.000MHz +HSync -VSync *current +preferred
        h: width  1920 start 1968 end 2068 total 2100 skew    0 clock  66.67KHz
        v: height 1080 start 1083 end 1084 total 1110           clock  60.06Hz

The program prints some timing info, all of them are microseconds. I measure my average, min and max rendering time, and more importantly my average, min and max frame time, i.e., the wallclock time between two runs of my render callback function.

On my machine, with the egl-demo, I have strict 60.06 FPS:

Full avg compute time: 433
1 sec avg compute time: 451
Min compute time: 388
Max compute time: 496
Full avg FPS: 60.06
1 sec FPS: 60.06
Min frame time: 16460
Max frame time: 16846

With gtk-demo, the numbers are not so nice:

Full avg compute time: 428
1 sec avg compute time: 456
Min compute time: 258
Max compute time: 2107
Full avg FPS: 59.03
1 sec FPS: 58.90
Min frame time: 15116
Max frame time: 19014

Although it is visible that I have bigger jitter with gtk, the max frame time indicates that presumably there are no dropped frames in the classical sense (halved refresh, 30fps on average), because in that case I would expect much bigger max frame times, at least bigger than 24000 usecs.

So, my conclusion is that on my machine gtk does not sync to vsync, but for some other timer.

Again: Ubuntu 16.04, more-or-less out of the box X config: libgtk 3.18.9-1ubuntu3.1, unity 7.4.0+16.04.20160906-0ubuntu1, compiz, X.org 1.18.4. I am not familiar with X, but maybe these lines are relevant from the log:

[   118.087] (II) intel(0): Using Kernel Mode Setting driver: i915, version 1.6.0 20151010
[   118.092] (--) intel(0): Integrated Graphics Chipset: Intel(R) HD Graphics 4600

I would be happy if this issue would uncover some bug, so I contributed to the community. :)

Background info:

I am working on a project where I need fluid scrolling of a background, and the gtk version has clearly visible and annoying glitches. This is why I also need the displayed frame counter, as in special cases I can compute the scrolling (shift) offset based on this instead of wall clock to have smoother result. Indeed, it seems GtkGlArea is not suitable for this, although I don't understand why; I don't see why should it be worse than x11-egl...

Emmanuele, I have checked GdkFrameClock, it does not give the info I need; in particular, if I stop the process and later resume it, there is no jump in the gdk_frame_clock_get_frame_counter() result. :(


On Sun, Feb 5, 2017 at 10:00 AM, pelzflorian (Florian Pelz) <pelzflorian@xxxxxxxxxxxxxx> wrote:
On 02/05/2017 12:32 AM, Emmanuele Bassi wrote:
> If you need to access things like DRM data and deeper timing
> information, then I strongly suspect you should not be using GTK+ at
> all, since it seem you're writing something like a game.

Are you saying GTK+ is not suitable when one needs libdrm access or are
you saying GTK+ is not suitable when writing 3D games? I thought GTK+
could be used for the UI and what it does is sane even for games. Am I
wrong? Is this a bad idea? Well, maybe there are issues if you want
stereoscopy or such stuff…

gtk-devel-list mailing list

gtk-devel-list mailing list