Jul 172007

The idea of using OpenGL as a future core rendering architecture for Gtk+ has been brought up a couple times at GUADEC (and then some variations thereof). However there are good reasons to avoid that and major issues with the suggested approaches, in particular these need to be considered:

1) A dependency on OpenGL for a library as portable and as widely used as Gtk+ these days, could only ever be a weak one. That is, OpenGL features may or may not be used, depending on whether the current platform a Gtk+ application runs on actually has OpenGL available or not (e.g. by animating widgets conditionally to only be carried out when acceleration support is present).

2) The OpenGL 2D drawing API is effectively unusable for Gdk/Gtk+ drawing primitives. The main problem here is that OpenGL doesn’t provide pixel-perfect 2D drawing operations which are necessary for accurate input event processing and a coherent visual presentation (it also doesn’t always provide anti-aliasing in the 2D drawing API either). Here is a very good web page with nice screenshots summarizing the problems with OpenGL pixel-perfectness: OpenGL: “not pixel exact”, Hardware AntiAliasing.

3) By using XRENDER and hardware-accelerated X drivers, Cairo is already being performance optimized to utilize hardware acceleration. Trying to use a portable OpenGL subset instead (pixel shaders / triangle rendering) would be fairly pointless, it’d effectively be using the available portable hardware acceleration facilities through another indirection. So with more and more Gtk+-based platforms/applications moving to Cairo-based drawing, there is no additional infrastructure or support code needed to make use of available hardware acceleration facilities. Essentially, the portably usable hardware acceleration subset is brought to you automatically through Cairo and X.

For the lazy, here’s a quick overview of the artifacts presented on the OpenGL comparison page:


Tweet about this on TwitterShare on Google+Share on LinkedInShare on FacebookFlattr the authorBuffer this pageShare on RedditDigg thisShare on VKShare on YummlyPin on PinterestShare on StumbleUponShare on TumblrPrint this pageEmail this to someone

[suffusion-the-author display='description']

  10 Responses to “17.07.2007 OpenGL for Gdk/Gtk+”

  1. This is not an OpenGL only issue, similar antialiasing artifacts are present in different X server implementations that use hardware antialiasing. You’re just noticing that the various graphics chip makers don’t all implement the identical antialiasing algorithm (patent issues at work). If you want pixel reproducible antialiasing it all has to be done in software for any graphics system.

    The bigger question is, do you really need pixel level, reproducible antialiasing. This is a real question since LCDs and CRTs don’t want the same type of antialiasing.

    Go read Carl Worth’s blog posts about how the best Xrender (EXA) implementation is still half the speed not using any acceleration. Hundreds of man years have gone into optimizing OpenGL. It will take a lot of effort before EXA implementations are as fast as OpenGL, plus OpenGL is a moving target.

    More important to me is does the interface scale from VGA to 3K x 2K screens without the drawing messing up. Read this about scalability.

  2. Question: wasn’t that Cairo uses hardware renderers only when explictly set, not automatically?

  3. While I can appreciate the idea of pixel perfect output I can’t help but feel that if your UI (with pixel densities of displays these days) requires pixel perfect input/output you have much bigger problems than just that. Similarly if the UI design is sane enough, not being pixel perfect wouldn’t really make much of a difference.

  4. The page you linked to seemed to promote OpenGL rendering using Texture AA: http://homepage.mac.com/arekkusu/bugs/invariance/TexAA.html. And I haven’t noticed any quality difference using Quartz 2D Extreme with OS X.

  5. […] really interesting posts on planet gnome (off the record): OpenGL for Gdk/Gtk+ and (in some way his response) Widget skeletons, GPU […]

  6. […] skeletons, GPU theming I agree 100% with Tim that using OpenGL directly as a rendering API for GTK+ makes no sense. The OpenGL 2D API is […]

  7. […] post continues, in some way, a post written by Tim Janick on his blog, about OpenGL for Gdk/Gtk+, that summarizes some of problems about using OpenGL with […]

  8. The way people usually use GL with 2D APIs is to composite/transform 2D layers. For example the browsers using GL to render HTML do that, and many Clutter apps are effectively that. Whenever you have a non-identity transformation on a subtree of the widget tree, you give that subtree its own buffer to draw to, so you can use GL for the transformation. You composite all those layers in hardware.

    This way you can get fast animation (as long as the animation is in terms of transforms on widgets, which it usually can be).

    Just use GL to push a graph of raster images to the graphics card, composite the images in hardware, and transform those images during animations. Don’t use GL to render the 2D graphics within the layers.

  9. But 2D rendering is also very important in HTML5, such as canvas and svg, and they do becomes a performance issue. So use gl to 2d rendering is still make sense.

 Leave a Reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>