WebKit Flatpak SDK and gst-build

This post is an annex of Phil’s Introducing the WebKit Flatpak SDK. Please make sure to read it, if you haven’t already.

Recapitulating, nowadays WebKitGtk/WPE developers —and their CI infrastructure— are moving towards to Flatpak-based environment for their workflow. This Flatpak-based environment, or Flatpak SDK for short, can be visualized as a software sandboxed-container, which bundles all the dependencies required to compile, run and debug WebKitGtk/WPE.

In a day-by-day work, this approach removes the potential compilation of the world in order to obtain reproducible builds, improving the development and testing work flow.

But what if you are also involved in the development of one dependency?

This is the case of Igalia’s multimedia team where, besides developing the multimedia features for WebKitGtk and WPE, we also participate in the GStreamer development, the framework used for multimedia.

Because of this, in our workflow we usually need to build WebKit with a fix, hack or new feature in GStreamer. Is it possible to add in Flatpak our custom GStreamer build without messing its own GStreamer setup? Yes, it’s possible.

gst-build is a set of scripts in Python which clone GStreamer repositories, compile them and setup an uninstalled environment. This uninstalled environment allows a transient usage of the compiled framework from their build tree, avoiding installation and further mess up with our system.

The WebKit scripts that wraps Flatpak operations are also capable to handle the scripts of gst-build to build GStreamer inside the container, and, when running WebKit’s artifacts, the scripts enable the mentioned uninstalled environment, overloading Flatpak’s GStreamer.

How do we unveil all this magic?

First of all, setup a gst-build installation as it is documented. In this installation is were the GStreamer plumbing is done.

Later, gst-build operations through WebKit compilation scripts are enabled when the environment variable GST_BUILD_PATH is exported. This variable should point to the directory where the gst-build tree is placed.

And that’s all!

But let’s put these words in actual commands. The following workflow assumes that WebKit repository is cloned in ~/WebKit and the gst-build tree is in ~/gst-build (please, excuse my bashisms).

Compiling WebKitGtk with symbols, using LLVM as toolchain (this command will also compile GStreamer):

$ cd ~/WebKit
% CC=clang CXX=clang++ GST_BUILD_PATH=/home/vjaquez/gst-build Tools/Scripts/build-webkit --gtk --debug

Running the generated minibrowser (remind GST_BUILD_PATH is required again for a correct linking):

$ GST_BUILD_PATH=/home/vjaquez/gst-build Tools/Scripts/run-minibrowser --gtk --debug

Running media layout tests:

$ GST_BUILD_PATH=/home/vjaquez/gst-build ./Tools/Scripts/run-webkit-tests --gtk --debug media

But wait! There’s more...

What if you I want to parametrize the GStreamer compilation. To say, I would like to enable a GStreamer module or disable the built of a specific element.

gst-build, as the rest of GStreamer modules, uses meson build system, so it’s possible to pass arguments to meson through the environment variable GST_BUILD_ARGS.

For example, I would like to enable gstreamer-vaapi 😇

$ cd ~/WebKit
% CC=clang CXX=clang++ GST_BUILD_PATH=/home/vjaquez/gst-build GST_BUILD_ARGS="-Dvaapi=enabled" Tools/Scripts/build-webkit --gtk --debug

Review of the Igalia Multimedia team Activities (2019/H2)

This blog post is a review of the various activities the Igalia Multimedia team was involved along the second half of 2019.

Here are the previous 2018/H2 and 2019/H1 reports.


Succinctly, GstWPE is a GStreamer plugin which allows to render web-pages as a video stream where it frames are GL textures.

Phil, its main author, wrote a blog post explaning at detail what is GstWPE and its possible use-cases. He wrote a demo too, which grabs and previews a live stream from a webcam session and blends it with an overlay from wpesrc, which displays HTML content. This composited live stream can be broadcasted through YouTube or Twitch.

These concepts are better explained by Phil himself in the following lighting talk, presented at the last GStreamer Conference in Lyon:

Video Editing

After implementing a deep integration of the GStreamer Editing Services (a.k.a GES) into Pixar’s OpenTimelineIO during the first half of 2019, we decided to implement an important missing feature for the professional video editing industry: nested timelines.

Toward that goal, Thibault worked with the GSoC student Swayamjeet Swain to implement a flexible API to support nested timelines in GES. This means that users of GES can now decouple each scene into different projects when editing long videos. This work is going to be released in the upcoming GStreamer 1.18 version.

Henry Wilkes also implemented the support for nested timeline in OpenTimelineIO making GES integration one of the most advanced one as you can see on that table:

Single Track of Clips W-O
Multiple Video Tracks W-O
Audio Tracks & Clips W-O
Markers N/A
Nesting W-O
Transitions W-O
Audio/Video Effects N/A
Linear Speed Effects R-O
Fancy Speed Effects
Color Decision List N/A

Along these lines, Thibault delivered a 15 minutes talk, also in the GStreamer Conference 2019:

After detecting a few regressions and issues in GStreamer, related to frame accuracy, we decided to make sure that we can seek in a perfectly frame accurate way using GStreamer and the GStreamer Editing Services. In order to ensure that, an extensive integration testsuite has been developed, mostly targeting most important container formats and codecs (namely mxf, quicktime, h264, h265, prores, jpeg) and issues have been fixed in different places. On top of that, new APIs are being added to GES to allow expressing times in frame number instead of nanoseconds. This work is still ongoing but should be merged in time for GStreamer 1.18.

GStreamer Validate Flow

GstValidate has been turning into one of the most important GStreamer testing tools to check that elements behave as they are supposed to do in the framework.

Along with our MSE work, we found that other way to specify tests, related with produced buffers and events through specific pads, was needed. Thus, Alicia developed a new plugin for GstValidate: Validate Flow.

Alicia gave an informative 30 minutes talk about GstValidate and the new plugin in the last GStreamer Conference too:

GStreamer VAAPI

Most of the work along the second half of 2019 were maintenance tasks and code reviews.

We worked mainly on memory restrictions per backend driver, and we reviewed a big refactor: internal encoders now use GstObject, instead of the custom GstVaapiObject. Also we reviewed patches for new features such as video rotation and cropping in vaapipostproc.

Servo multimedia

Last year we worked integrating media playing in Servo. We finally delivered hardware accelerated video playback in Linux and Android. We worked also for Windows and Mac ports but they were not finished. As natural, most of the work were in servo/media crate, pushing code and reviewing contributions. The major tasks were to rewrite the media player example and the internal source element looking to handle the download playbin‘s flag properly.

We also added WebGL integration support with <video> elements, thus webpages can use video frames as WebGL textures.

Finally we explored how to isolate the multimedia processing in a dedicated thread or process, but that task remains pending.

WebKit Media Source Extension

We did a lot of downstream and upstream bug fixing and patch review, both in WebKit and GStreamer, for our MSE GStreamer-based backend.

Along this line we improved WebKitMediaSource to use playbin3 but also compatibility with older GStreamer versions was added.

WebKit WebRTC

Most of the work in this area were maintenance and fix regressions uncovered by the layout tests. Besides, the support for the Rasberry Pi was improved by handling encoded streams from v4l2 video sources, with some explorations with Minnowboard on top of that.


GStreamer Conference

Igalia was Gold sponsor this last GStreamer Conference held in Lyon, France.

All team attended and five talks were delivered. Only Thibault presented, besides the video editing one which we already referred, another two more: One about GstTranscoder API and the other about the new documentation infrastructure based in Hotdoc:

We also had a productive hackfest, after the conference, where we worked on AV1 Rust decoder, HLS Rust demuxer, hardware decoder flag in playbin, and other stuff.

Linaro Connect

Phil attended the Linaro Connect conference in San Diego, USA. He delivered a talk about WPE/Multimedia which you can enjoy here:


Charlie attended Demuxed, in San Francisco. The conference is heavily focused on streaming and codec engineering and validation. Sadly there are not much interest in GStreamer, as the main focus is on FFmpeg.


Phil and I attended the last RustFest in Barcelona. Basically we went to meet with the Rust community and we attended the “WebRTC with GStreamer-rs” workshop presented by Sebastian Dröge.

Review of Igalia’s Multimedia Activities (2018/H2)

This is the first semiyearly report about Igalia’s activities around multimedia, covering the second half of 2018.

Great length of this report was exposed in Phil’s talk surveying mutimedia development in WebKitGTK and WPE:

WebKit Media Source Extensions (MSE)

MSE is a specification that allows JS to generate media streams for playback for Web browsers that support HTML 5 video and audio.

Last semester we upstreamed the support to WebM format in WebKitGTK with the related patches in GStreamer, particularly in qtdemux, matroskademux elements.

WebKit Encrypted Media Extensions (EME)

EME is a specification for enabling playback of encrypted content in Web bowsers that support HTML 5 video.

In a downstream project for WPE WebKit we managed to have almost full test coverage in the YoutubeTV 2018 test suite.

We merged our contributions in upstream, WebKit and GStreamer, most of what is legal to publish, for example, making demuxers aware of encrypted content and make them to send protection events with the initialization data and the encrypted caps, in order to select later the decryption key.

We started to coordinate the upstreaming process of a new implementation of CDM (Content Decryption Module) abstraction and there will be even changes in that abstraction.

Lighting talk about EME implementation in WPE/WebKitGTK in GStreamer Conference 2018.

WebKit WebRTC

WebRTC consists of several interrelated APIs and real time protocols to enable Web applications and sites to captures audio, or A/V streams, and exchange them between browsers without requiring an intermediary.

We added GStreamer interfaces to LibWebRTC, to use it for the network part, while using GStreamer for the media capture and processing. All that was upstreamed in 2018 H2.

Thibault described thoroughly the tasks done for this achievement.

Talk about WebRTC implementation in WPE/WebKitGTK in WebEngines hackfest 2018.


Servo is a browser engine written in Rust designed for high parallelization and high GPU usage.

We added basic support for <video> and <audio> media elements in Servo. Later on, we added the GstreamerGL bindings for Rust in gstreamer-rs to render GL textures from the GStreamer pipeline in Servo.

Lighting talk in the GStreamer Conference 2018.


Taking an idea from the GStreamer Conference, we developed a GStreamer source element that wraps WPE. With this source element, it is possible to blend a web page and video in a single video stream; that is, the output of a Web browser (to say, a rendered web page) is used as a video source of a GStreamer pipeline: GstWPE. The element is already merged in the gst-plugins-bad repository.

Talk about GstWPE in FOSDEM 2019

Demo #1

Demo #2

GStreamer VA-API and gst-MSDK

At last, but not the least, we continued helping with the maintenance of GStreamer-VAAPI and gst-msdk, with code reviewing and on-going migration of the internal library to GObject.

Other activities

The second half of 2018 was also intense in terms of conferences and hackfest for the team:

Thanks to bear with us along all this blog post and to keeping under your radar our work.

GStreamer Hackfest 2015

Last weekend was the GStreamer Hackfest in Staines, UK, in the Samsung’s premises, who also sponsored the dinners and the lunches. Special thanks to Luis de Bethencourt, the almighty organizer!

My main purpose was to sip one or two pints with the GStreamer folks and, secondarily, to talk about gstreamer-vaapi, WebKitGTK+ and the new OpenGL/ES support in gst-plugins-bad.


About gstreamer-vaapi, there were a couple questions about some problems shown in downstream (stable releases in distributions) which I was happy to announce that they are mostly fixed in upstream. On the other hand, Sebastian Drödge was worried about the existing support of GStreamer 0.10 and I answered him that its removal is already in the pipeline. He looked pleased.

Related with gstreamer-vaapi and the new GstGL, we tested and merged a patch for GLES2/EGL, so now it is possible to render VA-API decoded video through glimagesink with (nearly) zero-copy. Sadly, this is not currently possible using GLX. Along the way I found a silly bug that came from a previous patch of mine and fixed it; also, we fixed other small bug in the gluploader .

In the WebKitGTK+ realm, I worked on a new functionality: to share the OpenGL context and the display of the browser with the GStreamer pipeline. With it, we could add gl filters into the pipeline. But honour to whom honour is due: this patch is a split of a previous patch done by Philippe Normand. The ultimate goal is to ditch the custom video sink in WebKit and reuse the glimagesink, with it’s new off-screen rendering feature.

Finally, on Sunday’s afternoon, I walked around Richmond and it is beautiful.


Thanks to Igalia, Intel and all the sponsors  that make possible the hackfest and my attendance.


The last Friday 25 of July, National Day of Galicia, started very early because I had to travel to Strasbourg, official seat of the European Parliament, not for any political duty, but for the GNOME Users and Developers European Conference, the GUADEC!

My last GUADEC was in The Hague, in 2010, though in 2012, when it was hosted in Coruña, I attended a couple talks. Nonetheless, it had been a long time since I met the community, and it was a pleasure to me meet them again.

My biggest impression was the number of attendees. I remember the times in Turkey or in Gran Canaria where hundreds packed the auditoriums and halls. Nowadays the audience was smaller, but that is a good thing, since now you get in touch with the core of developers who drive and move the project easily.

We, Igalia, as sponsors, had a banner in the main room and a table in a corridor. Here is a picture of Juan to prove it:

Juan at the Igalia's both.
Juan at the Igalia’s booth.

Also I ran across with Emmanuele Bassi, setting up a booth to show up the Endless Mobile OS, based on GNOME 3. The people at GUADEC welcomed with enthusiasm the user experience provided by it and the purpose of the project. Personally, I love it. If you don’t know the project, you should visit their web site.

The first talk I attended what the classic GStreamer update by Sebastian Dröge and Tim Müller. They talked about the new features in GStreamer 1.4. Neat stuff in there. I like the new pace of GStreamer, rather of the old stagnated evolution of 0.10 version.

Afterwards, Jim Hall gave us a keynote about Usability in GNOME. I really enjoyed that talk. He studied the usability of several GNOME applications such as Nautilus (aka Files), GEdit, Epiphany (aka Web), etc., as part of his Masters’ research. It was a pleasure to hear that Epiphany is regarded as having a good usability.

After lunch I was in the main room hearing Sylvain Le Bon about sustainable business models for free software. He talked about crowd funding, community management and related stuff.

The next talk was Christian Hergert about his project GOM, an object mapper from GObjects to SQLite, which is used in Grilo to prevent SQL injection by some plugins that use SQLite.

Later on, Marina Zhurakhinskaya gave us one of the best talks of the GUADEC: How to be an ally to women in tech. I encourage you to download the slides and read them. There I learned about the unicorn law and the impostor syndrome.

The day closed with the GNOME Foundation’s teams reports.

Sunday came and I arrived to the venue for the second keynote: Should We Teach The Robot To Kill by Nathan Willis. In his particular style, Nathan, presented a general survey of GNU/Linux in the Automotive Industry.

Next, one of main talks from Igalia: Web 3.12: a browser to make us proud, presented by Edu. It was fairly good. Edu showed us the latest development in WebKitGTK+ and Epiphany (aka Web). There were quite a few questions at the end of the talk. Epiphany nowadays is actively used by a lot of people in the community.

After, Zeeshan presented his GNOME boxes, an user interface for running virtual machines. Later on Alberto Ruiz showed us Fleet Commander, a web application to handle large desktop deployments.

And we took our classic group photo:

Group phoo
Group photo

That Sunday closed with the intern’s lighting talks. Cool stuff is being cooked by them.

On Monday I was in the venue when Emmanuele Bassi talked us about GSK, the GTK+ Scene Graph Kit, his new project, using as a starting point the lessons learned in Clutter. Its objective is to have a scene graph library fully integrated in GTK+.

After the lunch and the second part of the Foundation’s Annual General Meeting, Benjamin Otte gave an amusing talk about the CSS implementation in GTK+. Later, Jasper St. Pierre talked about the Wayland support in GNOME.

When the coffee break ended, the almighty Žan Doberšek gave the other talk from Igalia: Wayland support in WebKit2GTK+.

In the last day of the GUADEC, I attended Bastien Nocera’s talk: Hardware integration, the GNOME way, where he reviewed the history of his contributions to GNOME related with hardware integration and the goal of nicely support most of the hardware in GNOME, like compasses, gyroscopes, et cetera.

Afterwards, Owen Taylor talked us about the GNOME’s continuous integration performance testing, in order to know exactly why one release of GNOME is faster or slower than the last.

And the third keynote came: Matthew Garrett talked us about his experiences with the GNOME community and his vision about where it should go: to enhance the privacy and security of the users, something that many GNOMErs are excited about, such as Federico Mena.

Later on, David King talked about his plans for Cheese, the webcam application, turning it into a DBus service, using the current development of kdbus to sandbox the interaction with the hardware.

Afterwards Christian Hergert talked us about his plans for Builder, a new IDE for GNOME. Promising stuff, but we will see how it goes. Christian said that he is going to take a full year working on this project.

The GUADEC ended with the lighting talks, where I enjoyed one about the problems around the current encryption and security tools.

Finally, the next GUADEC host was unveiled: the Sweden Conspiracy: Gothenburg!

Boosting WebKitGTK+ compilation for armhf with icecream

Some time ago I needed to jump into the fix-compile-test loop for WebKitGTK+, but in the armhf architecture, speaking in terms of Debian/Ubuntu.

To whom don’t know, WebKitGTK+ is huge, it is humongous, and it takes a lot of resources to compile. For me, at first glance, was impossible to even try to compile it natively in my hardware, which, by the way, is an Odroid-X2. So I setup a cross-compilation environment.

And I failed. I could not cross-compile the master branch of WebKitGTK+ using as root file system, a bootstrapped Debian. It is supposed to be the opposite, but all the multiarch thing made my old and good cross-compilation setup (based on scratchbox2) a bloody hell. Long story short, I gave up and I took more seriously the idea of native builds. Besides, Ubuntu and Debian does full native builds of their distributions for armhf, not to say that the Odroid-X2 has enough power for give it a try.

It is worth to mention that I could not use Yocto/OE or buildroot, though I would love to use them, because the target was a distribution based on Debian Sid/Ubuntu, and I would not afford a chroot environment only for WebKitGTK+.

With a lot of patience I was able to compile, in the Odroid, a minimalist configuration of WebKitGTK+ without symbols. As expected, it took ages (less than 3 hours, if I remember correctly)

Quickly an idea popped out in the office: to use distcc. I grabbed as many board based on ARMv7 I could find: another Odroid-X2, a couple Pandaboards, an Arndaleboard, and an IFC6410, installed in them a distcc compilation setup.

And yes, the compilation time went down, but not that much, though I don’t remember how much.

Many of the colleagues at the office migrated from distcc to icecream. Particularly, Juan A. Suárez told me about his experiments with icecc and his Raspberry pi. I decided to give it a shoot.

Icecream permits to do cross-compilation because the scheduler can deliver, into the compilation host, the required tool-chain by the requester.

First, you should have one or several cross tool-chains, one for each compilation tuple. In this case we will have only one: to compile in X86_64, generating code for armfh. Luckily, embdebian provides it, out of the box. Nevertheless you could use any other mean to obtain it, such as crosstool.

Second, you need the icecc-create-env script to create the tarball that the scheduler will distribute to the compilation host.

    $ /usr/lib/icecc/icecc-create-env \
          --gcc /usr/bin/arm-linux-gnueabihf-gcc-4.7 \

The output of this script is an archive file containing all the files necessary to setup the compiler environment. The file will have a random unique name like “ddaea39ca1a7c88522b185eca04da2d8.tar.bz2” per default. You will need to rename it to something more expressive.

Third, copy the generated archive file to board where your code will be compiled and linked, in this case WebKitGTK+.

For the purpose of this text, I assume that the board has already installed and configured the icecc daemon. Beside, I use ccache too. Hence my environment variables are more or less like these:

CCACHE_DIR=/mnt/hd/.ccache # /mnt/hd is a mounted hard disk through USB.
PATH=/usr/lib/ccache:..    # where Debian sets the compiler's symbolic links

Finally, the last pour of magic is the environment variable ICECC_VERSION. This variable needs to have this pattern


Where <native_archive_file> is the archive file with the native tool-chain. <platform> is the host hardware architecture. <cross_archive_file> is the archive file with the cross tool-chain. <target> is the target architecture of the cross tool-chain.

In my case, the target is not needed because I’m doing native compilation in armhf. Hence, my ICECC_VERSION environment variable looks like this:


And that’s it! Now I’m using the big iron available in the office, reducing the time of a clean compilation in less than an hour.

As a final word, I expect that this compilation time will be reduced a bit more using the new cmake build infrastructure in WebKitGTK+.

Composited video support in WebKitGTK+

A couple months ago we started to work on adding support for composited video in WebKitGTK+. The objective is to play video in WebKitGTK+ using the hardware accelerated path, so we could play videos at high definition resolutions (1080p).

How does WebKit paint?

Basically we can perceive a browser as an application for retrieving, presenting and traversing information on the Web.

For the composited video support, we are interested in the presentation task of the browser. More particularly, in the graphical presentation.

In WebKit, each HTML element on a web page is stored as a tree of Node objects called the DOM tree.

Then, each Node that produces visual output has a corresponding RenderObject, and they are stored in another tree, called the Render Tree.

Finally, each RenderObject is associated with a RenderLayer. These RenderLayers exist so that the elements of the page are composited in the correct order to properly display overlapping content, semi-transparent elements, etc.

It is worth to mention that there is not a one-to-one correspondence between RenderObjects and RenderLayers, and that there is a RenderLayer tree as well.

Render Trees in WebKit
Render Trees in WebKit (from GPU Accelerated Compositing in Chrome).

WebKit fundamentally renders a web page by traversing the RenderLayer tree.

What is the accelerated compositing?

WebKit has two paths for rendering the contents of a web page: the software path and hardware accelerated path.

The software path is the traditional model, where all the work is done in the main CPU. In this mode, RenderObjects paint themselves into the final bitmap, compositing a final layer which is presented to the user.

In the hardware accelerated path, some of the RenderLayers get their own backing surface into which they paint. Then, all the backing surfaces are composited onto the destination bitmap, and this task is responsibility of the compositor.

With the introduction of compositing an additional conceptual tree is added: the GraphicsLayer tree, where each RenderLayer may have its own GraphicsLayer.

In the hardware accelerated path, it is used the GPU for compositing some of the RenderLayer contents.

Accelerated Compositing in WebKit
Accelerated Compositing in WebKit (from Hardware Acceleration in WebKit).

As Iago said, the accelerated compositing, involves offloading the compositing of the GraphicLayers onto the GPU, since it does the compositing very fast, releasing that burden to the CPU for delivering a better and more responsive user experience.

Although there are other options, typically, OpenGL is used to render computing graphics, interacting with the GPU to achieve hardware acceleration. And WebKit provides cross-platform implementation to render with

How does WebKit paint using OpenGL?

Ideally, we could go from the GraphicsLayer tree directly to OpenGL, traversing it and drawing the texture-backed layers with a common WebKit implementation.

But an abstraction layer was needed because different GPUs may behave differently, they may offer different extensions, and we still want to use the software path if hardware acceleration is not available.

This abstraction layer is known as the Texture Mapper, which is a light-weight scene-graph implementation, which is specially attuned for an efficient usage of the GPU.

It is a combination of a specialized accelerated drawing context (TextureMapper) and a scene-graph (TextureMapperLayer):

The TextureMapper is an abstract class that provides the necessary drawing primitives for the scene-graph. Its purpose is to abstract different implementations of the drawing primitives from the scene-graph.

One of the implementations is the TextureMapperGL, which provides a GPU-accelerated implementation of the drawing primitives, using shaders compatible with GL/ES 2.0.

There is a TextureMapperLayer which may represent a GraphicsLayer node in the GPU-renderable layer tree. The TextureMapperLayer tree is equivalent to the GraphicsLayer tree.

How does WebKitGTK+ play a video?

As we stated earlier, in WebKit each HTML element, on a web page, is stored as a Node in the DOM tree. And WebKit provides a Node class hierarchy for all the HTML elements. In the case of the video tag there is a parent class called HTMLMediaElement, which aggregates a common, cross platform, media player. The MediaPlayer is a decorator for a platform-specific media player known as MediaPlayerPrivate.

All previously said is shown in the next diagram.

Video in WebKit
Video in WebKit. Three layers from top to bottom

In the GTK+ port the audio and video decoding is done with GStreamer. In the case of video, a special GStreamer video sink injects the decoded buffers into the WebKit process. You can think about it as a special kind of GstAppSink, and it is part of the WebKitGTK+ code-base.

And we come back to the two paths for content rendering in WebKit:

In the software path the decoded video buffers are copied into a Cairo surface.

But in the hardware accelerated path, the decoded video buffers shall be uploaded into a OpenGL texture. When a new video buffer is available to be shown, a message is sent to the GraphicsLayer asking for redraw.

Uploading video buffers into GL textures

When we are dealing with big enough buffers, such as the high definition video buffers, copying buffers is a performance killer. That is why zero-copy techniques are mandatory.

Even more, when we are working on a multi-processor environment, such as those where we have a CPU and a GPU, switching buffers among processor’s contexts, is also very expensive.

It is because of these reasons, that the video decoding and the OpenGL texture handling, should happen only in the GPU, without context switching and without copying memory chunks.

The simplest approach could be that decoder deliver an EGLImage, so we could blend the handle into the texture. As far as I know, the gst-omx video decoder in the Raspberry Pi, works in this way.

GStreamer added a new API, that will be available in the version 1.2, to upload video buffers into a texture efficiently: GstVideoGLTextureUploadMeta. This API is exposed through buffer’s metadata, and ought be implemented by any downstream element that deals with the decoded video frames, most commonly the video decoder.

For example, in gstreamer-vaapi there are a couple patches (which still are a work-in-progress) in bugzilla, enabling this API. In the low level, calling gst_video_gl_texture_upload_meta_upload() will call vaCopySurfaceGLX(), which will do an efficient copy of the vaAPI surface into a texture using a GLX extension.


This is an old demo, when all the pieces started to fit, but no the current performance. Still, it shows what has been achieved:

Future work

So far, all these bits are already integrated in WebKitGTK+ and GStreamer. Nevertheless there are some open issues.

  • gstreamer-vaapi et all:
    GStreamer 1.2 is not released yet, and its new API might change. Also, the port of gstreamer-vaapi to GStreamer 1.2 is still a work in progress, where the available patches may have rough areas.Also, there are many other projects that need to be updated with this new API, such as clutter-gst and provide more feedback to the community.
    Another important thing is to have more GStreamer elements implementing these new API, such as the texture upload and the caps features
  • Tearing:
    The composited video task unveiled a major problem in WebKitGTK+: it does not handle the vertical blank interval at all, causing tearing artifacts, clearly observable in high resolutions videos with high motion.WebKitGTK+ composites the scene off-screen, using X Composite redirected window, and then display it at a X Damage callback, but currently, GTK+ does not take care of the vertical blank interval, causing this tearing artifact in heavy compositions.
    At Igalia, we are currently researching for a way to fix this issue.
  • Performance:
    There is always room for performance improvement. And we are always aiming in that direction, improving the frame rate, the CPU, GPU and memory usage, et cetera.
    So, keep tuned, or even better, come and help us.

Igalia (three) week in WebKit: Media controls, Notifications, and Clang build

Hi all,

Three weeks have passed since I wrote the last WebKit report, and they did so quick that it scares me. Many great things have happened since then.

Let’s start with my favorite area: multimedia. Phil landed a patch that avoids muting the sound if the audio pitch is preserved. And Calvaris finally landed his great new media controls. Now watching videos in WebKitGTK+ is a pleasure.

Rego keeps his work on adding more tests to WebKitGTK+, ando he also wrote a fine document on how to build Epiphany with WK2 from git/svn.

Claudio, besides his work in the snapshots API that we already commented, retook the implementation of the notifications API for WebKitGTK+. And, while implementing it, he fixed some crashers in WK2’s implementation. He has also given us an early screencast with the status of the notifications implementation: Check it out! (video).

Carlos García Campos, besides working hard on the stable and development releases of WebKitGTK+ library, has also landed a couple of fixes. Meanwhile, Dape removed some dependencies, making the code base more clean.

Žan, untiredly, has been poking all around the WebKit project, keeping the GTK port kicking and healthy: He fixed code; cleaned up scripts, autogenerated code and enhanced utility scripts; he also enabled more tests in the development builds. But his most impressive work in progress is enabling the Clang build of the GTK port.

But there’s more! Žan setup a web page were you can visualize the status of WebKit2 on several ports: http://www.iswk2buildbrokenyet.com/

Igalia week in WebKit: Fixes, gardening, resources API and MathML

It is time for another weekly report on what is going on WebKit and Igalia.

Sergio reported and helped to debug different crashes in WebKit accessibility. He improved the robustness of build-webkit. Also, he fixed a bug in the defined network buffer size, simplified a bit the code that close the network connections, and fixed a bug in libsoup within a crash in webkit

And this week we have a new face too, at least in these reports: Adrián. He enabled the opus codec in the MIME type list. After that, he decided to give a try to a native compilation in armv5tel board. Crazy? A bit, but fun too.

Rego continues his hard work on enabling tests for WebKitGTK+, like testing the text direction setting, or unflag blocked tests that already pass, and many other that are still work in progress. Also, he got landed his patch for the bug found in the GtkLauncher.

Claudio, whilst he waits for the review of his API to retrieve a snapshot, he retook the Notifications API work for WebKitGTK+, particularly for WebKit2. Also, and just for sake of landing something, he fixed a couple of minor compilation glitches.

Philippe still kicking the fullscreen support in WebKit using GStreamer. And for relaxing purposes, he updated this patch for AudioSourceProvider.

Žan works tirelessly keeping the bots working as good as possible: disabling the build of wk2 on EWSs, meanwhile the wk2 storm appeases; cleaned building configuration options, removed deprecated methods for unit tests, enhanced the webkit-patch script and much more. Besides this needed work, he also started to hack on completing the WTFURL implementation for the WebCore’s KURL interface. WTFURL is pretty much what WebKit (as a project) would like to use in the future. WTFURL is based on the GoogleURL code (which is what the Chromium port is at the moment using as the backend of KURL).

The great WebKit hacker, Martin Robinson is exploring uncharted territory in the project: He’s trying to get away from port-specific things, by scratching in the core stuff, but the serendipity showed up, so he found some pretty serious port-specific bugs that have relatively straight-forward fixes.

Martin started to work on MathML and noticed that WebKit MathML support for the mathvariant attribute is unfinished. And this issue led him done to set of patches to overcome this situation, like fixing the freetype usage, or adding the logic to append UChar32 onto StringBuilder

In addition to all this work, Martin is working on a patch for mathvariant itself. The mathvariant attribute allows easily using the Mathematical Alphanumeric Symbols in MathML without having to type out XML entities. For instance this:

<mi mathvariant=”fraktur”>HI GUYZ!</mi>

will be rendered like this:

Carlos García Campos cooked a patch for fixing the WebKit2 GTK+ API by implementing the resources API, removed recently, using injected bundle. This is another effort to bring back WebKit2 into WebKitGTK+.

Dape is still pushing the Qt/WebKit2 spellcheck support with a new iteration of the patch. He also worked on the removal of the GDK dependency from ImageDiff.

Finally, I finished a first iteration of my patch for pitch preservation, and also it got landed!

Igalia WebKit week: welcome Žan and Rego!

This  a new weekly WebKit Igalia’s report. And the last week has been a shaky one.

Let’s start with a warm welcome to Žan Dobersek as collaborator student, who is working hard in a lot of cool matters: gardening the bots, cleaning up the Coverity run output, modifying the WebCoreTestSupport to decrease the dependency WebKit2 API, digging in a stack size problem in JSC, and much more fun stuff.

Meanwhile Joanie, after a lot of restless hacking hours, could fix a crash in tables accessibility, and saving us from many other accessibility crashes. She is working to make the world safer for table users everywhere!

But we have more new faces around: Our dear colleague, Rego, is getting his feet wet, and he started finding and fixing new bugs and he is enabling more tests in the Gtk+ port.

Calvaris is following his efforts for enhancing the user experience with the media controls for HTML5. Here is a recent screenshot of these controls:


Claudio is pushing the accelerator pedal for the snapshot API in WebKit2. With this API the applications would retrieve a snapshot from a webview as in the Overview mode in Epiphany.

Epiphany's overview mode

Philippe is working on the fullscreen video porting the GStreamerGWorld module to GStreamer 1.0, while he is still hacking on the AudioSourceProvider  for WebAudio.

And the last but not the least, Dape is working real hard on the Qt spell-check support, and he also proposed solutions for Qt WebNotification support.

And that is all for now. See you!