Planet Igalia


24
Jul 14

See you at GUADEC!

Woah! Has been long time since I attended a GUADEC, besides the one in Coruña, of course. And I am very excited about it!

My plan is to chill with other GNOME developers and talk about the future of GNOME OS.

I'm going to GUADEC 2014

I’m going to GUADEC 2014


24
Mar 14

Munich

I spent a week in Munich. I went there for two reasons: to attend the Web & TV workshop organized by the W3C and to hack along with the gst-gang in the GStreamer Hackfest 2014. All these sponsored by Igalia, my company.

I arrived to Munich on Tuesday evening, and when I reached the Marienplatz metro station, I ran across with a crowd of Bayern Munich fans, chanting songs about the glory of their team, huddling and dancing. And a lot of police officers surrounding the tracks.

The workshop was organized by the W3C Web and TV Interest Group, and intended to spark discussions around how to integrate and standardize TV technologies and the Web.

The first slide of the workshop

The first slide of the workshop

On Wednesday morning, the workshop began. People from Espial and Samsung talked about HbbTV, and japanese broadcasters talked about their Hybridcast. Both technologies seek to enhance the television experience, using the Internet Protocols, the first for Europe, and the former for Japan. Also, broadcasters from Chine showed their approach using ad-hoc technologies. I have to say that Hybricast awed me.

 

Afterwards, most of the workshop was around the problem of the companion device. People showed their solutions and proposals, in particular about device discovering, and data sharing and control. All the solutions relied on WebSockets and WebRTC for the data sharing between devices.

 

During the panels, I enjoyed a lot the participation of Jon Piesing, in particular his slide summarizing the specifications used by the HbbTV V2. It’s like juggling specs!

Specifications used by HbbTV V2

Specifications used by HbbTV V2

Broadcaster are very interested in Encrypted Media Extension and Media APIs, for example the specification for a Tuner API. Also there’s a lot of expectation about meta-data handling and defining common TV ontologies.

Finally, there were a couple talks about miscellaneous technologies surrounding the IPTV broadcasting.

The second stage of my visit to Bavaria’s Capital, was the GStreamer Hackfest. It was in the Google Offices, near to the Marienplatz.

Christan Schaller has made a very good summary of what appened along the hackfest. From my side, I worked with Nicolas Dufresne with the v4l2 video converter for the Exynos4, which is a piece required for the hardware acceleration decoding for that platform using v4l2 video decoder.


31
Dec 13

Boosting WebKitGTK+ compilation for armhf with icecream

Some time ago I needed to jump into the fix-compile-test loop for WebKitGTK+, but in the armhf architecture, speaking in terms of Debian/Ubuntu.

To whom don’t know, WebKitGTK+ is huge, it is humongous, and it takes a lot of resources to compile. For me, at first glance, was impossible to even try to compile it natively in my hardware, which, by the way, is an Odroid-X2. So I setup a cross-compilation environment.

And I failed. I could not cross-compile the master branch of WebKitGTK+ using as root file system, a bootstrapped Debian. It is supposed to be the opposite, but all the multiarch thing made my old and good cross-compilation setup (based on scratchbox2) a bloody hell. Long story short, I gave up and I took more seriously the idea of native builds. Besides, Ubuntu and Debian does full native builds of their distributions for armhf, not to say that the Odroid-X2 has enough power for give it a try.

It is worth to mention that I could not use Yocto/OE or buildroot, though I would love to use them, because the target was a distribution based on Debian Sid/Ubuntu, and I would not afford a chroot environment only for WebKitGTK+.

With a lot of patience I was able to compile, in the Odroid, a minimalist configuration of WebKitGTK+ without symbols. As expected, it took ages (less than 3 hours, if I remember correctly)

Quickly an idea popped out in the office: to use distcc. I grabbed as many board based on ARMv7 I could find: another Odroid-X2, a couple Pandaboards, an Arndaleboard, and an IFC6410, installed in them a distcc compilation setup.

And yes, the compilation time went down, but not that much, though I don’t remember how much.

Many of the colleagues at the office migrated from distcc to icecream. Particularly, Juan A. Suárez told me about his experiments with icecc and his Raspberry pi. I decided to give it a shoot.

Icecream permits to do cross-compilation because the scheduler can deliver, into the compilation host, the required tool-chain by the requester.

First, you should have one or several cross tool-chains, one for each compilation tuple. In this case we will have only one: to compile in X86_64, generating code for armfh. Luckily, embdebian provides it, out of the box. Nevertheless you could use any other mean to obtain it, such as crosstool.

Second, you need the icecc-create-env script to create the tarball that the scheduler will distribute to the compilation host.

    $ /usr/lib/icecc/icecc-create-env \
          --gcc /usr/bin/arm-linux-gnueabihf-gcc-4.7 \
                /usr/bin/arm-linux-gnueabihf-g++-4.7

The output of this script is an archive file containing all the files necessary to setup the compiler environment. The file will have a random unique name like “ddaea39ca1a7c88522b185eca04da2d8.tar.bz2″ per default. You will need to rename it to something more expressive.

Third, copy the generated archive file to board where your code will be compiled and linked, in this case WebKitGTK+.

For the purpose of this text, I assume that the board has already installed and configured the icecc daemon. Beside, I use ccache too. Hence my environment variables are more or less like these:

CCACHE_PREFIX=icecc
CCACHE_DIR=/mnt/hd/.ccache # /mnt/hd is a mounted hard disk through USB.
PATH=/usr/lib/ccache:..    # where Debian sets the compiler's symbolic links

Finally, the last pour of magic is the environment variable ICECC_VERSION. This variable needs to have this pattern

<native_archive_file>(,<platform>:<cross_archive_file>=<target>)*.

Where <native_archive_file> is the archive file with the native tool-chain. <platform> is the host hardware architecture. <cross_archive_file> is the archive file with the cross tool-chain. <target> is the target architecture of the cross tool-chain.

In my case, the target is not needed because I’m doing native compilation in armhf. Hence, my ICECC_VERSION environment variable looks like this:

ICECC_VERSION=/mnt/hd/tc/native-compiler.tar.gz,x86_64:/mnt/hd/tc/arm-x86-compiler.tar.gz

And that’s it! Now I’m using the big iron available in the office, reducing the time of a clean compilation in less than an hour.

As a final word, I expect that this compilation time will be reduced a bit more using the new cmake build infrastructure in WebKitGTK+.


26
Jul 13

Composited video support in WebKitGTK+

A couple months ago we started to work on adding support for composited video in WebKitGTK+. The objective is to play video in WebKitGTK+ using the hardware accelerated path, so we could play videos at high definition resolutions (1080p).

How does WebKit paint?

Basically we can perceive a browser as an application for retrieving, presenting and traversing information on the Web.

For the composited video support, we are interested in the presentation task of the browser. More particularly, in the graphical presentation.

In WebKit, each HTML element on a web page is stored as a tree of Node objects called the DOM tree.

Then, each Node that produces visual output has a corresponding RenderObject, and they are stored in another tree, called the Render Tree.

Finally, each RenderObject is associated with a RenderLayer. These RenderLayers exist so that the elements of the page are composited in the correct order to properly display overlapping content, semi-transparent elements, etc.

It is worth to mention that there is not a one-to-one correspondence between RenderObjects and RenderLayers, and that there is a RenderLayer tree as well.

Render Trees in WebKit

Render Trees in WebKit (from GPU Accelerated Compositing in Chrome).

WebKit fundamentally renders a web page by traversing the RenderLayer tree.

What is the accelerated compositing?

WebKit has two paths for rendering the contents of a web page: the software path and hardware accelerated path.

The software path is the traditional model, where all the work is done in the main CPU. In this mode, RenderObjects paint themselves into the final bitmap, compositing a final layer which is presented to the user.

In the hardware accelerated path, some of the RenderLayers get their own backing surface into which they paint. Then, all the backing surfaces are composited onto the destination bitmap, and this task is responsibility of the compositor.

With the introduction of compositing an additional conceptual tree is added: the GraphicsLayer tree, where each RenderLayer may have its own GraphicsLayer.

In the hardware accelerated path, it is used the GPU for compositing some of the RenderLayer contents.

Accelerated Compositing in WebKit

Accelerated Compositing in WebKit (from Hardware Acceleration in WebKit).

As Iago said, the accelerated compositing, involves offloading the compositing of the GraphicLayers onto the GPU, since it does the compositing very fast, releasing that burden to the CPU for delivering a better and more responsive user experience.

Although there are other options, typically, OpenGL is used to render computing graphics, interacting with the GPU to achieve hardware acceleration. And WebKit provides cross-platform implementation to render with
OpenGL.

How does WebKit paint using OpenGL?

Ideally, we could go from the GraphicsLayer tree directly to OpenGL, traversing it and drawing the texture-backed layers with a common WebKit implementation.

But an abstraction layer was needed because different GPUs may behave differently, they may offer different extensions, and we still want to use the software path if hardware acceleration is not available.

This abstraction layer is known as the Texture Mapper, which is a light-weight scene-graph implementation, which is specially attuned for an efficient usage of the GPU.

It is a combination of a specialized accelerated drawing context (TextureMapper) and a scene-graph (TextureMapperLayer):

The TextureMapper is an abstract class that provides the necessary drawing primitives for the scene-graph. Its purpose is to abstract different implementations of the drawing primitives from the scene-graph.

One of the implementations is the TextureMapperGL, which provides a GPU-accelerated implementation of the drawing primitives, using shaders compatible with GL/ES 2.0.

There is a TextureMapperLayer which may represent a GraphicsLayer node in the GPU-renderable layer tree. The TextureMapperLayer tree is equivalent to the GraphicsLayer tree.

How does WebKitGTK+ play a video?

As we stated earlier, in WebKit each HTML element, on a web page, is stored as a Node in the DOM tree. And WebKit provides a Node class hierarchy for all the HTML elements. In the case of the video tag there is a parent class called HTMLMediaElement, which aggregates a common, cross platform, media player. The MediaPlayer is a decorator for a platform-specific media player known as MediaPlayerPrivate.

All previously said is shown in the next diagram.

Video in WebKit

Video in WebKit. Three layers from top to bottom

In the GTK+ port the audio and video decoding is done with GStreamer. In the case of video, a special GStreamer video sink injects the decoded buffers into the WebKit process. You can think about it as a special kind of GstAppSink, and it is part of the WebKitGTK+ code-base.

And we come back to the two paths for content rendering in WebKit:

In the software path the decoded video buffers are copied into a Cairo surface.

But in the hardware accelerated path, the decoded video buffers shall be uploaded into a OpenGL texture. When a new video buffer is available to be shown, a message is sent to the GraphicsLayer asking for redraw.

Uploading video buffers into GL textures

When we are dealing with big enough buffers, such as the high definition video buffers, copying buffers is a performance killer. That is why zero-copy techniques are mandatory.

Even more, when we are working on a multi-processor environment, such as those where we have a CPU and a GPU, switching buffers among processor’s contexts, is also very expensive.

It is because of these reasons, that the video decoding and the OpenGL texture handling, should happen only in the GPU, without context switching and without copying memory chunks.

The simplest approach could be that decoder deliver an EGLImage, so we could blend the handle into the texture. As far as I know, the gst-omx video decoder in the Raspberry Pi, works in this way.

GStreamer added a new API, that will be available in the version 1.2, to upload video buffers into a texture efficiently: GstVideoGLTextureUploadMeta. This API is exposed through buffer’s metadata, and ought be implemented by any downstream element that deals with the decoded video frames, most commonly the video decoder.

For example, in gstreamer-vaapi there are a couple patches (which still are a work-in-progress) in bugzilla, enabling this API. In the low level, calling gst_video_gl_texture_upload_meta_upload() will call vaCopySurfaceGLX(), which will do an efficient copy of the vaAPI surface into a texture using a GLX extension.

Demo

This is an old demo, when all the pieces started to fit, but no the current performance. Still, it shows what has been achieved:

Future work

So far, all these bits are already integrated in WebKitGTK+ and GStreamer. Nevertheless there are some open issues.

  • gstreamer-vaapi et all:
    GStreamer 1.2 is not released yet, and its new API might change. Also, the port of gstreamer-vaapi to GStreamer 1.2 is still a work in progress, where the available patches may have rough areas.Also, there are many other projects that need to be updated with this new API, such as clutter-gst and provide more feedback to the community.
    Another important thing is to have more GStreamer elements implementing these new API, such as the texture upload and the caps features
  • Tearing:
    The composited video task unveiled a major problem in WebKitGTK+: it does not handle the vertical blank interval at all, causing tearing artifacts, clearly observable in high resolutions videos with high motion.WebKitGTK+ composites the scene off-screen, using X Composite redirected window, and then display it at a X Damage callback, but currently, GTK+ does not take care of the vertical blank interval, causing this tearing artifact in heavy compositions.
    At Igalia, we are currently researching for a way to fix this issue.
  • Performance:
    There is always room for performance improvement. And we are always aiming in that direction, improving the frame rate, the CPU, GPU and memory usage, et cetera.
    So, keep tuned, or even better, come and help us.

5
May 13

GPhone v0.2

It is almost 5 months since I published the first release of GPhone. Though I haven’t abandoned it, I swork on it slowly but steadly.  And to prove it, I’m releasing a new version, the v0.2.

What does it include? Mainly bug fixes. The new features are still a work in progress. You can browse the next branch in git to glance it.

Change Log:

  • Able to use GSimpleAsyncResult for glib <= 2.35
  • Added Vala (0.16) back compatibility, including a gstreamer-1.0.vapi
  • Added a script for build and update a jhbuild environment: update-jhbuild-env. It handles the gphone’s dependencies.
  • Notify to the user if his NAT type is not compatible with a VoIP session.
  • Renamed accounts to registrars.
  • Handle networking signaling
  • Create a global header file
  • Many bug fixes

 


8
Apr 13

GStreamer Hackfest 2013 – Milan

Last week, from 28th to 31th of March, some of us gathered at Milan to hack some bits of the GStreamer internals. For me was a great experience interact with great hackers such as Sebastian Drödge, Wim Taymans, Edward Hervey, Alessandro Decina and many more. We talked about GStreamer and, more particularly, we agreed on new features which I would like to discuss here.

GStreamer Hackers at Milan

GStreamer Hackers at Milan

For sake of completeness, let me say that I have been interested in hardware accelerated multimedia for a while, and just lately I started to wet my feet in VAAPI and VDPAU, and their support in our beloved GStreamer.

GstContext

The first feature that reached upstream is the GstContext. Historically, in 2011, Nicolas Dufresne added GstVideoContext as an interface to a share video context (such as display name, X11 display, VA-API display, etc.) among the pipeline elements and the applications. But now, Sebastian, generalized the interface to a container to stores and shares any kind of contexts between multiple elements and the application.

The first approach, that is still living in gst-plugins-bad, was merely a wrapper to a custom query to set or request a video context. But now, the context sharing is part of the pipeline setup.

An element that needs a shared context must follow these actions:

  1. Check if the element already has a context
  2. Query downstream for the context
  3. Post a message in the bus to see if the application has one to share.
  4. Create the context if there is none, post a message and send an event letting know that the element has the context.

You can see the example of the eglglessink to know how to use this feature.

GstVideoGLTextureUploadMeta

Also in 2011, Nicolas Dufresne, added a helper class to upload a buffer into a surface (OpenGL texture, VA API surface, Wayland surface, etc.). This is quite important since the new video players are scene based, using framework such as Clutter or OpenGL directly, where the video display is composed by various actors, such as the multimedia controls widgets.

But still, this interface didn’t fit well for GStreamer 1.0, until now, where it was introduced in the figure of a buffer’s meta, though this meta is only specific for OpenGL textures. If the buffer provides this new GstVideoGLTextureUploadMeta meta, a new function gst_video_gl_texture_upload_meta_upload() is available to upload that buffer into an OpenGL texture specified by its numeric identifier.

Obviously, in order to use this meta, it should be proposed for allocation by the sink. Again, you can see the case of eglglesink as example.

Caps Features

The caps features are a new data type for specify a specific extension or requirement for the handled media.

From the practical point of view, we can say that caps structures with the same name but with a non-equal set of caps features are not compatible, and, if a pad supports multiple sets of features it has to add multiple equal structures with different feature sets to the caps.

Empty GstCapsFeatures are equivalent with the GstCapsFeatures handled by the common system memory. Other examples would be a specific memory types or the requirement of having a specific meta on the buffer.

Again, we can see the example of the capsfeatures in eglglessink, because now the gst-inspect also shows the caps feature of the pads:

Pad Templates:
  SINK template: 'sink'
    Availability: Always
    Capabilities:
      video/x-raw(memory:EGLImage)
                 format: { RGBA, BGRA, ARGB, ABGR, RGBx,
                           BGRx, xRGB, xBGR, AYUV, Y444,
                           I420, YV12, NV12, NV21, Y42B,
                           Y41B, RGB, BGR, RGB16 }
                  width: [ 1, 2147483647 ]
                 height: [ 1, 2147483647 ]
              framerate: [ 0/1, 2147483647/1 ]
      video/x-raw(meta:GstVideoGLTextureUploadMeta)
                 format: { RGBA, BGRA, ARGB, ABGR, RGBx,
                           BGRx, xRGB, xBGR, AYUV, Y444,
                           I420, YV12, NV12, NV21, Y42B,
                           Y41B, RGB, BGR, RGB16 }
                  width: [ 1, 2147483647 ]
                 height: [ 1, 2147483647 ]
              framerate: [ 0/1, 2147483647/1 ]
      video/x-raw
                 format: { RGBA, BGRA, ARGB, ABGR, RGBx,
                           BGRx, xRGB, xBGR, AYUV, Y444,
                           I420, YV12, NV12, NV21, Y42B,
                           Y41B, RGB, BGR, RGB16 }
                  width: [ 1, 2147483647 ]
                 height: [ 1, 2147483647 ]
              framerate: [ 0/1, 2147483647/1 ]

Parsers meta

This is a feature which has been pulled by Edward Hervey. The idea is that the video codec parsers (H264, MPEG, VC1) attach a meta into the buffer with a defined structure that carries that new information provided by the codified stream.

This is particularly useful by the decoders, which will not have to parse again the buffer in order to extract the information they need to decode the current buffer and the following.

For example, here is the H264 parser meta definition.

VDPAU

Another task pulled by Edward Hervey, for which I feel excited, is the port of VDPAU decoding elements to GStreamer 1.0.

Right now only the MPEG decoder is upstreamed, but MPEG4 and H264 are coming.

As a final note, I want to thank Collabora and Fluendo for sponsoring dinners. A special thank you, as well, for Igalia which covered my travel expenses and attendance to the hackfest.


26
Mar 13

GStreamer Hackfest 2013

Next Thursday I’ll be flying to Milan to attend the 2013 edition of the GStreamer Hackfest. My main interests are hardware codecs and GL integration, particularly VA API integrated with GL-based sinks.

Thanks Igalia for sponsoring my trip!

Igalia


4
Mar 13

Igalia (three) week in WebKit: Media controls, Notifications, and Clang build

Hi all,

Three weeks have passed since I wrote the last WebKit report, and they did so quick that it scares me. Many great things have happened since then.

Let’s start with my favorite area: multimedia. Phil landed a patch that avoids muting the sound if the audio pitch is preserved. And Calvaris finally landed his great new media controls. Now watching videos in WebKitGTK+ is a pleasure.

Rego keeps his work on adding more tests to WebKitGTK+, ando he also wrote a fine document on how to build Epiphany with WK2 from git/svn.

Claudio, besides his work in the snapshots API that we already commented, retook the implementation of the notifications API for WebKitGTK+. And, while implementing it, he fixed some crashers in WK2′s implementation. He has also given us an early screencast with the status of the notifications implementation: Check it out! (video).

Carlos García Campos, besides working hard on the stable and development releases of WebKitGTK+ library, has also landed a couple of fixes. Meanwhile, Dape removed some dependencies, making the code base more clean.

Žan, untiredly, has been poking all around the WebKit project, keeping the GTK port kicking and healthy: He fixed code; cleaned up scripts, autogenerated code and enhanced utility scripts; he also enabled more tests in the development builds. But his most impressive work in progress is enabling the Clang build of the GTK port.

But there’s more! Žan setup a web page were you can visualize the status of WebKit2 on several ports: http://www.iswk2buildbrokenyet.com/


1
Feb 13

Igalia week in WebKit: Fixes, gardening, resources API and MathML

It is time for another weekly report on what is going on WebKit and Igalia.

Sergio reported and helped to debug different crashes in WebKit accessibility. He improved the robustness of build-webkit. Also, he fixed a bug in the defined network buffer size, simplified a bit the code that close the network connections, and fixed a bug in libsoup within a crash in webkit

And this week we have a new face too, at least in these reports: Adrián. He enabled the opus codec in the MIME type list. After that, he decided to give a try to a native compilation in armv5tel board. Crazy? A bit, but fun too.

Rego continues his hard work on enabling tests for WebKitGTK+, like testing the text direction setting, or unflag blocked tests that already pass, and many other that are still work in progress. Also, he got landed his patch for the bug found in the GtkLauncher.

Claudio, whilst he waits for the review of his API to retrieve a snapshot, he retook the Notifications API work for WebKitGTK+, particularly for WebKit2. Also, and just for sake of landing something, he fixed a couple of minor compilation glitches.

Philippe still kicking the fullscreen support in WebKit using GStreamer. And for relaxing purposes, he updated this patch for AudioSourceProvider.

Žan works tirelessly keeping the bots working as good as possible: disabling the build of wk2 on EWSs, meanwhile the wk2 storm appeases; cleaned building configuration options, removed deprecated methods for unit tests, enhanced the webkit-patch script and much more. Besides this needed work, he also started to hack on completing the WTFURL implementation for the WebCore’s KURL interface. WTFURL is pretty much what WebKit (as a project) would like to use in the future. WTFURL is based on the GoogleURL code (which is what the Chromium port is at the moment using as the backend of KURL).

The great WebKit hacker, Martin Robinson is exploring uncharted territory in the project: He’s trying to get away from port-specific things, by scratching in the core stuff, but the serendipity showed up, so he found some pretty serious port-specific bugs that have relatively straight-forward fixes.

Martin started to work on MathML and noticed that WebKit MathML support for the mathvariant attribute is unfinished. And this issue led him done to set of patches to overcome this situation, like fixing the freetype usage, or adding the logic to append UChar32 onto StringBuilder

In addition to all this work, Martin is working on a patch for mathvariant itself. The mathvariant attribute allows easily using the Mathematical Alphanumeric Symbols in MathML without having to type out XML entities. For instance this:

<mi mathvariant=”fraktur”>HI GUYZ!</mi>

will be rendered like this:

Carlos García Campos cooked a patch for fixing the WebKit2 GTK+ API by implementing the resources API, removed recently, using injected bundle. This is another effort to bring back WebKit2 into WebKitGTK+.

Dape is still pushing the Qt/WebKit2 spellcheck support with a new iteration of the patch. He also worked on the removal of the GDK dependency from ImageDiff.

Finally, I finished a first iteration of my patch for pitch preservation, and also it got landed!


28
Jan 13

Announcing GPhone v0.10

Hi folks!

As many of you may know, lately I have been working on Ekiga and Opal. And, as usually happens to me, I started to wonder how I would re-write that piece of software. My main ideas growth clearly: craft a GObject library, ala WebKitGTK+, wrapping Opal’s classes, paying attention to the gobject-introspection, also, redirecting the Opal’s multimedia rendering to a GStreamer player,  and, finally, write the application with Vala.

The curiosity itched me hard,  so I started to hack, from time to time, these ideas. In mid-November, last year, I had a functional prototype, which only could make phone calls in a disgraceful user interface. But I got my proof of concept. Nonetheless, as I usually do, I didn’t dropped the pet project, continuing the development of more features.

And today, I am pleased to present you,  the release v0.1 of GPhone.

Well, I guess a screencast is mandatory nowadays: