Mar 13

Igalia (three) week in WebKit: Media controls, Notifications, and Clang build

Hi all,

Three weeks have passed since I wrote the last WebKit report, and they did so quick that it scares me. Many great things have happened since then.

Let’s start with my favorite area: multimedia. Phil landed a patch that avoids muting the sound if the audio pitch is preserved. And Calvaris finally landed his great new media controls. Now watching videos in WebKitGTK+ is a pleasure.

Rego keeps his work on adding more tests to WebKitGTK+, ando he also wrote a fine document on how to build Epiphany with WK2 from git/svn.

Claudio, besides his work in the snapshots API that we already commented, retook the implementation of the notifications API for WebKitGTK+. And, while implementing it, he fixed some crashers in WK2’s implementation. He has also given us an early screencast with the status of the notifications implementation: Check it out! (video).

Carlos García Campos, besides working hard on the stable and development releases of WebKitGTK+ library, has also landed a couple of fixes. Meanwhile, Dape removed some dependencies, making the code base more clean.

Žan, untiredly, has been poking all around the WebKit project, keeping the GTK port kicking and healthy: He fixed code; cleaned up scripts, autogenerated code and enhanced utility scripts; he also enabled more tests in the development builds. But his most impressive work in progress is enabling the Clang build of the GTK port.

But there’s more! Žan setup a web page were you can visualize the status of WebKit2 on several ports: http://www.iswk2buildbrokenyet.com/

Feb 13

Igalia week in WebKit: Fixes, gardening, resources API and MathML

It is time for another weekly report on what is going on WebKit and Igalia.

Sergio reported and helped to debug different crashes in WebKit accessibility. He improved the robustness of build-webkit. Also, he fixed a bug in the defined network buffer size, simplified a bit the code that close the network connections, and fixed a bug in libsoup within a crash in webkit

And this week we have a new face too, at least in these reports: Adrián. He enabled the opus codec in the MIME type list. After that, he decided to give a try to a native compilation in armv5tel board. Crazy? A bit, but fun too.

Rego continues his hard work on enabling tests for WebKitGTK+, like testing the text direction setting, or unflag blocked tests that already pass, and many other that are still work in progress. Also, he got landed his patch for the bug found in the GtkLauncher.

Claudio, whilst he waits for the review of his API to retrieve a snapshot, he retook the Notifications API work for WebKitGTK+, particularly for WebKit2. Also, and just for sake of landing something, he fixed a couple of minor compilation glitches.

Philippe still kicking the fullscreen support in WebKit using GStreamer. And for relaxing purposes, he updated this patch for AudioSourceProvider.

Žan works tirelessly keeping the bots working as good as possible: disabling the build of wk2 on EWSs, meanwhile the wk2 storm appeases; cleaned building configuration options, removed deprecated methods for unit tests, enhanced the webkit-patch script and much more. Besides this needed work, he also started to hack on completing the WTFURL implementation for the WebCore’s KURL interface. WTFURL is pretty much what WebKit (as a project) would like to use in the future. WTFURL is based on the GoogleURL code (which is what the Chromium port is at the moment using as the backend of KURL).

The great WebKit hacker, Martin Robinson is exploring uncharted territory in the project: He’s trying to get away from port-specific things, by scratching in the core stuff, but the serendipity showed up, so he found some pretty serious port-specific bugs that have relatively straight-forward fixes.

Martin started to work on MathML and noticed that WebKit MathML support for the mathvariant attribute is unfinished. And this issue led him done to set of patches to overcome this situation, like fixing the freetype usage, or adding the logic to append UChar32 onto StringBuilder

In addition to all this work, Martin is working on a patch for mathvariant itself. The mathvariant attribute allows easily using the Mathematical Alphanumeric Symbols in MathML without having to type out XML entities. For instance this:

<mi mathvariant=”fraktur”>HI GUYZ!</mi>

will be rendered like this:

Carlos García Campos cooked a patch for fixing the WebKit2 GTK+ API by implementing the resources API, removed recently, using injected bundle. This is another effort to bring back WebKit2 into WebKitGTK+.

Dape is still pushing the Qt/WebKit2 spellcheck support with a new iteration of the patch. He also worked on the removal of the GDK dependency from ImageDiff.

Finally, I finished a first iteration of my patch for pitch preservation, and also it got landed!

Jan 13

Announcing GPhone v0.10

Hi folks!

As many of you may know, lately I have been working on Ekiga and Opal. And, as usually happens to me, I started to wonder how I would re-write that piece of software. My main ideas growth clearly: craft a GObject library, ala WebKitGTK+, wrapping Opal’s classes, paying attention to the gobject-introspection, also, redirecting the Opal’s multimedia rendering to a GStreamer player,  and, finally, write the application with Vala.

The curiosity itched me hard,  so I started to hack, from time to time, these ideas. In mid-November, last year, I had a functional prototype, which only could make phone calls in a disgraceful user interface. But I got my proof of concept. Nonetheless, as I usually do, I didn’t dropped the pet project, continuing the development of more features.

And today, I am pleased to present you,  the release v0.1 of GPhone.

Well, I guess a screencast is mandatory nowadays:


Jan 13

Igalia WebKit week: welcome Žan and Rego!

This  a new weekly WebKit Igalia’s report. And the last week has been a shaky one.

Let’s start with a warm welcome to Žan Dobersek as collaborator student, who is working hard in a lot of cool matters: gardening the bots, cleaning up the Coverity run output, modifying the WebCoreTestSupport to decrease the dependency WebKit2 API, digging in a stack size problem in JSC, and much more fun stuff.

Meanwhile Joanie, after a lot of restless hacking hours, could fix a crash in tables accessibility, and saving us from many other accessibility crashes. She is working to make the world safer for table users everywhere!

But we have more new faces around: Our dear colleague, Rego, is getting his feet wet, and he started finding and fixing new bugs and he is enabling more tests in the Gtk+ port.

Calvaris is following his efforts for enhancing the user experience with the media controls for HTML5. Here is a recent screenshot of these controls:


Claudio is pushing the accelerator pedal for the snapshot API in WebKit2. With this API the applications would retrieve a snapshot from a webview as in the Overview mode in Epiphany.

Epiphany's overview mode

Philippe is working on the fullscreen video porting the GStreamerGWorld module to GStreamer 1.0, while he is still hacking on the AudioSourceProvider  for WebAudio.

And the last but not the least, Dape is working real hard on the Qt spell-check support, and he also proposed solutions for Qt WebNotification support.

And that is all for now. See you!

Jan 13

Two video streams simultaneously in Ekiga

As we talked earlier, we added support for H.239 in Ekiga, and we were able to show the main role video and the slides role video, one stream at the time.

But now we did a step forward: we wanted to view both videos simultaneously. So, we hacked again Ekiga and we added that feature, when a second video stream arrives, another window is showed with the secondary role stream.

Here is the mandatory screencast:


Jan 13

Weekly Igalia’s report on WebKit

Hi webkitters,

This weekly report project was supposed to start after the last WebKit hackfest, but my holidays got in between and now I’m recovering of them 😉

Here we go:

In summary, in these last three weeks we have had 15 commits and done 23 reviews.

Martin and Carlos have been working on the authentication mechanisms. Now they can be hooked ,through the web view API, by the applications, which could take control of the dialogues and credentials handling.

Martin has also been dealing with text rendering with complex layouts (such as Arabic). This effort leaded, finally, to the removal of Pango in favor of Harfbuzz.

Now let’s talk about Carlos’ baby monster: the injected bundle patch. As you know, in WebKit2, the engine has been split in two isolated processes the UI and Web processes. The first is in charge of the user interface, and the former deals with HTML, CSS and JavaScript handling. Meanwhile this approach adds more robustness and responsiveness, also imposes more complexity because it is required an IPC mechanism to interact with the Web process. This is particularly hard for accessing to the DOM bindings.

Carlos, since the last year, has been working on his injected bundle patch, which offers a mean to support loading plugins in the web process using injected bundle. Hence, through DBus, an application could load a plugin to communicate, indirectly, with the Web process. This approach is supposed to be the milestone for the DOM bindings in WK2GTK, and also provides a mean to pre-fetch DNS registries. This patch has been happily pushed just recently, in the second week of January.

If this was not enough, Carlos also released the development version of WebKitGTK+ v1.11.4.

Now let us go to the multimedia realm, my favorite land.

Philippe finished the port of his patch for WebAudio support to GStreamer 1.0 as backend. And now he is porting the full-screen support in Gst 0.10 to Gst 1.0 in order to reuse ans share the same base code. Aligned with WebAudio, Philippe is developing a new audio source provider that gathers raw audio data from the MediaPlayer and pipe them into the AudioBus if it is required.

Xabier has been working to deliver a nice and neat HTML5 media controls, using stylable GTK+ controls. And myself, I’m still playing with the audio pitch preservation.

Another great landmark for us is a11y, and here Joanie has been working hard to bring back the accessibility tests on GTK to a sane state. And also keeps her efforts to enable an access to WebKit for Orca.

In other sort of things, Berto has been fighting against a bug on GtkLaunch, which was shown in Epiphany too when displaying only images. Meanwhile, Dape, lurked on spell checking support for Qt WebKit2. And Sergio enabled, by default, the WebP image handling.

Sep 12

The thrilling story about the H.239 support in Ekiga

The development of Ekiga 4.0 has begun, and three months ago I started a project aimed to support H.239 in Ekiga.

H.239 is an ITU recommendation and it is titled “Role management and additional media channels for H.300-series terminals”. Its purpose is the definition of procedures for use more than one video channel, and for labelling those channels with a role.

A traditional video-conference has an audio channel, a video channel and an optional data channel. The video channel typically carries the camera image of the participants. The H.239 recommendation defines rules and messages for establishing an additional video channel, often to transmit presentation slides, while still transmitting the video of the presenter.

For presentations in multi-point conferencing, H.239 defines token procedures to guarantee that only one endpoint in the conference sends the additional video channel which is then distributed to all conference participants.

Ekiga depends on Opal, a library that implements, among other things, the protocols used to send voice over IP networks (VoIP). And, according to the documentation, Opal had already support for H.239 since a while, for what we assumed that enable it in Ekiga would be straight forward.

After submitting a few patches to Opal and its base library, PTlib, then I  setup a jhbuild environment to automate the building of Ekiga and its dependencies. Soon, I realized that the task was not going to be as simple as we initially assumed: the support for H.323, in Ekiga, is not as mature as the SIP support.

I have to mention that along the development process, in order to test my scratched code, I used this H.239 application sample, a small MS Windows program which streams two video channels (the main role and the slides role). The good news is that it works fine in Wine.

Well, the true is that the activation of the client support for H.239 in Ekiga was easy. The real problem began with the video output mechanism. Ekiga has a highly complex design for drawing video frames. Too complex in my opinion. One of the first thing I did to understand the video frames display, was to sketch a sequence diagram from when a decoded frame is delivered by Opal to when the frame is displayed in the screen. And here is the result:

As I said, it is too complex for me, but the second stream had to be displayed. Fearfully, I started to refactor some code inside Ekiga and adding more parameters to handle another video stream. But finally, I just stepped forward, and submitted a bunch of patches that finally displays a second stream.

The current outcome is shown in the next screen cast: Screencast Ekiga H.239 (OGV / 7.8M)

Jun 12

A GStreamer Video Sink using KMS

The purpose of this blog post is to show the concepts related to the GstKMSSink, a new video sink for GStreamer 1.0, co-developed by Alessandro Decina and myself, done during my hack-fest time in the Igalia’s multimedia team.

One interesting thing to notice is that this element shows it is possible to write DRI clients without the burden of X Window.

Brief introduction to graphics in Linux

If you want to dump images onto your screen, you can simply use the frame buffer device. It provides an abstraction for the graphics hardware and represents the frame buffer of the video hardware. This kernel device allows user application to access the graphics hardware without knowing the low-level details [1].

In GStreamer, we have two options for displaying images using the frame buffer device; or three, if we use OMAP3: fbvideosink, fbdevsink and gst-omapfb.

Nevertheless, since the appearance of the GPUs, the frame buffer device interface has not been sufficient to fulfill all their capabilities. A new kernel interface ought to emerge. And that was the Direct Rendering Manager (DRM).

What in the hell is DRM?

The DRM layer is intended to support the needs of complex graphics devices, usually containing programmable pipelines well suited to 3D graphics acceleration [2]. It deals with [3]:

  1. A DMA queue for graphic buffers transfers [4].
  2. It provides locks for graphics hardware, treating it as shared resource for simultaneous 3D applications [5].
  3. And it provides secure hardware access, preventing clients from escalating privileges [6].

The DRM layer consists of two in-kernel drivers: a generic DRM driver, and another which has specific support for the video hardware [7]. This is possible because the DRM engine is extensible, enabling the device-specific driver to hook out those functionalities that are required by the hardware. For example, in the case of the Intel cards, the Linux kernel driver i915 supports this card and couples its capabilities to the DRM driver.

The device-specific driver, in particular, should cover two main kernel interfaces: the Kernel Mode Settings (KMS) and the Graphics Execution Manager (GEM). Both elements are also exposed to the user-space through the DRM.

With KMS, the user can ask the kernel to enable native resolution in the frame buffer, setting certain display resolution and colour depth mode. One of the benefits of doing it in kernel is that, since the kernel is in complete control of the hardware, it can switch back in the case of failure [8].

In order to allocate command buffers, cursor memory, scanout buffers, etc., the device-specific driver should support a memory manager, and GEM is the manager with more acceptance these days, because of its simplicity [9].

Beside to the graphics memory management, GEM ensures conflict-free sharing of data between applications by managing the memory synchronization. This is important because modern graphics hardware are essentially NUMA environments.

The following diagram shows the components view of the DRM layer:

Direct Rendering Infrastructure

What is the deal with KMS?

KMS is important because on it relies GEM and DRM to allocate frame buffers and to configure the display. And it is important to us because almost all of the ioctls called by the GStreamer element are part of the KMS subset.

Even more, there are some voices saying that KMS is the future replacement for the frame buffer device [10].

To carry out its duties, the KMS identifies five main concepts [11,12]:

Frame buffer:
The frame buffer is just a buffer, in the video memory, that has an image encoded in it as an array of pixels. As KMS configures the ring buffer in this video memory, it holds a the information of this configuration, such as width, height, color depth, bits per pixel, pixel format, and so on.
Stands for Cathode Ray Tube Controller. It reads the data out of the frame buffer and generates the video mode timing. The CRTC also determines what part of the frame buffer is read; e.g., when multi-head is enabled, each CRTC scans out of a different part of the video memory; in clone mode, each CRTC scans out of the same part of the memory.Hence, from the KMS perspective, the CRTC’s abstraction contains the display mode information, including, resolution, depth, polarity, porch, refresh rate, etc. Also, it has the information of the buffer region to display and when to change to the next frame buffer.
Overlay planes:
Overlays are treated a little like CRTCs, but without associated modes our encoder trees hanging off of them: they can be enabled with a specific frame buffer attached at a specific location, but they don’t have to worry about mode setting, though they do need to have an associated CRTC to actually pump their pixels out [13].
The encoder takes the digital bitstream from the CRTC and converts it to the appropriate format across the connector to the monitor.
The connector provides the appropriate physical plug for the monitor to connect to, such as HDMI, DVI-D, VGA, S-Video, etc..

And what about this KMSSink?

KMSSink is a first approach towards a video sink as a DRI client. For now it only works in the panda-board with a recent kernel (I guess, 3.3 would make it).

For now it only uses the custom non-tiled buffers and use an overlay plane to display them. So, it is in the to-do, add support for more hardware.


[1] http://free-electrons.com/kerneldoc/latest/fb/framebuffer.txt
[2] http://free-electrons.com/kerneldoc/latest/DocBook/drm/drmIntroduction.html
[3] https://www.kernel.org/doc/readme/drivers-gpu-drm-README.drm
[4] http://dri.sourceforge.net/doc/drm_low_level.html
[5] http://dri.sourceforge.net/doc/hardware_locking_low_level.html
[6] http://dri.sourceforge.net/doc/security_low_level.html
[7] https://en.wikipedia.org/wiki/Direct_Rendering_Manager
[8] http://www.bitwiz.org.uk/s/how-dri-and-drm-work.html
[9] https://lwn.net/Articles/283798/
[10] http://phoronix.com/forums/showthread.php?23756-KMS-as-anext-gen-Framebuffer
[11] http://elinux.org/images/7/71/Elce11_dae.pdf
[12] http://www.botchco.com/agd5f/?p=51
[13] https://lwn.net/Articles/440192/

Apr 12


In the last weeks Miguel, Calvaris and myself, developed an application for the N9/N950 mobile phone and we called it Aura.

Basically it uses the device’s camera (either the main one or the frontal one) for video recording, as a normal camera application, but also it exposes a set of effects that can be applied, in real time, to the video stream.

For example, here is a video using the historical effect:

Aura is inspired in the Gnome application, Cheese, and it uses many of the effects available in Gnome Video Effects.

The list of effects that were possible to port to the N9/N950 are: dice, edge, flip, historical, hulk, mauve, noir/blanc, optical illusion, quark, radioactive, waveform, ripple, saturation, shagadelic, kung-fu, vertigo and warp.

Besides of these software effects, it is possible to add, simultaneously, another set of effects that the hardware is capable, such as sepia colors. These hardware capabilities do not impose extra processing as the software effects do.

Because of this processing cost, imposed by the non-hardware video effects, Aura has a fixed video resolution. Otherwise the performance would make the application unusable. Also, we had a missing feature: the still image capture. But, hey! there is good news: Aura is fully open source, you can checkout the code at github and we happily accept patches.

Honoring Cheese, the name of Aura is taken from a kind of Finnish blue cheese.

We hope you enjoy this application as we enjoyed developing it.

Feb 12

Debian’s mutt with notmuch support

One of my weekend tasks was to reorganize my email environment. For reading email I use mutt, configured to grab the email from an IMAP server. For sending email, I have a minimal exim setup securely relaying to a smart host.

Mutt is a great email browser, but it is very bad at handling IMAP. Besides, I started to need searching through all my email. The solution for the first problem is offlineimap, a program wrote in Python, that “synchronizes emails between two repositories”. It downloads my email from the IMAP server into my laptop, so I work in my email locally, and if I delete an email locally, offlineimap will delete it in the next sync operation.

The solution for the second problem, the search, is notmuch, which is a email indexer, enabling fast searches among a vast mail collection. So, once new mail arrive (or is deleted) with offlineimap, notmuch (de)indexes it.

But another problem appear: how to query to notmch in an integrated way with my mail reader? One solution is provided by mutt-kz, a fork of mutt with notmuch support tightly integrated.

But I use Debian, and I like its package management. So I needed to craft a Debian package for mutt-kz.

I grabbed the Debian’s repository for mutt and re-based, one by one, the patches from mutt-kz.

The result is stored in this repository.

And now, I can query notmuch in mutt and immediately browse the result set.