05
May 13

GPhone v0.2

It is almost 5 months since I published the first release of GPhone. Though I haven’t abandoned it, I swork on it slowly but steadly.  And to prove it, I’m releasing a new version, the v0.2.

What does it include? Mainly bug fixes. The new features are still a work in progress. You can browse the next branch in git to glance it.

Change Log:

  • Able to use GSimpleAsyncResult for glib <= 2.35
  • Added Vala (0.16) back compatibility, including a gstreamer-1.0.vapi
  • Added a script for build and update a jhbuild environment: update-jhbuild-env. It handles the gphone’s dependencies.
  • Notify to the user if his NAT type is not compatible with a VoIP session.
  • Renamed accounts to registrars.
  • Handle networking signaling
  • Create a global header file
  • Many bug fixes

 


08
Apr 13

GStreamer Hackfest 2013 – Milan

Last week, from 28th to 31th of March, some of us gathered at Milan to hack some bits of the GStreamer internals. For me was a great experience interact with great hackers such as Sebastian Drödge, Wim Taymans, Edward Hervey, Alessandro Decina and many more. We talked about GStreamer and, more particularly, we agreed on new features which I would like to discuss here.

GStreamer Hackers at Milan

GStreamer Hackers at Milan

For sake of completeness, let me say that I have been interested in hardware accelerated multimedia for a while, and just lately I started to wet my feet in VAAPI and VDPAU, and their support in our beloved GStreamer.

GstContext

The first feature that reached upstream is the GstContext. Historically, in 2011, Nicolas Dufresne added GstVideoContext as an interface to a share video context (such as display name, X11 display, VA-API display, etc.) among the pipeline elements and the applications. But now, Sebastian, generalized the interface to a container to stores and shares any kind of contexts between multiple elements and the application.

The first approach, that is still living in gst-plugins-bad, was merely a wrapper to a custom query to set or request a video context. But now, the context sharing is part of the pipeline setup.

An element that needs a shared context must follow these actions:

  1. Check if the element already has a context
  2. Query downstream for the context
  3. Post a message in the bus to see if the application has one to share.
  4. Create the context if there is none, post a message and send an event letting know that the element has the context.

You can see the example of the eglglessink to know how to use this feature.

GstVideoGLTextureUploadMeta

Also in 2011, Nicolas Dufresne, added a helper class to upload a buffer into a surface (OpenGL texture, VA API surface, Wayland surface, etc.). This is quite important since the new video players are scene based, using framework such as Clutter or OpenGL directly, where the video display is composed by various actors, such as the multimedia controls widgets.

But still, this interface didn’t fit well for GStreamer 1.0, until now, where it was introduced in the figure of a buffer’s meta, though this meta is only specific for OpenGL textures. If the buffer provides this new GstVideoGLTextureUploadMeta meta, a new function gst_video_gl_texture_upload_meta_upload() is available to upload that buffer into an OpenGL texture specified by its numeric identifier.

Obviously, in order to use this meta, it should be proposed for allocation by the sink. Again, you can see the case of eglglesink as example.

Caps Features

The caps features are a new data type for specify a specific extension or requirement for the handled media.

From the practical point of view, we can say that caps structures with the same name but with a non-equal set of caps features are not compatible, and, if a pad supports multiple sets of features it has to add multiple equal structures with different feature sets to the caps.

Empty GstCapsFeatures are equivalent with the GstCapsFeatures handled by the common system memory. Other examples would be a specific memory types or the requirement of having a specific meta on the buffer.

Again, we can see the example of the capsfeatures in eglglessink, because now the gst-inspect also shows the caps feature of the pads:

Pad Templates:
  SINK template: 'sink'
    Availability: Always
    Capabilities:
      video/x-raw(memory:EGLImage)
                 format: { RGBA, BGRA, ARGB, ABGR, RGBx,
                           BGRx, xRGB, xBGR, AYUV, Y444,
                           I420, YV12, NV12, NV21, Y42B,
                           Y41B, RGB, BGR, RGB16 }
                  width: [ 1, 2147483647 ]
                 height: [ 1, 2147483647 ]
              framerate: [ 0/1, 2147483647/1 ]
      video/x-raw(meta:GstVideoGLTextureUploadMeta)
                 format: { RGBA, BGRA, ARGB, ABGR, RGBx,
                           BGRx, xRGB, xBGR, AYUV, Y444,
                           I420, YV12, NV12, NV21, Y42B,
                           Y41B, RGB, BGR, RGB16 }
                  width: [ 1, 2147483647 ]
                 height: [ 1, 2147483647 ]
              framerate: [ 0/1, 2147483647/1 ]
      video/x-raw
                 format: { RGBA, BGRA, ARGB, ABGR, RGBx,
                           BGRx, xRGB, xBGR, AYUV, Y444,
                           I420, YV12, NV12, NV21, Y42B,
                           Y41B, RGB, BGR, RGB16 }
                  width: [ 1, 2147483647 ]
                 height: [ 1, 2147483647 ]
              framerate: [ 0/1, 2147483647/1 ]

Parsers meta

This is a feature which has been pulled by Edward Hervey. The idea is that the video codec parsers (H264, MPEG, VC1) attach a meta into the buffer with a defined structure that carries that new information provided by the codified stream.

This is particularly useful by the decoders, which will not have to parse again the buffer in order to extract the information they need to decode the current buffer and the following.

For example, here is the H264 parser meta definition.

VDPAU

Another task pulled by Edward Hervey, for which I feel excited, is the port of VDPAU decoding elements to GStreamer 1.0.

Right now only the MPEG decoder is upstreamed, but MPEG4 and H264 are coming.

As a final note, I want to thank Collabora and Fluendo for sponsoring dinners. A special thank you, as well, for Igalia which covered my travel expenses and attendance to the hackfest.


26
Mar 13

GStreamer Hackfest 2013

Next Thursday I’ll be flying to Milan to attend the 2013 edition of the GStreamer Hackfest. My main interests are hardware codecs and GL integration, particularly VA API integrated with GL-based sinks.

Thanks Igalia for sponsoring my trip!

Igalia


04
Mar 13

Igalia (three) week in WebKit: Media controls, Notifications, and Clang build

Hi all,

Three weeks have passed since I wrote the last WebKit report, and they did so quick that it scares me. Many great things have happened since then.

Let’s start with my favorite area: multimedia. Phil landed a patch that avoids muting the sound if the audio pitch is preserved. And Calvaris finally landed his great new media controls. Now watching videos in WebKitGTK+ is a pleasure.

Rego keeps his work on adding more tests to WebKitGTK+, ando he also wrote a fine document on how to build Epiphany with WK2 from git/svn.

Claudio, besides his work in the snapshots API that we already commented, retook the implementation of the notifications API for WebKitGTK+. And, while implementing it, he fixed some crashers in WK2’s implementation. He has also given us an early screencast with the status of the notifications implementation: Check it out! (video).

Carlos García Campos, besides working hard on the stable and development releases of WebKitGTK+ library, has also landed a couple of fixes. Meanwhile, Dape removed some dependencies, making the code base more clean.

Žan, untiredly, has been poking all around the WebKit project, keeping the GTK port kicking and healthy: He fixed code; cleaned up scripts, autogenerated code and enhanced utility scripts; he also enabled more tests in the development builds. But his most impressive work in progress is enabling the Clang build of the GTK port.

But there’s more! Žan setup a web page were you can visualize the status of WebKit2 on several ports: http://www.iswk2buildbrokenyet.com/


01
Feb 13

Igalia week in WebKit: Fixes, gardening, resources API and MathML

It is time for another weekly report on what is going on WebKit and Igalia.

Sergio reported and helped to debug different crashes in WebKit accessibility. He improved the robustness of build-webkit. Also, he fixed a bug in the defined network buffer size, simplified a bit the code that close the network connections, and fixed a bug in libsoup within a crash in webkit

And this week we have a new face too, at least in these reports: Adrián. He enabled the opus codec in the MIME type list. After that, he decided to give a try to a native compilation in armv5tel board. Crazy? A bit, but fun too.

Rego continues his hard work on enabling tests for WebKitGTK+, like testing the text direction setting, or unflag blocked tests that already pass, and many other that are still work in progress. Also, he got landed his patch for the bug found in the GtkLauncher.

Claudio, whilst he waits for the review of his API to retrieve a snapshot, he retook the Notifications API work for WebKitGTK+, particularly for WebKit2. Also, and just for sake of landing something, he fixed a couple of minor compilation glitches.

Philippe still kicking the fullscreen support in WebKit using GStreamer. And for relaxing purposes, he updated this patch for AudioSourceProvider.

Žan works tirelessly keeping the bots working as good as possible: disabling the build of wk2 on EWSs, meanwhile the wk2 storm appeases; cleaned building configuration options, removed deprecated methods for unit tests, enhanced the webkit-patch script and much more. Besides this needed work, he also started to hack on completing the WTFURL implementation for the WebCore’s KURL interface. WTFURL is pretty much what WebKit (as a project) would like to use in the future. WTFURL is based on the GoogleURL code (which is what the Chromium port is at the moment using as the backend of KURL).

The great WebKit hacker, Martin Robinson is exploring uncharted territory in the project: He’s trying to get away from port-specific things, by scratching in the core stuff, but the serendipity showed up, so he found some pretty serious port-specific bugs that have relatively straight-forward fixes.

Martin started to work on MathML and noticed that WebKit MathML support for the mathvariant attribute is unfinished. And this issue led him done to set of patches to overcome this situation, like fixing the freetype usage, or adding the logic to append UChar32 onto StringBuilder

In addition to all this work, Martin is working on a patch for mathvariant itself. The mathvariant attribute allows easily using the Mathematical Alphanumeric Symbols in MathML without having to type out XML entities. For instance this:

<mi mathvariant=”fraktur”>HI GUYZ!</mi>

will be rendered like this:

Carlos García Campos cooked a patch for fixing the WebKit2 GTK+ API by implementing the resources API, removed recently, using injected bundle. This is another effort to bring back WebKit2 into WebKitGTK+.

Dape is still pushing the Qt/WebKit2 spellcheck support with a new iteration of the patch. He also worked on the removal of the GDK dependency from ImageDiff.

Finally, I finished a first iteration of my patch for pitch preservation, and also it got landed!


28
Jan 13

Announcing GPhone v0.10

Hi folks!

As many of you may know, lately I have been working on Ekiga and Opal. And, as usually happens to me, I started to wonder how I would re-write that piece of software. My main ideas growth clearly: craft a GObject library, ala WebKitGTK+, wrapping Opal’s classes, paying attention to the gobject-introspection, also, redirecting the Opal’s multimedia rendering to a GStreamer player,  and, finally, write the application with Vala.

The curiosity itched me hard,  so I started to hack, from time to time, these ideas. In mid-November, last year, I had a functional prototype, which only could make phone calls in a disgraceful user interface. But I got my proof of concept. Nonetheless, as I usually do, I didn’t dropped the pet project, continuing the development of more features.

And today, I am pleased to present you,  the release v0.1 of GPhone.

Well, I guess a screencast is mandatory nowadays:

 


24
Jan 13

Igalia WebKit week: welcome Žan and Rego!

This  a new weekly WebKit Igalia’s report. And the last week has been a shaky one.

Let’s start with a warm welcome to Žan Dobersek as collaborator student, who is working hard in a lot of cool matters: gardening the bots, cleaning up the Coverity run output, modifying the WebCoreTestSupport to decrease the dependency WebKit2 API, digging in a stack size problem in JSC, and much more fun stuff.

Meanwhile Joanie, after a lot of restless hacking hours, could fix a crash in tables accessibility, and saving us from many other accessibility crashes. She is working to make the world safer for table users everywhere!

But we have more new faces around: Our dear colleague, Rego, is getting his feet wet, and he started finding and fixing new bugs and he is enabling more tests in the Gtk+ port.

Calvaris is following his efforts for enhancing the user experience with the media controls for HTML5. Here is a recent screenshot of these controls:

https://bug-83869-attachments.webkit.org/attachment.cgi?id=183575

Claudio is pushing the accelerator pedal for the snapshot API in WebKit2. With this API the applications would retrieve a snapshot from a webview as in the Overview mode in Epiphany.

Epiphany's overview mode

Philippe is working on the fullscreen video porting the GStreamerGWorld module to GStreamer 1.0, while he is still hacking on the AudioSourceProvider  for WebAudio.

And the last but not the least, Dape is working real hard on the Qt spell-check support, and he also proposed solutions for Qt WebNotification support.

And that is all for now. See you!


21
Jan 13

Two video streams simultaneously in Ekiga

As we talked earlier, we added support for H.239 in Ekiga, and we were able to show the main role video and the slides role video, one stream at the time.

But now we did a step forward: we wanted to view both videos simultaneously. So, we hacked again Ekiga and we added that feature, when a second video stream arrives, another window is showed with the secondary role stream.

Here is the mandatory screencast:

 


17
Jan 13

Weekly Igalia’s report on WebKit

Hi webkitters,

This weekly report project was supposed to start after the last WebKit hackfest, but my holidays got in between and now I’m recovering of them 😉

Here we go:

In summary, in these last three weeks we have had 15 commits and done 23 reviews.

Martin and Carlos have been working on the authentication mechanisms. Now they can be hooked ,through the web view API, by the applications, which could take control of the dialogues and credentials handling.

Martin has also been dealing with text rendering with complex layouts (such as Arabic). This effort leaded, finally, to the removal of Pango in favor of Harfbuzz.

Now let’s talk about Carlos’ baby monster: the injected bundle patch. As you know, in WebKit2, the engine has been split in two isolated processes the UI and Web processes. The first is in charge of the user interface, and the former deals with HTML, CSS and JavaScript handling. Meanwhile this approach adds more robustness and responsiveness, also imposes more complexity because it is required an IPC mechanism to interact with the Web process. This is particularly hard for accessing to the DOM bindings.

Carlos, since the last year, has been working on his injected bundle patch, which offers a mean to support loading plugins in the web process using injected bundle. Hence, through DBus, an application could load a plugin to communicate, indirectly, with the Web process. This approach is supposed to be the milestone for the DOM bindings in WK2GTK, and also provides a mean to pre-fetch DNS registries. This patch has been happily pushed just recently, in the second week of January.

If this was not enough, Carlos also released the development version of WebKitGTK+ v1.11.4.

Now let us go to the multimedia realm, my favorite land.

Philippe finished the port of his patch for WebAudio support to GStreamer 1.0 as backend. And now he is porting the full-screen support in Gst 0.10 to Gst 1.0 in order to reuse ans share the same base code. Aligned with WebAudio, Philippe is developing a new audio source provider that gathers raw audio data from the MediaPlayer and pipe them into the AudioBus if it is required.

Xabier has been working to deliver a nice and neat HTML5 media controls, using stylable GTK+ controls. And myself, I’m still playing with the audio pitch preservation.

Another great landmark for us is a11y, and here Joanie has been working hard to bring back the accessibility tests on GTK to a sane state. And also keeps her efforts to enable an access to WebKit for Orca.

In other sort of things, Berto has been fighting against a bug on GtkLaunch, which was shown in Epiphany too when displaying only images. Meanwhile, Dape, lurked on spell checking support for Qt WebKit2. And Sergio enabled, by default, the WebP image handling.


25
Sep 12

The thrilling story about the H.239 support in Ekiga

The development of Ekiga 4.0 has begun, and three months ago I started a project aimed to support H.239 in Ekiga.

H.239 is an ITU recommendation and it is titled “Role management and additional media channels for H.300-series terminals”. Its purpose is the definition of procedures for use more than one video channel, and for labelling those channels with a role.

A traditional video-conference has an audio channel, a video channel and an optional data channel. The video channel typically carries the camera image of the participants. The H.239 recommendation defines rules and messages for establishing an additional video channel, often to transmit presentation slides, while still transmitting the video of the presenter.

For presentations in multi-point conferencing, H.239 defines token procedures to guarantee that only one endpoint in the conference sends the additional video channel which is then distributed to all conference participants.

Ekiga depends on Opal, a library that implements, among other things, the protocols used to send voice over IP networks (VoIP). And, according to the documentation, Opal had already support for H.239 since a while, for what we assumed that enable it in Ekiga would be straight forward.

After submitting a few patches to Opal and its base library, PTlib, then I  setup a jhbuild environment to automate the building of Ekiga and its dependencies. Soon, I realized that the task was not going to be as simple as we initially assumed: the support for H.323, in Ekiga, is not as mature as the SIP support.

I have to mention that along the development process, in order to test my scratched code, I used this H.239 application sample, a small MS Windows program which streams two video channels (the main role and the slides role). The good news is that it works fine in Wine.

Well, the true is that the activation of the client support for H.239 in Ekiga was easy. The real problem began with the video output mechanism. Ekiga has a highly complex design for drawing video frames. Too complex in my opinion. One of the first thing I did to understand the video frames display, was to sketch a sequence diagram from when a decoded frame is delivered by Opal to when the frame is displayed in the screen. And here is the result:

As I said, it is too complex for me, but the second stream had to be displayed. Fearfully, I started to refactor some code inside Ekiga and adding more parameters to handle another video stream. But finally, I just stepped forward, and submitted a bunch of patches that finally displays a second stream.

The current outcome is shown in the next screen cast: Screencast Ekiga H.239 (OGV / 7.8M)