Streams API in WebKit at the Web Engines Hackfest

Yes, I know, I should have written this post before you know, blah, blah, excuse 1, blah, excuse 2, etc. 😉

First of course I would like to thank Igalia for allowing me to use the company time to attend the hackfest and meeting such a group of amazing programmers! It was quite intense and I tried to give my best though for different reasons (coordination, personal and so on) I missed some session.

My purpose at the hackfest was to work with Youenn Fablet from Canon on implementing the Streams API in WebKit. When we began to work together in November, Youenn had already a prototype working with some tests, so the idea was taking that, completing, polishing and shipping it. Easy, huh? Not so…

What is Streams? As you can read in the spec, the idea is to create a way of handling different kind of streams with a common high level API. Those streams can be a mapping of low level I/O system operations or can be easily created from JavaScript.

Fancy things you can do:

  • Create readable/writable streams mapping different operations
  • Read/write data from/to the streams
  • Pipe data between different streams
  • Handle backpressure (controlling the data flow) automagically
  • Handle chunks as the web application sees fit, including different data types
  • Implement custom loaders to feed different HTML tags (images, multimedia, etc.)
  • Map some existing APIs to Streams. XMLHttpRequest would be a wonderful first step.

First thing we did after the prototype was defining a roadmap:

  • General ReadableStream that you can create at JavaScript and read from it
  • XMLHttpRequest integration
  • Loaders for some HTML tags
  • WritableStream
  • Piping operations

As you can see in bugzilla we are close to finishing the first point, which took quite a lot of effort because it required:

  • Code cleaning
  • Making it build in debug
  • Improving the tests
  • Writing the promises based constructor
  • Fixing a lot of bugs

Of course we didn’t do all this at the hackfest, only Chuck Norris would have been able to do that. The hackfest provided the oportunity of meeting Youenn in person, working side by side and discussing different problems and possible strategies to solve them, like for example, the error management, queueing chunks and handling their size, etc. which are not trivial given the complexity created by the flexibility of the API.

After the hackfest we continued working and, as I said before, the result you can find at bugzilla. We hope to be able to land this soon and continue working on the topic within the current roadmap.

To close the topic about the hackfest, it was a pleasure to work with such amount of awesome web engines hackers and I would like to finish thanking the sponsors Collabora and Adobe and specially my employer, Igalia, that was sponsor and host.

New media controls in WebKitGTK+ (reloaded)

In December we organized in A Coruña the WebKitGTK+ hackfest at the Igalia premises as usual and also as usual it was an awesome oportunity to meet the rest of the team. For more information about the progress done in the hackfest, you can have a look at KaL’s post.

As part of the hackfest I decided to take a task that it would take some time so that I could focus and I decided to go for rewriting once again the WebKitGTK+ multimedia controls. People who just read this post will wonder why I say again and the reason is that last year we completely redesigned the multimedia controls that use GStreamer for playback underneath. This time I have not redesigned them (well, a bit) but rewritten them in JavaScript as the Apple guys had done before.

To get the job done, the first step was bundling the JavaScript code and activating the codepath to use those controls. I used the Apple controls as template so you can imagine that the first result was a non-working monster that at some point reminded to Safari multimedia controls. At that point I could do two things, forking or inheriting. I decided to go with inheritance because it keeps the spirit of WebKit (and almost all Free Software projects) of sharing as much code as possible and because forking later is easier than merging. Then step by step I kept redefining JavaScript methods and tweaking some stuff in the C++ and CSS code to create the current user experience that we had so far.

Some of the non-aesthetic changes are the following:

  • Focus rings are now managed from CSS instead of C++.
  • Tests got new fixes, rebaselines and more love.
  • CMake support for the new controls.
  • Load captions icon from theme.
  • Load and hide elements handled now with CSS (and JavaScript).

The captions icon problem was interesting because I found out that the one we were using was “user-invisible-symbolic” and it was hardcoded directly in the CSS code. I changed it to be loaded from the theme but it raised the issue of using the incorrect metaphor though the current icon looks nice for captions. I filed a GNOME bug (and another WebKit bug to follow this up) so that a new icon can be created for captions/subtitles with the correct metaphor.

And which are the controls aesthetic changes?

  • Show a very subtle gradient when the elements are focused or active to improve the accessibility support (which won’t be complete until bug 117857 is fixed).
  • Volume slider rolls up and down with a nice animation.
  • Some other elements are not shown when they are not needed.
  • Captions menu shows up with both click and mouse hover for coherence with the volume slider.
  • Captions menu is also animated the same way as the volume slider.
  • Captions menu was propertly centered.
  • Captions menu style was changed to make it more similar to the rest of the controls (fonts, margings…)
  • Volume slider shows below the media element when it is too close to the page top and it cannot be shown on it. This was a regression that I introduced with the first rewrite, happy to have it fixed now.

As I already said the aesthetic differences with the former C++ are not a big deal unless you compare them with the original controls:

Starting point

To appreciate the new controls I cannot just show a screenshot, because the nicest thing are the animations. Therefore a video is needed (and if you have WebKit compiled you can experience them yourself)):

Of course, I thank our hackfest sponsors as the it was possible because of them:

Igalia GNOME Foundation

GUADEC 2013 is over

I departed from A Coruña, flying to Barcelona (where I had time to have a coffee with a friend), then to Vienna and then to Brno by bus. It was a bit tiring so after having a quick drink at our hotel with the rest of the Igalians, I just went to bed.

Web, the future is now by my Igalian friend Claudio Saavedra (different from Garnacho) was my first talk (apart from the keynote). Even after knowing the content already, I found quite interesting the way Claudio spoke about the latest changes in the development of Web (a.k.a. Epiphany), specially the way WebKit 2 helps to provide a much better user experience.

I liked a lot Alex Larsson’s talk of High resolution display support in GNOME. His approach of the abstract pixels and all the way down in the stack from GTK+ was very interesting.

Matt Dalio and his Endless Mobile project just rock, not only because of the tech and GNOME involved but also because of its social implications. Keep going!

Given the rest of my career and my work in WebKit, specially in the WebKitGTK+ port, I am always interested in GStreamer, as it is the framework we use use for multimedia playback (and other ports like EFL and Qt in Linux). What’s cooking in GStreamer talk by Sebastian Dröge and Tim-Philipp Müller was of course mandatory and worth going.

Jan-Christoph Borchardt’s talk about GNOME and ownCloud: desktop plus web for a holisic experience showed everything that is and is coming with ownCloud. I introduced him to Claudio as he mentioned that he would like to see at least Ephy’s bookmarks syncronized with ownCloud. It looks interesting.

I missed Philip Withnall’s talk about Testing online services because it was at the same time as the ownCloud one, but I was interested because of my wife‘s research project about the same subject. Philip, if you read this, leave me a comment or get in touch directly with her as she was very interested.

It is always refreshing to attend Marina Zhurakhinskaya’s talk about Outreach Program for Women: a lesson in collaboration, because I think it is very important to keep reminding ourselves about the relevance of programs to integrate more women into our community and I think we, and specially Marina and the OPW program, get always awesome results, though we cannot fall asleep and keep pushing till, at least, the 50%.

My Igalian friend Juan Suárez had a talk called Writing multimedia applications with Grilo where he showed the easy but powerful possibilities of the Grilo API and its plugins. There was also a lightning talk by an intern about adding support to build Lua plugins for Grilo.

Juan Pablo Ugarte was showing off his Glade skills in his talk about Rich custom user interfaces with Glade and CSS. His slides were made with Glade. Cool!

More secure with less “security” by Stef Walter had a lot of interesting ideas about how to improve security and making at the same time the user’s life easier.

Just after lunch it was the turn of my Igalian friend Martin Robinson who gave the talk about Webkit2 and you and explained many things about the WebKit 2 model, like for example all its layers and how the processes work.

There was another talk given by other fellow Igalians, Alejandro Piñeiro and Joanmarie Diggs, called Tag, your PDF is it that I couldn’t attend. In that case I preferred to prioritize other talks as I can always easily talk to them.

The rest of the talks I attended were:

Of course it would not be a GUADEC without hanging out with friends at the different parties, one of them sponsored by Igalia and Red Hat.

I really enjoyed the tour at the city, which is very beatiful and peaceful, though we had to finish it prematurely because of the impresive storm. On our way back from our unofficial GNOME Hispano dinner, which of course was great, to our hotel, we could see some broken tree branches and even what we think that were some fallen young trees.

The venue was really a nice place (with funny chairs) and the volunteers did an awesome job (THANKS!). Of course, there are always issues, like the AC problem (these things can always happen) and the internet connection which is an inherent problem to almost every GUADEC.

Wonderful work also of Ana, taking a lot of cool pictures, even from me, that you can see in her Flickr collections.

Of course, I cannot forget to thank Igalia for sponsoring my trip.

Strasbourg, there we go!

New media controls in WebKitGtk+

So it looks like my patch for the rework of the WebKitGtk+ media controls was finally landed.

First I would like to thank Igalia for giving me some time to complete this task, which took some work and began at WebKitGtk+ hackfest some time ago with Žan Doberšek and Jon McCann.

Starting point was:

Starting point

As you can see the controls look like an old Gtk+ application without any theming. Jon suggested that we could began with mimicing Chromium controls as they look closer to any modern themed GNOME application and adapt them to use the GNOME symbolic icons and keep some other stuff like the volume bar, but of course making it look nicer.

What was done:

  • Adding the GNOME symbolic icon theme and a method to replace the normal stock icons, though we keep them as fallback.
  • Deep adaptation of Chromium CSS and C++ code to make it suit the GNOME requirements.
  • Some buttons fell off the design, like seeking backwards and forward.
  • Aligned the elements with the pixel ruler to make them as close to perfect as possible in all conditions (as some buttons are hidden in certain situations, like fullscreen, volume…).
  • Fixed a bug about the buffering ranges that was in trunk at that point, but was independent of the code I was cooking.
  • Removed as much of the C++ code as possible to deviate the drawing to CSS, which is more maintainable for design purposes. The only things that are still painted with C++ code are the slider tracks, which depend on parameters than cannot be specified in CSS, like the buffering ranges and the volume (which was not before, but I introduced for design coherence).
  • Removed the focus ring which was making the controls uglier.
  • Removed the dead code.
  • New baselines for the tests, including the pixel ones. Flagged also some tests that are (and will) not working in Chromium either.

I had a small issue with a Chromium guy landing a patch that forced me to change the display of some components from -webkit-box to -webkit-flex and of course, rebasing all related tests. This created a small delay in landing the patch, but it finally did as 143463.

And the result is the following:
New media controls

I don’t know about you guys, but I like it!

GUADEC 2012: state of calvaris

GUADEC 2012 A Coruña

Executive summary

Tired as hell but with my hacking batteries completelly full.

Organization

There was always something to do here and there, which I needed to help with, but I think the most important things that I did were:

  • Organizing the Lightning Talks session, I guess you all remember the sound of the sea and the seaguls that I choose to indicate people that they were running out of time.
  • I helped Marina and Christophe in organizing the Interns’ Lightning Talks, which you can also guess because Marina liked the sea sound and decided to use it also for their session.
  • I helped Juanjo in organizing the BoFs. He planned them and I managed them at the indico.
  • Random stuff
  • Keeping hackers happy. Having Zeenix at my place implied hanging out almost every day with the hackers for dinner and beers so I needed to do my best to have them properly fed and with the deserved ‘party level’. I took some people to do some sightseeing and to several restaurants like A Roda, O Galiñeiro… I even remember going to a place with live music at Os Maios close to a Pulpeira de Arzúa.

The only bad thing I can think of is that I missed some talks because of having a lot of things to do.

Interesting talks

I attended all the talks I could and just tried to avoid the ones by Igalians because for obvious reasons I already knew what they were about, so I’ll mention the ones that I liked most:

  • Jacob Appelbaum’s keynote: Personal data exposure is underestimated by most people but it is something very important that we should care about. It would be interesting to have Tor integrated with GNOME.
  • Every detail matters by Allan: Suddenly you realize that you are in a better mood when using your GNOME 3 and the reason is because there are some guys focusing on having the small bugs fixed that were annoying you without noticing.
  • MinGW-w64 by Marc-André: Quite interesting talk about tools and recipes to crosscompile your programs for the Evil. This is quite important to help us with the task of showing that the GNOME world is as multiplatform as other options (you would be a fool if you think that you can have everything working like a charm on the Evil when using some frameworks that claim to be multiplatform just out of the box).
  • i18n by Gil: Random though: No native English speaker developer was on that talk. We really need to improve this. Then we can talk about tools and other stuff. Though I did not attend the BoF, I talked to Fran Diéguez and my wife Laura about some stuff the students of the Universidade da Coruña could help with to improve translators’ lives, maybe combined with GSoC or OPW.
  • GStreamer talk by Tim Müller: Can’t ever miss it. 1.0 is almost there and it shines!
  • PiTiVi by Jeff: Constant lol. Cannot wait to see the videos uploaded. Really enjoyed it not only because of it being fun, but because of all the new features coming to our favorite video editor.
  • Defensive publications: Patents suck in all senses and even more at FLOSS. These guys should be keynoters at all FLOSS events all over the world, because something I learned lately is that people can do things when they unite. If we help with bringing things to the public knowledge before they are patented by others (because Patent Offices workers are far from perfection considering their model that I do not agree with) we can do something. I recommend you to have a look at the Open Innovation Network.
  • History of GNOME: it is always interesting to look back to see how much we have walked.

Fixed bugs

  • After speaking to Matthias Clasen, I got the permission to apply the patches I had submitted for GB#613595. This was quite an old bug that I had worked on from the times of Hildon in Fremantle. Just after pushing I realized that I had caused a small regression but fortunately I fixed it before having consequences.
  • Bastien closed FDB#49945 about GeoClue by pushing my patch.
  • Edward promised to have a look at the patches I submitted for GB#663869. This is not closed yet, but I hope he does not forget 😉

Thanks

And of course, I need to thank Igalia for sponsoring me by attending and helping at GUADEC:

Igalia

GUADEC 2012

Wow, something I think I would not see, but GUADEC is happening less than 3km from home in less than a week!

GUADEC 2012 A Coruña

I have collaborated in organizing the Lightning Talks and BoFs/Hackfests and of course, I’ll help wherever I can. Poor Zeenix will be at my place so he will need to wake up early to come with me and my wife Laura to the faculty.

I also want to thank Igalia for sponsoring me with time to help organizing and attending the event and I might be also helping the Advisory Board meeting that will happen at Igalia headquarters on the 25th.

Aura made me Qt Ambassador

As I already said the other day in Twitter, I became Qt Ambassador because of Aura. The only problem is that is a project-person program, meaning that it is granted to a person because of having worked on a project. Aura was a project developed by three Igalians, who were Miguel, Víctor and me and I consider a bit unfair that it was granted only to me because they deserve it as much as I do.

The procedure I followed was:

  • Applying with Aura
  • When that was accepted, I submitted Aura project page.
  • After the publication I was told that I was going to receive the Qt Ambassador Merchansise

Does anybody know if more people can become ambassadors for the same project and how?

Sending emails in Harmattan with Qt/QML

In the context of a personal ad-hoc app (I will come to that later) that I wrote for my Nokia N9, I needed to send an email to a specific person with an attachment. After the first research at Harmattan APIs you come to QMessageService.

The first thing I did was writing Mixed QML/Qt object that I could instantiate from the QML code so that I could do something like:

Message {
    id: message
    from: "my@Address"
    to: [ "destination@Address" ]
    subject: "This is not Spam for sure!"
    body: "Trolled! Enlarge...!"
    attachments: [ "/a/path/to/an/attachment" ]
}

Button {
    onClicked: message.compose()
//    onClicked: message.send()
}

There we have an object with two send and compose methods, three string properties representing the from, subject and body and two string list properties representing the to and attachments (we leave the CC and BCC as an exercise 😉 ). As I already explained how to create the objects with the properties in previous posts, I’ll go directly to the compose and send methods.

const QMessage Message::createMessage()
{
    QMessage message;

    message.setFrom(QMessageAddress(QMessageAddress::Email, m_from));
    QMessageAddressList toList;
    foreach (const QString &item, m_to) {
        toList.append(QMessageAddress(QMessageAddress::Email, item));
    }
    message.setTo(toList);
    message.setSubject(m_subject);
    message.setBody(m_body);
    message.appendAttachments(m_attachments);

    return message;
}

bool Message::send()
{
    QMessage message = createMessage();
    service.send(message);

    return true;
}

bool Message::compose()
{
    QMessage message = createMessage();
    service.compose(message);

    return true;
}

Easy, right?

Dancing Troll

If you want to be completely trolled stop reading here. If not, please continue.

That does not work, at least I could not make it work. I might be missing something, but I could not. I hooked to the signal and saw the state changes, that were going from sending to Ok and no email composer was showing up. No error was being returned either.

Finally, I got an easier solution by using Qt.openUrlExternally and mailto: directly from QML. You can use obscure parameters to compose the message the way you want. Example:

Button {
    onClicked: Qt.openUrlExternally("mailto:destionation@Address" +
                                    "?subject=Subject" + 
                                    "&attach=/path/to/the/attachment" +
                                    "&body=Body")
}

That does the job and it is less complicated than creating a whole class. I leave the whole message class code in a pastebin in case somebody wants to try it or develop something from there in a platform where the QMessageService really works.

Ad-hoc program

My wife tracks all our finances using Home Bank and I needed an easy way to write down the cash expenses. I decided to code a simple program easying me to manage expenses by adding them to a list (and deleting them) and export them as CSV in order to be sent through email when she requested them. As the CSV and the expenses type were so ad-hoc, I decided not to complicate it and cable them in the code and database.

I don’t think it can be interesting to anybody, but I might end up uploading it to some git repo so that it can be maintained by someone else or even forked.

Painting video with GStreamer and Qt/QML or Gtk+ with overlay

As part of my work at Igalia I had to work with video and GStreamer for some years. I always used Gtk+ for that so when I needed to do things with Qt and QML, things were different. In my projects I always used pure GStreamer code instead of the Qt bindings for GStreamer because at the moment those bindings were not ready or reliable.

I know two ways of painting video:

  • Overlay way, with a window id and so on
  • Texture streaming

I might write later about texture streaming, but I will focus now on overlay.

Painting

The first way means that you need from your graphical toolkit a window id. That window id is asked by the video sink element in a very special moment and you need to provide it in that moment if you have not provided it before. For example, if you are using playbin2 and you already know the sink you want to use, just instantiate your sink and set the window id at that moment with gst_x_overlay_set_window_handle and set the sink to the playbin2 element by setting the video-sink property.

If you are not using playbin2 and for example you are using GStreamer Editing Services, you cannot use a property because currently there is no one and need to use a more complicated method. I already reported the bug with its patches and hope that they apply them as soon as possible to improve compatibility with playbin2 because the way it is now is a bit inconsistent with the rest of GStreamer code base.

Both Qt and Gtk have now client side windows, which means that your program window has only one X window and it is the toolkit that decides which widget is receiving the events. The main consequence is that if we just set the window id, GStreamer will use the whole window and will paint the video over the rest of our widgets (it does not matter if QML/Qt or Gtk+) and you’ll get very ugly effects. To solve that, you need to set the render rectangle, which are the coordinates (relative to the X whole X window) where you want to paint your video. You need to do that just after setting the window id with gst_x_overlay_set_render_rectangle.

If you do not set your window handle and your render rectangle before the pipeline begins to move, it will ask you about that with the prepare-xwindow-id GstMessage, but this message can happen inside the GStreamer threads and it cannot wait until the main loop runs, it needs the information at that very moment, so you need to connect to the synchronous bus handle. GStreamer has a good example at the GstXOverlay documentation about how to do that. To use the callback in C++, you need to declare a static method and pass this as user data parameter, then you can behave almost as having a normal object method. This is the most common solution used in the GNOME world and fits perfectly with the Qt framework too.

The code to get the window id and render rectangle in Gtk+ would be something like:

GdkWindow *gdk_window;
gdk_window = gtk_widget_get_window(your_widget);
/* as sink you can use GST_MESSAGE_SRC() if you are waiting
    for the prepare-xwindow-id message */
gst_x_overlay_set_xwindow_id(GST_X_OVERLAY(your_sink),
                             GDK_WINDOW_XID(gdk_window));
/* do your maths about your coordinates */
gst_x_overlay_set_window_handle(GST_X_OVERLAY(sink),
                                x, y, width, height);

In Qt, if you are using common widgets, you could use something like:

WId winId = QApplication::activeWindow()->effectiveWinId();
gst_x_overlay_set_xwindow_id(GST_X_OVERLAY(your_sink),
                             winId);
/* do your maths about your coordinates */
gst_x_overlay_set_window_handle(GST_X_OVERLAY(sink),
                                x, y, width, height);

If you are using a QGraphicsScene you would do something like:

/* to get the view you could do something like this
    (if you have only one or will to mess things up):
QGraphicsView *view = your_scene.views[0];
*/
gst_x_overlay_set_xwindow_id(GST_X_OVERLAY(your_sink),
                             view->viewport()->effectiveWinId());
/* do your maths about your coordinates */
gst_x_overlay_set_window_handle(GST_X_OVERLAY(sink),
                                x, y, width, height);

If you are using QML, you would have a very similar approach to the last snippet, because as you should have a QDeclarativeItem, it has a scene() that you can use, to have something like QGraphicsView *view = scene().views[0]; (of course, assuming that you have only one view, which is the most common case).

Overlaying stuff

Some times it is nice do put your controls on top of the video by covering part of the image. It would be like having the video as the background of a canvas where you draw some other widgets. Some GStreamer elements give you the possibility of doing a trick to do this, which is using a colorkey for your background and painting whatever you want on top of that as long as it does not include that colorkey. Some elements like xvimagesink or omapxvsink (used in the Nokia N9 and N950) have the colorkey property that you can read and set. If you are not planning to overlay anything, you can forget about this, but if you do, you need set a color key to the sink and use that color to paint the background of your widget and a good moment is also when setting the window handle:

g_object_set(sink, "autopaint-colorkey", FALSE,
             "colorkey", 0x080810, NULL);

Why do I unset the colorkey autopainting? Because I do not want GStreamer to mess my widget painting.

And more important: Why did I use 0x080810? Because it is a dark color, close to black, but it is not black. Pure black can be dangerous as it is commonly used in themes when painting widgets so you would be getting ugly artifacts. Some people recommend magenta (0xFF00FF) as it is supposedly a color that does not exist in nature (citation needed). I would not do it for several reasons:

  • You will need to synchronize your painting very well to avoid seeing the colorkey
  • If you respect aspect ratio you will see it for sure, because you (or the sink if it is automatic) paint the backgound and the sink draws the image by leaving some empty space.
  • It does not behave well with blendings, as you blend from your widget color to the background, which is the colorkey

Advice: do not mess with colorkey and omapxvsink. Though it is supposed to be writable, it is not and it always uses 0x080810.

Aspect ratio

There are two kind of people:

  • The ones that want to use all the pixels of their monitor/TVs and like damaging their brain with distorted images.
  • The ones that like to see a correctly dimensioned image with some bars giving you a better impression of what was recorded.

As you can guess I belong to the second group.

There are some sinks that do that automatically for you by setting the force-aspect-ratio property, like ximagesink and xvimagesink but there are other that does not and omapxvsink is an example. It is not a big problem but forces you to work a bit more when you select the render rectangle. For that you need to know the video size, which you cannot know until the pipeline is running, which forces to to hook to the GST_MESSAGE_ASYNC_DONE, or in the case of playbin2, you already have the video size when getting the prepare-xwindow-id message. An example to get the video size would be:

GstPad *pad;
GstCaps *caps;
GstStructure *structure;
int width, height;

pad = GST_BASE_SINK_PAD(sink);
caps = GST_PAD_CAPS(pad);
g_return_if_fail(caps && gst_caps_is_fixed(caps));

structure = gst_caps_get_structure(caps, 0);
gst_structure_get_int(structure, "width", &width);
gst_structure_get_int(structure, "height", &height);

/* some videos define a pixel aspect ratio, meaning that the
   video pixel could be like 2x1 copared to a squared pixed
   and we need to correct this */
if (gst_structure_has_field(structure, "pixel-aspect-ratio")) {
    int par_n, par_d;
    gst_structure_get_fraction(structure, "pixel-aspect-ratio",
                               &par_n, &par_d);
    width = width * par_n / par_d;
}

/* trick: some sinks perform better with multiple of 2 */
width &= ~1;
height &= ~1;