New media controls in WebKitGtk+

So it looks like my patch for the rework of the WebKitGtk+ media controls was finally landed.

First I would like to thank Igalia for giving me some time to complete this task, which took some work and began at WebKitGtk+ hackfest some time ago with Žan Doberšek and Jon McCann.

Starting point was:

Starting point

As you can see the controls look like an old Gtk+ application without any theming. Jon suggested that we could began with mimicing Chromium controls as they look closer to any modern themed GNOME application and adapt them to use the GNOME symbolic icons and keep some other stuff like the volume bar, but of course making it look nicer.

What was done:

  • Adding the GNOME symbolic icon theme and a method to replace the normal stock icons, though we keep them as fallback.
  • Deep adaptation of Chromium CSS and C++ code to make it suit the GNOME requirements.
  • Some buttons fell off the design, like seeking backwards and forward.
  • Aligned the elements with the pixel ruler to make them as close to perfect as possible in all conditions (as some buttons are hidden in certain situations, like fullscreen, volume…).
  • Fixed a bug about the buffering ranges that was in trunk at that point, but was independent of the code I was cooking.
  • Removed as much of the C++ code as possible to deviate the drawing to CSS, which is more maintainable for design purposes. The only things that are still painted with C++ code are the slider tracks, which depend on parameters than cannot be specified in CSS, like the buffering ranges and the volume (which was not before, but I introduced for design coherence).
  • Removed the focus ring which was making the controls uglier.
  • Removed the dead code.
  • New baselines for the tests, including the pixel ones. Flagged also some tests that are (and will) not working in Chromium either.

I had a small issue with a Chromium guy landing a patch that forced me to change the display of some components from -webkit-box to -webkit-flex and of course, rebasing all related tests. This created a small delay in landing the patch, but it finally did as 143463.

And the result is the following:
New media controls

I don’t know about you guys, but I like it!

GUADEC 2012: state of calvaris

GUADEC 2012 A Coruña

Executive summary

Tired as hell but with my hacking batteries completelly full.

Organization

There was always something to do here and there, which I needed to help with, but I think the most important things that I did were:

  • Organizing the Lightning Talks session, I guess you all remember the sound of the sea and the seaguls that I choose to indicate people that they were running out of time.
  • I helped Marina and Christophe in organizing the Interns’ Lightning Talks, which you can also guess because Marina liked the sea sound and decided to use it also for their session.
  • I helped Juanjo in organizing the BoFs. He planned them and I managed them at the indico.
  • Random stuff
  • Keeping hackers happy. Having Zeenix at my place implied hanging out almost every day with the hackers for dinner and beers so I needed to do my best to have them properly fed and with the deserved ‘party level’. I took some people to do some sightseeing and to several restaurants like A Roda, O Galiñeiro… I even remember going to a place with live music at Os Maios close to a Pulpeira de Arzúa.

The only bad thing I can think of is that I missed some talks because of having a lot of things to do.

Interesting talks

I attended all the talks I could and just tried to avoid the ones by Igalians because for obvious reasons I already knew what they were about, so I’ll mention the ones that I liked most:

  • Jacob Appelbaum’s keynote: Personal data exposure is underestimated by most people but it is something very important that we should care about. It would be interesting to have Tor integrated with GNOME.
  • Every detail matters by Allan: Suddenly you realize that you are in a better mood when using your GNOME 3 and the reason is because there are some guys focusing on having the small bugs fixed that were annoying you without noticing.
  • MinGW-w64 by Marc-André: Quite interesting talk about tools and recipes to crosscompile your programs for the Evil. This is quite important to help us with the task of showing that the GNOME world is as multiplatform as other options (you would be a fool if you think that you can have everything working like a charm on the Evil when using some frameworks that claim to be multiplatform just out of the box).
  • i18n by Gil: Random though: No native English speaker developer was on that talk. We really need to improve this. Then we can talk about tools and other stuff. Though I did not attend the BoF, I talked to Fran Diéguez and my wife Laura about some stuff the students of the Universidade da Coruña could help with to improve translators’ lives, maybe combined with GSoC or OPW.
  • GStreamer talk by Tim Müller: Can’t ever miss it. 1.0 is almost there and it shines!
  • PiTiVi by Jeff: Constant lol. Cannot wait to see the videos uploaded. Really enjoyed it not only because of it being fun, but because of all the new features coming to our favorite video editor.
  • Defensive publications: Patents suck in all senses and even more at FLOSS. These guys should be keynoters at all FLOSS events all over the world, because something I learned lately is that people can do things when they unite. If we help with bringing things to the public knowledge before they are patented by others (because Patent Offices workers are far from perfection considering their model that I do not agree with) we can do something. I recommend you to have a look at the Open Innovation Network.
  • History of GNOME: it is always interesting to look back to see how much we have walked.

Fixed bugs

  • After speaking to Matthias Clasen, I got the permission to apply the patches I had submitted for GB#613595. This was quite an old bug that I had worked on from the times of Hildon in Fremantle. Just after pushing I realized that I had caused a small regression but fortunately I fixed it before having consequences.
  • Bastien closed FDB#49945 about GeoClue by pushing my patch.
  • Edward promised to have a look at the patches I submitted for GB#663869. This is not closed yet, but I hope he does not forget 😉

Thanks

And of course, I need to thank Igalia for sponsoring me by attending and helping at GUADEC:

Igalia

GUADEC 2012

Wow, something I think I would not see, but GUADEC is happening less than 3km from home in less than a week!

GUADEC 2012 A Coruña

I have collaborated in organizing the Lightning Talks and BoFs/Hackfests and of course, I’ll help wherever I can. Poor Zeenix will be at my place so he will need to wake up early to come with me and my wife Laura to the faculty.

I also want to thank Igalia for sponsoring me with time to help organizing and attending the event and I might be also helping the Advisory Board meeting that will happen at Igalia headquarters on the 25th.

Painting video with GStreamer and Qt/QML or Gtk+ with overlay

As part of my work at Igalia I had to work with video and GStreamer for some years. I always used Gtk+ for that so when I needed to do things with Qt and QML, things were different. In my projects I always used pure GStreamer code instead of the Qt bindings for GStreamer because at the moment those bindings were not ready or reliable.

I know two ways of painting video:

  • Overlay way, with a window id and so on
  • Texture streaming

I might write later about texture streaming, but I will focus now on overlay.

Painting

The first way means that you need from your graphical toolkit a window id. That window id is asked by the video sink element in a very special moment and you need to provide it in that moment if you have not provided it before. For example, if you are using playbin2 and you already know the sink you want to use, just instantiate your sink and set the window id at that moment with gst_x_overlay_set_window_handle and set the sink to the playbin2 element by setting the video-sink property.

If you are not using playbin2 and for example you are using GStreamer Editing Services, you cannot use a property because currently there is no one and need to use a more complicated method. I already reported the bug with its patches and hope that they apply them as soon as possible to improve compatibility with playbin2 because the way it is now is a bit inconsistent with the rest of GStreamer code base.

Both Qt and Gtk have now client side windows, which means that your program window has only one X window and it is the toolkit that decides which widget is receiving the events. The main consequence is that if we just set the window id, GStreamer will use the whole window and will paint the video over the rest of our widgets (it does not matter if QML/Qt or Gtk+) and you’ll get very ugly effects. To solve that, you need to set the render rectangle, which are the coordinates (relative to the X whole X window) where you want to paint your video. You need to do that just after setting the window id with gst_x_overlay_set_render_rectangle.

If you do not set your window handle and your render rectangle before the pipeline begins to move, it will ask you about that with the prepare-xwindow-id GstMessage, but this message can happen inside the GStreamer threads and it cannot wait until the main loop runs, it needs the information at that very moment, so you need to connect to the synchronous bus handle. GStreamer has a good example at the GstXOverlay documentation about how to do that. To use the callback in C++, you need to declare a static method and pass this as user data parameter, then you can behave almost as having a normal object method. This is the most common solution used in the GNOME world and fits perfectly with the Qt framework too.

The code to get the window id and render rectangle in Gtk+ would be something like:

GdkWindow *gdk_window;
gdk_window = gtk_widget_get_window(your_widget);
/* as sink you can use GST_MESSAGE_SRC() if you are waiting
    for the prepare-xwindow-id message */
gst_x_overlay_set_xwindow_id(GST_X_OVERLAY(your_sink),
                             GDK_WINDOW_XID(gdk_window));
/* do your maths about your coordinates */
gst_x_overlay_set_window_handle(GST_X_OVERLAY(sink),
                                x, y, width, height);

In Qt, if you are using common widgets, you could use something like:

WId winId = QApplication::activeWindow()->effectiveWinId();
gst_x_overlay_set_xwindow_id(GST_X_OVERLAY(your_sink),
                             winId);
/* do your maths about your coordinates */
gst_x_overlay_set_window_handle(GST_X_OVERLAY(sink),
                                x, y, width, height);

If you are using a QGraphicsScene you would do something like:

/* to get the view you could do something like this
    (if you have only one or will to mess things up):
QGraphicsView *view = your_scene.views[0];
*/
gst_x_overlay_set_xwindow_id(GST_X_OVERLAY(your_sink),
                             view->viewport()->effectiveWinId());
/* do your maths about your coordinates */
gst_x_overlay_set_window_handle(GST_X_OVERLAY(sink),
                                x, y, width, height);

If you are using QML, you would have a very similar approach to the last snippet, because as you should have a QDeclarativeItem, it has a scene() that you can use, to have something like QGraphicsView *view = scene().views[0]; (of course, assuming that you have only one view, which is the most common case).

Overlaying stuff

Some times it is nice do put your controls on top of the video by covering part of the image. It would be like having the video as the background of a canvas where you draw some other widgets. Some GStreamer elements give you the possibility of doing a trick to do this, which is using a colorkey for your background and painting whatever you want on top of that as long as it does not include that colorkey. Some elements like xvimagesink or omapxvsink (used in the Nokia N9 and N950) have the colorkey property that you can read and set. If you are not planning to overlay anything, you can forget about this, but if you do, you need set a color key to the sink and use that color to paint the background of your widget and a good moment is also when setting the window handle:

g_object_set(sink, "autopaint-colorkey", FALSE,
             "colorkey", 0x080810, NULL);

Why do I unset the colorkey autopainting? Because I do not want GStreamer to mess my widget painting.

And more important: Why did I use 0x080810? Because it is a dark color, close to black, but it is not black. Pure black can be dangerous as it is commonly used in themes when painting widgets so you would be getting ugly artifacts. Some people recommend magenta (0xFF00FF) as it is supposedly a color that does not exist in nature (citation needed). I would not do it for several reasons:

  • You will need to synchronize your painting very well to avoid seeing the colorkey
  • If you respect aspect ratio you will see it for sure, because you (or the sink if it is automatic) paint the backgound and the sink draws the image by leaving some empty space.
  • It does not behave well with blendings, as you blend from your widget color to the background, which is the colorkey

Advice: do not mess with colorkey and omapxvsink. Though it is supposed to be writable, it is not and it always uses 0x080810.

Aspect ratio

There are two kind of people:

  • The ones that want to use all the pixels of their monitor/TVs and like damaging their brain with distorted images.
  • The ones that like to see a correctly dimensioned image with some bars giving you a better impression of what was recorded.

As you can guess I belong to the second group.

There are some sinks that do that automatically for you by setting the force-aspect-ratio property, like ximagesink and xvimagesink but there are other that does not and omapxvsink is an example. It is not a big problem but forces you to work a bit more when you select the render rectangle. For that you need to know the video size, which you cannot know until the pipeline is running, which forces to to hook to the GST_MESSAGE_ASYNC_DONE, or in the case of playbin2, you already have the video size when getting the prepare-xwindow-id message. An example to get the video size would be:

GstPad *pad;
GstCaps *caps;
GstStructure *structure;
int width, height;

pad = GST_BASE_SINK_PAD(sink);
caps = GST_PAD_CAPS(pad);
g_return_if_fail(caps && gst_caps_is_fixed(caps));

structure = gst_caps_get_structure(caps, 0);
gst_structure_get_int(structure, "width", &width);
gst_structure_get_int(structure, "height", &height);

/* some videos define a pixel aspect ratio, meaning that the
   video pixel could be like 2x1 copared to a squared pixed
   and we need to correct this */
if (gst_structure_has_field(structure, "pixel-aspect-ratio")) {
    int par_n, par_d;
    gst_structure_get_fraction(structure, "pixel-aspect-ratio",
                               &par_n, &par_d);
    width = width * par_n / par_d;
}

/* trick: some sinks perform better with multiple of 2 */
width &= ~1;
height &= ~1;

Aura

As my colleague Víctor at Igalia has said before in his post, Aura was released to the Nokia Store. Miguel, Víctor and I are quite happy with the result achieved with this app, which intention was to be kind of a port of the Cheese application of the GNOME platform to be used in the N9 or N950 Nokia phones.

The apps allows you to use both cameras (front and principal) to record videos, applying a lot of funny effects (a subset of the GNOME Video Effects) and changing them during the recording. Being Nokia a Finnish company, we decided to name the app after a Finnish Cheese to both honor the GNOME Cheese application and Finland 😉

You can download the app from the Nokia Store where we already got more than 6000 downloads and 100 reviews with a quite good average rating.

You have an example recorded by me with my own phone using the Historical effect and uploaded to Youtube:

And you have even already other videos uploaded to Youtube talking about how Aura works. This one is from a brazilian guy (obrigado!) for FaixaMobi and shows more effects:

Of course, being it free sofware you can also compile it yourself with the code at GitHub and do not be afraid of contributing! The technologies we used were the camerabin element of GStreamer and Qt/QML for the interface where we have the following components:

Aura components UML diagram

  • Main view (aura.qml) with the main interface
  • Controller, which is a mixed QML/C++ object allowing to control the pipeline.
  • Pipeline is a C++ object used by the controller to encapsulate the GStreamer code.
  • PostCapture is also a mixed QML/C++ object that opens the gallery application to show the recorded video and gives you the oportunity of sharing it, deleting it and so on. It uses a C++ controller loaded as a singleton to the context to do some stuff that can only be done in C++. Of course, you can open Gallery yourself and the videos will show up there.
  • EffectManager is a C++ class to load and manage the Effects, which is another C++ class defining how the effect must be applied.
  • Effects (Effects.qml) is a QML component to show the different effects, both software and hardware that Aura can apply. It uses the EffectManager (through the context) to load them and the Controller to apply them.
  • About view (AboutView.qml) is a rework of something done by my colleage Simón Pena and adapted to be used in Aura (Kudos!). It also uses a small AboutViewController to open a Nokia Store URL with the application instead of the browser.
  • ResourceManager is a C++ class used by the Controller to request the proper permissions to record the video.

Being taken to Azkaban for use of very dark GStreamer magic

I was writing some tests for a project at Igalia and I need to mock the convert-frame playbin2 element action. The code to invoke it is something like this:

GstElement *pipeline = /* get pipeline */;
GstCaps *caps = /* create caps to adapt the conversion */;
GstBuffer *buffer = NULL;
g_signal_emit_by_name (pipeline, "convert-frame", caps, &buffer);

When you are writing tests, what you want to do is testing just your code and not to depend on something external, so in this case the idea would be providing a fake implementation for that GStreamer element action.

The way you can do this kind of things is providing the symbol in your code so that the linker when doing its job does not look any further and uses that instead of the one in the external library, so the natural solution coming to your mind would be rewriting g_signal_emit_by_name. The problem with this is that though you are not using it in your code, it is too general, so it is not a good idea.

I thought I could replace the convert-frame action in the playbin2 class, so I wrote this code:

typedef struct
{
  GstPipelineClass parent_class;
  void (*about_to_finish) (gpointer playbin);
  void (*video_changed) (gpointer playbin);
  void (*audio_changed) (gpointer playbin);
  void (*text_changed) (gpointer playbin);
  void (*video_tags_changed) (gpointer playbin, gint stream);
  void (*audio_tags_changed) (gpointer playbin, gint stream);
  void (*text_tags_changed) (gpointer playbin, gint stream);
  GstTagList *(*get_video_tags) (gpointer playbin, gint stream);
  GstTagList *(*get_audio_tags) (gpointer playbin, gint stream);
  GstTagList *(*get_text_tags) (gpointer playbin, gint stream);
  GstBuffer *(*convert_frame) (gpointer playbin, GstCaps * caps);
  GstPad *(*get_video_pad) (gpointer playbin, gint stream);
  GstPad *(*get_audio_pad) (gpointer playbin, gint stream);
  GstPad *(*get_text_pad) (gpointer playbin, gint stream);
} GstPlayBinClass;

static gpointer
gst_play_bin_convert_frame (G_GNUC_UNUSED gpointer playbin,
                            G_GNUC_UNUSED gpointer caps)
{
    GstBuffer *buffer;

    /* Create my own GstBuffer with the data I need */

    return buffer;
}

void
simulator_gst_reset(GstElement *new_pipeline, GstBus *new_bus)
{
    /* ... */

    GstPlayBinClass *klass =
        G_TYPE_INSTANCE_GET_CLASS(new_pipeline, GST_PLAY_BIN_TYPE,
                                  GstPlayBinClass);
    klass->convert_frame = (gpointer) gst_play_bin_convert_frame;

    /* ... */
}

First I declared the GstPlayBinClass copying it from the GStreamer code. I didn’t change any parameters order, just replaced some pointers with gpointer as we don’t need them. This way you don’t break the ABI. Then you can declare your own element action code and finally you get the Class, assign the method and voilà!.

As I said, the solution is far from being the best, but if you know a better way, drop me a comment.

Moving to the next page in MafwGriloSource

Thank Aldon Hynes, who sent me an email with some comments about MafwGriloSource behavior in the N900, I could fix a bug and implement a feature. The bug was a limitation when browsing, as you could only see the first 64 results. It was caused by a problem with indexes when returning the results, as I alwayes sent 0. When fixing this, the interface was requesting more and adding them to the treeview. Then we had all results in the treeview.

Then the problem I saw was that loading so many items in treeview was not slow, but a never ending story, as it took a long time to stop because we were asking for more pages until we reached the end, which took a long time.

I thought of a way of providing my own pagination and the first step was reverting the fix for the index bug to avoid the interface requesting more pages itself, which is a bit hacky, but it was the only way when the interface is not open. The next step was implementing a way of showing a new container row with a “More results…” label to be able to transparently carry on browsing.

Other dirty thing is that I need to override the count and skip parameters because now we have to pay attention to the pagination info and having only 64 results would be a pain in the ass. Do you imaging browsing the thousands and thousands of Jamendo artists in chunks of 64? I thought that 1024 was a much more reasonable number. Anyway a g_message is printed when the given count is overridden.

I talked to Juan about the possibility of implementing that in Grilo itself as it could be a nice of way of providing automatic pagination, but we agreed that it was better to implement the prototype at MafwGriloSource and then, if it was good enough and the model was suitable for Grilo, we could move it there.

How could I implement this? The easiest way was changing the way the MAFW object ids are build from the GrlMedia object. What we had so far, was source_uuid::media_object_id. First I thought of having a special uuid indicating that there was pagination but this would not work as MAFW really needs to parse a valid source uuid (this is, corresponding to an existing source), to send the requests to the appropriate one, so I had to discard this option.

Then, if I could not touch the source uuid, the next and only choice was the media object id. As the MAFW media object id is created directly from the GrlMedia id, it is an opaque string for us, so the best solution was adding the pagination information (the next element index in the list to request) and then catenating the media object id with a semicolon as separation. Result: source_uuid::next_element_index:current_container_media_object_id. For this, I needed to change the functions to serialize and deserialize the object id.

Once I had the pagination info, the only thing left should be changing the callback to return the data to check if there were results left and adding the row that you will see if you, for example, browse Jamendo artists.

Here you have the result:

Screenshot showing pagination

If you find anything weird, drop me a line.

I have to thank Igalia for letting use work time to finish this.