Do you remember that I promised not to use a minimalistic window manager? Well, sorry, another broken promise. Since I started to play around with imapfilter, I discovered lua. Moreover, a comment in a previous post made a mention of awesome, a minimalistic window manager configured through lua. So, I installed it, play with it, and suddenly I got delighted with all of its features: Awesome comes along pretty well with Gnome and its panel, which I didn’t want to lose at all. Besides, Awesome provides its own panel (called widget box, a.k.a. wibox), which includes a systray (sadly, awesome’s systray steals the icons from the gnome-panel’s systray). I’ve found that a tidy desktop, which avoids to the user unnecessary mouse interactions, is much more relaxed and helps the user to focus on her task. We’ll see how this experiment ends.

Meanwhile, Joaquin, a colleague from Dextra, told about they were having troubles with the gst-openmax JPEG decoder element, because it needed a JPEG parser, while the gst-goo one mimic the official JPEG decoder provided by GStreamer in gst-plugins-good. In other words, the last two elements actually parse the buffer and validates the presences of a complete image in a single buffer, while the first doesn’t, it just assumes it, relying thou on a parser after the data source, which will deliver the image meta-data through fixed capabilities in the parser’s source pad.

Loathed by HAM and all the release processes, I though it could be nice to wet my feet again in the GStreamer waters. Besides, I need to help Iago with his codec base classes, so this JPEG parser, would help me to ramp up.

As an historical note, the first element I took in charge when I get in the GStreamer development group in Dextra was, precisely, the JPEG decoder.

As soon as I chatted with Joaquin, I found a bug report about an element for that purpose, but it still missed a couple features to turn it useful for our needs. And start to hack it. First, I moved from gst-plugins-good, to gst-plugins-bad, and then parse the image header in order to find its properties such as width, height, if the image is progressive, color format, etc. And the set these data in a fixed capability. Also, the frame-rate is negotiated in the sink pad of the parser, such as in the official JPEG decoder.

Finally I got something which seems to work OK and posted it in the same bugzilla report. I hope to receive feedback soon.

On the other hand, I’m still waiting for the approval of my last patches to the GStreamer’s Vala bindings (592346, 592345 and 591979). 591979 can be particularly controversial, given that it changes a typical class method, into a static function. I guess I need to ping somebody.

On the Bacon Video Widget port to Vala, some advances had came, but still there’s nothing to show yet.

the true imap usage: imapfilter

As I said before, I have been using mutt as my e-mail client, and I’m really getting into it. It does what it is supposed to do, cleanly and fast.

Nevertheless its support to IMAP lacks of several nice features. One I miss the most is to notice the unread messages in all of my IMAP folders. Yes, what it does is to notices the new e-mails, but if you ignore the announcement, Mutt will not remind you those not-yet-read messages later on.

And that was my biggest fear: Have I missed an important email?

Then I found imapfilter, the IMAP Swiss Army Knife. Using the LUA programming language, you would manipulate your IMAP resources as you want: you can do complex searches, apply operations on messages (move, copy, delete), put flags on them, etc.

So, my task was obtain the list of unread messages in my IMAP account. After a few of LUA exploration, I managed to get this script:

server = IMAP {
    server = 'mail.server.com',
    username = 'vjaquez',
    password = 'myubberpassword',
    ssl = 'tls1',

mailboxes, folders = server:list_all ('*')
for i,m in pairs (mailboxes) do
   -- server[m]:check_status ()
   messages = server[m]:is_unseen () -- + server[m]:is_new ()
   subjects = server[m]:fetch_fields ({ 'subject' }, messages)
   if subjects ~= nil then
      print (m)
      for j,s in pairs (subjects) do
         print (string.format ("t%s", string.sub (s, 0, s:len () - 1)))

It prints out nicely all the messages I haven’t read, so I can go to those IMAP folders, and read them.

Sadly, there’s no way to integrate (as far as I can see) the imapfilter results with Mutt. But that would be amazing! Imagine an IMAP email client manipulated by commands like these.

And those feelings made me recall the Philip’s ideas about integrating Tinymail with Tracker.

And I wonder, does imapfilter support pipelining?

The SSU nightmare

In the Maemo domain, there’s a concept called SSU, Seamless Software Updates, which is the mean to update the whole operative system without reflashing the device.

This concept is relatively new in the mobile device domain, where reflashing has been the common upgrade path. But this idea is as old as GNU/Linux distributions.

Maemo uses the Debian format packages and dpkg /APT for their handling.

Meanwhile in Debian we are used to the dist-upgrade command in APT to upgrade to the new OS release, in Maemo is not the case because each OS release has a specific hardware target, so there is no need to handle complex dependency handling.

And that is the historical reason that the Hildon Application Manager does not use the APT algorithms for dependency handling. Instead of that, it use custom chicken-like algorithms: if the dependencies are not fulfilled without any foreseen problem, the package installation, upgrade or removal is rejected.

The Maemo OS upgrades are done through a meta-packages, which is merely a list of package dependencies to conform the new release. And this is all the magic in the famous SSU.

But this approach posses a couple drawbacks, which, in Fremantle, had grown disproportionally given the dimensions of the project.

First, the number of packages which conform a new release is so big, that the package section in the list file is bigger than the buffer size allocated by APT to parse it. We already filed a bug in Debian about this matter. It seems the problem is an integer overflow.

Second. It is quite easy to break the SSU process: if you install a third party application with a hard dependency to an OS package, the meta-package of the next release will fail, given the chicken-like nature of H-A-M when solving dependencies problems.

The adopted solution is impose a dependency policy for third party packages, which current implementation had triggered a community discussion.

The other proposed solution, use the tiger-like algorithms available in APT, was discarded given the risk of the needed changes in the apt-worker versus the available time frame.

No, there is not conclusion yet about the third party policy issue.


Given the Igalia’s OLPI program (One Laptop Per Igalian) I got a new Thinkpad, the x301. I chose this one just because Alex chose it too, and I trust in the Alex judgement, and I don’t regret it: it’s a great machine.

All my computers had receive a name. The first computer I bought with my own resources was named Rowena. I still like that name. My first laptop was named Angelina. Meanwhile my third desktop computer was named Toaster.

So, in order to not break the tradition, I named the new laptop, and its name is lit 🙂

lit as noun is the contraction of literature. As adjective is the contraction of illuminated or lighted. Also, in French means bed.

I took a different approach to work with this laptop. First of all, I installed Debian Squeeze on it. Not Ubuntu, nor Gentoo. The reason of this is follow the starndard Igalia’s software stack.

The second difference is an reinforce in my console orientation 🙂 Instead of using Evolution, I’m trying Mutt; instead of XChat, IRSSI; right now I’m testing Newbeuter as feed reader, all of them in a single terminal using GNU/Screen. Besides, I’m continuing with my Emacs dependency curve.

And, if you’re wondering, I’m not going to use RatPoison or any other minimalistic window manager, and maybe never will.

Here’s a screenshot of my current desktop:


10 blast moments at GCDS

  1. When Edu played Gimme the power in the Gnome party
  2. When we rented a car to pass an afternoon in Maspalomas
  3. When Owen Taylor showed a picture with me and others as if we were the
    authors of GnomeShell 🙂
  4. The Robert Lefkowitz keynote
  5. That actually my talk in GUADEC-ES had public!
  6. The girls taking the sun in the beaches
  7. The hacking sessions at hotel’s lobby
  8. Met Jürg Billeter at Collabora’s party
  9. Gossip with Marius
  10. The weather… (it didn’t rain!)

GCDS ramblings

The last two years the GUADEC has been pushing the geographical limits of Europe: in 2008 in Istambul, more near to Asia, and now, in 2009, in Canarias, more near to Africa. And that’s OK, I like it actually, is better than repeating the old hosts of the conference.

Comparing my impressions with that two last GUADECs, I found something interesting: In Istambul the new ideas were boiling the environment, bold proposals: the seminal concepts for Gnome Shell, Zeitgeist, Gnome 3.0, and so on. And now, in Gran Canaria, the spirit was an evaluation the development of those ideas, a revision of where the project is and what’s missing to achieve the goals.

In Istambul, the hackers takeoff. In Gran Canaria, the hackers landed.

Nokia stepped back from GTK+ in their products and announced their official support to its recently acquired Qt framework. And that announcement staggered the gnome mobile gang, forcing to rethink their objectives.

The online desktop died and from its rotten corpse a new flower appeared: Gnome Shell: The desktop is a canvas driven by javascript applets, just like the Web 2.0. Moblin also adopted that idea, and many other are going in that path.

The Federico’s crazy idea about a chronological desktop evolved into Zeitgeist, which is taking over as the Tracker user interface for Gnome.

The furor unleashed in Istambul with the announce of the efforts for Gtk+-3.0 had been diminished in this time. Nevertheless, Garnacho showed in the GUADEC-ES some of his unfinished branches, some of them quite amazing, but unmerged into mainstream yet.

Almost of the attention was hogged by Clutter, and the Moblin guys are using this to push the rest of their framework. I guess that everybody is waiting a miraculous merge among Gtk+ and Clutter, but that’s really hard to happen.

Vincent is still pushing for Gnome 3.0. And I really wish for that to happen. And Gnome 3 means GnomeShell, Zeitgeist, and possible Gtk+ 3.0 and GStreamer 1.0. Those new versions doesn’t mean necessary new features, but more clean ups in the code.

What excited me the most was the discussion towards GStreamer 1.0. There’s plenty of work to do. Hopefully I’d find a spot to collaborate on it.

Also I went the ConMan talk and a couple about WebKit, DBus, and a couple more.

Hildon Application Manager goes public

Almost since my arrival to Igalia I started to work in the Hildon Application Manager or H-A-M for friends. The project was developed using SVN in garage, but after the GUADEC in Turkey, we move it to GIT, within an internal server.

But after an unexpected and bold movement of Marius Vollmer, the latest development version of H-A-M, for Fremantle more specifically, was pushed into the wild. Thanks mvo!

H-A-M was finally sync with a repository in Gitorious, making available its source code for everyone. So, everyone is invited to submit your patches! 😀

The big patch theory

It is well known that the free/open source software major asset is the sense of community surrounding each project. That idea is the constant in almost all the promotional videos which participate in the Linux Foundation contest We’re Linux.

A community driven project, exposed to the masses by Eric Raymond in his paper The Cathedral and the Bazaar, has been gaining moment, turning the software development, rather than a isolated, almost monastic, craft work, into a collaborative, openly discussed, engineering task.

All those ideas sound exciting and appealing, but from the trenches, how is it done? How does a bunch of hackers collaborate all together to construct a coherent and useful piece of software?

We could be tempted to think about complex, full solution, software such as forges, distributed revision control software, etc. But I won’t go there. That’s buzzword. From my point of view, the essence of any collaborative software project is the patch.

According the software engineering, an important activity, in the software development process, is the peer reviewing. Even ESR gave a name to it: the Linus’s Law. But the software peer review is not an task done once or periodically, it would be an almost impossible duty, or at least impractical, because it’d imply read and analyse thousands of lines of code. Instead of this, the peer review it’s a continuum, it’s a process that must be carried on during all the code writing.

So far we stated two purposes for a patch file: collaborative software construction and peer review. For the first purpose the activities implied are, from the developer point of view, modifying the code, extract the diff file and hand it out; from the integrator point of view, the application of the patch and keep a register of it. It’s relatively simple process, and it can be automatised in some degree with the help of a SCM software. But there’s hidden trick: the patch must be small and specific enough to survive several changes in the base code, and could be applied cleanly even if the surrounding code had changed since it was created.

Nevertheless, for the second purpose, the process from the developer point of view, is more complex; the crafting of a patch for peer review is not a trivial task, quite the opposite, good patches are fruit of experience.

A good patch has set of subjective and objectives values that the reviewers and the integrator will take in count at the moment to commit it in the official repository. A patch must be small enough to be easily understandable, attack a single issue, self-contained (in must have all it needs to solve an issue), well documented, complete, honour the project code-style, and so on.

A perfect patch must transmit the needed confidence to the maintainer, with a simple glance, to apply it immediately. And only the experience can give this skills. There’s not a Patch 101 lecture, but maybe is a interesting idea.