Do you remember this comment? Well, I took the challenge, and it has been hard to accomplish.
I started with an early implementation of FelipeC, but then I realised that I didn’t understand a bit about what it was going on.
So, I outlined a strategy with two milestones: try to clean up the TI OpenMAX IL implementation and then to add the jpeg decoder to gst-dsp.
In order to clean up the TI’s OMXIL, FelipeC recommended me to rewrite libdspbridge using dsp_bridge beneath, but I didn’t see a real gain on that task. It makes sense for a progressive TI-OMXIL update without breaking the ABI, but that was not my purpose, so I stepped further and decided to rewrite the LCML in terms of dsp_bridge instead.
The LCML is the acronym for Linux Common Multimedia Layer, and it is a shared library, loaded at run-time by the OMX components, and it provides the communication between the ARM-side application and the multimedia DSP socket node (SN). It is build upon the libdspbridge for the interaction with the DSPBridge kernel module.
My task was to rewrite the LCML, removing the libdspbridge linking dependency. You can see the result in my lcml-ng branch.
Along the rewrite process I understood the communication protocol used by the socket nodes. The clean up was really painful because the LCML code is very messy and poorly designed. And I have to say this, the Hungarian notation must be buried deep down into the oblivion.
It was my intention to keep the ABI compatibility but I rather preferred be readable, so I ended breaking it.
With a clear idea of how the LCML library works, I retook the challenge, with some degree of success: the SN was loaded up and allocated correctly, and also I found that the input port only admits one buffer, and not two as rest of the video decoders in gstdspvdec. But, when everything looked promising and the input buffer was pushed, the SN threw an critical error event and the output buffer was never received.
I had to do more than merely understand the LCML, I had must rewrite the JPEG decoder OMX component too. But this time the code was even more obfuscated than the one in the LCML.
And I had an epiphany: developing software in community implies have clean and readable code, for sake of the peer reviewing, each one with heterogeneous backgrounds. Meanwhile, under the closed and internal development approach, the QA is based on black box testing, where the cleanness of the code is not a praised virtue, but somewhat the opposite.
I rewrote all the jpeg decoder bits, from the component to its test application, but I have not pushed that branch yet into gitorious.
Finally I came across with the missing parts: each buffer pushed into the SN must have metadata, which is a structure with information about the buffer; in the case of the jpeg decoder, there were also a couple of magic numbers. The output buffer also comes with metadata, which, among other information, expresses if the buffer was decoded correctly.
Yesterday, before meeting my mates for cinema, I emailed a couple patches with the initial support for the jpeg decoding in gst-dsp.
The next task is find a strategy for the buffer number assignation on each port, so it could be defined as late as possible.
Since a while I’ve been working on a OpenEmbedded overlay called marmita. But this post is not about it. Where I want to aim now is about a nice trick: how to use Jhbuild using an OpenEmbedded setup.
In Marmita, just as in Poky, in order to get into the OE environment, the user source the script marmita-init-build-env.
Then, I setup another script, which is though to be the rcfile of a new bash session: marmita-simple-cross-compiling-env.
So, at this moment we have set all those environment variables needed to run a jhbuild session. There is also an alias for the jhbuild build, which specifies the jhbuildrc file, crafted for a cross-compiled environment: marmita.jhbuildrc.
For the moment I’ve only built GStreamer. And as a matter of fact, in the process, I came up with a simple patch for gst-plugins-bad.
By the way, the destination directory is in the /opt directory, under the stage directory; so, if you want to play with the generated output in a device, just copy that directory tree into the device’s file system.
Yes, I cannot say that I achieved a full integration between jhbuild and OE, but what I can state is that cooperation is quite possible.
I started to generate the bindings for Vala. I never thought it could be that hard: the heavy use of atypical callbacks in Grilo made me find a bug in the code writer of Vala. And eventually I came with a small patch, which I’d just pushed.
Those problems brought into the discussion to use GAsyncResult within Grilo instead of the custom callbacks mechanism. We’ll see where we can go.
Finally I got my small test snippet. Cute, isn’t it?
Today also pushed another patches I’d in my Vala queue. The interesting part is, after talking with Zeeshan, I understood that the gstreamer vapi must be generated with the latest release of GStreamer. Something logical but I never stopped to think about it.
Back in August 2009 I was chatting with my old peers in Mexico, and they told me that they needed a JPEG parser element in GStreamer for their DSP accelerated JPEG decoder. So, I went to bugzilla and found a bug report about the issue and a proposed patch. But the published patch still missed some features so I took it and worked on it.
After attaching my first try, Arnout, the first author of the patch, came with some comments to improve the element. Several weeks after I retook the element and almost rewrote it again. So I was waiting for the OK from a GStreamer developer.
Finally, this week, Stefan review it and pushed it. Sadly for me, I didn’t notice, when I rebased my local commits, squashing my change set into one single commit, that this commit had as author Arnout, not me 🙁
Yeah, sometimes I’m so absentminded.
Back in London I bought a couple CDs. Obviously I don’t use CD players anymore, I mostly stream all the music I hear (jamendo, spotify, last.fm). Though, if I want to hear music using my n900 without any network connection, I ought drop in there the music files. So, the solution is rip out the music from the CDs,
encode them et voilà.
The obvious solution to rip music is SoundJuicer, and I started to compile it within my jhbuild environment, but I found a huge list of dependencies which I didn’t want to install, such as brasero. As everybody knows the next logical think then is “let’s code a simple cd ripper”.
Vala was my choosen language (and don’t ask why). What I wanted was have metadata in the files (life without metadata is not feasible anymore), also I want to encode the files in AAC/MPEG4, and finally I didn’t want any user interaction: just run the program and have my directory with my music.
The first problem I found was that Vala hasn’t bindings for libmusizbrainz, so I started to cook one for libmusicbranz v2.x, which I found terrible bad to port to Vala and also it is already deprecated. Then I cooked another for libmusicbrainz3.
Solved all those issues, finally I came with my mcdripper!
Ah, by the way, it uses async methods, so you’ll need a recent Vala (I use the git’s version).
And finally I’ve been ripping my new CDs and storing the files in my N900.
As Linus Torvalds explained once keeping a nice linear regression set of patches above an upstream development implies the use of git-rebase, nevertheless that also implies that I will have to force my pushes and my friends will also have trouble keeping their repositories in sync with mine.
So this is a kind of compromise between be nice with the people who pull your changes, or ease your daily work.
Back in 2007 I started to work integrating OpenMAX IL components into the GStreamer platform.
OpenMAX is a set of programming interfaces, in C language, for portable multimedia processing. Specifically the Integration Layer (IL) defines the interface to communicate with multimedia codecs implemented by hardware or software.
A quick and rough view of the software architecture implemented to achieve this processing is more or less exposed in the next diagram:
+---------------------+ | OpenMAX IL | +---------------------+ | libdspbridge | +---------------------+ | Kernel (DSP Bridge) | +---------------------+
- Messaging: Ability to exchange fixed size control messages with DSP
- Dynamic memory management: Ability to dynamically map files to DSP address space
- Dynamic loading: Ability to dynamically load new nodes on DSP at run time
- Power Management: Static and dynamic power management for DSP
The libdspbridge is part of the user-space utilities of the DSP bridge, which purpose is to provide a simple programming interface to the GPP programs for the driver services.
In the DSP side, using the C/C++ compiler for the C64x+ and the libraries contained in the user-space utilities, it is possible to compile a DSP program and package it as a DSP node, ready to be controlled by the DSP bridge driver. But right now TI provides a set of out-of-the-box DSP multimedia codecs for non-commercial purposes. These nodes are contained in the tiopenmax package.
So, as I said before, my job was to wrap up the OpenMAX IL components delivered by TI as a GStreamer plug-in. In that way a lot of available multimedia consumers could use the hardware accelerated codecs. But also, our team did the test of the delivered OpenMAX components.
After trying several approaches we came to the conclusion that we need a new layer of software which will provide us
- Facilitate a great testing coverage of the components without the burden of the upper framework (GStreamer in this case).
- Improve the code reuse.
- Use an object oriented programming through GObject.
- Facilitate the bug’s workaround for each component and maintenance of those workarounds.
- A playground for experimenting with features such as (OpenMAX specific) tunneling and the (TI specific) DSP Audio Software Framework (DASF).
For those reasons we started to develop an intermediate layer called GOO (GObject OpenMAX).
+---------------------+ | GStreamer / gst-goo | +---------------------+ | libgoo | +---------------------+ | OpenMAX | +---------------------+
libgoo is a C language library that wraps OpenMAX using GObject. The follow diagram shows part of its class hierarchy.
+--------------+ | GooComponent | +--------------+ | +---------------+ +---------------+ +---------------+ +---------------+ | GooTiAudioEnc | | GooTiAudioDec | | GooTiVideoDec | | GooTiVideoEnc | +---------------+ +---------------+ +---------------+ +---------------+ | | | | +-------------+ +-------------+ +---------------+ +---------------+ | GooTiAACEnc | | GooTiAACDec | | GooTiMpeg4Dec | | GooTiMpeg4Enc | +-------------+ +-------------+ +---------------+ +---------------+
At the top there is GooComponent which represents any OpenMAX component. If the OMX IL implementation is neat and clean, there shouldn’t need to add subclasses underneath it, just parametrize it, and should be ready to use as any other OMX IL component. But reality, as usual, is quite different: Every implementation is different from each other; and to make it worst, each component in a same implementation might behave differently, and that was the case of the TI implementation.
Finally, over libgoo there is gst-goo, the set of GStreamer elements which use the libgoo components. GstGoo also sketched some proof of concepts such as ghost buffers (to be used with the OpenMAX interop profile), and dasfsink and dasfsrc (TI specific).
In those days, before I move to the GStreamer team, an old fellow, Felipe Contreras, worked on gomx, which is the precedent of libgoo, before he got an opportunity in Nokia and started to code on GstOpenMAX. An interesting issue at this point is that Felipec is pushing boldly for a new set of GStreamer elements which ditched OpenMAX and talks directly to the kernel’s DSP bridge: gst-dsp.
What’s the future of libgoo and GstGoo? I couldn’t say. Since I moved to Igalia, I left its development. I’ve heard about a couple companies showed some kind of interest on it, sadly, the current developers are very constrained by the TI workload.
Do you remember that I promised not to use a minimalistic window manager? Well, sorry, another broken promise. Since I started to play around with imapfilter, I discovered lua. Moreover, a comment in a previous post made a mention of awesome, a minimalistic window manager configured through lua. So, I installed it, play with it, and suddenly I got delighted with all of its features: Awesome comes along pretty well with Gnome and its panel, which I didn’t want to lose at all. Besides, Awesome provides its own panel (called widget box, a.k.a. wibox), which includes a systray (sadly, awesome’s systray steals the icons from the gnome-panel’s systray). I’ve found that a tidy desktop, which avoids to the user unnecessary mouse interactions, is much more relaxed and helps the user to focus on her task. We’ll see how this experiment ends.
Meanwhile, Joaquin, a colleague from Dextra, told about they were having troubles with the gst-openmax JPEG decoder element, because it needed a JPEG parser, while the gst-goo one mimic the official JPEG decoder provided by GStreamer in gst-plugins-good. In other words, the last two elements actually parse the buffer and validates the presences of a complete image in a single buffer, while the first doesn’t, it just assumes it, relying thou on a parser after the data source, which will deliver the image meta-data through fixed capabilities in the parser’s source pad.
Loathed by HAM and all the release processes, I though it could be nice to wet my feet again in the GStreamer waters. Besides, I need to help Iago with his codec base classes, so this JPEG parser, would help me to ramp up.
As an historical note, the first element I took in charge when I get in the GStreamer development group in Dextra was, precisely, the JPEG decoder.
As soon as I chatted with Joaquin, I found a bug report about an element for that purpose, but it still missed a couple features to turn it useful for our needs. And start to hack it. First, I moved from gst-plugins-good, to gst-plugins-bad, and then parse the image header in order to find its properties such as width, height, if the image is progressive, color format, etc. And the set these data in a fixed capability. Also, the frame-rate is negotiated in the sink pad of the parser, such as in the official JPEG decoder.
Finally I got something which seems to work OK and posted it in the same bugzilla report. I hope to receive feedback soon.
On the other hand, I’m still waiting for the approval of my last patches to the GStreamer’s Vala bindings (592346, 592345 and 591979). 591979 can be particularly controversial, given that it changes a typical class method, into a static function. I guess I need to ping somebody.
On the Bacon Video Widget port to Vala, some advances had came, but still there’s nothing to show yet.