I’m back

First, hello to planet.gnome.org! Thanks to jdub for making it possible. Briefly, I’m working at igalia as a member of the Gnome technologies team, and currently I’m devoted to continuous integration and testing tasks under the umbrella of the Build brigade group. Hackergotchie is comming up, stay tuned.

Back from holidays. Some notes:

  • I’ve been for nearly a week in Portugal (Lisbon, Fatima, Castelo Branco, Alto Douro, Viana do Castelo, …). Very nice country… and food.
  • Nearly one month of relax far away from a computer in Vigo. I did some local tourism there (visited Gondomar, Mondariz, Villasobroso, A Cañiza, and some nice places in Vigo like Monte Alba, or the boat restaurant in the harbour).
  • The worst thing was the terrible forest fire wave in Galicia. In Vigo I couldn’t even see the sun for a week, as all the sky was covered by a dense smoke cloud (photos in flicker Vigo Harbour, Rande Bridge).

But now I’m back. These days I’ll try to go on with the efforts around Buildbot and jhbuild integration:

  • I’ve got Buildbot running with jhbuild, as I told in my last post.
  • Buildbot is not designed for this use, and it’s a problem because I want to provide better aggregated views (build status of a slave, last builds of a module, information about an specific module, etc). I’ll try to improve a bit these views.
  • I need better unit test support in buildbot (specifically I would like to integrate coverage reports as Thomas has done in gstreamer buildbot, and better unit tests log reports).

Jhbuild + buildbot

This week I’ve been playing a bit with buildbot, in order to complete my hack to get gnome jhbuild integrated with it. And it’s running now. I’ve uploaded a patch to buildbot project webpage (buildbot jhbuild support patch in sf).

The ideas of the patch are the following:

  • Each module in a jhbuild module set gets a factory
  • The factory can be used to get builders in many slaves. Then we’ll get a builder for each factor and slave.
  • An scheduler for each builder. In my case the first module uses a Periodic scheduler, and the others are Serial depending on the previous module.

I added an example of a configuration file for buildbot master to get this working. Now I’m running gnome jhbuild in my machine. Saddenly the UI of buildbot does not scale very well for large sets of modules :(.

A capture here:

jhbuild-buildbot-capture.jpeg

cvs diff of new files over a read-only CVS repository access

Today I’ve found a problem trying to create a patch with cvs. I haven’t got write access to gnome repository. This way I have only read-only access to it. The problem is when I try to create a patch with new files involved.

If I run:

$ cvs diff -N

it ignores the new files. If I run:

$ cvs diff -N changedfile1 changedfile2 newfile1 newfile2

it says something like this for every new file:

cvs server: I know nothing about newfile1

An easy solution would be creating two patches:

  • One patch using standard cvs diff
  • A second patch for new files, using diff command against /dev/null.

But there’s another trick to get a standard patch from cvs. You have to edit the CVS/Entries file of the directory you want to add a file. For example, for a newfile1 in directory directory1, you would change the directory1/CVS/Entries file and add an entry like this one:

/newfile1/0/Initial newfile1//

Then I can do a standard cvs diff command like this one:

$ cvs diff -N

and this will include the new files in the patch. Of course you can use the -u/-U parameters to get unified format patches.

DBus support for JHBuild (2)

Since the last post, I’ve began to implement all the required methods to provide a complete DBus functionality in jhbuild. I added the following DBus interfaces:

  • dict:string,dict:string,string info (array:string modulelist): returns the information about the modules requested.
  • array:struct:string,string list (array:string modulelist): returns the list of required dependencies required to compile the module list.
  • async update (array:string modulelist, array:string skiplist, string start, string datespec): updates the modules in modulelist and their dependencies, starting in start parameter, ignoring the modules in skiplist. The checkouts will be obtained using the datespec.
  • async updateone (array:string modulelist, string datespec): updates the modules in modulelist. The checkouts will be obtained using the datespec.
  • async build (array:string modulelist, array:string skiplist, string start, string datespec, dict:string,string options): builds the modules in modulelist and their dependencies, starting in start parameter, ignoring the modules in skiplist. The checkouts will be obtained using the datespec. The options parameter is a dictionary of additional options. They can be autogen (forces autogen.sh of the module), clean (calls make clean before compiling the modules) and nonetwork (forces avoiding access to network, and then, avoids accessing version control repositories).
  • async buildone (array:string modulelist, string datespec, dict:string,string options): builds the modules in modulelist. The checkouts will be obtained using the datespec. The options work in the same way as in build command.
  • void set_autogen(boolean enabled): set the global autogen parameter value. If set, it will run always autogen.sh before compiling modules.
  • boolean get_autogen(): obtains the current global autogen parameter value.
  • void set_clean(boolean enabled): set the global makeclean parameter value. If set, it will run always run make clean before compiling modules.
  • boolean get_ckean(): obtains the current global autogen parameter value.
  • void set_nonetwork(boolean enabled): set the global nonetwork parameter value. If set, it will run always avoid networks access (and checkouts) for build and buildone commands.
  • boolean get_nonetwork(): obtains the current global nonetwork parameter value.
  • struct:string,string,string get_status(): obtains the current compilation status. The struct has three fields: the current command (build, buildone, updateone, update), the current module, and the status/phase (idle, build, checkout, …).

You can hook to the following signals:

  • start_build_signal (): called when a build starts.
  • end_build_signal (array:string failures):called when the build ends. failures contains the list of modules that did fail.
  • start_module_signal (string module): called when a module build starts. module contains the name of the module.
  • end_module_signal (string module, boolean failed): called when a module build ends. module contains the name of the module. failed tells if the module build has failed.
  • start_phase_signal (string module, string state): called when a phase in the build of a module begins. module contains the name of the module. state contains the name of the phase.
  • end_phase_signal (string module, string state, boolean failed): called when a phase in the build of a module ends.module contains the name of the module. state contains the name of the phase. failed tells if the build phase has failed.
  • message (string message): all the messages from compilation logs are sent through this signal. If you want to retrieve the compilation log, you should hook and get all this messages. This signal is called in each phase with a frequency of 5 seconds.

The biggest problem I’ve faced these days was related to the way I could launch the subprocess for build/update commands. As it’s called in a dbus handler, I couldn’t put the jhbuild script in another thread. And the method should return immediately without waiting for the end of the command. I had to use the gobject.Idle interface to hook the build to the gobject mainloop, this way:

  1. I implemented a gobject.Idle child, and add a callback implementation calling the builder script and returning False (in order to be called only one time).
  2. I attach this idle object to the mainloop, and then end the dbus procedure implementation.

The code for the idle child is something like this:

class JHBuildBuilderIdle(gobject.Idle):
def __init__(self, builder):
gobject.Idle.__init__(self)
self.set_callback(self.callback, builder)

def callback(self, builder):
builder.build()
return False

And the call in the dbus handler is this:

@dbus.service.method('org.gnome.JHBuildIFace')
def build(self, modulelist = [], skiplist = [], start = None, datespec = None, options = {}):
[...]

build = jhbuild.frontends.get_buildscript(ownconfig, module_list)

idle = JHBuildBuilderIdle(build)
idle.attach()
return

Tomorrow I’ll add this work to the jhbuild bugzilla in order to begin the discussion upstream.

DBus support for JHBuild

Today I’ve began the experiment to add DBus support to JHBuild. To do this I’ve established these goals:

  • Be able to launch JHBuild as a DBus service. Also test how can I set it up with DBus activation.
  • Implement a list method, equivalent to jhbuild list command.
  • Implement one of the build methods.

And guys, I did it! I’ve added a new dbus command to jhbuild to launch it as a DBus service, and begin to wait for user calls.And added a dbus frontend to buildscript. This frontend offers signals to know the status of a compilation (offering methods to get the logs, and to know the current status of compilation.

If I run:

dape@bonus:~$ dbus-send --print-reply --dest='org.gnome.JHBuild'
/org/gnome/JHBuildObject org.gnome.JHBuildIFace.build

it launches a complete build. For example, I can use dbus-monitor to view events, and I get a log like this one:

signal sender=:1.95 -> dest=(null destination) interface=org.gnome.JHBuildIFace; member=end_phase_signal
string "libxml2"    string "checkout"    string "false"
signal sender=:1.95 -> dest=(null destination) interface=org.gnome.JHBuildIFace; member=start_phase_signal
string "libxml2"    string "build"
signal sender=:1.95 -> dest=(null destination) interface=org.gnome.JHBuildIFace; member=message_signal
string "Building libxml2"
signal sender=:1.95 -> dest=(null destination) interface=org.gnome.JHBuildIFace; member=message_signal
string "make  all-recursive make[1]:
se ingresa al directorio `/usr/local/devel/dape/cvs/libxml2'
Making all in include

This way we can communicate with the JHBuild compilation loop. It should be interesting to avoid running jhbuild (and loading python) for each compilation in an integration loop. But it also adds an asynchronous channel to communicate with continuous integration tools. I’ll go on completing the dbus interface for most used commands of jhbuild tomorrow, and also add customizability (currently it does not wait for many kinds of parameters, it should be improved). When it’s more polished I’ll propose the patch for jhbuild.

Adapting JHBuild to continuous integration

These days I’ve been doing some work in JHBuild to make life easier for those who want to run it from a Continuous Integration loop. In particular, it should improve the experience for those using Buildbot (as we intend in the Gnome Build Brigade).

Checkout modes for JHBuild

In Buildbot, most used build scripts  enable you to decide how checkouts are done. The options are currently four:

  • Update mode: checkout in a directory, then updates over the same folder you’re compiling the tree. It’s the way jhbuild works currently. Problem: sometimes there are generated files in CVS/SVN/whatever. When you update them, they can raise merge conflicts. It’s very typical in gtk-doc templates.
  • Clobber mode: it wipes the build directory before checking out. Then, every time you compile, you need to download the full tree. It avoids conflicts as there are no merges. If you use clobber always, it’s good to have a compilation cache to avoid compiling all the trees every time.
  • Export mode: similar to clobber, but it uses export instead of checkout. Basically, the difference is you don’t get version control information in the build tree (and then you can use it directly to generate source package, for example).
  • Copy mode: works as in update mode, but version control is done in a different directory. Then, after checking out/updating, it copies all the files to the directory it will use to compile. It avoids the merge conflicts problem, enables you to do development over the version control directory being able to create patches easily. Unfortunately it stores the files in version control 2 times.

For me, the most interesting options are the copy mode and the clobber mode. In continuous integration loops, we usually don’t add modifications over the compilation trees, but merge conflicts raise frequently. With those options, you get an easy solution.

I’ve added a bug  and a patch to JHBuild bugzilla to support checkout options.  With this patch, you can establish a global checkout mode, and specific checkout modes for each module in jhbuildrc file.

Running make check from jhbuild without breaking all the moduleset build every time

Currently JHBuild lets you run make check for modules using the autotools module type. If you set the makecheck variable in jhbuildrc, it will run make check before installing the packages. The problem with this is: if the checks fail, then JHBuild considers the module is broken, and then it will not compile the modules depending on it.

I’ve written a patch to change this behavior. With this, you can set up a unittestfails variable (currently defaulting to True to maintain current behavior) in jhbuildrc. If this variable is false, make check failures don’t break compilations of the module. It goes on compiling, but shows a message in the log warning about this.

There was an old bug talking about this in JHBuild bugzilla. I added the patch to this bug.

Future work

Two ideas for JHBuild integration with other tools:

  • Convert JHBuild to a Python module. Doing this we could have a version control independent library for python scripts. Buildbot is implemented in Python, so it can be a solution to integrate JHBuild with more flexibility.
  • Add DBus support. It would be interesting that applications could fire compilations, and register to wait for different events of the compilations (end of each module, stage or so). Then we could have a better control of the compilation process from external tools.

I’ll be working on this next week.

Guadec 2006 experience. Build brigade!

As I wrote last week, I’ve been attending to Guadec 2006 in Vilanova i la Geltru. A very positive experience, with lots of meetings, talks and interesting events in general.

For last thursday, there was a BOF about Continuous Integration hosted by Juan José Sánchez. There, I presented the work with Tinderbox here in Igalia, and we discussed a bit about the requirements for a Continuos Integration service in Gnome project. The interest was more than I expected, and we shared our experiences and decided to create a work group to get the infrastructure up. It’s called Gnome Build Brigade, and we’ve set up a wiki (yes, all belongs to wiki!)
Currently we have the fantastic work from Frederic Peters (JHAutobuild). It’s running last months, and offers RSS information about compilation status for developers.

But Thomas Vander Stichele proposed Buildbot, as they’re using it in Fluendo for gstreamer. Main reasons: well maintained, big community and more mature. So it seems the way to go, and I began to test it yesterday, as I’m planning to move Fisterra continuous integration to Buildbot.

Yes, buildbot is very easy. It’s more strict about clients (slaves in its notation), but it lets server administrator get a better control over them. And the model is more secure. I’ve also been learning a bit about Twisted, the framework it uses to implement an asynchronous events system. I also implemented a prototype of integration with jhbuild that is working for small sets of modules, but does not handle dependencies yet.
My compromise with Build brigade group is about jhbuild work anyway. I’ll be these days working on JHBuild integration, to extend it to fit better with CI systems (I’ve prepared a patch for stages splitting, and I would want to add more options to checkout). I’ll write about these next days.

Guadec 2006

Yes, I’ve got to Guadec this Friday. After a long trip (fast flight, not so fast train trip), we’ve arrived to the Gnome Village. There you can find lots of Gnomers from nearly any project you’re interested in.

Vilanova i la Geltru is a quiet village near Barcelona. Unfortunately the Gnome Village is not near the Guadec rooms, so there’s been lots of problems to move people among the places (dining rooms, centre of the town, bungallows in the village). Also it seems there are few taxis, so if you do something wrong with bus schedules, you’re driving in heavy problems.

But it’s Guadec. Lot’s of interesting presentations. Now I’m writing from the Carpa and listening to an interesting speech by John Laerum about how to do a good presentation. I’ve realized I’m not very good at my presentations, but the tips are interesting, and I hope I learn something for my own presentations in the future.

Meanwhile I’m going on improving the Gnome Tinderbox 3 deployment at Igalia. Hope it can be complete enough this Thursday, so we can talk about this in the Continuous Integration BOF here in Guadec.

Tinderbox 3 building Gnome

Here in Igalia we’re very interested in some aspects of software quality, and in special in continuous integration. We’re very used to have continuous integration services for our projects, as one of the fundamentals to be handle a group of programmers accessing the same repository.

For doing this, we’ve used three different continuous integration servers, depending on the project:

  • Tinderbox2, for Gnome technologies based projects. For example, you can see our Fisterra tinderbox, running compilations every two hours of our middleware.
  • Cruisecontrol, for Java/PHP web based projects.
  • Tinderbox3, the one I’m working on these days.

As a result of the petition in Gnome Love list for a tinderbox we started an effort to adapt Tinderbox to Jhbuild, and run modulesets inside. It led to two parallel works, one with Tinderbox2 and one with Tinderbox3. Meanwhile, BxLUG has done some work in the same direction, which you can view in the Gnome JhAutobuild webpage.

The Tinderbox2 work was done easily. You can check the current experimental status of the portal in our Tinderbox2 based Gnome Tinderbox. We are integrating unit tests and coverage in it, and we’ve also added RSS feeds for the modules (see here).

And now I’m working in a Tinderbox 3 setup. As the T2 one, it’s experimental. You can check it in our Gnome Tinderbox3 webpage. I’m tweaking and hacking this days, so it’s changing fast. Some features are:

  • A wonderful show all builds view, containing the last compilations of all modules in a time line.
  • Rss feeds for every module, and one for all the trees. With them you can subscribe to the last compilation failures.
  • SSL enabled client and server. The comunications in Tinderbox3 are done through HTTP/HTTPS, making easy to do a secure configuration of clients.
  • The compilation clients detect the modules containing a make check rule in Makefile, and runs the tests. As in Tinderbox pure tradition, you can see the modules failing in unit tests stage as orange.

I would like to share this effort with community. Related to this, a BOF will be held in GUADEC 2006 next week in Vilanova. There’s more information about the continuous integration BOF in GUADEC webpage. It will be on next Thursday 29 at 12:00. See you there!