A bit of reading on music recommendations
Today I have been reading about the work done by the Distributed Computing group at the Zurich Institute of Technology. These guys are developing complex techniques for combining “audio social information” (tags and listening habits, taken from Audioscrobbler) to discover relationships, thus generating a social audio space where songs/artists are placed and that the user can navigate. An opposite (or maybe complementary?) approach to this one would be the use of audio features to establish relationships between songs/artists, but this might ignore some of the semantic and social information that is so relevant to our understanding of music. They have developed a set of prototypes for Android (with video) and Amarok. The science looks interesting and these kind of approaches could well be the next step in the evolution of music players now that the average collection is often too big to be dealt with using only long text lists.
Ok, back to the paper now (Social Audio Features for Advanced Music Retrieval Interfaces, if you are interested).