Author Archives: abolte

About abolte

Product keys, DRM, licensing servers, online product activation, adware, spyware, restrictive EULAs... or free software? For me, it wasn't a hard choice. It is all too common for non-free software to come back and bite you on the butt!

Farewell Grip, hello Sound Juicer

For probably over a decade, I’ve been using Grip to rip all my audio CDs to MP3, Ogg Vorbis, and (for the last few years), FLAC. I admit that it’s not the easiest ripper to use these days, but its flexibility is unparalleled.

Unfortunately, it is no longer maintained, and the last stable release was from 2005. Due to this, many major distributions (including Debian) have removed it from the official repositories… which means compiling. What joy. 🙂

I brought home a new CD today. It might have been possible to buy it online, but I’m increasingly becoming disappointed with digital download audio purchases. Unexpected censoring/editing, poor audio quality, lack of high-quality/complete album artwork, having to rename tracks when purchasing from different online music stores to meet preferred naming convention requirements, inconsistent tag naming conventions and tag types between stores, and a price that’s still very similar to the physical disc version have been steering me away from them. If I happen to be down the street near a local CD store, it’s more convenient to duck in and get the real thing rather then purchasing some downloadable hacked imitation later which will also involve more work in the long run.

As soon as I arrived home, I threw the disc in my laptop’s external USB DVD drive to rip it (because seriously – who uses a CD player to play music on these days?) and up pops Sound Juicer. That’s normally my cue to start compiling Grip, but this time I paused – it had been a long time since I had investigated alternatives, and GStreamer is the future (which is what Sound Ripper apparently uses) so I decided I’d spend a few moments to see if I could get Sound Ripper to rip the disc in the exact way I like it.

My music naming convention is as follows:
<lowercase_artist_name>/<lowercase_album_name>/<zero-padded_track_number>-<lowercase_filename>.flac

I don’t use white-space or special characters in file names (I replace them with underscores) – there’s no point since getting the names exact is supposed to be what tagging information is for – and this convention is extremely easy to parse. Since track names are always in album order due to the padded track number, I can always play an album with mplayer *.flac or gst123 *.flac – which is how I generally play my music. It’s also uncommon in *NIX to see upper-case in file-names, so I keep everything lower-case too for consistency. Additionally, tags should be in native FLAC format – not id3v2 or something. I only require the band, album, album release year, track name and track number tags be set, but am not fussy if others are included.

Okay – so what’s Sound Juicer got? Clicking on Edit -> Preferences provides the following screen:

Hmm… doesn’t look too promising. The drop-down boxes don’t provide the exact configuration I want, but I can get pretty close. If only I could create my own custom settings! I quickly fired up gconf-editor to take a look at behind-the-scenes configurable options, and found three relevant keys:

  • /apps/sound-juicer/path_pattern
  • /apps/sound-juicer/file_pattern
  • /apps/sound-juicer/strip-special

Clicking them provides a complete description of all the available parameters these can take, so I tweaked them as per the following screen shot:

Changes seem to be applied instantly, so I hit the bit Sound Juicer Extract button, and lo and behold:

So far so good… but what about tagging information?

~/Music/Flac/killswitch_engage/alive_or_just_breathing$ lltag -S 01-numbered_days.flac
01-numbered_days.flac:
ARTIST=Killswitch Engage
TITLE=Numbered Days
ALBUM=Alive or Just Breathing
NUMBER=1
GENRE=Metalcore
DATE=2002-01-01
TRACKTOTAL=12
DISCID=9a0a860c
MUSICBRAINZ_DISCID=JakXtKdQUlm1n5i3sr9KRBqxIy4-
ARTISTSORT=Killswitch Engage
~/Music/Flac/killswitch_engage/alive_or_just_breathing$

Perfect!

Grip has served me well for a long time, and given me so many fond memories of spending hours ripping my entire collection… but now it seems it’s finally time to say goodbye – and hello Sound Juicer.

AMD Fusion E-350 – Debian Wheezy (testing) with Catalyst 11.5 drivers and VA-API support via XvBA

My new Sony Vaio 11.6″ laptop runs an E-350 processor. I’ve upgraded the RAM to 8Gb and threw in a OCZ Vertex 2 SSD. Even using LUKS under LVM2, it feels insanely fast.

Everything I’ve tried works perfectly on it out of the box, including the card reader, HDMI port, Gigabit NIC, USB ports, sound card, wireless, Bluetooth, volume/brightness/video output function keys, etc. Well, almost everything… the graphics is one glaring exception.

For instance, when I go to log out or drop to a virtual terminal, I get all kinds of screen corruption and can’t see a thing. I always need to hit CTRL+ALT+F2 to switch to a virtual terminal, and CTRL+ALT+DEL to restart it to get picture back. Fortunately the SSD means that’s under a 30 second wait to shutdown and boot up again into the login screen, but after owning this machine for just over two weeks now it was starting to get old.

Further, 3D acceleration via Mesa3D was painful. Games like the Penumbra Collection ran at about 1 FPS. It looks like the Phoronix guys have tested newer versions with Gallium3D which look slightly more pleasing, however none of that’s in Debian and I don’t want to waste lots of time recompiling everything and potentially causing new updates to break. Further, while I could watch 720p video reasonably well, 1080p video would occasionally cause slowdowns and prevent the viewing experience from being completely enjoyable.

Time to do something I’d rather not… and install the proprietary driver. 11.3 is currently in the testing repos, however it still isn’t efficient enough to allow 1080p video to play properly. Downloading the new 11.5 Catalyst driver also fails to properly install due to a DKMS compilation error.

Fortunately, Mindwerks has instructions on how to fix this issue under Ubuntu 11.04. With a few little tweaks they can be made to work with Debian too.

Instructions:

Debian Wheezy (7.0) users can finally get fglrx playing nicely together with X.Org X Server 1.9.5. We can also make the latest driver work well with the 2.6.39 kernel.

The Minderks custom build procedure follows, adapted for Debian Wheezy (7.0) users. Note that the commands are for 64-bit, but the only change 32-bit users likely need to make is to download an i386 package instead for step 8. Also, the following commands assume you use the sudo package to gain root.

  1. Install the latest 2.6.39 kernel revision from kernel.org. eg.
    $ wget http://www.kernel.org/pub/linux/kernel/v2.6/testing/linux-2.6.39-rc7.tar.bz2
    $ wget http://www.kernel.org/pub/linux/kernel/v2.6/testing/linux-2.6.39-rc7.tar.bz2.sign
    $ gpg --verify linux-2.6.39-rc7.tar.bz2.sign linux-2.6.39-rc7.tar.bz2
    $ tar jxf linux-2.6.39-rc7.tar.bz2
    $ cd linux-2.6.39-rc7
    $ make-kpkg -j2 --rootcmd fakeroot --initrd --bzimage kernel_headers kernel_image
    $ sudo dpkg -i ../*.deb

    and boot it.

  2. Download the AMD Catalyst 11.5 64-bit driver installer.
  3. Install the Debian VA-API packages. Note that some say to get them from the Splitted Desktop website, however I tried them and didn’t notice any benefit in doing so so reverted to the following:
    $ sudo apt-get install libva1 libva-x11-1 libva-tpi1 libva-glx1 libva-dev
  4. Extract the files from the package:
    $ sh ./ati-driver-installer-11-5-x86.x86_64.run --extract ati
  5. For 2.6.39 support, download this additional patch: 2.6.39_bkl.patch
    $ wget http://www.mindwerks.net/wp-content/uploads/2011/03/2.6.39_bkl.patch
  6. Check for Big Kernel Lock usage:
    $ cat /lib/modules/$(uname -r)/build/.config | grep -c CONFIG_BKL=y

    If the result of this command is 0, then download no_bkl.patch as well. For stock kernels you should get 0 and will need the patch – which is probably the main reason you are here. 🙂

    $ wget http://www.mindwerks.net/wp-content/uploads/2011/03/no_bkl.patch
  7. then apply them:
    $ cd ati; for i in ../*.patch; do patch -p1 < $i; done
  8. Build your new ati/fglrx driver:
    $ sudo ./ati-installer.sh 8.85.6 --install
  9. Since we’re using the proprietary drivers, we might as well make the most of them. Grab the latest XvBA backend library for VA-API:
    $ wget http://www.splitted-desktop.com/~gbeauchesne/xvba-video/xvba-video_0.7.8-1_amd64.deb
    $ dpkg -i xvba-video_0.7.8-1_amd64.deb
    $ cd /usr/lib/dri && sudo ln -s ../va/drivers/xvba_drv_video.so fglrx_drv_video.so
  10. If your /etc/X11/xorg.conf is missing you will need to run:
    $ sudo aticonfig --initial

    and then reboot.

That newly created package should work for the entire 2.6.39 series.

These steps are really useful for AMD Fusion users at the moment. Without VA-API, multi-threaded mplayer will occupy 100% of available CPU (both cores) and drop frames when playing a test 1080p MKV file I have (containing H.264+AAC+AAS). It’s not unwatchable, but it’s annoying. With VA-API-patched mplayer, CPU usage never hits 15% when playing the same test video!

It’s a pity that the Catalyst driver doesn’t currently correctly generate deb packages for Debian, but the fglrx packages are not listed as a depenency for xvba-video so it’s not a major problem.

Update (2011-05-17, 12:41am): Using the new kernel is not necessary. Although there appear to be a few AMD Fusion fixes in there, I haven’t noticed any benefit in using 2.6.39-rc7 – possibly because I’m using the proprietary drivers. In fact, I’m not completely convinced the Catalyst 11.5 drivers are required either as I was having problems getting mplayer to use VA-API when I was testing 11.3. I’d be interested to know if just steps 3, 9 and 10 would be sufficient for hardware decoding using the patched mplayer and the default fglrx-atieventsd fglrx-control fglrx-driver fglrx-glx fglrx-glx-ia32 fglrx-modules-dkms package versions. I might very well have done much more than required.

Update (2011-05-22, 9:04pm): You do not need Catalyst version 11.5 – you can use the version in the repositories. In fact, I recommend it. The reason being (aside from having the drivers managed via apt) is that the compiling and installing 11.5 via the instructions above won’t enable 32-bit compatibility. I tried running a few games under WINE yesterday, but WINE always complained that OpenGL (and hence DirectX) was not available to it. Presumably I needed whatever the fglrx-glx-ia32 package included. Possibly I was just missing something as simple as a system link before, but I didn’t investigate – I’d rather use the standard Debian packages wherever possible if they do the job. Also, you don’t need to manually fetch the Splitted Desktop xvba-video package – xvba-va-driver is basically the same thing, only A) xvba-va-driver is in the official Debian repositories and B) it specifies the fglrx packages as a required dependency.

So in summary:

  1. Don’t touch your kernel.
  2. $ sudo apt-get install libva1 libva-x11-1 libva-tpi1 libva-glx1 libva-dev xvba-va-driver fglrx-atieventsd fglrx-control fglrx-driver fglrx-glx fglrx-glx-ia32 fglrx-modules-dkms
  3. $ sudo aticonfig --initial
  4. Reboot
  5. Use the VA-API-patched mplayer for watching movies.

It’s easy when you know how. 🙂

Update (2011-07-03, 12:11am): If you run gnome-mplayer (perhaps so you can make use of gecko-mediaplayer for libva-accelerated video in Firefox), be aware that Debian currently ships no later than 0.9.9.2 – even in sid. Thus, you will encounter a problem with a filter gnome-mplayer automatically uses which is incompatible with vaapi – and will automatically disable it. I can verify that 1.0.4 avoids this issue, and allows gecko-mediaplayer to work wonders when combined with the ViewTube and YouTube HTML5 forever Greasemonkey scripts.

It’s not you, it’s me.

I’ve used a number of GNU/Linux distributions over the years, some more than others. This weekend, I’ve apparently been overcome by nostalgia and revisited a distribution I haven’t used in years.

Early beginnings

When I first started getting serious about leaving Windows in the late 90’s, I used RedHat 5.2 as my primary OS. Release after release went by, and I was convinced it was continuing to lose its charm. There was a limited number of packages available and I regularly had to hit rpmfind.net to get things I wanted installed. First, RedHat 7 shipped with a broken compiler. Next, RedHat 8 shipped with Bluecurve which seemed like a crippled mash of GNOME and KDE, so getting certain things to compile either way could be quite difficult. Given the small repositories, compilation was frequently necessary. There had to be a better way…

I occasionally glanced over Debian’s way, but at the time it was too archaic. I could not (for example) understand how anything with such a primitive installer could be any good. eg. Why would one ever want to manually specify kernel modules to install? Why not at least have an option to install the lot and just load what’s required like other distributions did? Then once you got the thing installed and working, packages would be quite old. You could switch to testing, but then you don’t have a stable distribution any more. The whole distribution felt very server/CLI-orientated, and at the time (having not so long ago came from a Win9x environment) it was not easy to appreciate. It also lacked a community with a low-barrier entry – eg. IRC and mailing lists instead of forums.

Enter Gentoo. It seemed to always make the headlines on Slashdot, so I had to see what all the fuss was about. Certainly it was no snap to install, but the attention to detail was incredible. The boot-splash graphics (without hiding the important start-up messages as other distros later tried to do) and automatic detecting of sound and network cards was a sight for sore eyes. The LiveCD alone was my first go-to CD for system rescues for many years due to it’s huge hardware support, ease of use, high-resolution console and small boot time (as it didn’t come with X). Further, everything was quite up-to-date – certainly more so than basically anything else at the time. Later I came to realise the beauty of rolling releases and community.

So I compiled, and compiled, and compiled some more. My AMD Duron 900MHz with 512Mb and 60Gb IDE HDDs seemed pretty sweet at the time, but compiling X or GNOME would still basically require me to leave the machine on overnight. It didn’t help any that I only had a 56k dial-up connection. Sometimes packages would not compile successfully so I would have to fix it and resume compilation the next morning (although the fix was only ever a Google search away). With the amount of time I invested into learning Gentoo GNU/Linux instead of studying my university courses, it’s amazing I managed to get my degree. I even went so far as to install Gentoo on my Pentium 166MMX with 80Mb RAM laptop (using distcc and ccache).

Later during my university work-experience year, I used Gentoo on my first-ever built production web-server. Another “senior” system administrator came along at one point to fix an issue it had with qmail, and he was mighty upset with me to see Gentoo. Not because it was a rolling-release (and I’m pretty sure Hardened Gentoo didn’t exist at the time either), but because he didn’t know the rc-update command (he expected RedHat’s chkconfig) and was only familiar with System V and BSD init scripts. I think he would have had a heart attack if he later saw Ubuntu’s Upstart system (which has also been quite inconsistent across releases over the years)!

During my final university year, I used Gentoo again for a team-based development project. For years it seemed so awesome that I could hardly bare to look at another distro. The forums were always active, and the flexibility the ports-based system provided was unmatched by any other GNU/Linux distribution. However it was at this time Ubuntu starting making Slashdot headlines – perhaps moreso than Gentoo. Whilst I was midly curious and read the articles, I didn’t immediately bother to try Ubuntu myself. After all, it was just a new Debian fork, and I knew about Debian. The next year I attended the 2005 Linux.conf.au in Canberra, still with my trusty P166 laptop running Gentoo I might add. After arriving at the hallway where everyone was browsing the web on their personal laptops with the free wifi, it was immediately obvious I was the odd one out. Not just because my computer was ancient by comparison, but because I was just about the only person that *wasn’t* running Ubuntu. I think I literally saw just one other laptop not running Ubuntu – I simply could not believe it. I had to see what the fuss was about.

A few days after the conference, I partitioned my hard disk and installed Ubuntu Warty. Hoary had just been released, but I already had a Warty install CD and I was still limited to dial-up – broadband plans required a one-year contract and installation fees, and I was frequently moving around. Unlike Debian, the Warty installation was quick and simple. The community was quite large, and while it seemed smaller than Gentoo it was growing rapidly. Because user-friendly documentation actually existed, I felt more comfortable using the package management system. I didn’t need to use the horrible dselect package tool either. The packages were quite up-to-date; perhaps not on the same level as Gentoo, but close enough that any differences didn’t matter. Ultimately I was obtaining all the notable benefits Gentoo had offered, without the downside of huge downloads and slow compilations. By the end of the year I wasn’t using Gentoo on my personal machines any longer.

Side-note:

Seemingly inspired by Ubuntu, Debian has over the years since made significant improvements – shorter release cycles, more beginner-friendly and user-friendly support pages and wikis (although no forms so far), a much easier and efficient installer, etc. Additionally, because of my familiarity with the Debian-based Ubuntu, I later found myself far more comfortable using Debian. I might even go so far as to say enjoying Debian. Indeed, my home server is a Debian box running Xen, where Ubuntu doesn’t even support Xen in Dom0. Further, unlike Ubuntu, Debian hasn’t changed the init script system every six months. Each release does things the way you would expect, whereas Ubuntu frequently “rocks the boat”, requiring more time going through documentation trying to figure out how new releases work. Being a free software supporter, I also don’t appreciate Ubuntu recommending me to install the proprietary Flash player in my Firefox install, or offering to automatically install nVidia proprietary drivers. If I need to use proprietary software, I want it to be a very conscious decision so I know exactly which areas of my OS are tainted. In some ways, Debian is stealing back some of the thunder it lost to Ubuntu – at least in my book.

Life after Ubuntu

The installation I have been using on my personal desktop was a Ubuntu install, upgraded with each release to the current 10.10 (Maverick) from 7.10 (Gutsy), occasionally installing 3rd party PPAs and what not over the years. Whilst I had upgraded my file-systems to use ext4, I knew I would need to reformat them from scratch to get the most performance benefits from them. There was probably a ton of bloat on the system (unused packages, hidden configuration files and directories under $HOME,  etc.) and many configuration files probably differed largely from the improved defaults one would get on a fresh modern install. As this weekend was a day longer than usual due to Australia Day, ultimately I decided it was a good time to reinstall.

What to install however? Ubuntu again? I was seriously getting tired of it. It just didn’t feel interesting any more, although it did the job. Additionally, Ubuntu doesn’t strip the kernel of binary blobs like other distributions do (including Debian starting with Squeeze) – another cross in my book. No, it was time for a change. Debian might have been the obvious choice – particularly since Squeeze was released recently making it fresh in new. However I was already somewhat familiar with Squeeze (I installed it on my Asus EeePC 701 netbook last week) and it still felt a bit dated in some areas over Ubuntu. I’m also conscious that the next version probably won’t be released as stable for at least another year or two. Further, I don’t appreciate how Debian considers lots of the GNU *-doc packages non-free. I want those installed, but I hate the thought of enabling non-free repositories just to get them. Perhaps I could find something completely free instead?

Completely free GNU/Linux options

With that in mind, I did a little bit of investigation into completely free GNU/Linux distributions (as per the FSF guidelines). Let’s look at the list:

Blag

Based on Fedora. I don’t like Fedora, but I might be willing to look past that aspect. I do however need a distribution that is kept up to date. According to Wikipedia, “the latest stable release, BLAG90001, is based on Fedora 9, and was released 21 July 2008.”. In all likeliness, this would suggest that the stable release hasn’t seen security updates in a long time (Fedora 9 dropped support some time ago). The Blag website does however list a 140k RC1 version which is based on Fedora 14 (it’s not clear when this was posted), but other parts of the website such as the Features section still reference Fedora 9 as the base distribution. It would seem the latest Blag RC version has only made an appearance after well over 2 years, and it’s not even marked as stable.

Further, I’m a little skeptical of installing a distribution that is basically the same as another distribution with bits stripped out to make it completely free – there is always the chance that something was missed, or a certain uncommonly-installed package will malfunction with part of a dependency removed. On the face of it, Blag fills me with doubts.

Dragora

Something new, and not based on any other distribution. Sounds intriguing. Just the coolness factor of using something unlike anything I’ve ever used before gives this bonus points, if only for the fact that the guys doing this must be serious due to doing so much work (eg. new init system, new package management system, installer, documentation, etc.). The Wikipedia page is light on details and the project website redirects to a broken wiki with the text “Be patient: we are working in the content of the new site. – Thanks”. There is a link to the old wiki at the bottom of the page, but it redirects to a Spanish page by default. Fortunately there is an English version there you can select, and it looks mostly as complete as the Spanish version.

It’s nice to see this project has its own artwork and a few translations. There is also a download wiki page which indicates that there is a regular release cycle. Although the wiki doesn’t indicate which architectures are supported, at least two of the mirrors provide x86_64 downloads. SHA1 checksums and GPG signatures of the images are accounted for – always a good sign to see integrity considered. As an aside, apparently Arch GNU/Linux doesn’t verify package signatures downloaded via its package management system which is why I would never consider using it.

Dynebolic

From the FSF’s description, Dynebolic focuses on audio and video editing. I actually do intend to purchase a video camera at some point this year so I can start uploading videos, however for now I have no need for such tools.

gNewSence

The distribution Richard Stallman apparently uses as of late. I suspect that since Stallman now owns a MIPS-based netbook, he found gNewSense was able to support the configuration better than Ututo (which he used previously). Unfortunately I use an Intel i7 with 6Gb of RAM and a GTX480 graphics card with 1.5Gb of RAM – I need a 64-bit distribution to address all that memory, and gNewSense doesn’t support the x86_64 instruction set. The last stable release (v2.3) was 17 months ago too, according to Wikipedia at the time of writing.

Musix

Based on Knoppix. Isn’t Knoppix a blend of packages from various distributions and distribution versions? Last I checked (admittedly many years ago) it didn’t look like something easy to maintain. Musix also apparently has an emphasis on audio production, which would explain the name. As mentioned previously I don’t have a requirement for such tools.

Trisquel

I’ve actually used this at work for Xen Dom0 and DomU servers (although it’s surely not the best distribution for Dom0 – I had to use my own kernel and Xen install), and it works quite well. Basically, the latest version is just a re-badged Ubuntu 10.04 LTS with all the proprietary bits stripped out. Unfortunately, in many ways it’s a step back from what I was already using for my desktop – older packages than Ubuntu 10.10, a far smaller community and much less support/resources (although 90+% of whatever works for Ubuntu 10.04 will work for the latest Trisquel (4.0.1 at the time of writing). There is basically no compelling reason for me to use Trisquel on my personal desktop as my primary OS at this time from what I can see – I would probably be better off installing Debian Squeeze with non-free repositories disabled.

Ututo

Based on Gentoo. Sounds good, right? However upon further investigation, it’s clear that the documentation is entirely in Spanish! This may not be a problem for Stallman, but I don’t speak (or read) Spanish.

Venenux

Apparently this distribution is built around the KDE desktop. I personally don’t like KDE and don’t run it, so it doesn’t bode well so far. Heading on over to the Venenux website I once again fail to find any English, so scratch Venenux off the list.

So that’s it – the complete list of FSF-endorsed GNU/Linux distributions. Of all of the above, there is only one worthy of further investigation for my purposes – Dragora.

Dragora on trial

I downloaded, verified and burnt the 2.1 stable x86_64 release, booted it, and attempted an installation.

As is the case with the website, during installation I faced quite a bit of broken English – another potential sign that the project doesn’t have much of a community around it. However I wasn’t about to let that get to me. Perhaps it did have a large community, but simply not a large native-English-speaking community? Besides, English corrections aren’t a problem – I can take care of that by submitting patches at any time should upstream accept them.

The next thing I noticed was that my keyboard wasn’t working. I unplugged it and plugged it back in again. Pressed caps-lock a few times but didn’t see the caps-lock LED indicator appear. Clearly, the kernel wasn’t compiled with USB support (or the required module wasn’t loaded). Luckily, I managed to find a spare PS/2 keyboard which worked.

Like Gentoo, the first step involved manually partitioning my hard drives. I like to use LVM on RAID, and fortunately all the tools were provided on the CD to do that. It wasn’t clear if I needed to format the partitions as well, so I formatted over 2Tb of data anyway. Unfortunately during the setup program stage, the tool formatted my partitions again regardless of checking the box for it not to do so, which wasted a lot of time.

While waiting, I looked over some of the install scripts and noted that they were written in bash. All comments looked to be in Spanish, even though all installer prompts were in English. I found this both strange and frustrating. I also elected to install everything, as that was the recommended option. Lastly, I attempted to install Grub… and things didn’t go so well from that point onwards. As far as I could tell, the kernel was not able to support a MDADM+LVM2 root file-system (/boot was on a RAID1 only MDADM setup which I know normally works). It looked like I would either need to figure out why the kernel wasn’t working, or reformat everything and not use my 4-disk RAID5 array to host a LVM2 layout containing the root file-system. At this point my patience had truly ran out and I decided it was time to try something else. I also never managed to figure out how the package management system would work, as that was one section missing from the wiki documentation.

An old friend

So what did I end up with? Gentoo! Not exactly a completely free-software distribution… in fact I was shocked by just how much proprietary software Gentoo supports installing. The kernel won’t have all the non-free blobs removed either, however I suspect I don’t need to compile any modules that would load them. Gentoo also is feeling very nostalgic to me right now.

Since the last time I used it, there have been some notable changes and improvements. For example, stage1 and stage2 installations are no longer supported. Previously, I only ever used stage1 installs so this felt a bit like cheating. Additionally when compiling multiple programs at once, package notes are now displayed again at the end of compilation (regardless of success or failure of the last package) so you actually have a chance to read them. Perhaps the most important change however is that now source file downloads occur in parallel with compilation in the background (which can be easily seen by tailing the logs during compilation) – this saves a lot of time over the old download, compile, repeat process from years ago.

All USE flags still aren’t listed in the documentation, and I have ran into some circular dependency issues by using the doc USE flag (which is apparently common according to my www searches). I’ve had to add 6 lines to the package.accept_keywords and package.use files to get everything going. However, now I’m all set up. I’ve compiled a complete GNOME desktop with the Gimp, LibreOffice and many others. Unlike the days of my Duron which had to be left on overnight, my overclocked i7 950 rips through the builds – sometimes as quickly as they can be downloaded from a local mirror over my ADSL2+ connection.

Although the Gentoo website layout has largely been unchanged over the years, and some of the documentation is ever-so-slightly out of date, I get the sense that Gentoo is still a serious GNU/Linux distribution. I didn’t encounter any compilation issues that couldn’t be quickly worked around and it feels very fast. I still use Ubuntu 10.10 at work and Debian Squeeze on my netbook and home server, but going back to ebuilds still feels very strange and somewhat exciting. If only Gentoo focused more on freedom it would be darn near perfect.

Birth of the FreedomBox Foundation

Eben Moglen’s FreedomBox idea has caught my attention ever since his Silver lining in the cloud speech August last year. Unfortunately I haven’t noticed any visible progress on the project – until today. Looks like things have indeed been going on behind the scenes, as Mr Moglen has created the Freedom Box Foundation.

This inspired me to watch another of Moglen’s talks – Freedom in the Cloud (transcript here) – an older video that inspired the Diaspora project. Whilst it didn’t shine any more light on the subject (it was slightly more vague about how a FreedomBox device would function), Moglen was certainly right that people have been all to happy to sacrifice their privacy for convenience.

This blog runs on my personal home server. If the government wants to know what information I have on it or who has been accessing it, they can get a search warrant. They would have to physically enter my home and take my computer away to get it. The logs are all stored here – not on Facebook, Twitter or anywhere else. Nobody cares more about my data than me, and the government or anyone else who wants my data will have to go through me. That’s privacy.

My wife also has the option of using the home server for hosting her blog – but she refuses. Instead, she decided to post all her blogs and photos on Yahoo Blogs.

When I asked why, she told me that she wanted to know who was visiting her website and asked if I could tell who visited my website.

“Sure I can… kinda. I can give IP addresses. I can look up what countries those IP addresses have been allocated to. Alternatively, I could potentially see people’s user-names who visited my website if somebody logged in – required if somebody wants to post something.”

My wife was not impressed. “I want to see a list of user-names for everyone” she claimed. “Simple” I replied – “only allow people to view content when they log in”. In theory they shouldn’t have any privacy concerns since they obviously already need to be logged in to visit your site at Yahoo.

“Ah – that won’t work. They are already logged in when they visit other blogs. Nobody will create a separate login just for one blog – people are too lazy and nobody will visit.”

And there you have it. Seemingly, many people who use Yahoo Blogs (and presumably Facebook) feel the same way. I personally don’t care who visits my website and don’t see why I should care. If somebody wants me to know they visited, they can drop me an e-mail or post a comment.

OpenID would solve part of the problem my wife describes – it would reduce the burden of creating a new account, but won’t eliminate additional steps. It also requires the reader to already have an OpenID account to see any benefit, and it’s just not popular enough. I just spent a few minutes clicking through my bookmarks, and I could only find one website with OpenID support – SourceForge – and even then they only support a limited number of OpenID providers.

Will the FreedomBox project fix my wife’s use-case scenario? Most probably. One of the primary goals is “safe social networking, in which, without losing touch with any of your friends, you replace Facebook, Flickr, Twitter and other centralized services with privacy-respecting federated services”. Most probably Yahoo Blogs is popular enough that it would be included in that list.

How would the transition work though? If my wife had a FreedomBox, she would presumably be able to navigate a web interface to have it suck down the Yahoo Blogs data and host it locally. Next, her Yahoo page would add a link to her FreedomBox URL. When people visit, they would either be granted or denied access based on whether she had previously granted a particular user access. If denied, there would be an option to request said access.

However, say my wife decided to use a FreedomBox prior to all her Yahoo friends having one – how would she be able to be sure person X is Yahoo Blogs person X to grant X access? That’s where things get tricky, and is the part of the picture I’m not too clear on.

The only thing I could imagine working would be for person X to have an account on a third-party website that can talk whatever protocol the FreedomBox uses. Obviously this means another account, but as would be the case with Yahoo Blogs the one account sign-in would support access to all the FreedomBox blogs. Further, like OpenID providers, the third-party website in question would be able to be hosted anywhere. Perhaps OpenID providers themselves will even provide this functionality thereby eliminating the sign-up process for those already with an OpenID account.

I imagine it’s going to be a hard battle, but if it picks up it has the potential to be unstoppable.

Ultimate Free Software Web Browsing Experience

I want the web to work the way it was intended, using only 100% free software. Is that so much to ask? Apparently so – and almost exclusively due to Flash.

Flash. I have concluded long ago that it’s impossible to have a good web browsing experience with or without it, so you might as well protect your freedom and go without it. As a GNU/Linux user, it presents so many problems. Even if using the proprietary player was acceptable, it is plagued by bugs such as the famous “Linux Flash plugin is unresponsive to mouse clicks” issue that Adobe doesn’t even seem to acknowledge. There are various workarounds, but that’s not the point. Then there’s the issue of 64-bit GNU/Linux distributions typically bundling 64-bit versions of the Firefox web browser. Too bad Flash dropped support of the 64-bit Flash plugin while it was still in beta, leaving users with an unsupported version with known security vulnerabilities. [Update: Seems Adobe changed their minds again! Most people still hellbent on using Flash have already had to switch away from the unsupported beta by now, so what is the point?]

Want to know what really gets on my nerves? 99% of Flash content that I am actually interested in is just video. The stuff that Firefox and Chrome has included native support for since over a year ago, via Xiph.Org‘s Ogg Theora/Vorbis formats. Heck, even Opera finally included support for these free video formats early this year. Those three browsers alone account for over 40% of web browser usage worldwide. Of course, Microsoft doesn’t want anything to do with free software, and Apple generally tries to pretend it doesn’t exist wherever convenient to do so. Since the majority of web browser usage does not include video support natively but does include the Flash plugin, for a lot of websites Flash is the easy fix. This of course forced more people to use Flash, which caused more websites to use it, which caused more people to use Flash… you get the idea. Even though Flash has been responsible for a huge number of system compromises, people feel forced to use it anyway.

W3C recognized the need for an open standard so that anyone could play video on the web regardless of Adobe’s approval. When HTML 5 was being drafted, VP8 was proposed as the video codec. Why VP8, when three of the five major browsers had native Theora support already? The answer to that was of course; video quality. Everyone was whining that Theora wasn’t as high in picture quality for H.264 and everyone wanted video quality to be as nice as possible. Due to H.264 being unusable (being encumbered with with patents), Google generously purchased On2 Technologies who created the wonderful VP8 codec and released the codec as a truly free format. As it is the highest quality open-spec free-of-patents codec which anyone can use, this paved the way for W3C to give it serious consideration.

Unsurprisingly, Microsoft made it clear that they would not support a free format. Period. Microsoft doesn’t need to provide a reason – given their normal attitude towards open standards or anything that would benefit free software or competition of any kind, rejecting the proposal was a given. Historically Microsoft deliberately doesn’t follow standards (eg. IE6, MS Office… anything really), so having the suits at Microsoft disagreeing with it was completely expected. Still if everyone else supported the standard, and with IE’s popularity continuing to fall, this might be enough to either force Microsoft’s hand, or make IE basically irrelevant – eventually.

There’s one other (somewhat) common browser – Safari. Apple’s response to VP8? Utter BS – patent FUD, and bogus hardware claims. Apparently a lot of Apple hardware has dedicated H.264 decoder chips (presumably iPods, iPhones and such), which Apple seems to suggest can be used by H.264 exclusively. I don’t believe it. Considering how similar H.264 and VP8 actually are, you’d think a company like Apple would be able to make it work. Anyway, Apple comes out with new iWhatevers every year, and Apple provides basically no support for older devices. Last I checked (a few months back – don’t have a link), there were various vulnerabilities in the 1st generation iPod Touch models which Apple has no intention of fixing. It was only superseded by the 2nd generation just on 2 years ago. That’s right – if you brought your iPod Touch around 2 years ago, Apple wouldn’t care about you today. Due to this forced rapid upgrade cycle, it should be no problem at all to have Apple get hardware support into all its devices relatively quickly – after all, we’re talking about the company that got Intel to redesign them a smaller CPU to fit their MacBook Air. If Apple can boss Intel around to get chips working the way they want, they likely can with any hardware company.

As for the patent FUD, Apple claims that H.264 hasn’t been proven not to infringe on patents in court. Steve Jobs famously claims that all video codecs are covered by patents. If this actually were true – that it was impossible to create a video codec without stepping on a patent, the patents in question would surely have to be invalidated by being obvious or demonstrating prior art. Either way, Apple’s talking trash. The real reason for rejecting VP8 is surely for the same reason as Microsoft – so they can keep themselves from being on a level playing field with their most direct browser competitors. Mozilla, Google and Opera won’t pay for MPEG-LA patent licenses on a per-user basis since the browsers can be copied and distributed to anyone without charge – and there would be no way to track the licenses anyway. Even if (for example) the Mozilla Foundation did find a way to overcome these obstacles, what of projects that fork Mozilla? Mozilla is free software. If all derivatives weren’t covered, Firefox wouldn’t be free anymore. If they were covered, any project would never have to pay the MPEG-LA again since they could just opt to borrow the Mozilla code – it would be a licensing deal that the MPEG-LA would never agree to. Clearly, the future of video on the web cannot possibly depend on paying patent licenses.

So where does this leave us? I predict that if HTML5 does not specify a format to use for the video tag, we’ll continue to see Flash dominate as the preferred video decoding option by website owners for many years into the future. Couldn’t we just dump Flash already and have the Microsoft fanboys install the Xiph.Org Directshow Filters package (which apparently comes with support for IE’s <video> tag)? That could work in a lot of cases, however if it really took off you could be sure that Microsoft would find a way to “accidentally” break the plugin somehow. It wouldn’t be the first time. I recall Microsoft IE 5.5 beta (if I’m remembering my version numbers correctly) would prevent executable files named hotdog.exe from working. This kind of file name was commonly used for Sausage Software’s HotDog HTML editor installation program – direct competition to Microsoft FrontPage back in the day. Rename the file to setup.exe and you were in business – not easy to do when the file came on a CD. Microsoft could potentially just argue that the incompatibility was only in its beta software, but web developers would likely have installed it.

Getting back on track… <cough>.. if the future of web video is in Flash, what can we do about it? How can we play our video using 100% software? We’re not out of options yet. Adobe has announced that upcoming versions of Flash will support VP8! How does that help us? If webmasters want to reach as close to 100% of their audience as possible right now, H.264 is the best option. As much as I hate it, H.264 can be played back via Flash on 90+% of desktops. Encoding in a second format to reach users that don’t have Flash installed might not be cost effective when time and storage costs are considered. However when Flash supports VP8, everyone can adopt that format and not need to worry about encoding in H.264 as well. People without Flash but using Firefox, Chrome or Opera can gracefully fall back to watching video natively. That way, the website video will work on all free-software-only desktops. Video numbers can be still further improved by updating the free software Java applet player video player Cortado to add WebM support. This would be a combination that would likely get us as close to 100% compatibility as reasonably possible using only a single codec.

There are some reasons why this could fail. Perhaps a percentage of IE users that don’t have Flash, Java or have the Directshow Filters plugins installed (but can play video natively due to having IE9 or later) will be larger than the number of GNU/Linux desktop users. I expect this to be very unlikely. However if H.264 remains the only option for iPhone-style devices, that might help tilt the scales in H.264’s favor. Another problem is that a lot of video recording devices such as webcams and some digital camcorders record to H.264 natively. It might be more efficient for the website maintainer to keep video in that format (even if heavy edits are required). Fortunately most web videos are so short that transcoding times probably won’t matter… but it’s a minor concern.

But what about playback today using entirely free software? Flash sucks on GNU/Linux! Enter Gnash and FFmpeg. The latest version (0.8.8 at the time of writing) works with YouTube 99% as well as Flash on Windows. Other video sites… not so much. In particular, I still have problems with Gnash when I try to play AVGN and Zero Punctuation – but I have a solution for these as well – the gecko-mediaplayer plugin with Greasemonkey. Once those are installed, grab the Greasemonkey scripts Download Escapist Videos and Gametrailers No Flash. You also will want to install rtmpdump. With those all installed, when you want to check out Zero Punctuation simply click the Download link that now appears under the video. Gecko MediaPlayer will kick in and give you video that takes up the entire browser window. As for AVGN, I discovered that GameTrailers hosts all the ScrewAttack content which includes many of the AVGN videos. Simply head on over to the ScrewAttack section – the latest video should be towards the top. Note that you have to sign in for the script to work, but basically it just takes the Download link and streams it to Gecko MediaPlayer, which gets embedded in the area that Flash normally resides. It works perfectly.

So there’s a lot of hacks involved. Gnash is buggy, and FFmpeg might have patent issues depending on the codec and your country. The AVGN solution involved finding an alternative (but still non-free possibly patent-infringing) format. Lastly, the Zero Punction hack basically involved a stream ripper, Gecko MediaPlayer and (probably) FFmpeg too. This is ugly as hell, but it works. When it does the first time, it’s a wonderful feeling. Unfortunately if you want native WebM in Firefox you need to upgrade to Firefox 4 beta, and today’s Greasemonkey checkout still has serious compatibility issues (although it’s being actively worked on). When Greasemonkey works natively in Firefox 4 and both projects release a stable build (expected this year), things will be looking very nice… and I imagine Gnash will get better in the meantime. YouTube is also testing encoding videos to WebM format, so hopefully they keep that up and encourage other video sites to follow suite. All systems are go!

What an awesome week for freedom

Wednesday, Richard Stallman gave a speech on Free Software in Ethics and in Practise at Melbourne University. Thursday, he gave a talk on Copyright vs Community in the Age of Computer Networks at RMIT. I had the pleasure of attending both, and asking Richard a few quick questions regarding Android phones, IaaS and the FreedomBox project.

Today, I just got back from the Melbourne Software Freedom Day event at the State Library. I was pleasantly surprised by the number of volunteers in orange t-shirts. Whilst many of the talks were focused specifically on open source as opposed to free software (such as the RedHat talk), all talks I elected to attend were very interesting and occasionally even exciting.

Well done to everyone who helped organise and run the event, and a special thanks to sponsors the Victorian Government, the State Library of Victoria, LUV and Linux Australia. Looking very forward to next year.

Being a sysadmin is dangerous work

Okay… so I hit my head on an overhead sprinkler nozzle. Those things are sharp! Despite the small size of the cut, I needed three stitches.

not my first forehead scar

Hard to believe the mess it made. I just happened to be wearing shoes I brought just two weeks ago, and pants that I recently had professionally patched.

Shoes need a clean

quite a mess

Nothing compared to what Lucas had to clean up. Sorry! 🙂

Also – sorry to all users who had issues with the sign-up page earlier. I believe these are all sorted now. If there are any further problems, feel free to drop me an e-mail using the address in the website footer.

The ethics of gaming with non-free software

You’ve probably already read my rant about how PC gaming is looking doomed due to overpriced games, no second hand market, and most importantly; DRM. I haven’t really brought that many games since then for any platform; I’ve been cutting back since I was previously purchasing games quicker than I could play them. As such, I’ve been left with an overwhelming number of nice looking titles on my shelves; many of which had never even been loaded! Time to do something about that…

You might be wondering why, as a free software advocate, I allow myself to run non-free gaming software? The problem is that the FOSS development methodology doesn’t lend itself well to creating surprises. Take any FOSS FPS for example; these have been gradually developed in public, and project updates have been reported across various websites throughout development. There is little surprise.

Games are more than just being about software, hardware and reflexes. Generally, the type of single-player game that interests me will have great story telling, artwork, suspense, puzzles, music, and essentially provide a great overall experience. Hypothetically, if a FOSS game was under public development until completion, by the time the game was finished many aspects of it would feel old. Puzzles would have been solved in the betas, any surprises would be expected, and suspense would be eliminated because one would know what was coming from all the pre-releases.

There are exceptions. If a game looked sufficiently boring that I didn’t bother to look at it during development, I might look at it for the first time as a finished product. More likely though, it’s probably not my type of game and wouldn’t be played at all.

Another exception would be if the game was developed behind closed doors, and released once completed as a FOSS product. I would imagine this would be more likely to happen from a commercial game company that used an existing FOSS game engine and wanted to make money by licensing copyrights (such as artwork and in-game text) and possibly whatever trademarks they have registered. Although I haven’t researched it, I don’t recall this ever happening. It seems these days, companies are more interested in paying for an engine that gives developers and publishers the ability to redistribute the game in a non-free form, and then slapping DRM on top of that.

Looking at games from the angle of a user mainly interested in multi-player, FOSS has a real chance as many of the limitations I’ve mentioned don’t exist, or exist to a lesser extent. Unfortunately, I’m more of a single-player gamer; I often enjoy the feel of a totally new experience, and I don’t know a lot of other gamers. I also don’t like being restricted to gaming when my fellow comrades are interested or available, and don’t find it so fun to always play against strangers. Further, LAN parties aren’t easily accessible with events seemingly becoming more rare and PC equipment becoming increasingly heavy (and I rely on public transportation). Online gameplay also isn’t so fun when your wife is always downloading via bittorrent and the router doesn’t support prioritization.

Due to the above, I’ve come to depend on proprietary software for much of my gaming. Traditionally, this has meant dual-booting my gaming rig with a Windows OS. Great… more proprietary software. As of around mid last year, I’ve been actively looking for ways to minimize my reliance on it.

By late August, I had completely removed my dual-boot configuration in favor of a single GNU/Linux installation, with WINE.

I’ve completely installed all my games into their own individual wine WINEPREFIXes (jails, if you will). When it came around to testing the games, to my surprise most of the games in my collection actually worked. My success rate was around 55%, and that was with barely trying! With such a large collection already working, and WINE steadily improving to increase compatibility further, there was simply no need to reboot into Windows anymore.

Now I know a lot of people will be skeptical about such a claim. I expect some readers would point out that having a game run and be playable are two completely different things, and I agree. As such, to prove the maturity of the WINE platform and GNU/Linux in general, I’ve recently been saving to a log the games I’ve actually finished as I complete them! For games that have multiple campaigns (eg. 4 campaigns for WarCraft II Battle.net Edition) I finished all of them.

PC:
2009-08-29: BlackSite (WINE + mousepatch)
2009-08-30: F.E.A.R. 2 Project Origin (WINE)
2009-09-11: Quake 4 (native GNU/Linux)
2009-09-13: Unreal (WINE)
2009-09-14: Frontlines: Fuel of War (WINE + mousepatch)
2009-12-12: Unreal II: The Awakening (WINE)
2009-12-13: Wolfenstein (WINE)
2009-12-20: Crysis (WINE + regression patch)
2009-12-25: Crysis Warhead (WINE + regression patch)
2009-12-28: Red Faction (WINE 1.1.33)
2010-01-03: Red Alert 3 (WINE)
2010-01-08: Red Faction II (WINE 1.1.35)
2010-01-30: Half-Life (WINE 1.1.37)
2010-02-02: Half-Life 2 (WINE 1.1.37)
2010-02-02: Half-Life 2: Lost Coast (WINE 1.1.37)
2010-02-05: Half-Life 2: Episode 1 (WINE 1.1.37)
2010-02-06: Half-Life 2: Episode 2 (WINE 1.1.37)
2010-03-14: Portal (WINE 1.1.40)
2010-04-04: WarCraft II Battle.net Edition (WINE 1.1.41)

Playstation 3:
2010-01-10: Turok
2010-01-13: Uncharted: Drake’s Fortune
2010-01-26: Army of Two
2010-02-28: inFamous

Xbox 360:
2010-01-18: BioShock
2010-01-26: Halo 3

By comparing against the games I finished on the PS3 and 360, it is clear that I did most of my gaming under WINE.

The excitement doesn’t stop there. Most of these games contained some form of copy protection, and in almost every case WINE was able to satisfy the copy protection software’s requirements – even those that used DRM. Some of you will remember that I had previously mentioned on my blog that I wouldn’t be able to purchase Red Alert 3 for the PC because what could be done with it was restricted by DRM. In the case of this game, I believe it allows you to install it a limited number of times, and each time it connects to EA’s servers to reduce your remaining available install count. The great thing about WINE is that WINE mimics an entire Windows system, but with very little disk usage overhead (currently 39Mb for WINE 1.1.42 on my x86-64 system). This means that I can install the game, have it reduce my available install count by one, and then back up the entire wine prefix. When I restore it, the game will still be activated so won’t need to reconnect to the servers. It’s the best solution to this kind of DRM system I’ve seen (aside from simply not using it at all).

Unfortunately, just when I thought I had the DRM problem figured out, Ubisoft and EA go and release even worse DRM still! The latest versions require a constant connection to the Internet to both verify that you’re not running multiple instances of the game simultaneously, and to store all of your save games on – effectively making it Software as a Service. I fear there really is no nice solution for using these kinds of games, but they have some of the properties of multi-player gaming that will keep me away from them anyway, eg. if my wife’s running bittorrent I won’t be able to play my single-player game effectively.

Once again, I am reverting back to being very cautious about buying PC games, and may find myself returning to get a another look through those second hand bargain bins. 🙂

Terminator

My terminator config:

# ~/.config/terminator/config
## Colors
use_theme_colors = False
enable_real_transparency = True
background_color = "#000000"
foreground_color = "#FFFFFF"
background_darkness = 0.8
background_type = "transparent"
# The following palette value should not be split,
# but was done so to make it fit on this post.
palette = '#2E3436:#CC0000:#B1C1A3:#C4A000:#4F97F3: \
#75507B:#06989A:#D3D7CF:#555753:#EF2929:#8AE234: \
#FCE94F:#729FCF:#AD7FA8:#34E2E2:#EEEEEC'
# Text
font = "Mono 8"
cursor_blink = True
# Misc
scrollback_lines = 100000
force_no_bell = True
titlebars = True
scroll_on_output = False
focus = sloppy
borderless = False
# end of config

Hasta la vista, baby.

flac2aac

Last week, I wanted to quickly convert part of my music collection (which has entirely and painstakingly been ripped into FLAC files) into AAC format so I could play it on my 2nd generation iPod Nano (which unfortunately isn’t supported by Rockbox yet). I couldn’t quickly find a tool that would do the job while preserving the track tags, so I quickly made a script to do it myself.

Since there is probably a need for something like this, I decided to add a bit of spit and polish to it and release it here under the GPL. I’ve named the script flac2aac, and it can be downloaded here. Hope this benefits somebody.