fglrx on Debian Wheezy (testing)

Packages fglrx-driver and fglrx-glx have been removed from Debian for the last few weeks due to #639875, so I’ve been using the free software Radeon drivers in the meantime. While I appreciate having my virtual terminals display the console at my native screen resolution automatically, I don’t like that I’ve had to put playing Amnesia on hold for a while – these drivers cause the game to segfault.

Today I decided to roll back my xorg version to get the fglrx drivers working again, and as it turns out, it really wasn’t that hard. Here’s how I did it.

  1. Set up a preference control file (eg. /etc/apt/preferences.d/60xorg_rollback_for_fglrx.pref) as follows:
    Package: xserver-xorg
    Pin: version 1:7.6+8
    Pin-Priority: 1001
    Package: xserver-xorg-core xserver-xorg-dev
    Pin: version 2:1.10.4-1
    Pin-Priority: 1001
    Package: xserver-xorg-input-evdev
    Pin: version 1:2.6.0-2+b1
    Pin-Priority: 1001
    Package: xserver-xorg-input-kbd
    Pin: version 1:1.6.0-3
    Pin-Priority: 1001
    Package: xserver-xorg-input-mouse
    Pin: version 1:1.7.1-1
    Pin-Priority: 1001
  2. Now add the following repositories to your apt sources configuration (eg. /etc/apt/sources.list.d/60snapshot-20110911T211512Z.list):
    deb http://snapshot.debian.org/archive/debian/20110911T211512Z/ wheezy main contrib non-free
    deb-src http://snapshot.debian.org/archive/debian/20110911T211512Z/ wheezy main contrib non-free

    These include Xorg package versions that don’t have the ABI change which is incompatible with fglrx.

  3. Normally, Debian will spit out the following error:
    E: Release file for http://snapshot.debian.org/archive/debian/20110911T211512Z/dists/wheezy/InRelease is expired (invalid since 33d 17h 12min 59s). Updates for this repository will not be applied. We fix this by adding an apt configuration file (eg. /etc/apt/apt.conf.d/60ignore_repo_date_check) like so:

    	Check-Valid-Until "false";
  4. We should now be able to resynchronize the package index successfully.
    apt-get update
  5. Log out of your X session (if you haven’t already), and (from a virtual terminal) stop gdm/gdm3/lightdm or anything else that might be responsible for an Xorg server process running. eg,
    /etc/init.d/gdm3 stop
  6. Revert xorg packages to older versions, as defined in our preferences policy.
    apt-get dist-upgrade
  7. Install the fglrx drivers from the snapshot repository.
    apt-get install fglrx-driver fglrx-glx fglrx-control
  8. Make sure Kernel Mode Setting is not enabled. This should (in theory) be handled automatically due to the /etc/modprobe.d/fglrx-driver.conf file created during the fglrx-driver package installation – or at least it seemed to be for me.
  9. Create a new xorg.conf file. Assuming Bash:
    mv /etc/X11/xorg.conf{,.$(date +'%Y%m%d%H%m')}
    aticonfig --initial
  10. Reboot, and you should be presented with some kind of X display manager login screen. If everything went well, you should be able to see the following:
    $ glxinfo | grep -E '(^direct\ |\ glx\ |^GLX\ |^OpenGL)' | grep -v '\ extensions:$'
    direct rendering: Yes
    server glx vendor string: ATI
    server glx version string: 1.4
    client glx vendor string: ATI
    client glx version string: 1.4
    GLX version: 1.4
    OpenGL vendor string: ATI Technologies Inc.
    OpenGL renderer string: AMD Radeon HD 6900 Series  
    OpenGL version string: 4.1.11005 Compatibility Profile Context
    OpenGL shading language version string: 4.10

Update (2011-10-23, 10:14pm): In case it wasn’t clear, these changes are temporary. However that brings up new questions like how will I know when I need to revert these changes? and how do I revert?. Well, the first question is easy to answer – simply run the following command:
$ [ "$(apt-cache policy fglrx-driver | grep -E '^(\ |\*){5}[0-9]+' | wc -l)" -ge 2 ] && echo "fglrx-driver is back - time to upgrade xorg" You can optionally put that in a daily cron job to have it e-mail you when it is time.

Reverting the above changes is also very easy:
$ rm -f /etc/apt/preferences.d/60xorg_rollback_for_fglrx.pref /etc/apt/sources.list.d/60snapshot-20110911T211512Z.list /etc/apt/apt.conf.d/60ignore_repo_date_check
$ apt-get update && apt-get dist-upgrade
followed by a reboot.

Battlefield: Bad Company 2 – buggiest game ever?

I’m declaring Bad Company 2 one of the worst games I’ve played to date.

I wanted to like this game. Hawkeye of Atomic fame loves it, as do many others given its following. The queue at the EB Expo that was specifically for playing a Battlefield 3 preview for 5 minutes was massive, with reports of people waiting in line for over 3h. I’ve had Bad Company 2 sitting in my Steam account for some time, and with no other games currently occupying my time I figured I’d finally give it a shot.

Now one thing you must know about me – I’m first and foremost a PC gamer. Yes, I have all the major consoles. I have more consoles than anyone I know, but they’re purely for compatibility. If a game is made available for a range of platforms, I’ll just about always pick the PC version. Consoles are primarily great for two things – fighting games, and casual players that just want to relax on a couch and do something because their favourite TV show isn’t on. However, dedicated fans to genres such as FPS and RTS know the PC is the only true option. As a consequence, I only have 7 Xbox 360 games for example – all platform exclusives (at the time of purchase anyway).

Before I start playing any PC game, I always take a couple of minutes to make sure all the key mapping, locking and mouse sensitivity settings are all to my liking. For the first few minutes, I often find myself fine-tuning these further. Playing at the EB Expo (where everything seemed to be demoed on a console), I felt like the setup was barely usable – there was absolutely no accuracy, and aiming often took over a second longer than it should. It should be no surprise to learn then that I failed to find a single RTS game on display anywhere at the event. Beyond the hardware manufacturer stands such as Alienware, Razer and Western Digital, the EB Expo unfortunately did not cater for the professional/serious gamer at all.

After spending a few hours playing Bad Company 2, I have come to the conclusion that EA and DICE are also failing to do so. Had I done my research (and not just bothered to trust Atomic and the general popularity of the game which both indicated the game would be terrific), I probably would have noticed subtle hints that the game might not even be worthy of an average rating. For example, when I wondered why I had never noticed a Bad Company 1 anywhere I turned to eBay and Wikipedia only to discovered that Bad Company 2 is the sequel to a console-only game. Serious PC gamers would never have seen the first instalment, so already I started to feel alienated.

Anyway, after playing the game for a few minutes there were a few issues that became immediately obvious. First and foremost, this game is tries to be a realistic shooter but does not support a prone position. You can crouch, but that’s all there is – and it doesn’t feel like crouching takes you anywhere near as low as one might normally crouch if bullets were really flying at them.

Secondly, there is no option to look around corners. I’m pretty sure I saw the computer doing that, so why isn’t there an option? I don’t normally care about such a thing in most cases, but if this game really wants me to believe that it’s at least set in a semi-realistic environment, it needs to be supported. Otherwise, I might as well be playing Doom. I suspect this is related to such options being difficult to implement on a console controller… <sigh>.

Alright… Doom’s still fun. But it should be better than Doom because you get to go outside and drive vehicles, right? Hey, that’s Halo! Oh… that’s the series of games I almost didn’t finish because I just became so bored out of my mind towards the end… but perhaps it’s best I try not to draw comparisons.

Bad Company 2 starts you off trying to rescue a Japanese scientist a few days before the bomb gets dropped. This feels to me like a potentially fun mission, but why did the frame rate have to drop to 1 frame per second every time I entered into close combat? I’m running a 2x Radeon HD 6990 CrossFire setup (overclocked) and an 1920×1080 res LCD, so one would think such a system could handle game thrown at it, right? Apparently the answer is no. Apparently, selecting the “High” graphics profile at 1920×1080 would require consumer graphics hardware that has not yet been released (or possibly even invented). Yes, my drivers are up to date, I’m trying to play on a standard Windows 7 Ultimate install (not WINE for a change), etc. The game must be unbelievably unoptimised for the PC.

Then there’s the issue of people getting stuck when you kill them. Sometimes they get stuck in walls. Sometimes they just stand there perfectly still – giving the illusion you haven’t killed them yet just to keep you wasting your time and ammo. This is both frustrating and lame. I mean, the game has been out for over 18 months as I write this, so you’d think they’d have gotten this shit sorted. With that in mind, go read this Atomic interview with DICE. Choice quote: β€œThe way we look at making the game as good as possible is that the real work only starts once the game’s released.” Seriously, WTF?

But hell, I’m not only a serious gamer, I’m also dedicated. Surely I can look beyond the above flaws and finish the Bad Company 2 campaign? Heck, I finished Kayne and Lynch – and that’s saying something! My answer to that is that I might, but I really can’t say for sure… Bad Company 2 is seriously that bad.

The most recent level I was on (‘Crack the Sky’) involved driving a truck along a road while being shot at by other vehicles. Unfortunately in the fight, I ran slightly off the road onto a lake of ice which had a crack in it and caused the vehicle to be swallowed. Fortunately I was thrown to dry land and the rest of the team managed to escape uninjured. However it soon hit me that this was not a situation the developers had ever considered. I was forced to continue very slowly along the road by foot, but the entire time my allies kept repeating something like “quick, we need to beat them to the satellite control station” or some darn thing – over and over and over. There were other completely out of place comments made along the way as well, as if we were still driving. But there was no truck.

After walking for what seemed like forever (which actually reminded me a lot of Far Cry 2 – one of the other very few games I found too dull to ever finish), we ended up coming to another truck with a manned gun on top. The distance was too far without cover to be able to do it any real damage, and eventually I died.

A few seconds later, I had respawned to a position not far from where the truck had previously fallen into the frozen lake… but without the truck! What was I supposed to do? Spend 10 minutes walking the long road on foot again just so I could probably die in the same way again? No – the game had other plans for me. Instead, it had sent not one, but two armed trucks with manned guns on top to chase us right off the bat. This was in an area where there was absolutely no cover, in a game where there is no prone option. The trucks were too far away for grenades to hit, so it was darn near impossible to survive just a few seconds without getting killed again!

As if that wasn’t bad enough, the game had spawned me right next to an out-of-battle area. In Bad Company 2, if you walk into an out-of-battle area, you get a countdown timer appear in the middle of the screen and you have just 10 seconds to return or you are automatically killed. The problem was, the trucks firing upon me were driving directly from the out-of-battle area. If I ran close enough to the trucks to make a grenade throw count (while using my squad mates as shields) I figured I might have a chance, but no… the game will make sure I’m killed for running into the out-of-battle area that the computer keeps killing me from. Unbelievable.

Eventually, I realized that if I used one of my automatic weapons to aim for the gunner on each truck, I can actually take them out – again, if I use a squad mate as a shield to stay alive. I managed to do this a couple of times, all the while being repeatedly told over the speakers to stay out of the open – despite there being absolutely no cover in the area. The trucks would continue rolling for some distance – seemingly they had no driver. Unfortunately, they would always stop just short of leaving the out-of-battle area, but I figured they were close enough that I could just sprint to the truck and drive out in 10 seconds… which brings me to yet another bug – you cannot enter a vehicle in an out-of-battle area. The screen says “Press E to enter vehicle” or some such, but it simply does not respond. Possibly the developers never intended for there to be a vehicle in an out-of-battle area… who the fuck knows. After wasting *way* too much time on this, I gave up and restarted the level.

I don’t think I’ve ever swore on my blog before, but that’s just how pissed of this game has made me.

Want to know something else funny? Check out this Atomic review of Dead Island. I finished this game without any problems, and claim it is one of the most stable games I have ever played. No crashes, no glitches (except for the occasional enemy hand sticking through a wall), it was basically bug-free from my perspective, so much so that I was impressed. And so what does Atomic do? Why they go and bag it for being buggy of course! I wonder if we live in parallel universes or something. However, given Atomic had been bagging Dead Island for so long prior to its release, and given how they treat Bad Company 2 as though its the best game to ever grace the PC, I’m a little skeptical.

I’ve got many issues of Atomic on my shelves. I’ve been to Atomic events, and regularly scan the website for news. For a while, I even used to be a subscriber. But as of late, Atomic has just been giving me *way* too many of these WTF moments. Here’s another example – Atomic says “you can now buy PDFs of each issue on Zinio”, but then scroll to the comments section. Hawkeye states “It is PDF format, but yes, as special Zinio version, so that we don’t have folks just passing around raw PDFs.” Looking at the Zinio FAQ page, it is clear that you do not get a PDF but a proprietary ZNO file which requires a Zinio Reader program to open (and is not compatible with most of my devices, but that’s another story). Even when this issue was pointed out, Hawkeye refused to correct the article so it continues to mislead readers to this day.

Since Atomic is knowingly and wilfully engaging in false reporting where they are the direct beneficiaries of said error, it is reasonable to assume that the same could be said of any article they have ever written. And if Atomic does this, we can be darn sure Game Informer does as well given their close ties with EB Games – which explains why I have almost never seen them rate a game EB Games sells badly. There is also strong evidence that GameSpot behaves in this manner.

It’s beginning to look more and more like I need to completely ignore what games the masses buy, and what game reviewers say if I want to maintain any level of interest in serious PC gaming. I do not like where things are heading.

What does everyone reading this think? I’m especially interested in what older gamers have to say, that are dedicated to the PC platform. Is this just me, or are the heavy weights of the industry such as EA and EB potentially pushing everyone else (outside of free software and indie game companies) to sacrifice professionalism and soul in an attempt make PC gamers a smaller segment and (eventually) possible to ignore completely?

Update 2011-11-21: By starting the level from the beginning, I was able to continue the single-player story mode. However, there were many more bugs that awaited. Everything from crashes, to boxes falling through characters *during cut-scenes*, to sub-titles being horribly out-of-sync with audio, to enemy characters appearing out of thin air, to enemy characters being impossible to kill only to eventually vanish, to cut-scenes suddenly deciding to start playing at 1 frame per second… OMG what a train-wreck. This is *by far* the most buggy game I have ever played on any platform ever – and I haven’t even started on multiplayer! To be honest, I’m not sure I want to bother.

Battlefield 3 looks like it’ll be the same deal – a good console game maybe, and a crap PC title. As early evidence of this, EB Games isn’t even bothering to sell a Limited Edition PC version.

Farewell Grip, hello Sound Juicer

For probably over a decade, I’ve been using Grip to rip all my audio CDs to MP3, Ogg Vorbis, and (for the last few years), FLAC. I admit that it’s not the easiest ripper to use these days, but its flexibility is unparalleled.

Unfortunately, it is no longer maintained, and the last stable release was from 2005. Due to this, many major distributions (including Debian) have removed it from the official repositories… which means compiling. What joy. πŸ™‚

I brought home a new CD today. It might have been possible to buy it online, but I’m increasingly becoming disappointed with digital download audio purchases. Unexpected censoring/editing, poor audio quality, lack of high-quality/complete album artwork, having to rename tracks when purchasing from different online music stores to meet preferred naming convention requirements, inconsistent tag naming conventions and tag types between stores, and a price that’s still very similar to the physical disc version have been steering me away from them. If I happen to be down the street near a local CD store, it’s more convenient to duck in and get the real thing rather then purchasing some downloadable hacked imitation later which will also involve more work in the long run.

As soon as I arrived home, I threw the disc in my laptop’s external USB DVD drive to rip it (because seriously – who uses a CD player to play music on these days?) and up pops Sound Juicer. That’s normally my cue to start compiling Grip, but this time I paused – it had been a long time since I had investigated alternatives, and GStreamer is the future (which is what Sound Ripper apparently uses) so I decided I’d spend a few moments to see if I could get Sound Ripper to rip the disc in the exact way I like it.

My music naming convention is as follows:

I don’t use white-space or special characters in file names (I replace them with underscores) – there’s no point since getting the names exact is supposed to be what tagging information is for – and this convention is extremely easy to parse. Since track names are always in album order due to the padded track number, I can always play an album with mplayer *.flac or gst123 *.flac – which is how I generally play my music. It’s also uncommon in *NIX to see upper-case in file-names, so I keep everything lower-case too for consistency. Additionally, tags should be in native FLAC format – not id3v2 or something. I only require the band, album, album release year, track name and track number tags be set, but am not fussy if others are included.

Okay – so what’s Sound Juicer got? Clicking on Edit -> Preferences provides the following screen:

Hmm… doesn’t look too promising. The drop-down boxes don’t provide the exact configuration I want, but I can get pretty close. If only I could create my own custom settings! I quickly fired up gconf-editor to take a look at behind-the-scenes configurable options, and found three relevant keys:

  • /apps/sound-juicer/path_pattern
  • /apps/sound-juicer/file_pattern
  • /apps/sound-juicer/strip-special

Clicking them provides a complete description of all the available parameters these can take, so I tweaked them as per the following screen shot:

Changes seem to be applied instantly, so I hit the bit Sound Juicer Extract button, and lo and behold:

So far so good… but what about tagging information?

~/Music/Flac/killswitch_engage/alive_or_just_breathing$ lltag -S 01-numbered_days.flac
ARTIST=Killswitch Engage
TITLE=Numbered Days
ALBUM=Alive or Just Breathing
ARTISTSORT=Killswitch Engage


Grip has served me well for a long time, and given me so many fond memories of spending hours ripping my entire collection… but now it seems it’s finally time to say goodbye – and hello Sound Juicer.

AMD Fusion E-350 – Debian Wheezy (testing) with Catalyst 11.5 drivers and VA-API support via XvBA

My new Sony Vaio 11.6″ laptop runs an E-350 processor. I’ve upgraded the RAM to 8Gb and threw in a OCZ Vertex 2 SSD. Even using LUKS under LVM2, it feels insanely fast.

Everything I’ve tried works perfectly on it out of the box, including the card reader, HDMI port, Gigabit NIC, USB ports, sound card, wireless, Bluetooth, volume/brightness/video output function keys, etc. Well, almost everything… the graphics is one glaring exception.

For instance, when I go to log out or drop to a virtual terminal, I get all kinds of screen corruption and can’t see a thing. I always need to hit CTRL+ALT+F2 to switch to a virtual terminal, and CTRL+ALT+DEL to restart it to get picture back. Fortunately the SSD means that’s under a 30 second wait to shutdown and boot up again into the login screen, but after owning this machine for just over two weeks now it was starting to get old.

Further, 3D acceleration via Mesa3D was painful. Games like the Penumbra Collection ran at about 1 FPS. It looks like the Phoronix guys have tested newer versions with Gallium3D which look slightly more pleasing, however none of that’s in Debian and I don’t want to waste lots of time recompiling everything and potentially causing new updates to break. Further, while I could watch 720p video reasonably well, 1080p video would occasionally cause slowdowns and prevent the viewing experience from being completely enjoyable.

Time to do something I’d rather not… and install the proprietary driver. 11.3 is currently in the testing repos, however it still isn’t efficient enough to allow 1080p video to play properly. Downloading the new 11.5 Catalyst driver also fails to properly install due to a DKMS compilation error.

Fortunately, Mindwerks has instructions on how to fix this issue under Ubuntu 11.04. With a few little tweaks they can be made to work with Debian too.


Debian Wheezy (7.0) users can finally get fglrx playing nicely together with X.Org X Server 1.9.5. We can also make the latest driver work well with the 2.6.39 kernel.

The Minderks custom build procedure follows, adapted for Debian Wheezy (7.0) users. Note that the commands are for 64-bit, but the only change 32-bit users likely need to make is to download an i386 package instead for step 8. Also, the following commands assume you use the sudo package to gain root.

  1. Install the latest 2.6.39 kernel revision from kernel.org. eg.
    $ wget http://www.kernel.org/pub/linux/kernel/v2.6/testing/linux-2.6.39-rc7.tar.bz2
    $ wget http://www.kernel.org/pub/linux/kernel/v2.6/testing/linux-2.6.39-rc7.tar.bz2.sign
    $ gpg --verify linux-2.6.39-rc7.tar.bz2.sign linux-2.6.39-rc7.tar.bz2
    $ tar jxf linux-2.6.39-rc7.tar.bz2
    $ cd linux-2.6.39-rc7
    $ make-kpkg -j2 --rootcmd fakeroot --initrd --bzimage kernel_headers kernel_image
    $ sudo dpkg -i ../*.deb

    and boot it.

  2. Download the AMD Catalyst 11.5 64-bit driver installer.
  3. Install the Debian VA-API packages. Note that some say to get them from the Splitted Desktop website, however I tried them and didn’t notice any benefit in doing so so reverted to the following:
    $ sudo apt-get install libva1 libva-x11-1 libva-tpi1 libva-glx1 libva-dev
  4. Extract the files from the package:
    $ sh ./ati-driver-installer-11-5-x86.x86_64.run --extract ati
  5. For 2.6.39 support, download this additional patch: 2.6.39_bkl.patch
    $ wget http://www.mindwerks.net/wp-content/uploads/2011/03/2.6.39_bkl.patch
  6. Check for Big Kernel Lock usage:
    $ cat /lib/modules/$(uname -r)/build/.config | grep -c CONFIG_BKL=y

    If the result of this command is 0, then download no_bkl.patch as well. For stock kernels you should get 0 and will need the patch – which is probably the main reason you are here. πŸ™‚

    $ wget http://www.mindwerks.net/wp-content/uploads/2011/03/no_bkl.patch
  7. then apply them:
    $ cd ati; for i in ../*.patch; do patch -p1 < $i; done
  8. Build your new ati/fglrx driver:
    $ sudo ./ati-installer.sh 8.85.6 --install
  9. Since we’re using the proprietary drivers, we might as well make the most of them. Grab the latest XvBA backend library for VA-API:
    $ wget http://www.splitted-desktop.com/~gbeauchesne/xvba-video/xvba-video_0.7.8-1_amd64.deb
    $ dpkg -i xvba-video_0.7.8-1_amd64.deb
    $ cd /usr/lib/dri && sudo ln -s ../va/drivers/xvba_drv_video.so fglrx_drv_video.so
  10. If your /etc/X11/xorg.conf is missing you will need to run:
    $ sudo aticonfig --initial

    and then reboot.

That newly created package should work for the entire 2.6.39 series.

These steps are really useful for AMD Fusion users at the moment. Without VA-API, multi-threaded mplayer will occupy 100% of available CPU (both cores) and drop frames when playing a test 1080p MKV file I have (containing H.264+AAC+AAS). It’s not unwatchable, but it’s annoying. With VA-API-patched mplayer, CPU usage never hits 15% when playing the same test video!

It’s a pity that the Catalyst driver doesn’t currently correctly generate deb packages for Debian, but the fglrx packages are not listed as a depenency for xvba-video so it’s not a major problem.

Update (2011-05-17, 12:41am): Using the new kernel is not necessary. Although there appear to be a few AMD Fusion fixes in there, I haven’t noticed any benefit in using 2.6.39-rc7 – possibly because I’m using the proprietary drivers. In fact, I’m not completely convinced the Catalyst 11.5 drivers are required either as I was having problems getting mplayer to use VA-API when I was testing 11.3. I’d be interested to know if just steps 3, 9 and 10 would be sufficient for hardware decoding using the patched mplayer and the default fglrx-atieventsd fglrx-control fglrx-driver fglrx-glx fglrx-glx-ia32 fglrx-modules-dkms package versions. I might very well have done much more than required.

Update (2011-05-22, 9:04pm): You do not need Catalyst version 11.5 – you can use the version in the repositories. In fact, I recommend it. The reason being (aside from having the drivers managed via apt) is that the compiling and installing 11.5 via the instructions above won’t enable 32-bit compatibility. I tried running a few games under WINE yesterday, but WINE always complained that OpenGL (and hence DirectX) was not available to it. Presumably I needed whatever the fglrx-glx-ia32 package included. Possibly I was just missing something as simple as a system link before, but I didn’t investigate – I’d rather use the standard Debian packages wherever possible if they do the job. Also, you don’t need to manually fetch the Splitted Desktop xvba-video package – xvba-va-driver is basically the same thing, only A) xvba-va-driver is in the official Debian repositories and B) it specifies the fglrx packages as a required dependency.

So in summary:

  1. Don’t touch your kernel.
  2. $ sudo apt-get install libva1 libva-x11-1 libva-tpi1 libva-glx1 libva-dev xvba-va-driver fglrx-atieventsd fglrx-control fglrx-driver fglrx-glx fglrx-glx-ia32 fglrx-modules-dkms
  3. $ sudo aticonfig --initial
  4. Reboot
  5. Use the VA-API-patched mplayer for watching movies.

It’s easy when you know how. πŸ™‚

Update (2011-07-03, 12:11am): If you run gnome-mplayer (perhaps so you can make use of gecko-mediaplayer for libva-accelerated video in Firefox), be aware that Debian currently ships no later than – even in sid. Thus, you will encounter a problem with a filter gnome-mplayer automatically uses which is incompatible with vaapi – and will automatically disable it. I can verify that 1.0.4 avoids this issue, and allows gecko-mediaplayer to work wonders when combined with the ViewTube and YouTube HTML5 forever Greasemonkey scripts.

It’s not you, it’s me.

I’ve used a number of GNU/Linux distributions over the years, some more than others. This weekend, I’ve apparently been overcome by nostalgia and revisited a distribution I haven’t used in years.

Early beginnings

When I first started getting serious about leaving Windows in the late 90’s, I used RedHat 5.2 as my primary OS. Release after release went by, and I was convinced it was continuing to lose its charm. There was a limited number of packages available and I regularly had to hit rpmfind.net to get things I wanted installed. First, RedHat 7 shipped with a broken compiler. Next, RedHat 8 shipped with Bluecurve which seemed like a crippled mash of GNOME and KDE, so getting certain things to compile either way could be quite difficult. Given the small repositories, compilation was frequently necessary. There had to be a better way…

I occasionally glanced over Debian’s way, but at the time it was too archaic. I could not (for example) understand how anything with such a primitive installer could be any good. eg. Why would one ever want to manually specify kernel modules to install? Why not at least have an option to install the lot and just load what’s required like other distributions did? Then once you got the thing installed and working, packages would be quite old. You could switch to testing, but then you don’t have a stable distribution any more. The whole distribution felt very server/CLI-orientated, and at the time (having not so long ago came from a Win9x environment) it was not easy to appreciate. It also lacked a community with a low-barrier entry – eg. IRC and mailing lists instead of forums.

Enter Gentoo. It seemed to always make the headlines on Slashdot, so I had to see what all the fuss was about. Certainly it was no snap to install, but the attention to detail was incredible. The boot-splash graphics (without hiding the important start-up messages as other distros later tried to do) and automatic detecting of sound and network cards was a sight for sore eyes. The LiveCD alone was my first go-to CD for system rescues for many years due to it’s huge hardware support, ease of use, high-resolution console and small boot time (as it didn’t come with X). Further, everything was quite up-to-date – certainly more so than basically anything else at the time. Later I came to realise the beauty of rolling releases and community.

So I compiled, and compiled, and compiled some more. My AMD Duron 900MHz with 512Mb and 60Gb IDE HDDs seemed pretty sweet at the time, but compiling X or GNOME would still basically require me to leave the machine on overnight. It didn’t help any that I only had a 56k dial-up connection. Sometimes packages would not compile successfully so I would have to fix it and resume compilation the next morning (although the fix was only ever a Google search away). With the amount of time I invested into learning Gentoo GNU/Linux instead of studying my university courses, it’s amazing I managed to get my degree. I even went so far as to install Gentoo on my Pentium 166MMX with 80Mb RAM laptop (using distcc and ccache).

Later during my university work-experience year, I used Gentoo on my first-ever built production web-server. Another “senior” system administrator came along at one point to fix an issue it had with qmail, and he was mighty upset with me to see Gentoo. Not because it was a rolling-release (and I’m pretty sure Hardened Gentoo didn’t exist at the time either), but because he didn’t know the rc-update command (he expected RedHat’s chkconfig) and was only familiar with System V and BSD init scripts. I think he would have had a heart attack if he later saw Ubuntu’s Upstart system (which has also been quite inconsistent across releases over the years)!

During my final university year, I used Gentoo again for a team-based development project. For years it seemed so awesome that I could hardly bare to look at another distro. The forums were always active, and the flexibility the ports-based system provided was unmatched by any other GNU/Linux distribution. However it was at this time Ubuntu starting making Slashdot headlines – perhaps moreso than Gentoo. Whilst I was midly curious and read the articles, I didn’t immediately bother to try Ubuntu myself. After all, it was just a new Debian fork, and I knew about Debian. The next year I attended the 2005 Linux.conf.au in Canberra, still with my trusty P166 laptop running Gentoo I might add. After arriving at the hallway where everyone was browsing the web on their personal laptops with the free wifi, it was immediately obvious I was the odd one out. Not just because my computer was ancient by comparison, but because I was just about the only person that *wasn’t* running Ubuntu. I think I literally saw just one other laptop not running Ubuntu – I simply could not believe it. I had to see what the fuss was about.

A few days after the conference, I partitioned my hard disk and installed Ubuntu Warty. Hoary had just been released, but I already had a Warty install CD and I was still limited to dial-up – broadband plans required a one-year contract and installation fees, and I was frequently moving around. Unlike Debian, the Warty installation was quick and simple. The community was quite large, and while it seemed smaller than Gentoo it was growing rapidly. Because user-friendly documentation actually existed, I felt more comfortable using the package management system. I didn’t need to use the horrible dselect package tool either. The packages were quite up-to-date; perhaps not on the same level as Gentoo, but close enough that any differences didn’t matter. Ultimately I was obtaining all the notable benefits Gentoo had offered, without the downside of huge downloads and slow compilations. By the end of the year I wasn’t using Gentoo on my personal machines any longer.


Seemingly inspired by Ubuntu, Debian has over the years since made significant improvements – shorter release cycles, more beginner-friendly and user-friendly support pages and wikis (although no forms so far), a much easier and efficient installer, etc. Additionally, because of my familiarity with the Debian-based Ubuntu, I later found myself far more comfortable using Debian. I might even go so far as to say enjoying Debian. Indeed, my home server is a Debian box running Xen, where Ubuntu doesn’t even support Xen in Dom0. Further, unlike Ubuntu, Debian hasn’t changed the init script system every six months. Each release does things the way you would expect, whereas Ubuntu frequently “rocks the boat”, requiring more time going through documentation trying to figure out how new releases work. Being a free software supporter, I also don’t appreciate Ubuntu recommending me to install the proprietary Flash player in my Firefox install, or offering to automatically install nVidia proprietary drivers. If I need to use proprietary software, I want it to be a very conscious decision so I know exactly which areas of my OS are tainted. In some ways, Debian is stealing back some of the thunder it lost to Ubuntu – at least in my book.

Life after Ubuntu

The installation I have been using on my personal desktop was a Ubuntu install, upgraded with each release to the current 10.10 (Maverick) from 7.10 (Gutsy), occasionally installing 3rd party PPAs and what not over the years. Whilst I had upgraded my file-systems to use ext4, I knew I would need to reformat them from scratch to get the most performance benefits from them. There was probably a ton of bloat on the system (unused packages, hidden configuration files and directories under $HOME,Β  etc.) and many configuration files probably differed largely from the improved defaults one would get on a fresh modern install. As this weekend was a day longer than usual due to Australia Day, ultimately I decided it was a good time to reinstall.

What to install however? Ubuntu again? I was seriously getting tired of it. It just didn’t feel interesting any more, although it did the job. Additionally, Ubuntu doesn’t strip the kernel of binary blobs like other distributions do (including Debian starting with Squeeze) – another cross in my book. No, it was time for a change. Debian might have been the obvious choice – particularly since Squeeze was released recently making it fresh in new. However I was already somewhat familiar with Squeeze (I installed it on my Asus EeePC 701 netbook last week) and it still felt a bit dated in some areas over Ubuntu. I’m also conscious that the next version probably won’t be released as stable for at least another year or two. Further, I don’t appreciate how Debian considers lots of the GNU *-doc packages non-free. I want those installed, but I hate the thought of enabling non-free repositories just to get them. Perhaps I could find something completely free instead?

Completely free GNU/Linux options

With that in mind, I did a little bit of investigation into completely free GNU/Linux distributions (as per the FSF guidelines). Let’s look at the list:


Based on Fedora. I don’t like Fedora, but I might be willing to look past that aspect. I do however need a distribution that is kept up to date. According to Wikipedia, “the latest stable release, BLAG90001, is based on Fedora 9, and was released 21 July 2008.”. In all likeliness, this would suggest that the stable release hasn’t seen security updates in a long time (Fedora 9 dropped support some time ago). The Blag website does however list a 140k RC1 version which is based on Fedora 14 (it’s not clear when this was posted), but other parts of the website such as the Features section still reference Fedora 9 as the base distribution. It would seem the latest Blag RC version has only made an appearance after well over 2 years, and it’s not even marked as stable.

Further, I’m a little skeptical of installing a distribution that is basically the same as another distribution with bits stripped out to make it completely free – there is always the chance that something was missed, or a certain uncommonly-installed package will malfunction with part of a dependency removed. On the face of it, Blag fills me with doubts.


Something new, and not based on any other distribution. Sounds intriguing. Just the coolness factor of using something unlike anything I’ve ever used before gives this bonus points, if only for the fact that the guys doing this must be serious due to doing so much work (eg. new init system, new package management system, installer, documentation, etc.). The Wikipedia page is light on details and the project website redirects to a broken wiki with the text “Be patient: we are working in the content of the new site. – Thanks”. There is a link to the old wiki at the bottom of the page, but it redirects to a Spanish page by default. Fortunately there is an English version there you can select, and it looks mostly as complete as the Spanish version.

It’s nice to see this project has its own artwork and a few translations. There is also a download wiki page which indicates that there is a regular release cycle. Although the wiki doesn’t indicate which architectures are supported, at least two of the mirrors provide x86_64 downloads. SHA1 checksums and GPG signatures of the images are accounted for – always a good sign to see integrity considered. As an aside, apparently Arch GNU/Linux doesn’t verify package signatures downloaded via its package management system which is why I would never consider using it.


From the FSF’s description, Dynebolic focuses on audio and video editing. I actually do intend to purchase a video camera at some point this year so I can start uploading videos, however for now I have no need for such tools.


The distribution Richard Stallman apparently uses as of late. I suspect that since Stallman now owns a MIPS-based netbook, he found gNewSense was able to support the configuration better than Ututo (which he used previously). Unfortunately I use an Intel i7 with 6Gb of RAM and a GTX480 graphics card with 1.5Gb of RAM – I need a 64-bit distribution to address all that memory, and gNewSense doesn’t support the x86_64 instruction set. The last stable release (v2.3) was 17 months ago too, according to Wikipedia at the time of writing.


Based on Knoppix. Isn’t Knoppix a blend of packages from various distributions and distribution versions? Last I checked (admittedly many years ago) it didn’t look like something easy to maintain. Musix also apparently has an emphasis on audio production, which would explain the name. As mentioned previously I don’t have a requirement for such tools.


I’ve actually used this at work for Xen Dom0 and DomU servers (although it’s surely not the best distribution for Dom0 – I had to use my own kernel and Xen install), and it works quite well. Basically, the latest version is just a re-badged Ubuntu 10.04 LTS with all the proprietary bits stripped out. Unfortunately, in many ways it’s a step back from what I was already using for my desktop – older packages than Ubuntu 10.10, a far smaller community and much less support/resources (although 90+% of whatever works for Ubuntu 10.04 will work for the latest Trisquel (4.0.1 at the time of writing). There is basically no compelling reason for me to use Trisquel on my personal desktop as my primary OS at this time from what I can see – I would probably be better off installing Debian Squeeze with non-free repositories disabled.


Based on Gentoo. Sounds good, right? However upon further investigation, it’s clear that the documentation is entirely in Spanish! This may not be a problem for Stallman, but I don’t speak (or read) Spanish.


Apparently this distribution is built around the KDE desktop. I personally don’t like KDE and don’t run it, so it doesn’t bode well so far. Heading on over to the Venenux website I once again fail to find any English, so scratch Venenux off the list.

So that’s it – the complete list of FSF-endorsed GNU/Linux distributions. Of all of the above, there is only one worthy of further investigation for my purposes – Dragora.

Dragora on trial

I downloaded, verified and burnt the 2.1 stable x86_64 release, booted it, and attempted an installation.

As is the case with the website, during installation I faced quite a bit of broken English – another potential sign that the project doesn’t have much of a community around it. However I wasn’t about to let that get to me. Perhaps it did have a large community, but simply not a large native-English-speaking community? Besides, English corrections aren’t a problem – I can take care of that by submitting patches at any time should upstream accept them.

The next thing I noticed was that my keyboard wasn’t working. I unplugged it and plugged it back in again. Pressed caps-lock a few times but didn’t see the caps-lock LED indicator appear. Clearly, the kernel wasn’t compiled with USB support (or the required module wasn’t loaded). Luckily, I managed to find a spare PS/2 keyboard which worked.

Like Gentoo, the first step involved manually partitioning my hard drives. I like to use LVM on RAID, and fortunately all the tools were provided on the CD to do that. It wasn’t clear if I needed to format the partitions as well, so I formatted over 2Tb of data anyway. Unfortunately during the setup program stage, the tool formatted my partitions again regardless of checking the box for it not to do so, which wasted a lot of time.

While waiting, I looked over some of the install scripts and noted that they were written in bash. All comments looked to be in Spanish, even though all installer prompts were in English. I found this both strange and frustrating. I also elected to install everything, as that was the recommended option. Lastly, I attempted to install Grub… and things didn’t go so well from that point onwards. As far as I could tell, the kernel was not able to support a MDADM+LVM2 root file-system (/boot was on a RAID1 only MDADM setup which I know normally works). It looked like I would either need to figure out why the kernel wasn’t working, or reformat everything and not use my 4-disk RAID5 array to host a LVM2 layout containing the root file-system. At this point my patience had truly ran out and I decided it was time to try something else. I also never managed to figure out how the package management system would work, as that was one section missing from the wiki documentation.

An old friend

So what did I end up with? Gentoo! Not exactly a completely free-software distribution… in fact I was shocked by just how much proprietary software Gentoo supports installing. The kernel won’t have all the non-free blobs removed either, however I suspect I don’t need to compile any modules that would load them. Gentoo also is feeling very nostalgic to me right now.

Since the last time I used it, there have been some notable changes and improvements. For example, stage1 and stage2 installations are no longer supported. Previously, I only ever used stage1 installs so this felt a bit like cheating. Additionally when compiling multiple programs at once, package notes are now displayed again at the end of compilation (regardless of success or failure of the last package) so you actually have a chance to read them. Perhaps the most important change however is that now source file downloads occur in parallel with compilation in the background (which can be easily seen by tailing the logs during compilation) – this saves a lot of time over the old download, compile, repeat process from years ago.

All USE flags still aren’t listed in the documentation, and I have ran into some circular dependency issues by using the doc USE flag (which is apparently common according to my www searches). I’ve had to add 6 lines to the package.accept_keywords and package.use files to get everything going. However, now I’m all set up. I’ve compiled a complete GNOME desktop with the Gimp, LibreOffice and many others. Unlike the days of my Duron which had to be left on overnight, my overclocked i7 950 rips through the builds – sometimes as quickly as they can be downloaded from a local mirror over my ADSL2+ connection.

Although the Gentoo website layout has largely been unchanged over the years, and some of the documentation is ever-so-slightly out of date, I get the sense that Gentoo is still a serious GNU/Linux distribution. I didn’t encounter any compilation issues that couldn’t be quickly worked around and it feels very fast. I still use Ubuntu 10.10 at work and Debian Squeeze on my netbook and home server, but going back to ebuilds still feels very strange and somewhat exciting. If only Gentoo focused more on freedom it would be darn near perfect.

Birth of the FreedomBox Foundation

Eben Moglen’s FreedomBox idea has caught my attention ever since his Silver lining in the cloud speech August last year. Unfortunately I haven’t noticed any visible progress on the project – until today. Looks like things have indeed been going on behind the scenes, as Mr Moglen has created the Freedom Box Foundation.

This inspired me to watch another of Moglen’s talks – Freedom in the Cloud (transcript here) – an older video that inspired the Diaspora project. Whilst it didn’t shine any more light on the subject (it was slightly more vague about how a FreedomBox device would function), Moglen was certainly right that people have been all to happy to sacrifice their privacy for convenience.

This blog runs on my personal home server. If the government wants to know what information I have on it or who has been accessing it, they can get a search warrant. They would have to physically enter my home and take my computer away to get it. The logs are all stored here – not on Facebook, Twitter or anywhere else. Nobody cares more about my data than me, and the government or anyone else who wants my data will have to go through me. That’s privacy.

My wife also has the option of using the home server for hosting her blog – but she refuses. Instead, she decided to post all her blogs and photos on Yahoo Blogs.

When I asked why, she told me that she wanted to know who was visiting her website and asked if I could tell who visited my website.

“Sure I can… kinda. I can give IP addresses. I can look up what countries those IP addresses have been allocated to. Alternatively, I could potentially see people’s user-names who visited my website if somebody logged in – required if somebody wants to post something.”

My wife was not impressed. “I want to see a list of user-names for everyone” she claimed. “Simple” I replied – “only allow people to view content when they log in”. In theory they shouldn’t have any privacy concerns since they obviously already need to be logged in to visit your site at Yahoo.

“Ah – that won’t work. They are already logged in when they visit other blogs. Nobody will create a separate login just for one blog – people are too lazy and nobody will visit.”

And there you have it. Seemingly, many people who use Yahoo Blogs (and presumably Facebook) feel the same way. I personally don’t care who visits my website and don’t see why I should care. If somebody wants me to know they visited, they can drop me an e-mail or post a comment.

OpenID would solve part of the problem my wife describes – it would reduce the burden of creating a new account, but won’t eliminate additional steps. It also requires the reader to already have an OpenID account to see any benefit, and it’s just not popular enough. I just spent a few minutes clicking through my bookmarks, and I could only find one website with OpenID support – SourceForge – and even then they only support a limited number of OpenID providers.

Will the FreedomBox project fix my wife’s use-case scenario? Most probably. One of the primary goals is “safe social networking, in which, without losing touch with any of your friends, you replace Facebook, Flickr, Twitter and other centralized services with privacy-respecting federated services”. Most probably Yahoo Blogs is popular enough that it would be included in that list.

How would the transition work though? If my wife had a FreedomBox, she would presumably be able to navigate a web interface to have it suck down the Yahoo Blogs data and host it locally. Next, her Yahoo page would add a link to her FreedomBox URL. When people visit, they would either be granted or denied access based on whether she had previously granted a particular user access. If denied, there would be an option to request said access.

However, say my wife decided to use a FreedomBox prior to all her Yahoo friends having one – how would she be able to be sure person X is Yahoo Blogs person X to grant X access? That’s where things get tricky, and is the part of the picture I’m not too clear on.

The only thing I could imagine working would be for person X to have an account on a third-party website that can talk whatever protocol the FreedomBox uses. Obviously this means another account, but as would be the case with Yahoo Blogs the one account sign-in would support access to all the FreedomBox blogs. Further, like OpenID providers, the third-party website in question would be able to be hosted anywhere. Perhaps OpenID providers themselves will even provide this functionality thereby eliminating the sign-up process for those already with an OpenID account.

I imagine it’s going to be a hard battle, but if it picks up it has the potential to be unstoppable.

Ultimate Free Software Web Browsing Experience

I want the web to work the way it was intended, using only 100% free software. Is that so much to ask? Apparently so – and almost exclusively due to Flash.

Flash. I have concluded long ago that it’s impossible to have a good web browsing experience with or without it, so you might as well protect your freedom and go without it. As a GNU/Linux user, it presents so many problems. Even if using the proprietary player was acceptable, it is plagued by bugs such as the famous “Linux Flash plugin is unresponsive to mouse clicks” issue that Adobe doesn’t even seem to acknowledge. There are various workarounds, but that’s not the point. Then there’s the issue of 64-bit GNU/Linux distributions typically bundling 64-bit versions of the Firefox web browser. Too bad Flash dropped support of the 64-bit Flash plugin while it was still in beta, leaving users with an unsupported version with known security vulnerabilities. [Update: Seems Adobe changed their minds again! Most people still hellbent on using Flash have already had to switch away from the unsupported beta by now, so what is the point?]

Want to know what really gets on my nerves? 99% of Flash content that I am actually interested in is just video. The stuff that Firefox and Chrome has included native support for since over a year ago, via Xiph.Org‘s Ogg Theora/Vorbis formats. Heck, even Opera finally included support for these free video formats early this year. Those three browsers alone account for over 40% of web browser usage worldwide. Of course, Microsoft doesn’t want anything to do with free software, and Apple generally tries to pretend it doesn’t exist wherever convenient to do so. Since the majority of web browser usage does not include video support natively but does include the Flash plugin, for a lot of websites Flash is the easy fix. This of course forced more people to use Flash, which caused more websites to use it, which caused more people to use Flash… you get the idea. Even though Flash has been responsible for a huge number of system compromises, people feel forced to use it anyway.

W3C recognized the need for an open standard so that anyone could play video on the web regardless of Adobe’s approval. When HTML 5 was being drafted, VP8 was proposed as the video codec. Why VP8, when three of the five major browsers had native Theora support already? The answer to that was of course; video quality. Everyone was whining that Theora wasn’t as high in picture quality for H.264 and everyone wanted video quality to be as nice as possible. Due to H.264 being unusable (being encumbered with with patents), Google generously purchased On2 Technologies who created the wonderful VP8 codec and released the codec as a truly free format. As it is the highest quality open-spec free-of-patents codec which anyone can use, this paved the way for W3C to give it serious consideration.

Unsurprisingly, Microsoft made it clear that they would not support a free format. Period. Microsoft doesn’t need to provide a reason – given their normal attitude towards open standards or anything that would benefit free software or competition of any kind, rejecting the proposal was a given. Historically Microsoft deliberately doesn’t follow standards (eg. IE6, MS Office… anything really), so having the suits at Microsoft disagreeing with it was completely expected. Still if everyone else supported the standard, and with IE’s popularity continuing to fall, this might be enough to either force Microsoft’s hand, or make IE basically irrelevant – eventually.

There’s one other (somewhat) common browser – Safari. Apple’s response to VP8? Utter BS – patent FUD, and bogus hardware claims. Apparently a lot of Apple hardware has dedicated H.264 decoder chips (presumably iPods, iPhones and such), which Apple seems to suggest can be used by H.264 exclusively. I don’t believe it. Considering how similar H.264 and VP8 actually are, you’d think a company like Apple would be able to make it work. Anyway, Apple comes out with new iWhatevers every year, and Apple provides basically no support for older devices. Last I checked (a few months back – don’t have a link), there were various vulnerabilities in the 1st generation iPod Touch models which Apple has no intention of fixing. It was only superseded by the 2nd generation just on 2 years ago. That’s right – if you brought your iPod Touch around 2 years ago, Apple wouldn’t care about you today. Due to this forced rapid upgrade cycle, it should be no problem at all to have Apple get hardware support into all its devices relatively quickly – after all, we’re talking about the company that got Intel to redesign them a smaller CPU to fit their MacBook Air. If Apple can boss Intel around to get chips working the way they want, they likely can with any hardware company.

As for the patent FUD, Apple claims that H.264 hasn’t been proven not to infringe on patents in court. Steve Jobs famously claims that all video codecs are covered by patents. If this actually were true – that it was impossible to create a video codec without stepping on a patent, the patents in question would surely have to be invalidated by being obvious or demonstrating prior art. Either way, Apple’s talking trash. The real reason for rejecting VP8 is surely for the same reason as Microsoft – so they can keep themselves from being on a level playing field with their most direct browser competitors. Mozilla, Google and Opera won’t pay for MPEG-LA patent licenses on a per-user basis since the browsers can be copied and distributed to anyone without charge – and there would be no way to track the licenses anyway. Even if (for example) the Mozilla Foundation did find a way to overcome these obstacles, what of projects that fork Mozilla? Mozilla is free software. If all derivatives weren’t covered, Firefox wouldn’t be free anymore. If they were covered, any project would never have to pay the MPEG-LA again since they could just opt to borrow the Mozilla code – it would be a licensing deal that the MPEG-LA would never agree to. Clearly, the future of video on the web cannot possibly depend on paying patent licenses.

So where does this leave us? I predict that if HTML5 does not specify a format to use for the video tag, we’ll continue to see Flash dominate as the preferred video decoding option by website owners for many years into the future. Couldn’t we just dump Flash already and have the Microsoft fanboys install the Xiph.Org Directshow Filters package (which apparently comes with support for IE’s <video> tag)? That could work in a lot of cases, however if it really took off you could be sure that Microsoft would find a way to “accidentally” break the plugin somehow. It wouldn’t be the first time. I recall Microsoft IE 5.5 beta (if I’m remembering my version numbers correctly) would prevent executable files named hotdog.exe from working. This kind of file name was commonly used for Sausage Software’s HotDog HTML editor installation program – direct competition to Microsoft FrontPage back in the day. Rename the file to setup.exe and you were in business – not easy to do when the file came on a CD. Microsoft could potentially just argue that the incompatibility was only in its beta software, but web developers would likely have installed it.

Getting back on track… <cough>.. if the future of web video is in Flash, what can we do about it? How can we play our video using 100% software? We’re not out of options yet. Adobe has announced that upcoming versions of Flash will support VP8! How does that help us? If webmasters want to reach as close to 100% of their audience as possible right now, H.264 is the best option. As much as I hate it, H.264 can be played back via Flash on 90+% of desktops. Encoding in a second format to reach users that don’t have Flash installed might not be cost effective when time and storage costs are considered. However when Flash supports VP8, everyone can adopt that format and not need to worry about encoding in H.264 as well. People without Flash but using Firefox, Chrome or Opera can gracefully fall back to watching video natively. That way, the website video will work on all free-software-only desktops. Video numbers can be still further improved by updating the free software Java applet player video player Cortado to add WebM support. This would be a combination that would likely get us as close to 100% compatibility as reasonably possible using only a single codec.

There are some reasons why this could fail. Perhaps a percentage of IE users that don’t have Flash, Java or have the Directshow Filters plugins installed (but can play video natively due to having IE9 or later) will be larger than the number of GNU/Linux desktop users. I expect this to be very unlikely. However if H.264 remains the only option for iPhone-style devices, that might help tilt the scales in H.264’s favor. Another problem is that a lot of video recording devices such as webcams and some digital camcorders record to H.264 natively. It might be more efficient for the website maintainer to keep video in that format (even if heavy edits are required). Fortunately most web videos are so short that transcoding times probably won’t matter… but it’s a minor concern.

But what about playback today using entirely free software? Flash sucks on GNU/Linux! Enter Gnash and FFmpeg. The latest version (0.8.8 at the time of writing) works with YouTube 99% as well as Flash on Windows. Other video sites… not so much. In particular, I still have problems with Gnash when I try to play AVGN and Zero Punctuation – but I have a solution for these as well – the gecko-mediaplayer plugin with Greasemonkey. Once those are installed, grab the Greasemonkey scripts Download Escapist Videos and Gametrailers No Flash. You also will want to install rtmpdump. With those all installed, when you want to check out Zero Punctuation simply click the Download link that now appears under the video. Gecko MediaPlayer will kick in and give you video that takes up the entire browser window. As for AVGN, I discovered that GameTrailers hosts all the ScrewAttack content which includes many of the AVGN videos. Simply head on over to the ScrewAttack section – the latest video should be towards the top. Note that you have to sign in for the script to work, but basically it just takes the Download link and streams it to Gecko MediaPlayer, which gets embedded in the area that Flash normally resides. It works perfectly.

So there’s a lot of hacks involved. Gnash is buggy, and FFmpeg might have patent issues depending on the codec and your country. The AVGN solution involved finding an alternative (but still non-free possibly patent-infringing) format. Lastly, the Zero Punction hack basically involved a stream ripper, Gecko MediaPlayer and (probably) FFmpeg too. This is ugly as hell, but it works. When it does the first time, it’s a wonderful feeling. Unfortunately if you want native WebM in Firefox you need to upgrade to Firefox 4 beta, and today’s Greasemonkey checkout still has serious compatibility issues (although it’s being actively worked on). When Greasemonkey works natively in Firefox 4 and both projects release a stable build (expected this year), things will be looking very nice… and I imagine Gnash will get better in the meantime. YouTube is also testing encoding videos to WebM format, so hopefully they keep that up and encourage other video sites to follow suite. All systems are go!

What an awesome week for freedom

Wednesday, Richard Stallman gave a speech on Free Software in Ethics and in Practise at Melbourne University. Thursday, he gave a talk on Copyright vs Community in the Age of Computer Networks at RMIT. I had the pleasure of attending both, and asking Richard a few quick questions regarding Android phones, IaaS and the FreedomBox project.

Today, I just got back from the Melbourne Software Freedom Day event at the State Library. I was pleasantly surprised by the number of volunteers in orange t-shirts. Whilst many of the talks were focused specifically on open source as opposed to free software (such as the RedHat talk), all talks I elected to attend were very interesting and occasionally even exciting.

Well done to everyone who helped organise and run the event, and a special thanks to sponsors the Victorian Government, the State Library of Victoria, LUV and Linux Australia. Looking very forward to next year.

Being a sysadmin is dangerous work

Okay… so I hit my head on an overhead sprinkler nozzle. Those things are sharp! Despite the small size of the cut, I needed three stitches.

not my first forehead scar

Hard to believe the mess it made. I just happened to be wearing shoes I brought just two weeks ago, and pants that I recently had professionally patched.

Shoes need a clean

quite a mess

Nothing compared to what Lucas had to clean up. Sorry! πŸ™‚

Also – sorry to all users who had issues with the sign-up page earlier. I believe these are all sorted now. If there are any further problems, feel free to drop me an e-mail using the address in the website footer.