Category Archives: Software

Software

Giving up fglrx in Debian Wheezy

The title says it all. A recent update has once again killed fglrx direct rendering from working with Xorg, so I’ve decided to just switch over to the free software Gallium driver entirely. This means no Amnesia, but I’ve since finished that game. It probably goes without saying that CrossFire won’t work now too, so… I would like to say that three of my GPUs are just doing nothing, but there are still power management issues with the radeon driver so the fans are sending my wife and I deaf while my cards cook at around 80-90 degrees, and it heats up my apartment noticeably – an annoyance since we’re heading towards the middle of summer here. It also means no OpenCL support since the AMD APP SDK depends on fglrx, although fortunately I haven’t been using that lately either.

The uninstallation of fglrx did not go smoothly. There have been times since I first performed my current desktop OS install where I manually ran the installer downloaded from AMD’s website, which spread files all over the place. These had to be cleaned up. The following two links were the most useful I came across which deal with this problem:

However, the final issue I had was documented on neither of those. The AMD installer created a file on my system in /etc/profile.d/ati-fglrx.sh which set an environment variable which caused direct rendering fail ($LIBGL_DRIVERS_PATH IIRC). Removing that file, logging out and in again got everything back to normal… well, “normal” as described above. :/

I’m still keeping fglrx on my laptop though (which I haven’t updated in a while)… for now. I don’t want my laptop run into the same power management issues leading up to Linux.conf.au 2012.

Here’s something I’ll be taking away from this experience. Proprietary software might sometimes be better than free software, but generally there can be no expectation of it becoming any better in the future than it is today. In the future it may become incompatible, may add new restrictions upon you, may not support new formats, may force you to upgrade (sometimes at cost) to continue functioning properly, etc. The issue I have experienced in this post was the former. With free software however, I can generally expect that the software I have today will never become worse over time – that is, it only gets better. Even in cases where ‘better’ is debatable (eg. GNOME 3), it can be (and often is) forked by anyone. That’s one of the reasons I love it.

To show my support of free software and software freedom, I have finally done something I feel guilty for not doing a long time ago – and became an associate member of the Free Software Foundation.
[FSF Associate Member]

Tough time for Debian Wheezy users running fglrx (and farewell GNOME)

Do you run the fglrx driver on Debian Wheezy? I sure do, and if you’re like me I feel your pain.

About a month ago, the fglrx packages were added back into Debian Testing so the previous workaround is no longer required. Unfortunately not long after things appeared working, the Debian guys decided it would be a good time to upgrade to GNOME 3, which caused all kinds of graphical corruption and made Debian all but unusable for me. I actually found myself booting into my Windows install for a while.

A newer fglrx driver was then released which fixed most graphical glitches, however things were still far from perfect. As an example the alt tag pop-up text in Firefox was rendered incorrectly and barely readable, but for the most part things were okay… until I tried to play a video. Hello bug #649346 – “fglrx-driver: using xv extension crashes Xorg”. At the time of writing, this still isn’t fixed. Naturally this is all terribly frustrating.

I generally use mplayer whenever I need to watch a video, so the work-around for me was adding “vo=gl” under the [default] section in my ~/.mplayer/config file, and being extra careful mplayer is the default video player for everything!

There’s one other interesting thing that happened to me over the weekend – I ditched GNOME. I’ve been a GNOME fan-boy since the pre-1.0 releases back around 1998, so you might imagine the significance of this. Certainly some of the lead GNOME guys have previously upset me by encouraging some further development to be in Mono, but my real reason for doing this is simply because modern GNOME 3.x versions just don’t seem to cater to me any more. After using it for a few weeks, I just feel too constrained.

For example, I wanted to find a way to select an appropriate font size. I couldn’t – I could choose “Small”, “Medium” “Large” etc. I know I like font size 8, but there was no way to select it – all the options gave something too big or small for my liking.

Another thing I use all the time is virtual desktops. Right now, I’m using 3, but sometimes I find myself using 10 or more depending on my workload. Because GNOME has always defaulted to two horizontal panels along the top and bottom of the screen, my virtual desktops have also always been aligned horizontally. GNOME 3 changes this – you have to get used to managing them vertically. Further, I can’t assign e-mail to virtual desktop 7 – GNOME only creates them as you need them. This may seem like nit-picking, but it’s too difficult to get used to, and it just feels inefficient.

Then there’s the Alt+Tab functionality. How could they screw that up? Well, if you have 10 Terminator (xterm) windows open for example, GNOME considers them all to be a single application. So when you Alt+Tab to switch through them, they all appear as a single item. Instead, you must Alt+tab to Terminator, and then Alt+` to switch between the individual terminals. I’m sure they were aiming for efficiency here (for a change), but it all feels very tedious and breaks conventions everyone is already used to.

Opening a new program is also annoying in GNOME 3. You need to move the mouse over to the top-left corner of the screen, then click an Applications icon that appears a few centimetres away (probably further away on larger screens). An large (huge?) icon for every application will appear after a few seconds of loading time – which is impractical for most people since the list is so big, so you need to narrow down the results by category. So now move the mouse way over to the right side of the screen to what looks a bit like the traditional GNOME 2.x menu options. These limit the giant application icons to only those that fit within that category.

What happens if the application you have has been installed manually and does not have a GNOME launch icon? Well, you need to create one manually, of course! Fire up your favourite text editor and create one under ~/.local/share/applications/ or some such. What a pain in the ass! Unlike the good old days of creating a custom application launcher through the GUI, in GNOME 3 you need to do it all through text editors.

You can add commonly used application launch icons to a dock on the left-hand side of the screen, but if you’re like me and use a bunch of different applications depending on the task at hand, that’s not particularly helpful. In fact, quite frequently I find myself hitting ALT+F2 to just type the name of the application I want to launch. This functionality is still there in GNOME 3, however it’s far less useful than it used to be. Auto-complete functionality seemed to be missing, however it’s still the best option for launching applications when you don’t want to bother with creating launch icons.

Some of the GNOME 3 options simply aren’t even implemented. For example, telling GNOME you want your user to automatically log in doesn’t work – you need to edit configuration files. How a major release ever made it out in such a state I’ll never know.

Another thing I wanted to do was tell GNOME that my default terminal should be Terminator, since it was clearly ignoring my /etc/alternatives/x-terminal-emulator setting. Unfortunately, that’s a matter of firing up gconf-editor and hunting down the option. What used to be a simple drop-down menu in GNOME 2.x no longer exists!

Some of the above issues are able to be worked around via Shell Extensions, and GnomeTweakTool, but it seems stupid to be forced to waste time with hacks just to get basic functionality going. Firefox provides everything needed for efficient web browsing out of the box, and if you want extra uncommon functionality the extensions are there to help you out – but it’s still a perfectly good web browser without them. GNOME 3 on the other hand just feels useless as a desktop without them. It’s a disaster.

So what have I switched to? I wanted something Debian was likely to have good support for, so I started poking around the available packages:

$ for i in $(apt-cache show task-desktop | grep ^Recommends: | cut -d ' ' -f 2- | tr -d ',|') ; do [[ ${i} = *desktop ]] && echo ${i} ; done
task-gnome-desktop
task-kde-desktop
task-lxde-desktop
task-xfce-desktop
$

I’ve given KDE a number of chances over the years, but have always switched back to GNOME due to its complexity. When you get frustrated trying to hunt down an option you know should exist, something has to be wrong. However, my N900 does run LXDE in a chroot and it seemes okay, so I gave it a spin. Ouch was it buggy! Trying to configure options would spit out random errors which had been fixed in newer releases which made it into Ubuntu over a month ago, but were still an issue in Debian Testing? Seemed to me like the Debian guys haven’t given LXDE much love, so that leaves me with Xfce. Linus Torvalds switched to it a while ago… how bad can it be? Well, I did try it years ago, and my memories of it were not good… but given the lack of options I thought I’d give it a try anyway. And boy was I impressed!

GNOME 2.x users will feel right at home. Imagine GNOME 2.x… except with more options for configuration out-of-the-box! I was able to make my Xfce desktop look and behave almost identically to GNOME 2.x, and it feels quicker to boot! I don’t know why the Mate project (aiming to fork GNOME 2) is bothering – Xfce just feels so right. :)

I did have one issue with Xfce sound however. I have basically two sound cards – an Intel HD Audio Controller located on my motherboard, and my Logitech G35 USB headset. Stock Xfce did not seem to provide any option for switching between these on the fly, however audio was one thing that GNOME (both 2 and 3) got right.. which gave me an idea. Under Settings -> ‘Session and Startup’ -> ‘Application Autostart’, I added /usr/bin/gnome-sound-applet (which comes from the gnome-control-center package). Now, audio works just as well under Xfce via this applet as it does under GNOME. Beautiful!

There’s a few other little things I’ve found in Xfce where I’ve thought “wow that’s a nice touch”. Eg. I regularly Alt+drag windows around, but with Xfce you can actually drag them to neighbouring virtual desktops! It might not sound that amazing, but it feels nice. Also, when you want to move an applet around on the panels, you get a square appear that makes it very clear what the panel will look like if you left-click to confirm – as opposed to the GNOME way where you see the results as you have already made the change by dragging. Lastly, say I click on a launcher for a program that is already open in another virtual desktop which I forgot about, instead of getting a flashing icon in the task panel and having to click it to jump to a different virtual desktop (as would be the case in GNOME 2.x), you just have the application instantly move from whichever virtual desktop it was on to the current one. These are all minor details, but have made me pleasantly surprised.

As for the Xfce panel applets, some are better than those in GNOME 2.x, and some aren’t quite as good. Overall, I didn’t feel any worse off. I did think the Directory Menu applet will be really useful, but I haven’t relied on it much yet (perhaps out of habit of not having it). If you like GNOME 2 and hate GNOME 3, definitely do yourself a favour and give Xfce 4.8 a try for a few days and see what you think.

fglrx on Debian Wheezy (testing)

Packages fglrx-driver and fglrx-glx have been removed from Debian for the last few weeks due to #639875, so I’ve been using the free software Radeon drivers in the meantime. While I appreciate having my virtual terminals display the console at my native screen resolution automatically, I don’t like that I’ve had to put playing Amnesia on hold for a while – these drivers cause the game to segfault.

Today I decided to roll back my xorg version to get the fglrx drivers working again, and as it turns out, it really wasn’t that hard. Here’s how I did it.

  1. Set up a preference control file (eg. /etc/apt/preferences.d/60xorg_rollback_for_fglrx.pref) as follows:
    
    Package: xserver-xorg
    Pin: version 1:7.6+8
    Pin-Priority: 1001
    
    Package: xserver-xorg-core xserver-xorg-dev
    Pin: version 2:1.10.4-1
    Pin-Priority: 1001
    
    Package: xserver-xorg-input-evdev
    Pin: version 1:2.6.0-2+b1
    Pin-Priority: 1001
    
    Package: xserver-xorg-input-kbd
    Pin: version 1:1.6.0-3
    Pin-Priority: 1001
    
    Package: xserver-xorg-input-mouse
    Pin: version 1:1.7.1-1
    Pin-Priority: 1001
    
  2. Now add the following repositories to your apt sources configuration (eg. /etc/apt/sources.list.d/60snapshot-20110911T211512Z.list):
    deb http://snapshot.debian.org/archive/debian/20110911T211512Z/ wheezy main contrib non-free
    deb-src http://snapshot.debian.org/archive/debian/20110911T211512Z/ wheezy main contrib non-free

    These include Xorg package versions that don’t have the ABI change which is incompatible with fglrx.

  3. Normally, Debian will spit out the following error:
    E: Release file for http://snapshot.debian.org/archive/debian/20110911T211512Z/dists/wheezy/InRelease is expired (invalid since 33d 17h 12min 59s). Updates for this repository will not be applied. We fix this by adding an apt configuration file (eg. /etc/apt/apt.conf.d/60ignore_repo_date_check) like so:

    Acquire
    {
    	Check-Valid-Until "false";
    }
  4. We should now be able to resynchronize the package index successfully.
    apt-get update
  5. Log out of your X session (if you haven’t already), and (from a virtual terminal) stop gdm/gdm3/lightdm or anything else that might be responsible for an Xorg server process running. eg,
    /etc/init.d/gdm3 stop
  6. Revert xorg packages to older versions, as defined in our preferences policy.
    apt-get dist-upgrade
  7. Install the fglrx drivers from the snapshot repository.
    apt-get install fglrx-driver fglrx-glx fglrx-control
  8. Make sure Kernel Mode Setting is not enabled. This should (in theory) be handled automatically due to the /etc/modprobe.d/fglrx-driver.conf file created during the fglrx-driver package installation – or at least it seemed to be for me.
  9. Create a new xorg.conf file. Assuming Bash:
    mv /etc/X11/xorg.conf{,.$(date +'%Y%m%d%H%m')}
    aticonfig --initial
  10. Reboot, and you should be presented with some kind of X display manager login screen. If everything went well, you should be able to see the following:
    $ glxinfo | grep -E '(^direct\ |\ glx\ |^GLX\ |^OpenGL)' | grep -v '\ extensions:$'
    direct rendering: Yes
    server glx vendor string: ATI
    server glx version string: 1.4
    client glx vendor string: ATI
    client glx version string: 1.4
    GLX version: 1.4
    OpenGL vendor string: ATI Technologies Inc.
    OpenGL renderer string: AMD Radeon HD 6900 Series  
    OpenGL version string: 4.1.11005 Compatibility Profile Context
    OpenGL shading language version string: 4.10
    $ 
    

Update (2011-10-23, 10:14pm): In case it wasn’t clear, these changes are temporary. However that brings up new questions like how will I know when I need to revert these changes? and how do I revert?. Well, the first question is easy to answer – simply run the following command:
$ [ "$(apt-cache policy fglrx-driver | grep -E '^(\ |\*){5}[0-9]+' | wc -l)" -ge 2 ] && echo "fglrx-driver is back - time to upgrade xorg" You can optionally put that in a daily cron job to have it e-mail you when it is time.

Reverting the above changes is also very easy:
$ rm -f /etc/apt/preferences.d/60xorg_rollback_for_fglrx.pref /etc/apt/sources.list.d/60snapshot-20110911T211512Z.list /etc/apt/apt.conf.d/60ignore_repo_date_check
$ apt-get update && apt-get dist-upgrade
followed by a reboot.

Farewell Grip, hello Sound Juicer

For probably over a decade, I’ve been using Grip to rip all my audio CDs to MP3, Ogg Vorbis, and (for the last few years), FLAC. I admit that it’s not the easiest ripper to use these days, but its flexibility is unparalleled.

Unfortunately, it is no longer maintained, and the last stable release was from 2005. Due to this, many major distributions (including Debian) have removed it from the official repositories… which means compiling. What joy. :)

I brought home a new CD today. It might have been possible to buy it online, but I’m increasingly becoming disappointed with digital download audio purchases. Unexpected censoring/editing, poor audio quality, lack of high-quality/complete album artwork, having to rename tracks when purchasing from different online music stores to meet preferred naming convention requirements, inconsistent tag naming conventions and tag types between stores, and a price that’s still very similar to the physical disc version have been steering me away from them. If I happen to be down the street near a local CD store, it’s more convenient to duck in and get the real thing rather then purchasing some downloadable hacked imitation later which will also involve more work in the long run.

As soon as I arrived home, I threw the disc in my laptop’s external USB DVD drive to rip it (because seriously – who uses a CD player to play music on these days?) and up pops Sound Juicer. That’s normally my cue to start compiling Grip, but this time I paused – it had been a long time since I had investigated alternatives, and GStreamer is the future (which is what Sound Ripper apparently uses) so I decided I’d spend a few moments to see if I could get Sound Ripper to rip the disc in the exact way I like it.

My music naming convention is as follows:
<lowercase_artist_name>/<lowercase_album_name>/<zero-padded_track_number>-<lowercase_filename>.flac

I don’t use white-space or special characters in file names (I replace them with underscores) – there’s no point since getting the names exact is supposed to be what tagging information is for – and this convention is extremely easy to parse. Since track names are always in album order due to the padded track number, I can always play an album with mplayer *.flac or gst123 *.flac – which is how I generally play my music. It’s also uncommon in *NIX to see upper-case in file-names, so I keep everything lower-case too for consistency. Additionally, tags should be in native FLAC format – not id3v2 or something. I only require the band, album, album release year, track name and track number tags be set, but am not fussy if others are included.

Okay – so what’s Sound Juicer got? Clicking on Edit -> Preferences provides the following screen:

Hmm… doesn’t look too promising. The drop-down boxes don’t provide the exact configuration I want, but I can get pretty close. If only I could create my own custom settings! I quickly fired up gconf-editor to take a look at behind-the-scenes configurable options, and found three relevant keys:

  • /apps/sound-juicer/path_pattern
  • /apps/sound-juicer/file_pattern
  • /apps/sound-juicer/strip-special

Clicking them provides a complete description of all the available parameters these can take, so I tweaked them as per the following screen shot:

Changes seem to be applied instantly, so I hit the bit Sound Juicer Extract button, and lo and behold:

So far so good… but what about tagging information?

~/Music/Flac/killswitch_engage/alive_or_just_breathing$ lltag -S 01-numbered_days.flac
01-numbered_days.flac:
ARTIST=Killswitch Engage
TITLE=Numbered Days
ALBUM=Alive or Just Breathing
NUMBER=1
GENRE=Metalcore
DATE=2002-01-01
TRACKTOTAL=12
DISCID=9a0a860c
MUSICBRAINZ_DISCID=JakXtKdQUlm1n5i3sr9KRBqxIy4-
ARTISTSORT=Killswitch Engage
~/Music/Flac/killswitch_engage/alive_or_just_breathing$

Perfect!

Grip has served me well for a long time, and given me so many fond memories of spending hours ripping my entire collection… but now it seems it’s finally time to say goodbye – and hello Sound Juicer.

AMD Fusion E-350 – Debian Wheezy (testing) with Catalyst 11.5 drivers and VA-API support via XvBA

My new Sony Vaio 11.6″ laptop runs an E-350 processor. I’ve upgraded the RAM to 8Gb and threw in a OCZ Vertex 2 SSD. Even using LUKS under LVM2, it feels insanely fast.

Everything I’ve tried works perfectly on it out of the box, including the card reader, HDMI port, Gigabit NIC, USB ports, sound card, wireless, Bluetooth, volume/brightness/video output function keys, etc. Well, almost everything… the graphics is one glaring exception.

For instance, when I go to log out or drop to a virtual terminal, I get all kinds of screen corruption and can’t see a thing. I always need to hit CTRL+ALT+F2 to switch to a virtual terminal, and CTRL+ALT+DEL to restart it to get picture back. Fortunately the SSD means that’s under a 30 second wait to shutdown and boot up again into the login screen, but after owning this machine for just over two weeks now it was starting to get old.

Further, 3D acceleration via Mesa3D was painful. Games like the Penumbra Collection ran at about 1 FPS. It looks like the Phoronix guys have tested newer versions with Gallium3D which look slightly more pleasing, however none of that’s in Debian and I don’t want to waste lots of time recompiling everything and potentially causing new updates to break. Further, while I could watch 720p video reasonably well, 1080p video would occasionally cause slowdowns and prevent the viewing experience from being completely enjoyable.

Time to do something I’d rather not… and install the proprietary driver. 11.3 is currently in the testing repos, however it still isn’t efficient enough to allow 1080p video to play properly. Downloading the new 11.5 Catalyst driver also fails to properly install due to a DKMS compilation error.

Fortunately, Mindwerks has instructions on how to fix this issue under Ubuntu 11.04. With a few little tweaks they can be made to work with Debian too.

Instructions:

Debian Wheezy (7.0) users can finally get fglrx playing nicely together with X.Org X Server 1.9.5. We can also make the latest driver work well with the 2.6.39 kernel.

The Minderks custom build procedure follows, adapted for Debian Wheezy (7.0) users. Note that the commands are for 64-bit, but the only change 32-bit users likely need to make is to download an i386 package instead for step 8. Also, the following commands assume you use the sudo package to gain root.

  1. Install the latest 2.6.39 kernel revision from kernel.org. eg.
    $ wget http://www.kernel.org/pub/linux/kernel/v2.6/testing/linux-2.6.39-rc7.tar.bz2
    $ wget http://www.kernel.org/pub/linux/kernel/v2.6/testing/linux-2.6.39-rc7.tar.bz2.sign
    $ gpg --verify linux-2.6.39-rc7.tar.bz2.sign linux-2.6.39-rc7.tar.bz2
    $ tar jxf linux-2.6.39-rc7.tar.bz2
    $ cd linux-2.6.39-rc7
    $ make-kpkg -j2 --rootcmd fakeroot --initrd --bzimage kernel_headers kernel_image
    $ sudo dpkg -i ../*.deb

    and boot it.

  2. Download the AMD Catalyst 11.5 64-bit driver installer.
  3. Install the Debian VA-API packages. Note that some say to get them from the Splitted Desktop website, however I tried them and didn’t notice any benefit in doing so so reverted to the following:
    $ sudo apt-get install libva1 libva-x11-1 libva-tpi1 libva-glx1 libva-dev
  4. Extract the files from the package:
    $ sh ./ati-driver-installer-11-5-x86.x86_64.run --extract ati
  5. For 2.6.39 support, download this additional patch: 2.6.39_bkl.patch
    $ wget http://www.mindwerks.net/wp-content/uploads/2011/03/2.6.39_bkl.patch
  6. Check for Big Kernel Lock usage:
    $ cat /lib/modules/$(uname -r)/build/.config | grep -c CONFIG_BKL=y

    If the result of this command is 0, then download no_bkl.patch as well. For stock kernels you should get 0 and will need the patch – which is probably the main reason you are here. :)

    $ wget http://www.mindwerks.net/wp-content/uploads/2011/03/no_bkl.patch
  7. then apply them:
    $ cd ati; for i in ../*.patch; do patch -p1 < $i; done
  8. Build your new ati/fglrx driver:
    $ sudo ./ati-installer.sh 8.85.6 --install
  9. Since we’re using the proprietary drivers, we might as well make the most of them. Grab the latest XvBA backend library for VA-API:
    $ wget http://www.splitted-desktop.com/~gbeauchesne/xvba-video/xvba-video_0.7.8-1_amd64.deb
    $ dpkg -i xvba-video_0.7.8-1_amd64.deb
    $ cd /usr/lib/dri && sudo ln -s ../va/drivers/xvba_drv_video.so fglrx_drv_video.so
  10. If your /etc/X11/xorg.conf is missing you will need to run:
    $ sudo aticonfig --initial

    and then reboot.

That newly created package should work for the entire 2.6.39 series.

These steps are really useful for AMD Fusion users at the moment. Without VA-API, multi-threaded mplayer will occupy 100% of available CPU (both cores) and drop frames when playing a test 1080p MKV file I have (containing H.264+AAC+AAS). It’s not unwatchable, but it’s annoying. With VA-API-patched mplayer, CPU usage never hits 15% when playing the same test video!

It’s a pity that the Catalyst driver doesn’t currently correctly generate deb packages for Debian, but the fglrx packages are not listed as a depenency for xvba-video so it’s not a major problem.

Update (2011-05-17, 12:41am): Using the new kernel is not necessary. Although there appear to be a few AMD Fusion fixes in there, I haven’t noticed any benefit in using 2.6.39-rc7 – possibly because I’m using the proprietary drivers. In fact, I’m not completely convinced the Catalyst 11.5 drivers are required either as I was having problems getting mplayer to use VA-API when I was testing 11.3. I’d be interested to know if just steps 3, 9 and 10 would be sufficient for hardware decoding using the patched mplayer and the default fglrx-atieventsd fglrx-control fglrx-driver fglrx-glx fglrx-glx-ia32 fglrx-modules-dkms package versions. I might very well have done much more than required.

Update (2011-05-22, 9:04pm): You do not need Catalyst version 11.5 – you can use the version in the repositories. In fact, I recommend it. The reason being (aside from having the drivers managed via apt) is that the compiling and installing 11.5 via the instructions above won’t enable 32-bit compatibility. I tried running a few games under WINE yesterday, but WINE always complained that OpenGL (and hence DirectX) was not available to it. Presumably I needed whatever the fglrx-glx-ia32 package included. Possibly I was just missing something as simple as a system link before, but I didn’t investigate – I’d rather use the standard Debian packages wherever possible if they do the job. Also, you don’t need to manually fetch the Splitted Desktop xvba-video package – xvba-va-driver is basically the same thing, only A) xvba-va-driver is in the official Debian repositories and B) it specifies the fglrx packages as a required dependency.

So in summary:

  1. Don’t touch your kernel.
  2. $ sudo apt-get install libva1 libva-x11-1 libva-tpi1 libva-glx1 libva-dev xvba-va-driver fglrx-atieventsd fglrx-control fglrx-driver fglrx-glx fglrx-glx-ia32 fglrx-modules-dkms
  3. $ sudo aticonfig --initial
  4. Reboot
  5. Use the VA-API-patched mplayer for watching movies.

It’s easy when you know how. :)

Update (2011-07-03, 12:11am): If you run gnome-mplayer (perhaps so you can make use of gecko-mediaplayer for libva-accelerated video in Firefox), be aware that Debian currently ships no later than 0.9.9.2 – even in sid. Thus, you will encounter a problem with a filter gnome-mplayer automatically uses which is incompatible with vaapi – and will automatically disable it. I can verify that 1.0.4 avoids this issue, and allows gecko-mediaplayer to work wonders when combined with the ViewTube and YouTube HTML5 forever Greasemonkey scripts.

It’s not you, it’s me.

I’ve used a number of GNU/Linux distributions over the years, some more than others. This weekend, I’ve apparently been overcome by nostalgia and revisited a distribution I haven’t used in years.

Early beginnings

When I first started getting serious about leaving Windows in the late 90’s, I used RedHat 5.2 as my primary OS. Release after release went by, and I was convinced it was continuing to lose its charm. There was a limited number of packages available and I regularly had to hit rpmfind.net to get things I wanted installed. First, RedHat 7 shipped with a broken compiler. Next, RedHat 8 shipped with Bluecurve which seemed like a crippled mash of GNOME and KDE, so getting certain things to compile either way could be quite difficult. Given the small repositories, compilation was frequently necessary. There had to be a better way…

I occasionally glanced over Debian’s way, but at the time it was too archaic. I could not (for example) understand how anything with such a primitive installer could be any good. eg. Why would one ever want to manually specify kernel modules to install? Why not at least have an option to install the lot and just load what’s required like other distributions did? Then once you got the thing installed and working, packages would be quite old. You could switch to testing, but then you don’t have a stable distribution any more. The whole distribution felt very server/CLI-orientated, and at the time (having not so long ago came from a Win9x environment) it was not easy to appreciate. It also lacked a community with a low-barrier entry – eg. IRC and mailing lists instead of forums.

Enter Gentoo. It seemed to always make the headlines on Slashdot, so I had to see what all the fuss was about. Certainly it was no snap to install, but the attention to detail was incredible. The boot-splash graphics (without hiding the important start-up messages as other distros later tried to do) and automatic detecting of sound and network cards was a sight for sore eyes. The LiveCD alone was my first go-to CD for system rescues for many years due to it’s huge hardware support, ease of use, high-resolution console and small boot time (as it didn’t come with X). Further, everything was quite up-to-date – certainly more so than basically anything else at the time. Later I came to realise the beauty of rolling releases and community.

So I compiled, and compiled, and compiled some more. My AMD Duron 900MHz with 512Mb and 60Gb IDE HDDs seemed pretty sweet at the time, but compiling X or GNOME would still basically require me to leave the machine on overnight. It didn’t help any that I only had a 56k dial-up connection. Sometimes packages would not compile successfully so I would have to fix it and resume compilation the next morning (although the fix was only ever a Google search away). With the amount of time I invested into learning Gentoo GNU/Linux instead of studying my university courses, it’s amazing I managed to get my degree. I even went so far as to install Gentoo on my Pentium 166MMX with 80Mb RAM laptop (using distcc and ccache).

Later during my university work-experience year, I used Gentoo on my first-ever built production web-server. Another “senior” system administrator came along at one point to fix an issue it had with qmail, and he was mighty upset with me to see Gentoo. Not because it was a rolling-release (and I’m pretty sure Hardened Gentoo didn’t exist at the time either), but because he didn’t know the rc-update command (he expected RedHat’s chkconfig) and was only familiar with System V and BSD init scripts. I think he would have had a heart attack if he later saw Ubuntu’s Upstart system (which has also been quite inconsistent across releases over the years)!

During my final university year, I used Gentoo again for a team-based development project. For years it seemed so awesome that I could hardly bare to look at another distro. The forums were always active, and the flexibility the ports-based system provided was unmatched by any other GNU/Linux distribution. However it was at this time Ubuntu starting making Slashdot headlines – perhaps moreso than Gentoo. Whilst I was midly curious and read the articles, I didn’t immediately bother to try Ubuntu myself. After all, it was just a new Debian fork, and I knew about Debian. The next year I attended the 2005 Linux.conf.au in Canberra, still with my trusty P166 laptop running Gentoo I might add. After arriving at the hallway where everyone was browsing the web on their personal laptops with the free wifi, it was immediately obvious I was the odd one out. Not just because my computer was ancient by comparison, but because I was just about the only person that *wasn’t* running Ubuntu. I think I literally saw just one other laptop not running Ubuntu – I simply could not believe it. I had to see what the fuss was about.

A few days after the conference, I partitioned my hard disk and installed Ubuntu Warty. Hoary had just been released, but I already had a Warty install CD and I was still limited to dial-up – broadband plans required a one-year contract and installation fees, and I was frequently moving around. Unlike Debian, the Warty installation was quick and simple. The community was quite large, and while it seemed smaller than Gentoo it was growing rapidly. Because user-friendly documentation actually existed, I felt more comfortable using the package management system. I didn’t need to use the horrible dselect package tool either. The packages were quite up-to-date; perhaps not on the same level as Gentoo, but close enough that any differences didn’t matter. Ultimately I was obtaining all the notable benefits Gentoo had offered, without the downside of huge downloads and slow compilations. By the end of the year I wasn’t using Gentoo on my personal machines any longer.

Side-note:

Seemingly inspired by Ubuntu, Debian has over the years since made significant improvements – shorter release cycles, more beginner-friendly and user-friendly support pages and wikis (although no forms so far), a much easier and efficient installer, etc. Additionally, because of my familiarity with the Debian-based Ubuntu, I later found myself far more comfortable using Debian. I might even go so far as to say enjoying Debian. Indeed, my home server is a Debian box running Xen, where Ubuntu doesn’t even support Xen in Dom0. Further, unlike Ubuntu, Debian hasn’t changed the init script system every six months. Each release does things the way you would expect, whereas Ubuntu frequently “rocks the boat”, requiring more time going through documentation trying to figure out how new releases work. Being a free software supporter, I also don’t appreciate Ubuntu recommending me to install the proprietary Flash player in my Firefox install, or offering to automatically install nVidia proprietary drivers. If I need to use proprietary software, I want it to be a very conscious decision so I know exactly which areas of my OS are tainted. In some ways, Debian is stealing back some of the thunder it lost to Ubuntu – at least in my book.

Life after Ubuntu

The installation I have been using on my personal desktop was a Ubuntu install, upgraded with each release to the current 10.10 (Maverick) from 7.10 (Gutsy), occasionally installing 3rd party PPAs and what not over the years. Whilst I had upgraded my file-systems to use ext4, I knew I would need to reformat them from scratch to get the most performance benefits from them. There was probably a ton of bloat on the system (unused packages, hidden configuration files and directories under $HOME,  etc.) and many configuration files probably differed largely from the improved defaults one would get on a fresh modern install. As this weekend was a day longer than usual due to Australia Day, ultimately I decided it was a good time to reinstall.

What to install however? Ubuntu again? I was seriously getting tired of it. It just didn’t feel interesting any more, although it did the job. Additionally, Ubuntu doesn’t strip the kernel of binary blobs like other distributions do (including Debian starting with Squeeze) – another cross in my book. No, it was time for a change. Debian might have been the obvious choice – particularly since Squeeze was released recently making it fresh in new. However I was already somewhat familiar with Squeeze (I installed it on my Asus EeePC 701 netbook last week) and it still felt a bit dated in some areas over Ubuntu. I’m also conscious that the next version probably won’t be released as stable for at least another year or two. Further, I don’t appreciate how Debian considers lots of the GNU *-doc packages non-free. I want those installed, but I hate the thought of enabling non-free repositories just to get them. Perhaps I could find something completely free instead?

Completely free GNU/Linux options

With that in mind, I did a little bit of investigation into completely free GNU/Linux distributions (as per the FSF guidelines). Let’s look at the list:

Blag

Based on Fedora. I don’t like Fedora, but I might be willing to look past that aspect. I do however need a distribution that is kept up to date. According to Wikipedia, “the latest stable release, BLAG90001, is based on Fedora 9, and was released 21 July 2008.”. In all likeliness, this would suggest that the stable release hasn’t seen security updates in a long time (Fedora 9 dropped support some time ago). The Blag website does however list a 140k RC1 version which is based on Fedora 14 (it’s not clear when this was posted), but other parts of the website such as the Features section still reference Fedora 9 as the base distribution. It would seem the latest Blag RC version has only made an appearance after well over 2 years, and it’s not even marked as stable.

Further, I’m a little skeptical of installing a distribution that is basically the same as another distribution with bits stripped out to make it completely free – there is always the chance that something was missed, or a certain uncommonly-installed package will malfunction with part of a dependency removed. On the face of it, Blag fills me with doubts.

Dragora

Something new, and not based on any other distribution. Sounds intriguing. Just the coolness factor of using something unlike anything I’ve ever used before gives this bonus points, if only for the fact that the guys doing this must be serious due to doing so much work (eg. new init system, new package management system, installer, documentation, etc.). The Wikipedia page is light on details and the project website redirects to a broken wiki with the text “Be patient: we are working in the content of the new site. – Thanks”. There is a link to the old wiki at the bottom of the page, but it redirects to a Spanish page by default. Fortunately there is an English version there you can select, and it looks mostly as complete as the Spanish version.

It’s nice to see this project has its own artwork and a few translations. There is also a download wiki page which indicates that there is a regular release cycle. Although the wiki doesn’t indicate which architectures are supported, at least two of the mirrors provide x86_64 downloads. SHA1 checksums and GPG signatures of the images are accounted for – always a good sign to see integrity considered. As an aside, apparently Arch GNU/Linux doesn’t verify package signatures downloaded via its package management system which is why I would never consider using it.

Dynebolic

From the FSF’s description, Dynebolic focuses on audio and video editing. I actually do intend to purchase a video camera at some point this year so I can start uploading videos, however for now I have no need for such tools.

gNewSence

The distribution Richard Stallman apparently uses as of late. I suspect that since Stallman now owns a MIPS-based netbook, he found gNewSense was able to support the configuration better than Ututo (which he used previously). Unfortunately I use an Intel i7 with 6Gb of RAM and a GTX480 graphics card with 1.5Gb of RAM – I need a 64-bit distribution to address all that memory, and gNewSense doesn’t support the x86_64 instruction set. The last stable release (v2.3) was 17 months ago too, according to Wikipedia at the time of writing.

Musix

Based on Knoppix. Isn’t Knoppix a blend of packages from various distributions and distribution versions? Last I checked (admittedly many years ago) it didn’t look like something easy to maintain. Musix also apparently has an emphasis on audio production, which would explain the name. As mentioned previously I don’t have a requirement for such tools.

Trisquel

I’ve actually used this at work for Xen Dom0 and DomU servers (although it’s surely not the best distribution for Dom0 – I had to use my own kernel and Xen install), and it works quite well. Basically, the latest version is just a re-badged Ubuntu 10.04 LTS with all the proprietary bits stripped out. Unfortunately, in many ways it’s a step back from what I was already using for my desktop – older packages than Ubuntu 10.10, a far smaller community and much less support/resources (although 90+% of whatever works for Ubuntu 10.04 will work for the latest Trisquel (4.0.1 at the time of writing). There is basically no compelling reason for me to use Trisquel on my personal desktop as my primary OS at this time from what I can see – I would probably be better off installing Debian Squeeze with non-free repositories disabled.

Ututo

Based on Gentoo. Sounds good, right? However upon further investigation, it’s clear that the documentation is entirely in Spanish! This may not be a problem for Stallman, but I don’t speak (or read) Spanish.

Venenux

Apparently this distribution is built around the KDE desktop. I personally don’t like KDE and don’t run it, so it doesn’t bode well so far. Heading on over to the Venenux website I once again fail to find any English, so scratch Venenux off the list.

So that’s it – the complete list of FSF-endorsed GNU/Linux distributions. Of all of the above, there is only one worthy of further investigation for my purposes – Dragora.

Dragora on trial

I downloaded, verified and burnt the 2.1 stable x86_64 release, booted it, and attempted an installation.

As is the case with the website, during installation I faced quite a bit of broken English – another potential sign that the project doesn’t have much of a community around it. However I wasn’t about to let that get to me. Perhaps it did have a large community, but simply not a large native-English-speaking community? Besides, English corrections aren’t a problem – I can take care of that by submitting patches at any time should upstream accept them.

The next thing I noticed was that my keyboard wasn’t working. I unplugged it and plugged it back in again. Pressed caps-lock a few times but didn’t see the caps-lock LED indicator appear. Clearly, the kernel wasn’t compiled with USB support (or the required module wasn’t loaded). Luckily, I managed to find a spare PS/2 keyboard which worked.

Like Gentoo, the first step involved manually partitioning my hard drives. I like to use LVM on RAID, and fortunately all the tools were provided on the CD to do that. It wasn’t clear if I needed to format the partitions as well, so I formatted over 2Tb of data anyway. Unfortunately during the setup program stage, the tool formatted my partitions again regardless of checking the box for it not to do so, which wasted a lot of time.

While waiting, I looked over some of the install scripts and noted that they were written in bash. All comments looked to be in Spanish, even though all installer prompts were in English. I found this both strange and frustrating. I also elected to install everything, as that was the recommended option. Lastly, I attempted to install Grub… and things didn’t go so well from that point onwards. As far as I could tell, the kernel was not able to support a MDADM+LVM2 root file-system (/boot was on a RAID1 only MDADM setup which I know normally works). It looked like I would either need to figure out why the kernel wasn’t working, or reformat everything and not use my 4-disk RAID5 array to host a LVM2 layout containing the root file-system. At this point my patience had truly ran out and I decided it was time to try something else. I also never managed to figure out how the package management system would work, as that was one section missing from the wiki documentation.

An old friend

So what did I end up with? Gentoo! Not exactly a completely free-software distribution… in fact I was shocked by just how much proprietary software Gentoo supports installing. The kernel won’t have all the non-free blobs removed either, however I suspect I don’t need to compile any modules that would load them. Gentoo also is feeling very nostalgic to me right now.

Since the last time I used it, there have been some notable changes and improvements. For example, stage1 and stage2 installations are no longer supported. Previously, I only ever used stage1 installs so this felt a bit like cheating. Additionally when compiling multiple programs at once, package notes are now displayed again at the end of compilation (regardless of success or failure of the last package) so you actually have a chance to read them. Perhaps the most important change however is that now source file downloads occur in parallel with compilation in the background (which can be easily seen by tailing the logs during compilation) – this saves a lot of time over the old download, compile, repeat process from years ago.

All USE flags still aren’t listed in the documentation, and I have ran into some circular dependency issues by using the doc USE flag (which is apparently common according to my www searches). I’ve had to add 6 lines to the package.accept_keywords and package.use files to get everything going. However, now I’m all set up. I’ve compiled a complete GNOME desktop with the Gimp, LibreOffice and many others. Unlike the days of my Duron which had to be left on overnight, my overclocked i7 950 rips through the builds – sometimes as quickly as they can be downloaded from a local mirror over my ADSL2+ connection.

Although the Gentoo website layout has largely been unchanged over the years, and some of the documentation is ever-so-slightly out of date, I get the sense that Gentoo is still a serious GNU/Linux distribution. I didn’t encounter any compilation issues that couldn’t be quickly worked around and it feels very fast. I still use Ubuntu 10.10 at work and Debian Squeeze on my netbook and home server, but going back to ebuilds still feels very strange and somewhat exciting. If only Gentoo focused more on freedom it would be darn near perfect.

Birth of the FreedomBox Foundation

Eben Moglen’s FreedomBox idea has caught my attention ever since his Silver lining in the cloud speech August last year. Unfortunately I haven’t noticed any visible progress on the project – until today. Looks like things have indeed been going on behind the scenes, as Mr Moglen has created the Freedom Box Foundation.

This inspired me to watch another of Moglen’s talks – Freedom in the Cloud (transcript here) – an older video that inspired the Diaspora project. Whilst it didn’t shine any more light on the subject (it was slightly more vague about how a FreedomBox device would function), Moglen was certainly right that people have been all to happy to sacrifice their privacy for convenience.

This blog runs on my personal home server. If the government wants to know what information I have on it or who has been accessing it, they can get a search warrant. They would have to physically enter my home and take my computer away to get it. The logs are all stored here – not on Facebook, Twitter or anywhere else. Nobody cares more about my data than me, and the government or anyone else who wants my data will have to go through me. That’s privacy.

My wife also has the option of using the home server for hosting her blog – but she refuses. Instead, she decided to post all her blogs and photos on Yahoo Blogs.

When I asked why, she told me that she wanted to know who was visiting her website and asked if I could tell who visited my website.

“Sure I can… kinda. I can give IP addresses. I can look up what countries those IP addresses have been allocated to. Alternatively, I could potentially see people’s user-names who visited my website if somebody logged in – required if somebody wants to post something.”

My wife was not impressed. “I want to see a list of user-names for everyone” she claimed. “Simple” I replied – “only allow people to view content when they log in”. In theory they shouldn’t have any privacy concerns since they obviously already need to be logged in to visit your site at Yahoo.

“Ah – that won’t work. They are already logged in when they visit other blogs. Nobody will create a separate login just for one blog – people are too lazy and nobody will visit.”

And there you have it. Seemingly, many people who use Yahoo Blogs (and presumably Facebook) feel the same way. I personally don’t care who visits my website and don’t see why I should care. If somebody wants me to know they visited, they can drop me an e-mail or post a comment.

OpenID would solve part of the problem my wife describes – it would reduce the burden of creating a new account, but won’t eliminate additional steps. It also requires the reader to already have an OpenID account to see any benefit, and it’s just not popular enough. I just spent a few minutes clicking through my bookmarks, and I could only find one website with OpenID support – SourceForge – and even then they only support a limited number of OpenID providers.

Will the FreedomBox project fix my wife’s use-case scenario? Most probably. One of the primary goals is “safe social networking, in which, without losing touch with any of your friends, you replace Facebook, Flickr, Twitter and other centralized services with privacy-respecting federated services”. Most probably Yahoo Blogs is popular enough that it would be included in that list.

How would the transition work though? If my wife had a FreedomBox, she would presumably be able to navigate a web interface to have it suck down the Yahoo Blogs data and host it locally. Next, her Yahoo page would add a link to her FreedomBox URL. When people visit, they would either be granted or denied access based on whether she had previously granted a particular user access. If denied, there would be an option to request said access.

However, say my wife decided to use a FreedomBox prior to all her Yahoo friends having one – how would she be able to be sure person X is Yahoo Blogs person X to grant X access? That’s where things get tricky, and is the part of the picture I’m not too clear on.

The only thing I could imagine working would be for person X to have an account on a third-party website that can talk whatever protocol the FreedomBox uses. Obviously this means another account, but as would be the case with Yahoo Blogs the one account sign-in would support access to all the FreedomBox blogs. Further, like OpenID providers, the third-party website in question would be able to be hosted anywhere. Perhaps OpenID providers themselves will even provide this functionality thereby eliminating the sign-up process for those already with an OpenID account.

I imagine it’s going to be a hard battle, but if it picks up it has the potential to be unstoppable.

Ultimate Free Software Web Browsing Experience

I want the web to work the way it was intended, using only 100% free software. Is that so much to ask? Apparently so – and almost exclusively due to Flash.

Flash. I have concluded long ago that it’s impossible to have a good web browsing experience with or without it, so you might as well protect your freedom and go without it. As a GNU/Linux user, it presents so many problems. Even if using the proprietary player was acceptable, it is plagued by bugs such as the famous “Linux Flash plugin is unresponsive to mouse clicks” issue that Adobe doesn’t even seem to acknowledge. There are various workarounds, but that’s not the point. Then there’s the issue of 64-bit GNU/Linux distributions typically bundling 64-bit versions of the Firefox web browser. Too bad Flash dropped support of the 64-bit Flash plugin while it was still in beta, leaving users with an unsupported version with known security vulnerabilities. [Update: Seems Adobe changed their minds again! Most people still hellbent on using Flash have already had to switch away from the unsupported beta by now, so what is the point?]

Want to know what really gets on my nerves? 99% of Flash content that I am actually interested in is just video. The stuff that Firefox and Chrome has included native support for since over a year ago, via Xiph.Org‘s Ogg Theora/Vorbis formats. Heck, even Opera finally included support for these free video formats early this year. Those three browsers alone account for over 40% of web browser usage worldwide. Of course, Microsoft doesn’t want anything to do with free software, and Apple generally tries to pretend it doesn’t exist wherever convenient to do so. Since the majority of web browser usage does not include video support natively but does include the Flash plugin, for a lot of websites Flash is the easy fix. This of course forced more people to use Flash, which caused more websites to use it, which caused more people to use Flash… you get the idea. Even though Flash has been responsible for a huge number of system compromises, people feel forced to use it anyway.

W3C recognized the need for an open standard so that anyone could play video on the web regardless of Adobe’s approval. When HTML 5 was being drafted, VP8 was proposed as the video codec. Why VP8, when three of the five major browsers had native Theora support already? The answer to that was of course; video quality. Everyone was whining that Theora wasn’t as high in picture quality for H.264 and everyone wanted video quality to be as nice as possible. Due to H.264 being unusable (being encumbered with with patents), Google generously purchased On2 Technologies who created the wonderful VP8 codec and released the codec as a truly free format. As it is the highest quality open-spec free-of-patents codec which anyone can use, this paved the way for W3C to give it serious consideration.

Unsurprisingly, Microsoft made it clear that they would not support a free format. Period. Microsoft doesn’t need to provide a reason – given their normal attitude towards open standards or anything that would benefit free software or competition of any kind, rejecting the proposal was a given. Historically Microsoft deliberately doesn’t follow standards (eg. IE6, MS Office… anything really), so having the suits at Microsoft disagreeing with it was completely expected. Still if everyone else supported the standard, and with IE’s popularity continuing to fall, this might be enough to either force Microsoft’s hand, or make IE basically irrelevant – eventually.

There’s one other (somewhat) common browser – Safari. Apple’s response to VP8? Utter BS – patent FUD, and bogus hardware claims. Apparently a lot of Apple hardware has dedicated H.264 decoder chips (presumably iPods, iPhones and such), which Apple seems to suggest can be used by H.264 exclusively. I don’t believe it. Considering how similar H.264 and VP8 actually are, you’d think a company like Apple would be able to make it work. Anyway, Apple comes out with new iWhatevers every year, and Apple provides basically no support for older devices. Last I checked (a few months back – don’t have a link), there were various vulnerabilities in the 1st generation iPod Touch models which Apple has no intention of fixing. It was only superseded by the 2nd generation just on 2 years ago. That’s right – if you brought your iPod Touch around 2 years ago, Apple wouldn’t care about you today. Due to this forced rapid upgrade cycle, it should be no problem at all to have Apple get hardware support into all its devices relatively quickly – after all, we’re talking about the company that got Intel to redesign them a smaller CPU to fit their MacBook Air. If Apple can boss Intel around to get chips working the way they want, they likely can with any hardware company.

As for the patent FUD, Apple claims that H.264 hasn’t been proven not to infringe on patents in court. Steve Jobs famously claims that all video codecs are covered by patents. If this actually were true – that it was impossible to create a video codec without stepping on a patent, the patents in question would surely have to be invalidated by being obvious or demonstrating prior art. Either way, Apple’s talking trash. The real reason for rejecting VP8 is surely for the same reason as Microsoft – so they can keep themselves from being on a level playing field with their most direct browser competitors. Mozilla, Google and Opera won’t pay for MPEG-LA patent licenses on a per-user basis since the browsers can be copied and distributed to anyone without charge – and there would be no way to track the licenses anyway. Even if (for example) the Mozilla Foundation did find a way to overcome these obstacles, what of projects that fork Mozilla? Mozilla is free software. If all derivatives weren’t covered, Firefox wouldn’t be free anymore. If they were covered, any project would never have to pay the MPEG-LA again since they could just opt to borrow the Mozilla code – it would be a licensing deal that the MPEG-LA would never agree to. Clearly, the future of video on the web cannot possibly depend on paying patent licenses.

So where does this leave us? I predict that if HTML5 does not specify a format to use for the video tag, we’ll continue to see Flash dominate as the preferred video decoding option by website owners for many years into the future. Couldn’t we just dump Flash already and have the Microsoft fanboys install the Xiph.Org Directshow Filters package (which apparently comes with support for IE’s <video> tag)? That could work in a lot of cases, however if it really took off you could be sure that Microsoft would find a way to “accidentally” break the plugin somehow. It wouldn’t be the first time. I recall Microsoft IE 5.5 beta (if I’m remembering my version numbers correctly) would prevent executable files named hotdog.exe from working. This kind of file name was commonly used for Sausage Software’s HotDog HTML editor installation program – direct competition to Microsoft FrontPage back in the day. Rename the file to setup.exe and you were in business – not easy to do when the file came on a CD. Microsoft could potentially just argue that the incompatibility was only in its beta software, but web developers would likely have installed it.

Getting back on track… <cough>.. if the future of web video is in Flash, what can we do about it? How can we play our video using 100% software? We’re not out of options yet. Adobe has announced that upcoming versions of Flash will support VP8! How does that help us? If webmasters want to reach as close to 100% of their audience as possible right now, H.264 is the best option. As much as I hate it, H.264 can be played back via Flash on 90+% of desktops. Encoding in a second format to reach users that don’t have Flash installed might not be cost effective when time and storage costs are considered. However when Flash supports VP8, everyone can adopt that format and not need to worry about encoding in H.264 as well. People without Flash but using Firefox, Chrome or Opera can gracefully fall back to watching video natively. That way, the website video will work on all free-software-only desktops. Video numbers can be still further improved by updating the free software Java applet player video player Cortado to add WebM support. This would be a combination that would likely get us as close to 100% compatibility as reasonably possible using only a single codec.

There are some reasons why this could fail. Perhaps a percentage of IE users that don’t have Flash, Java or have the Directshow Filters plugins installed (but can play video natively due to having IE9 or later) will be larger than the number of GNU/Linux desktop users. I expect this to be very unlikely. However if H.264 remains the only option for iPhone-style devices, that might help tilt the scales in H.264’s favor. Another problem is that a lot of video recording devices such as webcams and some digital camcorders record to H.264 natively. It might be more efficient for the website maintainer to keep video in that format (even if heavy edits are required). Fortunately most web videos are so short that transcoding times probably won’t matter… but it’s a minor concern.

But what about playback today using entirely free software? Flash sucks on GNU/Linux! Enter Gnash and FFmpeg. The latest version (0.8.8 at the time of writing) works with YouTube 99% as well as Flash on Windows. Other video sites… not so much. In particular, I still have problems with Gnash when I try to play AVGN and Zero Punctuation – but I have a solution for these as well – the gecko-mediaplayer plugin with Greasemonkey. Once those are installed, grab the Greasemonkey scripts Download Escapist Videos and Gametrailers No Flash. You also will want to install rtmpdump. With those all installed, when you want to check out Zero Punctuation simply click the Download link that now appears under the video. Gecko MediaPlayer will kick in and give you video that takes up the entire browser window. As for AVGN, I discovered that GameTrailers hosts all the ScrewAttack content which includes many of the AVGN videos. Simply head on over to the ScrewAttack section – the latest video should be towards the top. Note that you have to sign in for the script to work, but basically it just takes the Download link and streams it to Gecko MediaPlayer, which gets embedded in the area that Flash normally resides. It works perfectly.

So there’s a lot of hacks involved. Gnash is buggy, and FFmpeg might have patent issues depending on the codec and your country. The AVGN solution involved finding an alternative (but still non-free possibly patent-infringing) format. Lastly, the Zero Punction hack basically involved a stream ripper, Gecko MediaPlayer and (probably) FFmpeg too. This is ugly as hell, but it works. When it does the first time, it’s a wonderful feeling. Unfortunately if you want native WebM in Firefox you need to upgrade to Firefox 4 beta, and today’s Greasemonkey checkout still has serious compatibility issues (although it’s being actively worked on). When Greasemonkey works natively in Firefox 4 and both projects release a stable build (expected this year), things will be looking very nice… and I imagine Gnash will get better in the meantime. YouTube is also testing encoding videos to WebM format, so hopefully they keep that up and encourage other video sites to follow suite. All systems are go!

What an awesome week for freedom

Wednesday, Richard Stallman gave a speech on Free Software in Ethics and in Practise at Melbourne University. Thursday, he gave a talk on Copyright vs Community in the Age of Computer Networks at RMIT. I had the pleasure of attending both, and asking Richard a few quick questions regarding Android phones, IaaS and the FreedomBox project.

Today, I just got back from the Melbourne Software Freedom Day event at the State Library. I was pleasantly surprised by the number of volunteers in orange t-shirts. Whilst many of the talks were focused specifically on open source as opposed to free software (such as the RedHat talk), all talks I elected to attend were very interesting and occasionally even exciting.

Well done to everyone who helped organise and run the event, and a special thanks to sponsors the Victorian Government, the State Library of Victoria, LUV and Linux Australia. Looking very forward to next year.