Category Archives: Freedom

Why I will not back FSF’s guidelines for free software distributions

The FSF publishes a document describing guidelines for free software distributions on gnu.org, as well as a list of distributions known to comply with these guidelines. In light of popular distributions that are increasingly including and recommending non-free software, these guidelines and distributions are a breath of fresh air to many – but they too are not without their problems.

From the guidelines, “any nonfree firmware needs to be removed from a free system”. The purpose of such firmware is to allow the target hardware device to function, so essentially distributions like Trisquel GNU/Linux feel it is fine to disable parts of a computer if it cannot be used in a completely free way. I have no complaint about this per se, but the way this is implemented in practice makes these distribution maintainers come off as hypocrites. These distributions are being reduced to not much more than a marketing ploy to mislead users. To understand why, I need to explain a bit more about what is meant exactly by the FSF when they refer to “firmware”, and why in many cases it’s a non-issue.

When the FSF talks about firmware, they are using it in a way that is inclusive of the term “microcode“. This is important, because proprietary microcode is everywhere and difficult to avoid. Even so-called “freedom-compatible” hardware frequently includes it.

If you are running an x86 processor released in the last 10 years or so, your CPU likely supports microcode runtime updates from within the operating system. If you run a Debian Wheezy GNU/Linux distribution, an Intel CPU and have the intel-microcode non-free package installed, this will automatically load the latest proprietary Intel microcode into your CPU at boot (if the packaged version is newer than what is already running).

So what happens if you don’t have this package installed? The answer is that your computer BIOS already includes CPU microcode that it injects into your CPU every time you turn your PC on. This is done before your operating system (or even its bootloader) has started to load. Were you not to load microcode updates in from your operating system, you would need to rely on flashing BIOS updates to deliver your CPU microcode updates. Either way, like it or not, you’re going to run Intel or AMD microcode at boot. It’s just a question of having the latest version with microcode fixes, or running an older version.

Here is the beginnings of why the argument for fully free software distributions (for the x86 architecture at least) falls flat on it’s face. These distributions might be 100% free software, and give you the illusion of having a computer that is fully free, but in practice removing this microcode has achieved very little – if anything at all.

CPUs aren’t the only devices you’ll find in modern PCs that require microcode. Enter the subject of graphics cards. This is where my main gripe with these distributions comes into being. Modern AMD graphics cards, like the CPUs discussed above, require microcode to function properly. Unlike CPUs however, AMD graphics cards need drivers to load this microcode into the GPU at boot – the BIOS will not do this.

AMD has helped the free software community create some great free software drivers. They have released all the specifications, and assisted in the development of code. Nvidia, by comparison, seldom plays ball with free software developers and (for x86-based graphics card drivers at least) has basically been no help at all. If you’re in the market for a high-end graphics card from one of these vendors, AMD would seem the logical choice – support the guys who support free software the most, right? No! Not according to the FSF!

Generators for Nvidia microcode have been created, but not for Radeon microcode. This result is likely just out of necessity – Nouveau (the free software project that has reverse engineered Nvidia graphics card drivers) likely were not able to redistribute the existing proprietary microcode due to licensing. However since AMD has allowed Radeon microcode to be distributed “as is” (basically do whatever you want with it [Edit: Sadly I was mistaken – you can basically redistribute as you like but “No reverse engineering, decompilation, or disassembly of this Software is permitted.”], but did not release the means to recreate the (21K or less in size) microcode file, there was little incentive for developers to replace this – they would rather work on actually getting the drivers working properly than dedicating time to what appears to amount to (in this case at least) a purely philosophical exercise.

Now I admit, I don’t like that I need to run my AMD graphics hardware with proprietary microcode (even if they do have excellent free software drivers). Distribution maintainers have two options:

1. Allow the user to install microcode (possibly that the user provides so as to not need to redistribute it as part of the project) to have a working and otherwise completely free software operating system installed

or

2. Don’t make it easy to have the user get his/her hardware working, make them install a different distribution that may respect software freedom far less

Although option one would seem more logical at a glance, we have already established distribution maintainers wishing to comply with the FSF guidelines for free software distributions will need to elect to go with option two.

Now that all the discussion of firmware and microcode is out of the way, I have paved the way to explain what really makes me mad in all of this.

From the above, we can conclude that Free software distributions do not want us to run hardware that requires non-free binary blobs of any kind – no matter how small the blob or how important the hardware may be. Now have a look at, say, the download page for Trisquel. Trisquel apparently supports 32-bit or 64-bit PCs (ie. x86-architecture, ie. AMD and Intel CPUs, ie. CPUs that require priorietary microcode to function). Where are the download links for people that have that have RISC CPUs that don’t require proprietary microcode (eg. MIPS, like the Loongson processors as used in the Lemote netbook that RMS uses)? No, Trisquel doesn’t really make any effort or seem to care about you running a 100% free software computer. To do so would mean dropping support for one of their main sponsors Think Pengiun computers, which only ship Intel x86 PCs!

If the free software guidelines were serious about avoiding non-free blobs, they should be blacklisting hardware known to disrespect user freedom by mandating blobs – regardless of how the blobs get installed, and should probably be dropping x86 architecture support. Alternatively they could go the other way and allow any non-free blobs, if they are stripped to the absolute minimum required to get hardware actually working, so end users gain the maximum possible free software experience from their hardware. Of course they wont do either of these things though. Neither having a completely free software computing experience, or having things work correctly for end users is their primary goal; it’s all about marketing.

StatusNet now a part of System Saviour

Last week, the FSF dented about a MediaGoblin fund-raiser. Shortly after, Ben sent an email out to the FSM mail list indicating that he had used the service in the past and found himself donating. A couple of days later, a FSF e-mail hit my inbox pressuring me some more.

The funny thing is that whilst I’ve heard of the project, I don’t fully understand how it works and why I would use it. After all, if it’s just for sharing images I would either add them within WordPress, or otherwise simply do this by scp’ing them to a directory my server and link to them as required. This functionality works fine with my N900 as well, although clearly posting images online is not a service I have much demand for. Heck, not a week goes by that I don’t just use elinks for something.

Perhaps I’m not the target audience, but I’m probably also misunderstanding what MediaGoblin is all about. How does it compare to say ownCloud? The best way to understand it is to take it for a spin. Let’s take a look at the documentation… they compare it to Identi.ca and Libre.fm right off the bat. Wait a second… I use Identi.ca a lot but I’m not running it on my own hardware right now. Despite this I’m deploying some Goblin to my server that I don’t really understand? Time to change priorities.

What followed was me spending the rest of the day re-organising my DomU machines, web server configurations and finally installing my own StatusNet micro-blog at http://micro.systemsaviour.com/.

So far I haven’t customised my install too much. I haven’t even replaced the Status.Net heading with the site name, but can do that all in good time. As my usage of Identi.ca was previously almost exclusively limited to other Identi.ca accounts, I had not until now had a good chance to see for myself how well the federation features worked. While not perfect (eg. no direct messaging functionality, documented bugs preventing messages to groups sometimes appearing, etc.) I think it will live up to my expectations and be sufficiently useful to me to want to make the switch away from my boltronics@identi.ca account.

As for MediaGoblin, I’ll have to look at that again another weekend to see if I can figure out how it might be useful. As for Libre.FM, I don’t think I’ll be hosting my own GNU FM server any time soon given it doesn’t appear to have federation capabilities currently which would pretty much restrict its usefulness to scrobbling (which I don’t really care much for anyway). I have decided that I also want to run my own Gitorious install sooner rather than later. Too much cool tech… arrggh!!

October 28th 2012 update:
As expected, I have since spent some time messing around with MediaGoblin. The results are visible from the Images menu button above. I have yet to create a custom theme, and do not have registrations enabled – with no plans to do so; at least not until the software matures.

Giving up fglrx in Debian Wheezy

The title says it all. A recent update has once again killed fglrx direct rendering from working with Xorg, so I’ve decided to just switch over to the free software Gallium driver entirely. This means no Amnesia, but I’ve since finished that game. It probably goes without saying that CrossFire won’t work now too, so… I would like to say that three of my GPUs are just doing nothing, but there are still power management issues with the radeon driver so the fans are sending my wife and I deaf while my cards cook at around 80-90 degrees, and it heats up my apartment noticeably – an annoyance since we’re heading towards the middle of summer here. It also means no OpenCL support since the AMD APP SDK depends on fglrx, although fortunately I haven’t been using that lately either.

The uninstallation of fglrx did not go smoothly. There have been times since I first performed my current desktop OS install where I manually ran the installer downloaded from AMD’s website, which spread files all over the place. These had to be cleaned up. The following two links were the most useful I came across which deal with this problem:

However, the final issue I had was documented on neither of those. The AMD installer created a file on my system in /etc/profile.d/ati-fglrx.sh which set an environment variable which caused direct rendering fail ($LIBGL_DRIVERS_PATH IIRC). Removing that file, logging out and in again got everything back to normal… well, “normal” as described above. :/

I’m still keeping fglrx on my laptop though (which I haven’t updated in a while)… for now. I don’t want my laptop run into the same power management issues leading up to Linux.conf.au 2012.

Here’s something I’ll be taking away from this experience. Proprietary software might sometimes be better than free software, but generally there can be no expectation of it becoming any better in the future than it is today. In the future it may become incompatible, may add new restrictions upon you, may not support new formats, may force you to upgrade (sometimes at cost) to continue functioning properly, etc. The issue I have experienced in this post was the former. With free software however, I can generally expect that the software I have today will never become worse over time – that is, it only gets better. Even in cases where ‘better’ is debatable (eg. GNOME 3), it can be (and often is) forked by anyone. That’s one of the reasons I love it.

To show my support of free software and software freedom, I have finally done something I feel guilty for not doing a long time ago – and became an associate member of the Free Software Foundation.
[FSF Associate Member]

Birth of the FreedomBox Foundation

Eben Moglen’s FreedomBox idea has caught my attention ever since his Silver lining in the cloud speech August last year. Unfortunately I haven’t noticed any visible progress on the project – until today. Looks like things have indeed been going on behind the scenes, as Mr Moglen has created the Freedom Box Foundation.

This inspired me to watch another of Moglen’s talks – Freedom in the Cloud (transcript here) – an older video that inspired the Diaspora project. Whilst it didn’t shine any more light on the subject (it was slightly more vague about how a FreedomBox device would function), Moglen was certainly right that people have been all to happy to sacrifice their privacy for convenience.

This blog runs on my personal home server. If the government wants to know what information I have on it or who has been accessing it, they can get a search warrant. They would have to physically enter my home and take my computer away to get it. The logs are all stored here – not on Facebook, Twitter or anywhere else. Nobody cares more about my data than me, and the government or anyone else who wants my data will have to go through me. That’s privacy.

My wife also has the option of using the home server for hosting her blog – but she refuses. Instead, she decided to post all her blogs and photos on Yahoo Blogs.

When I asked why, she told me that she wanted to know who was visiting her website and asked if I could tell who visited my website.

“Sure I can… kinda. I can give IP addresses. I can look up what countries those IP addresses have been allocated to. Alternatively, I could potentially see people’s user-names who visited my website if somebody logged in – required if somebody wants to post something.”

My wife was not impressed. “I want to see a list of user-names for everyone” she claimed. “Simple” I replied – “only allow people to view content when they log in”. In theory they shouldn’t have any privacy concerns since they obviously already need to be logged in to visit your site at Yahoo.

“Ah – that won’t work. They are already logged in when they visit other blogs. Nobody will create a separate login just for one blog – people are too lazy and nobody will visit.”

And there you have it. Seemingly, many people who use Yahoo Blogs (and presumably Facebook) feel the same way. I personally don’t care who visits my website and don’t see why I should care. If somebody wants me to know they visited, they can drop me an e-mail or post a comment.

OpenID would solve part of the problem my wife describes – it would reduce the burden of creating a new account, but won’t eliminate additional steps. It also requires the reader to already have an OpenID account to see any benefit, and it’s just not popular enough. I just spent a few minutes clicking through my bookmarks, and I could only find one website with OpenID support – SourceForge – and even then they only support a limited number of OpenID providers.

Will the FreedomBox project fix my wife’s use-case scenario? Most probably. One of the primary goals is “safe social networking, in which, without losing touch with any of your friends, you replace Facebook, Flickr, Twitter and other centralized services with privacy-respecting federated services”. Most probably Yahoo Blogs is popular enough that it would be included in that list.

How would the transition work though? If my wife had a FreedomBox, she would presumably be able to navigate a web interface to have it suck down the Yahoo Blogs data and host it locally. Next, her Yahoo page would add a link to her FreedomBox URL. When people visit, they would either be granted or denied access based on whether she had previously granted a particular user access. If denied, there would be an option to request said access.

However, say my wife decided to use a FreedomBox prior to all her Yahoo friends having one – how would she be able to be sure person X is Yahoo Blogs person X to grant X access? That’s where things get tricky, and is the part of the picture I’m not too clear on.

The only thing I could imagine working would be for person X to have an account on a third-party website that can talk whatever protocol the FreedomBox uses. Obviously this means another account, but as would be the case with Yahoo Blogs the one account sign-in would support access to all the FreedomBox blogs. Further, like OpenID providers, the third-party website in question would be able to be hosted anywhere. Perhaps OpenID providers themselves will even provide this functionality thereby eliminating the sign-up process for those already with an OpenID account.

I imagine it’s going to be a hard battle, but if it picks up it has the potential to be unstoppable.

Ultimate Free Software Web Browsing Experience

I want the web to work the way it was intended, using only 100% free software. Is that so much to ask? Apparently so – and almost exclusively due to Flash.

Flash. I have concluded long ago that it’s impossible to have a good web browsing experience with or without it, so you might as well protect your freedom and go without it. As a GNU/Linux user, it presents so many problems. Even if using the proprietary player was acceptable, it is plagued by bugs such as the famous “Linux Flash plugin is unresponsive to mouse clicks” issue that Adobe doesn’t even seem to acknowledge. There are various workarounds, but that’s not the point. Then there’s the issue of 64-bit GNU/Linux distributions typically bundling 64-bit versions of the Firefox web browser. Too bad Flash dropped support of the 64-bit Flash plugin while it was still in beta, leaving users with an unsupported version with known security vulnerabilities. [Update: Seems Adobe changed their minds again! Most people still hellbent on using Flash have already had to switch away from the unsupported beta by now, so what is the point?]

Want to know what really gets on my nerves? 99% of Flash content that I am actually interested in is just video. The stuff that Firefox and Chrome has included native support for since over a year ago, via Xiph.Org‘s Ogg Theora/Vorbis formats. Heck, even Opera finally included support for these free video formats early this year. Those three browsers alone account for over 40% of web browser usage worldwide. Of course, Microsoft doesn’t want anything to do with free software, and Apple generally tries to pretend it doesn’t exist wherever convenient to do so. Since the majority of web browser usage does not include video support natively but does include the Flash plugin, for a lot of websites Flash is the easy fix. This of course forced more people to use Flash, which caused more websites to use it, which caused more people to use Flash… you get the idea. Even though Flash has been responsible for a huge number of system compromises, people feel forced to use it anyway.

W3C recognized the need for an open standard so that anyone could play video on the web regardless of Adobe’s approval. When HTML 5 was being drafted, VP8 was proposed as the video codec. Why VP8, when three of the five major browsers had native Theora support already? The answer to that was of course; video quality. Everyone was whining that Theora wasn’t as high in picture quality for H.264 and everyone wanted video quality to be as nice as possible. Due to H.264 being unusable (being encumbered with with patents), Google generously purchased On2 Technologies who created the wonderful VP8 codec and released the codec as a truly free format. As it is the highest quality open-spec free-of-patents codec which anyone can use, this paved the way for W3C to give it serious consideration.

Unsurprisingly, Microsoft made it clear that they would not support a free format. Period. Microsoft doesn’t need to provide a reason – given their normal attitude towards open standards or anything that would benefit free software or competition of any kind, rejecting the proposal was a given. Historically Microsoft deliberately doesn’t follow standards (eg. IE6, MS Office… anything really), so having the suits at Microsoft disagreeing with it was completely expected. Still if everyone else supported the standard, and with IE’s popularity continuing to fall, this might be enough to either force Microsoft’s hand, or make IE basically irrelevant – eventually.

There’s one other (somewhat) common browser – Safari. Apple’s response to VP8? Utter BS – patent FUD, and bogus hardware claims. Apparently a lot of Apple hardware has dedicated H.264 decoder chips (presumably iPods, iPhones and such), which Apple seems to suggest can be used by H.264 exclusively. I don’t believe it. Considering how similar H.264 and VP8 actually are, you’d think a company like Apple would be able to make it work. Anyway, Apple comes out with new iWhatevers every year, and Apple provides basically no support for older devices. Last I checked (a few months back – don’t have a link), there were various vulnerabilities in the 1st generation iPod Touch models which Apple has no intention of fixing. It was only superseded by the 2nd generation just on 2 years ago. That’s right – if you brought your iPod Touch around 2 years ago, Apple wouldn’t care about you today. Due to this forced rapid upgrade cycle, it should be no problem at all to have Apple get hardware support into all its devices relatively quickly – after all, we’re talking about the company that got Intel to redesign them a smaller CPU to fit their MacBook Air. If Apple can boss Intel around to get chips working the way they want, they likely can with any hardware company.

As for the patent FUD, Apple claims that H.264 hasn’t been proven not to infringe on patents in court. Steve Jobs famously claims that all video codecs are covered by patents. If this actually were true – that it was impossible to create a video codec without stepping on a patent, the patents in question would surely have to be invalidated by being obvious or demonstrating prior art. Either way, Apple’s talking trash. The real reason for rejecting VP8 is surely for the same reason as Microsoft – so they can keep themselves from being on a level playing field with their most direct browser competitors. Mozilla, Google and Opera won’t pay for MPEG-LA patent licenses on a per-user basis since the browsers can be copied and distributed to anyone without charge – and there would be no way to track the licenses anyway. Even if (for example) the Mozilla Foundation did find a way to overcome these obstacles, what of projects that fork Mozilla? Mozilla is free software. If all derivatives weren’t covered, Firefox wouldn’t be free anymore. If they were covered, any project would never have to pay the MPEG-LA again since they could just opt to borrow the Mozilla code – it would be a licensing deal that the MPEG-LA would never agree to. Clearly, the future of video on the web cannot possibly depend on paying patent licenses.

So where does this leave us? I predict that if HTML5 does not specify a format to use for the video tag, we’ll continue to see Flash dominate as the preferred video decoding option by website owners for many years into the future. Couldn’t we just dump Flash already and have the Microsoft fanboys install the Xiph.Org Directshow Filters package (which apparently comes with support for IE’s <video> tag)? That could work in a lot of cases, however if it really took off you could be sure that Microsoft would find a way to “accidentally” break the plugin somehow. It wouldn’t be the first time. I recall Microsoft IE 5.5 beta (if I’m remembering my version numbers correctly) would prevent executable files named hotdog.exe from working. This kind of file name was commonly used for Sausage Software’s HotDog HTML editor installation program – direct competition to Microsoft FrontPage back in the day. Rename the file to setup.exe and you were in business – not easy to do when the file came on a CD. Microsoft could potentially just argue that the incompatibility was only in its beta software, but web developers would likely have installed it.

Getting back on track… <cough>.. if the future of web video is in Flash, what can we do about it? How can we play our video using 100% software? We’re not out of options yet. Adobe has announced that upcoming versions of Flash will support VP8! How does that help us? If webmasters want to reach as close to 100% of their audience as possible right now, H.264 is the best option. As much as I hate it, H.264 can be played back via Flash on 90+% of desktops. Encoding in a second format to reach users that don’t have Flash installed might not be cost effective when time and storage costs are considered. However when Flash supports VP8, everyone can adopt that format and not need to worry about encoding in H.264 as well. People without Flash but using Firefox, Chrome or Opera can gracefully fall back to watching video natively. That way, the website video will work on all free-software-only desktops. Video numbers can be still further improved by updating the free software Java applet player video player Cortado to add WebM support. This would be a combination that would likely get us as close to 100% compatibility as reasonably possible using only a single codec.

There are some reasons why this could fail. Perhaps a percentage of IE users that don’t have Flash, Java or have the Directshow Filters plugins installed (but can play video natively due to having IE9 or later) will be larger than the number of GNU/Linux desktop users. I expect this to be very unlikely. However if H.264 remains the only option for iPhone-style devices, that might help tilt the scales in H.264’s favor. Another problem is that a lot of video recording devices such as webcams and some digital camcorders record to H.264 natively. It might be more efficient for the website maintainer to keep video in that format (even if heavy edits are required). Fortunately most web videos are so short that transcoding times probably won’t matter… but it’s a minor concern.

But what about playback today using entirely free software? Flash sucks on GNU/Linux! Enter Gnash and FFmpeg. The latest version (0.8.8 at the time of writing) works with YouTube 99% as well as Flash on Windows. Other video sites… not so much. In particular, I still have problems with Gnash when I try to play AVGN and Zero Punctuation – but I have a solution for these as well – the gecko-mediaplayer plugin with Greasemonkey. Once those are installed, grab the Greasemonkey scripts Download Escapist Videos and Gametrailers No Flash. You also will want to install rtmpdump. With those all installed, when you want to check out Zero Punctuation simply click the Download link that now appears under the video. Gecko MediaPlayer will kick in and give you video that takes up the entire browser window. As for AVGN, I discovered that GameTrailers hosts all the ScrewAttack content which includes many of the AVGN videos. Simply head on over to the ScrewAttack section – the latest video should be towards the top. Note that you have to sign in for the script to work, but basically it just takes the Download link and streams it to Gecko MediaPlayer, which gets embedded in the area that Flash normally resides. It works perfectly.

So there’s a lot of hacks involved. Gnash is buggy, and FFmpeg might have patent issues depending on the codec and your country. The AVGN solution involved finding an alternative (but still non-free possibly patent-infringing) format. Lastly, the Zero Punction hack basically involved a stream ripper, Gecko MediaPlayer and (probably) FFmpeg too. This is ugly as hell, but it works. When it does the first time, it’s a wonderful feeling. Unfortunately if you want native WebM in Firefox you need to upgrade to Firefox 4 beta, and today’s Greasemonkey checkout still has serious compatibility issues (although it’s being actively worked on). When Greasemonkey works natively in Firefox 4 and both projects release a stable build (expected this year), things will be looking very nice… and I imagine Gnash will get better in the meantime. YouTube is also testing encoding videos to WebM format, so hopefully they keep that up and encourage other video sites to follow suite. All systems are go!