23 October 2006

GPLv3: What Linus, Alan, Greg, Andrew and Dave Said

Few subjects in the world of free software have provoked as much discussion as the new GNU GPLv3 licence. Mostly it's outsiders (like me) sounding off about this, but what do the people involved really think?

I decided to ask them, and write up the result for the Guardian. I was expecting a couple of lines back to my emailed questions if I was lucky, but badly underestimated hacker generosity. Linus and his mates sent me back long and typically thoughtful replies, while RMS wanted to talk about it at length.

Since I was only able to use a tiny fraction of this material in the Guardian article that resulted, I thought it might be a useful contribution to the GPLv3 debate to post it here (with the permission of those concerned).

For length reasons, I've split it up into two postings. Below are the replies of the kernel coders, which I received on 3 October, 2006 (placed in the order in which their authors appear in the recent GPLv3 poll). The interview with RMS can be found above.

Linus Torvalds

I don't think there will necessarily be a lot of _practical_ fallout from it, so in that sense it probably doesn't matter all that much. It's not like we haven't had license "discussions" before (the whole BSD vs GPL flame-war seemed to go on for years back in the early nineties). And in many ways, it's not like the actual split between the "Open Source" and the "Free Software" mentality is in any way new, or even brought about by the GPLv3 license.

So while I think there is still a (admittedly pretty remote) chance of some kind of agreement, I don't think that it's a disaster if we end up with a GPLv2 and a new and incompatible GPLv3. It's not like we haven't had licenses before either, and most of them haven't been compatible.

In some ways, I can even hope that it clears the air for all the stupid tensions to just admit that there are differences of opinion, and that the FSF might even just stop using the name "GNU/Linux", finally admitting that Linux never was a GNU project in the first place.

The real downside, I suspect, is just the confusion by yet another incompatible license - and one that shares the same name (licenses such as OSL and GPL were both open source licenses and they were incompatible with each other, but at least they had clear differentiation in their names).

And there's bound to be some productivity loss from all the inevitable arguments, although in all honesty, it's not like open source developers don't spend a lot of time arguing _anyway_, so maybe that won't be all that big of a factor - just a shift of area rather than any actual new lost time ;)

One of the reasons the thing gets so heated is that people (very much me included) feel very strongly about their licenses. It's more than just a legal paper, it's deeply associated with what people have been working on for in some cases decades. So logically I don't think the disagreement really matters a whole lot, but a lot of it is about being very personally attached to some license choice.

Alan Cox

Ar Maw, 2006-10-03 am 13:57 +0100, ysgrifennodd glyn moody:
> Since it seems likely that the kernel will remain under v2, while the
> rest of GNU goes for v3, I was wondering whether you think this is
> going to cause you and others practical problems in your work on the
> kernel. What about companies and end-users of GNU/Linux: will there


There is no such thing as GNU/Linux. For an article like this it's really important to understand and clarify that (and from the US view also as a trademark matter).

I mean there is no abstract entity even that is properly called "GNU/Linux". It's a bit of spin-doctoring by the FSF to try and link themselves to Linux. Normally its just one of those things they do and people sigh about, but when you look at the licensing debate the distinction is vital. (its also increasingly true that FSF owned code is a minority part of Linux)

Linux is not and never has been an FSF project. I would say the majority of the kernel developers don't buy the FSF political agenda. Linus likewise chose the license for the pragmatic reason it was a good license for the OS, not because he supported the GNU manifesto.

Thus this isn't about the Linux people splitting from the FSF, its a separate project that happens to have been consulted as to whether it would like to use a new, allegedly better, variant of the license it chose.

Linux does use FSF tools but that doesn't make it a GNU project any more than this article will be an IBM project because it was typed on a PC, or a BT project because it used an ADSL line.

The Linux kernel being GPLv2 isn't a problem we can see for the future. It is a distinct work to the applications that run on it, just as Windows kernel is to Windows applications. The more awkward corner cases will be LGPL and similar licenses where you want the benefits and flexibility. The FSF have indicated they understand that and will ensure it works out. The licenses are about having barriers to abuse, not barriers to use.

> be negative consequences for them, or do you think that life will
> just go on as before?


I'm not sure what will happen with the rest of the GPL licensed software world. It really is too early to say because the license is a draft at this point and various areas around patents and optional clauses remain open to correction and improvement.

Most GPL licensed code is not controlled by the FSF and probably has too many contributers to relicense. Stuff that is new or has a few owners might change license if the new license is good. However given that most of the work on the FSF owned projects is done by non FSF people then if the license is bad I imagine all the developers will continue the GPLv2 branch and the FSF will be left out in the cold. The FSF know this too and that's why it takes time to build a new license and consensus.

It may well be the new license is mostly used with new code.

> What's the main problem you have with GPLv3?

For the the kernel there are a few, the big one that is hard to fix is the DRM clause. Right now the GPLv2 covers things like DRM keys in generic language and it means the law can interpret that sanely. Its vague but flexible, which lawyers don't like of course. There isn't any caselaw but out of court settlements support the fact this is enforcable.

The GPLv3 variant is much stronger and it appears to cover things like keys to rented devices where the DRM logic is less clear.

The big one though for the kernel is not a legal matter or even a specifically GPLv3 matter. Many people contributed to the kernel under a set of understood terms. Not only would all those people have to agree to a change in those terms but those terms changing would prevent some of the existing users from continuing to use it in the manner they do now.

You can't really make an agreement like that and then change the rules on people who've contributed time, money and code to the Linux project. I support Linus' assertion that legal issues aside he doesn't have the moral right to change the rules this way.

Greg Kroah-Hartman

> My question concerns the timing of the recent white paper: why was it
> released now, and not at the beginning of the GPLv3 consultation process
> when it might have been able to influence things?


The process is not over, and we still hope to influence things. We would not have written that letter otherwise. The main reason it was not done earlier is that we just did not think it was going to be a problem, as the kernel was not going to change licenses. But the more that we realized this was going to have a problem outside of just the kernel, and affect the whole community, we felt that we should at least voice our opinions.

Also, please note that the DRM issues have changed over time from being very broad (which was at least admirable), to being explicitly targeted at only the Linux kernel. Now the license is worded to try to stop the "tivoization" issue.

This is the where a bootloader or bios determines if the crypto signature of the kernel is acceptable or not before it decides to run it or not. This means that only "approved" kernels that come from the company will run properly on the hardware.

Now this kind of restriction pretty much _only_ affects the kernel, not any other type of program. This is because only if you can control the kernel can you ensure that the system is "secure".

So it seems that the FSF is only targeting the Tivo issue, which us kernel developers have explicitly stated in public that it is acceptable to use _our_ code in this manner. So they are now trying to tell another group (us) what we should do to our code.

As the FSF has no contribution in the Linux kernel, and has nothing to do with it in general, we kernel developers are now a bit upset that someone else is trying to tell us that something we explicitly stated was acceptable use of our code, is suddenly bad and wrong.

> Given that the FSF is unlikely to throw away all the work it has done, or
> even modify it substantially/substantively, do you have any thoughts on
> what's going to happen?


I really have no idea, but I would hope that things change for the better. We are already hearing rumors from the people on the different GPLv3 committees that our statement has had an affect, but we will not know for sure until the next draft comes out.

Andrew Morton

Well gee. We're programmers and we spend our time programming, not swanning around at meetings talking about legal matters and playing politics. We find things like licensing to be rather a distraction, and dull. So most people largely ignored it all.

It was only later in the process when the thing started to take shape, when we saw where it was headed and when we began to hear the concerns of various affected parties that there was sufficient motivation to get involved.

In fact this points at a broad problem with the existing process: I'm sure that a large majority of the people who actually write this code haven't made their opinions felt to the FSF. Yet the FSF presumes to speak for them, and proposes to use their work as ammunition in the FSF's campaigns.

And why haven't these programmers made their opinions known? Some are busy. Many work for overlawyered companies and are afraid that they might be seen to be speaking for their companies. Some don't speak English very well. Almost all of them find it to be rather dull and a distraction.

Dave Miller

For the kernel I'm pretty sure things will go on as they have before.

The problems are most likely for the projects under the GNU Project umbrella. All the copyrights to those projects, such as GCC, Binutils, etc. are all assigned to the GNU Project. So the FSF could, and almost certainly will, make all of those projects use the GPL v3.

As an aside, I will note that originally the FSF used to say that they wanted copyright assigned to them "to make it easier to enforce the GPL in court for software projects under the GNU Project umbrella." But as is clear today, it's also a power thing, in that having all the copyrights assigned to them allows the FSF to choose the licensing of the code as they see fit since they are the copyright holder of the complete work.

At the point of a relicense to GPL v3 for these GNU Project source trees one of two things could happen. Either the developers are OK with this, even if to simply "grin and bear it" and things go on under GPL v3. Or, the developers are unhappy with this, and fork off a GPL v2 copy of the tree and do development there.

In the end, even though they've assigned their copyrights to the FSF, the developers do control the ultimate licensing of these GNU projects. If they don't like GPL v3 and work on the GPL v2 fork instead, the FSF is at a loss because while they can mandate whatever they like such mandates are useless if the developers don't want to contribute to the GPL v3 variant.

So being the ones who do the development work is actually a kind of power which permeates through all of the politics. If the political folks do something stupid, the developers can just take their talent and efforts elsewhere.

I'm more than familiar with this process, since I was part of the group that forked the GCC compiler project many years ago because the majority of the GCC developers found the head maintainer (Richard Kenner) impossible to work with. Although he was quite upset about it, there wasn't much that Richard Stallman and the FSF could do about it. In the end the fork became the "real GCC" under GNU Project umbrella once more.

So the opinion of the developers matters a lot, especially when it comes to licensing. It could get messy is a lot of these projects fork, but the GPL v3 isn't a done deal yet so the FSF still has time to fix things up and make it more palatable to people.

> Alone among those polled for their views on the v2 and v3 you choose
>
> 0 I don't really care at all
>
> why is this when everybody else seems to hold such extreme views on the
> subject? Do you think they're getting worked up over nothing?


First, I think the poll was pretty useless.

The poll asked what people think of the GPL v3 draft, which is by definition in draft state and therefore not ready for final consumption. Of course the GPL v3 still needs some fixing. So asking about actually using it in a major software project right now is totally pointless. What would have been more interesting would have been to ask what the developers think about the "core issues" rather than the specific implementation of those issues in the current GPL v3 draft.

For example, polling on what the kernel developers thought about "keying of the Linux kernel" in the way that Tivo does would have been much more interesting. In my opinion, I believe you would have seen about an even split down the middle on this one. But even the people who are against keying think that the DRM language in the GPL v3 meant to combat this is not done correctly.

Several kernel developers believe that GPL v2 already has enough language to make restrictions such as keying be not allowed.

Personally, I'm against keying and I'm quite unhappy with what Tivo did with the Linux kernel. This kind of keying sets a very bad precedent. For example, in the future vendors could get away with some questionable things using keying. Say a vendor sells a piece of hardware, and provides the GPL source to the kernel drivers of the new things in that piece of hardware. Then, they only allow kernel binaries signed with a special key to load. This makes the publishing of their drivers effectively useless. The vendor still controls everything and nobody gains from the code they've submitted. Nobody can recompile a kernel with their changes and actually test it on their hardware, since they have no way to produce a signed kernel that the device will actually allow to boot. So the value of this "contribution" is absolutely zero. This is, in my opinion, totally against the spirit of the GPL v2.

I would have been perfectly fine with Tivo using another OS for their product. All the world does not have to be Linux, and if Linux's license doesn't suit someone, they are free to not use it.

In most examples I've ever been shown where this kind of lockdown is supposedly "legitimate", the owner of the device is effectively making this lockdown decision. For example I'm OK with electronic voting machines used in real elections having their software locked down. The government owns those machines, and is well within their rights to lock down that hardware in order to ensure a proper and fair election to the public by preventing software tampering.

But if I purchase an electronic voting machine of my own, and it uses the Linux kernel, I very much want to tinker with it, build my own kernels, and try to poke holes in the device. How else could we validate that electronic voting machines are safe if the public has no way to test these claims out for themselves, in particular when such devices use open technologies such as the Linux kernel?

Web 2.0 Start-ups as Haiku

Web 2.0:
Surely we've all had enough?
Maybe not, like this.

A Map of the Commons

Commons are things that are held, well, in common, for the benefit of all. The traditional commons is the common land, many of which still exist in England - Clapham Common, for example. But commons can be anything. For example, free software is a commons, as is open content. The air we breathe is clearly a commons, as are the world's oceans.

Less obvious, perhaps, is the commons of tranquillity. Like other commons, it can be destroyed for all by the selfish actions of a few. But what exactly is the state of this commons today? Here in England, we now know, thanks to a neat map of this commons put together by the Council for the Preservation of Rural England.

This actually quite useful because, as they say, if you can't measure it, you can't manage it: if you don't know where the commons is most threatened, you can't take action to protect it. Well done CPRE. (Via BBC News.)

22 October 2006

A Request to the Icelandic Nation

On the occasion of its breach of a 21-year-old international ban on commercial whaling, just a quick request to the Icelandic nation: could you please close the door on your way out of the civilised world.

21 October 2006

Innovating all the Way to Openness

Innovate - the "journal of online education" - has an issue exploring

the potential of open source software and related trends to transform educational practice.

Nothing hugely new there for readers of this blog, but there are some articles with interesting case studies from the educational world. There's also a typically thoughtful and well-written piece by David Wiley, who invented the term "open content" back in 1998. You'll need to register (email address required), but it's worth that minor effort.

Intellectual Property is not "Up for Grabs"...

...as this interesting post suggests; it is just simply broken. Time to wheel it away and consign it to Time's winged dustbin.

20 October 2006

Open Source Disaster Management System

No, really: it's called Sahana. (Via Bob Sutor's Open Blog.)

OA Book on OA from OA Publisher

As the Digg lot say, "title says it all" - or nearly: worth noting too that this open access book on open access comes from a publisher that provides open access to all its titles. (Via Open Access News.)

Copyright as a Metaphor for "Rip-off"

There's a piece on C|net which I can only hope was written with the express intent of provoking a reaction, since it's basic idea is so batty:

A European court last month agreed with a group of regional publishers in Belgium that accused Google of ripping off their content. The court ordered Google to remove text summaries of the newspapers' articles, along with Web links to the publishers' sites.

As world and dog have pointed out, what Google News does is provide free - yes, free - publicity for news sites, leading to free - yes, free - extra traffic, which can then be converted to what we in the trade call dosh. The idea that Google is somehow "ripping off" the poor old media conglomerates is risible. But luckily, they seem intent on slitting their own throats, so let 'em, says I.

More serious is the implicit assumption in the C|net piece that there is something sacred about copyrighted material. Maybe there would be, if copyright did what it was originally intended to do: to provide an incentive to the creator to create. But now that copyright typically runs for 50 or even 70 years after the creator's death, it's hard to see how new works are going to be conjured up except with a Ouija board.

Copyright has broken the original social compact, which is that people aren't allowed to copy it for 14 years - yes, 14 - in return for being allowed to do what they like with it afterwards. As copyright is extended time and time again, it is becoming impossible ever to access the content it covers: there is no quid for the quo.

So copyright has become the "rip-off", demanding without giving. If media companies really wanted to stop people using their materials, they should go back to a balanced copyright that gave to both parties. The current system is so inequitable that it is no wonder most people feel morally justified in ignoring it.

On Sharing and Fake Sharing

Larry Lessig has some wise words on what Flickr gets right and YouTube gets wrong.

IE7: Mossberg and the Fox

I've not really been following the IE7 saga, since it seems to be a case of too little too late. I was pleased to find my prejudices confirmed by the Grand Old Man of populist tech journalism, Walter Mossberg:

The new Internet Explorer is a solid upgrade, but it's disappointing that after five years, the best Microsoft could do was to mostly catch up to smaller competitors.

Quite.

Kudos to Kocoras and ICANN

Looks like I was overly pessimistic about the Spamhaus case:

On 19 October 2006, United States District Court Judge Charles P. Kocoras, presiding over the e360Insight v. The Spamhaus Project matter in the Northern District of Illinois, issued an order denying e360Insight's ("e360") motion asking the Court to, among other things, suspend www.spamhaus.org. The Court explained that the relief e360 sought was too broad to be warranted under the circumstances. First, the Court noted that since there is no indication that ICANN or Tucows acted in concert with Spamhaus, the Court could not conclude that either party could be brought within the ambit of Federal Rule of Civil Procedure 65(d), which states that an order granting an injunction is "binding only upon the parties to the action, their officers, agents, servants, employees, and attorneys, and upon those persons in active concert or participation with them." Second, the Court stated that a suspension of www.spamhaus.org would cut off all lawful online activities of Spamhaus, not just those that are in contravention of the injunction the Court previously issued against Spamhaus.

Kudos to Kocoras for his intelligence, and to ICANN for not rolling over as I feared they would.

19 October 2006

It's the Big 1 - 0 - 0 - 0

This is my thousandth post.

Just thought I'd mention it.

Er, that's all, really.

Thanks for your attention.

Plone Goes Pltwo...

...as in Second Life:

The Plone Foundation announced today the broadcasting of the Plone Conference 2006 into the virtual world Second Life. It is the first big open source conference being colocated inside a virtual world. The event will be held from Oct 25-27

"We'll broadcast selected talks and tutorials each day into a virtual conference building inside Second Life. Residents who could not make it to the by-now sold-out conference can participate in virtual form. A back channel for their questions to the actual speaker will be provided, too"

And so the boundaries between RL (real life) and SL (Second Life) became just that tiny bit more friable....

Sun Supports OpenOffice.org - No, Really

It might seem strange to talk about Sun supporting OpenOffice.org - after all, it was the original donor of the code to the open source community. But what's changed is that it is now offering service plans for the software: this is news, because in the past it has only supported its own variant, StarSuite.

This is also important, because it provides a safety net to companies and government departments who want to use OpenOffice.org. In the past, they have been forced to opt for StarSuite if they wanted support; no longer.

Well done, my Sun.

All Hail the ODF Alliance

The ODF Alliance has been going for a while now, but even so this list of 300+ members is a forceful reminder that this is a standard that is getting stronger day by day. (Via Erwin's StarOffice Tango.)

The Evolution of Academic Knowledge

The complete works of Charles Darwin are now online. This is certainly an important moment in the evolution of academic knowledge, since it points the way to a future where everything will be accessible in this way - call it the Googleisation of academia.

A pity, though, that the terms of use are so restrictive: not a CC licence in sight. Obviously we're still at the Neanderthal stage as far as copyright evolution is concerned.

18 October 2006

The Integrated Open Source Stack Meme

I noted previously that Red Hat has blessed the idea of the integrated open source stack; now Novell is doing the same, with the support of IBM.

And the meme marched on.

Casing Citizendium

Citizendium, Larry Sanger's Wikipedia fork, is opening its doors, albeit in a very controlled sort of way, as a private alpha. At least the press release - characteristically lengthy - sketches in some of the details as to who is doing what with this interesting project. I'll be writing more about this in due course.

Will Lack of Open Access Wipe Out the World?

A few months ago, I asked whether lack of open access to avian 'flu data might hinder our ability to head off a pandemic; now it looks like lack of open access could lead to the destruction of civilisation as we know it. If that sounds a little far fetched, consider the facts.

The US is the largest single polluter in terms of carbon dioxide: according to the US Environmental Protection Agency, "In 1997, the United States emitted about one-fifth of total global greenhouse gases."

The EPA plays a key role in determining the US's environmental actions: "the Agency works to assess environmental conditions and to identify, understand, and solve current and future environmental problems; integrate the work of scientific partners such as nations, private sector organizations, academia and other agencies; and provide leadership in addressing emerging environmental issues and in advancing the science and technology of risk assessment and risk management."

To "assess environmental conditions and to identify, understand, and solve current and future environmental problems; integrate the work of scientific partners such as nations, private sector organizations, academia and other agencies" clearly requires information. Much of that information comes from scientific journals published around the world. Unfortunately, the EPA is in the process of cutting back on journal subscriptions:

The U.S. Environmental Protection Agency is sharply reducing the number of technical journals and environmental publications to which its employees will have online access, according to agency e-mails released today by Public Employees for Environmental Responsibility (PEER). This loss of online access compounds the effect of agency library closures, meaning that affected employees may not have access to either a hard copy or an electronic version of publications.

...

In addition to technical journals, EPA is also canceling its subscriptions to widely-read environmental news reports, such as Greenwire, The Clean Air Report and The Superfund Report, which summarize and synthesize breaking events and trends inside industry, government and academia. Greenwire, for example, recorded more than 125,000 hits from EPA staff last year.

As a result of these cuts, agency scientists and other technical specialists will no longer have ready access to materials that keep them abreast of developments within their fields. Moreover, enforcement staff, investigators and other professionals will have a harder time tracking new developments affecting their cases and projects.

So, we have the organisation whose job is to help determine the actions of the world's worst polluter cut off from much of the most recent and relevant research, in part because much of it is not open access.

No OA, no tomorrow, no comment. (Via Open Access News.)

Open Source Intelligence

Technocrat pointed me to this story on Village Voice, a title I used to read assiduously in my younger days. It's about how a network of planespotters have put together many of the pieces that go to make up the shameful jigsaw puzzle of the CIA's "torture taxi" operation, used for moving people around the world to be held and tortured without judicial oversight.

What's fascinating is the way that tiny, apparently meaningless contributions - a photo here, a Yahoo search of a plane number there - when put together, can help create something really big and important, just as open source projects pool the work of hundreds or thousands to create vast and astonishing achievements like GNU/Linux or Wikipedia.

An Ode to Unicode 5.0

Andy Updegrove has a short but justified paean to the wonder that is Unicode, one of the unsung heroes/heroines of the computer revolution. Apparently version 5.0 is now available. Don't all rush to buy a copy at once.

17 October 2006

Gotcha!

This story from Cory Doctorow on Boing Boing about someone allegedly trying to copyright a fabric seems to be fading away, but its life has not been in vain: it's brought us this wonderful parting shot:

Thanks Cory, you really got us! We were really putting one over on everybody - and you totally busted us! Saving the world from evil fabric stores, you are, one post at a time...

Ha!

KOffice 1.6 - No Mere Point Upgrade

Well, not if you look at what's on offer:

# Krita Becomes Usable for Professional Image Work
Krita and its maintainer Boudewijn Rempt won the aKademy Award for "Best Application" at this year's KDE conference in Dublin. With features such as magnetic selection, effect layers, colour model independence and full scriptability, it has risen to become what is probably the best free image editing program today.

# Lots of New Features in Kexi
Kexi, the desktop database application competing with MS Access, is the other application in KOffice that is already the best of its kind. Kexi has received over 270 improvements since KOffice 1.5. With this release, Kexi gains such features as the ability to handle images, compact the database, automatic datatype recognition and Kross scripting tools.

# KFormula Implements OpenDocument and MathML
The formula editor of KOffice now supports OpenDocument and MathML and uses it as its default file format. It also surpasses the equivalent component in OpenOffice.org, scoring 70% on the W3C MathML test suite compared to 22% for OpenOffice.org Formula. We see this as one example where the work to provide a very well-structured codebase of KOffice pays off to create a superior support for the existing standard.

KOffice is clearly storming away. I can't wait for the Windows port to introduce more people to the free software way....

And Now, the Community's MySQL

MySQL's success is impressive, and provides a handy example of pervasive corporate open source that isn't Apache. Although I'd seen about its new Enterprise offering earlier today, I must confess I hadn't picked up on the complementary Community product until I read this post by Matt Asay. It's a shrewd and necessary move that will doubtless be imitated by others.