23 October 2006

Law is a Real Problem in the Virtual World

If you think laws are a problem in the real world, wait until you start thinking about the virtual one. Here's what one person has decided:

I think that the entire range of common law rights needs to be viewed as applicable to virtual worlds -- property included.

Heavy stuff - but not something that we can avoid confronting as our second lives start to take on ever-more importance alongside our first ones.

Firefox: the Breakthrough

This is it. Just look at these dynamics:

The number of businesses allowing employees to download the Firefox Web browser soared this year, and at least one analyst believes the recently released Internet Explorer 7 could boost use of Firefox in companies.

Fully, 44 percent of businesses with 250 employees or more allow workers to download Mozilla Corp.'s open-source browser at the office, according to a survey conducted this year by JupiterResearch. Last year, only 26 percent of such businesses were willing to do the same.

So, we're through the crucial stage, where Firefox is only downloaded by enthusiasts, to that of corporate acceptance. That's good, but even better is the timing:

For many businesses, the move to Vista could take a year and a half or more, analysts say.

As a result, many people who get IE 7 at home through Microsoft's automatic update service will likely find IE6 lacking. Without the option of installing IE 7 at work, they are likely to turn to Firefox, Wilcox said.

Yee-ha, as they say.

GPLv3: What Richard Stallman Said

More than anyone else, Richard Stallman is driving the GPLv3 debate (although Eben Moglen is clearly another crucially important figure). What follows is a transcript of a short interview that took place on 6 October, 2006. In it, RMS talked about the issues that lie behind the GPLv3, and gave his thoughts on the concerns expressed by the Linux coders, some of which were raised in the posting below.

Could you give a little background to the drafting of the GNU GPLv3?

The purpose of the GNU GPL is to defend for all users the freedoms that define free software. It doesn't make sense in terms of open source. It's the result of implementing the philosophy of free software in the most strong way that we can. So all the version of the GPL have prevented middlemen from restricting subsequent users by changing the licence. Some free software licences permit that, for example the X11 licence permits that. The various BSD licences permit that. But the GPL was specifically designed not to permit that - you cannot add restrictions making the program non free.

Now, what we didn't have 15 years ago was the threat of making the program effectively non free by technical restrictions placed around it. That's what Tivoisation is. Tivoisation means taking a free program and distributing a binary of it, and also providing the source, because the GPL requires that. But when the user changes the source code and compiles it and then tries to install the changed program he discovers that that's impossible because the machine is designed not to let him.

The result of this is that freedom number 1, the freedom to study the source code and change it so the program does what you want, has become a sham. Tivoisation is essentially a way to formally comply with the requirement, but not in substance.

So we've come to the conclusion that this is more than just a minor issue. That this will be common, probably the usual case, if we don't do something to stop it. And therefore we've decided to do what is necessary so that our software will not be Tivoised. Our purpose is to deliver freedom to the user.

Why do you think there has been such an outcry in some quarters recently?

I don't know. A few people are upset.

A few people including most of the key kernel coders...

Their business. That's their program and they can decide whether to use this licence.

Seems clear they will stick with GPLv2?

I hope not, but if they do it's up to them.

If that happens, is that going to cause any problems for GNU?

It won't cause any problems for us, only for the public. The problem it will cause is Tivoisation. It will cause the problem that users don't have the freedoms that they should have. And that's a very big problem, but it's not a problem specifically for us, it's a problem for everyone. The problem is that many people will get machines in which Linux has been Tivoised. Which means that for practical purposes it won't be free for them.

If that happens, would you put more effort into the Hurd?

I don't think so, and the reason is that wouldn't achieve much unless we convinced everyone to switch to the Hurd from Linux, and that isn't too likely. The Hurd still has some technical problems, and who knows if it would ever become a competitor. But suppose somebody wanted to Tivoise, and he had available the Hurd and Linux to choose from, and Linux permits Tivoisation and the Hurd doesn't: the solution would be to use Linux.

Some people make the argument that if GPLv3 is applied to Linux, companies might simply adopt a different operating system for their products.


I don't think so.

You don't think they might use BSD or Windows?

They might, who knows? I don't think it's very likely, but the main point is it's no use giving up on a fight because you might lose, not when the fight is for something very important like freedom.

Is there anything you can do to assuage concerns of the kernel coders without giving up your principles?

I don't know. If they would just speak with us. we can explore that possibility.

Are they not doing that?

Basically no. Just recently we have had a couple of communications with them, not yet reaching the stage of being entirely civil in tone, but at least it's a start. We've been inviting them to talk with us since before we started publishing drafts, but they have not for the most part taken up that offer. In general they've made statements to the public instead of to us. And some of them are based on misunderstandings of the draft and of our intention. They're talking to each other not to us. But it's not too late for them to start if they wish to talk to us.

Is there scope to rephrase the clause that deals with Tivoisation?

We can rephrase it in a lot of different ways. We just recently decided on a change, which is that the requirement for keys would no longer work by calling them part of the corresponding source. This is a change in the details, but the substance is the same, the aim is the same - to change that would be giving up.

The two philosophies of free software and open source in some cases lead to similar conduct - in fact, in many cases. That's why it was so easy for the people who support open source to apply their label to what we're doing. Because if you're participating in a free software project it usually doesn't matter whether your goal is to give users freedom and to establish freedom in cyberspace or just have powerful and reliable software, because either way you could do the same things. And there's no need for people to ask each other: What's your philosophy, why do you want to contribute to this project? - they just start contributing, and they work on making the software better, and they focus on that.

But there are cases where these two different philosophies lead to different results. For instance, some people have proposed what they call “open source DRM” - DRM meaning “digital restrictions management”. This is a plan to develop software to put in machines that will restrict users, and then publish the source code of this. The idea is that programmers around the world will work together making that software do its job better, that is, restrict the user more inescapably, more reliably, more powerfully. Although the source code of this software will be published, they plan to use Tivoisation to make sure that the users can't escape from their power.

Now, if your goal is to give the users freedom, restricting the users through open source is no more tolerable than restricting the users any other way, because the users have to have the freedom.

Have you tried talking to TiVo about this?

No.

You don't think it might be useful?

No, not really. And the reason is they're just the first example. If it were only that one company that were the problem, we probably wouldn't pay attention because it would be a small problem. But the idea is floating around, and there are many different plans to use it.

Couldn't you help TiVo do what they want to do with free software?

They initially did. This Tivoisation was not in the first TiVo box. The point is, it's pressure from Hollywood. And the best way to have a chance of negotiating something with those who are under the pressure is first to set up counter pressure.

The problem being that a hacked version of TiVo could circumvent any DRM?


Exactly. And the point is, DRM itself is evil. Restricting the user's freedom in other ways so that the user cannot change the software and get rid of DRM makes the software effectively not free for that user. So we have these two philosophies, and here they make a big difference. You can imagine open source DRM, and if all you care about are the philosophical values of open source, you might think it's great. If you only want software to be powerful and reliable, you might tend to apply that to software whose purpose is to go in somebody's machine and restrict it, and you might think, “Sure I'll help you make that powerful and reliable.” But if you believe in free software, and you think that the user whose machine it is should be in control of what that machine does and not somebody else, then the aim of that project becomes wrong in itself. Free software DRM makes no sense - it's a contradiction in terms.

Are you worried about the prospect of GPL projects forking?

It can happen. But again, there's no use not fighting, there's no use surrendering to this threat. It's too dangerous.

Are there any other points you'd like to make?

There are people who seem to imagine that some disaster will happen because some programs in the GNU/Linux system are using GPLv3 and some are using GPLv2, but in fact there are many programs with other licences in the system as well, and there's no problem there at all.

There are many people who would like to come across some disastrous flaw in GPLv3. If one person says he's found it, the others repeat it without stopping to make sure it is for real, because they consider it the answer to their prayers.

But you think they'll work together without problems?

I know they will, because these programs are separate programs, and the licence of one has no effect on the licence of another.

Now, I wish that everyone would switch to GPLv3 because that would give the strongest possible front to resist Tivoisation and ensure the freedom of the users. But I know that not everybody will participate, nonetheless we have to try to defend the freedom.

Happy hacking.

Update

Richard Stallman has sent me a comment on Alan Cox's reply:

While I addressed the topic you proposed--version 3 of the GNU General Public License--Alan Cox chose instead to present a misleading picture of the history of GNU and Linux.

The GNU/Linux system comes out of the effort that I began in 1983 to develop a complete free Unix-like system called GNU. GNU is the only operating system that was developed specifically to respect computer users' freedom. Since our goal was to achieve freedom as soon as possible, we utilized the scattered existing free software packages that would fit. That still left most of the components for us to write. In those years, we of the GNU Project systematically developed the essential components of this system, plus many other desirable components, ranging from libraries to text editors to games.

In 1991, Linus Torvalds developed a kernel called Linux--initially not free software, but he freed it in 1992. At that time, the GNU system was complete except for a kernel. The combination of Linux and the GNU system was the first complete free operating system. That combination is GNU/Linux.

Cox says that Linux is not part of the GNU Project. That is true--of the kernel, Linux, that he and Torvalds have worked on. But the combined system that Cox calls "Linux" is more our work than his.

When Cox says that "FSF-copyrighted code is a minority in [GNU/Linux]", that too is misleading; he knows that just a fraction of the GNU packages' code is copyright FSF. What part do GNU packages compose in the whole system? Many are just as essential as Linux is.

In 1995, GNU packages were 28% of the system, while Linux was 3%. 28% is less than half, so that was a minority; but it is a lot more than 3%. Nowadays, after thousands of other groups have added to the system, both the GNU and Linux percentages are smaller than before; but no other project has contributed as much as the GNU Project.

Calling the combined system GNU/Linux is right because it gives the GNU Project credit for its work, but there are things more important than credit -- your freedom, for example. It is no accident that the GNU GPL existed before Linux was begun. We wrote the GPL to protect the freedom of the users of GNU, and we are revising it today so that it will protect against newer technical methods of denying that freedom. When you think about GPL issues, this is the background for them.

If the developers of Linux disagree with that goal, they are entitled to their views. They are entitled to cite their important work--Linux, the kernel--to be listened to more, but they should respect our right to cite the GNU system in the same way.

See http://www.gnu.org/gnu/gnu-linux-faq.html for more explanation.

GPLv3: What Linus, Alan, Greg, Andrew and Dave Said

Few subjects in the world of free software have provoked as much discussion as the new GNU GPLv3 licence. Mostly it's outsiders (like me) sounding off about this, but what do the people involved really think?

I decided to ask them, and write up the result for the Guardian. I was expecting a couple of lines back to my emailed questions if I was lucky, but badly underestimated hacker generosity. Linus and his mates sent me back long and typically thoughtful replies, while RMS wanted to talk about it at length.

Since I was only able to use a tiny fraction of this material in the Guardian article that resulted, I thought it might be a useful contribution to the GPLv3 debate to post it here (with the permission of those concerned).

For length reasons, I've split it up into two postings. Below are the replies of the kernel coders, which I received on 3 October, 2006 (placed in the order in which their authors appear in the recent GPLv3 poll). The interview with RMS can be found above.

Linus Torvalds

I don't think there will necessarily be a lot of _practical_ fallout from it, so in that sense it probably doesn't matter all that much. It's not like we haven't had license "discussions" before (the whole BSD vs GPL flame-war seemed to go on for years back in the early nineties). And in many ways, it's not like the actual split between the "Open Source" and the "Free Software" mentality is in any way new, or even brought about by the GPLv3 license.

So while I think there is still a (admittedly pretty remote) chance of some kind of agreement, I don't think that it's a disaster if we end up with a GPLv2 and a new and incompatible GPLv3. It's not like we haven't had licenses before either, and most of them haven't been compatible.

In some ways, I can even hope that it clears the air for all the stupid tensions to just admit that there are differences of opinion, and that the FSF might even just stop using the name "GNU/Linux", finally admitting that Linux never was a GNU project in the first place.

The real downside, I suspect, is just the confusion by yet another incompatible license - and one that shares the same name (licenses such as OSL and GPL were both open source licenses and they were incompatible with each other, but at least they had clear differentiation in their names).

And there's bound to be some productivity loss from all the inevitable arguments, although in all honesty, it's not like open source developers don't spend a lot of time arguing _anyway_, so maybe that won't be all that big of a factor - just a shift of area rather than any actual new lost time ;)

One of the reasons the thing gets so heated is that people (very much me included) feel very strongly about their licenses. It's more than just a legal paper, it's deeply associated with what people have been working on for in some cases decades. So logically I don't think the disagreement really matters a whole lot, but a lot of it is about being very personally attached to some license choice.

Alan Cox

Ar Maw, 2006-10-03 am 13:57 +0100, ysgrifennodd glyn moody:
> Since it seems likely that the kernel will remain under v2, while the
> rest of GNU goes for v3, I was wondering whether you think this is
> going to cause you and others practical problems in your work on the
> kernel. What about companies and end-users of GNU/Linux: will there


There is no such thing as GNU/Linux. For an article like this it's really important to understand and clarify that (and from the US view also as a trademark matter).

I mean there is no abstract entity even that is properly called "GNU/Linux". It's a bit of spin-doctoring by the FSF to try and link themselves to Linux. Normally its just one of those things they do and people sigh about, but when you look at the licensing debate the distinction is vital. (its also increasingly true that FSF owned code is a minority part of Linux)

Linux is not and never has been an FSF project. I would say the majority of the kernel developers don't buy the FSF political agenda. Linus likewise chose the license for the pragmatic reason it was a good license for the OS, not because he supported the GNU manifesto.

Thus this isn't about the Linux people splitting from the FSF, its a separate project that happens to have been consulted as to whether it would like to use a new, allegedly better, variant of the license it chose.

Linux does use FSF tools but that doesn't make it a GNU project any more than this article will be an IBM project because it was typed on a PC, or a BT project because it used an ADSL line.

The Linux kernel being GPLv2 isn't a problem we can see for the future. It is a distinct work to the applications that run on it, just as Windows kernel is to Windows applications. The more awkward corner cases will be LGPL and similar licenses where you want the benefits and flexibility. The FSF have indicated they understand that and will ensure it works out. The licenses are about having barriers to abuse, not barriers to use.

> be negative consequences for them, or do you think that life will
> just go on as before?


I'm not sure what will happen with the rest of the GPL licensed software world. It really is too early to say because the license is a draft at this point and various areas around patents and optional clauses remain open to correction and improvement.

Most GPL licensed code is not controlled by the FSF and probably has too many contributers to relicense. Stuff that is new or has a few owners might change license if the new license is good. However given that most of the work on the FSF owned projects is done by non FSF people then if the license is bad I imagine all the developers will continue the GPLv2 branch and the FSF will be left out in the cold. The FSF know this too and that's why it takes time to build a new license and consensus.

It may well be the new license is mostly used with new code.

> What's the main problem you have with GPLv3?

For the the kernel there are a few, the big one that is hard to fix is the DRM clause. Right now the GPLv2 covers things like DRM keys in generic language and it means the law can interpret that sanely. Its vague but flexible, which lawyers don't like of course. There isn't any caselaw but out of court settlements support the fact this is enforcable.

The GPLv3 variant is much stronger and it appears to cover things like keys to rented devices where the DRM logic is less clear.

The big one though for the kernel is not a legal matter or even a specifically GPLv3 matter. Many people contributed to the kernel under a set of understood terms. Not only would all those people have to agree to a change in those terms but those terms changing would prevent some of the existing users from continuing to use it in the manner they do now.

You can't really make an agreement like that and then change the rules on people who've contributed time, money and code to the Linux project. I support Linus' assertion that legal issues aside he doesn't have the moral right to change the rules this way.

Greg Kroah-Hartman

> My question concerns the timing of the recent white paper: why was it
> released now, and not at the beginning of the GPLv3 consultation process
> when it might have been able to influence things?


The process is not over, and we still hope to influence things. We would not have written that letter otherwise. The main reason it was not done earlier is that we just did not think it was going to be a problem, as the kernel was not going to change licenses. But the more that we realized this was going to have a problem outside of just the kernel, and affect the whole community, we felt that we should at least voice our opinions.

Also, please note that the DRM issues have changed over time from being very broad (which was at least admirable), to being explicitly targeted at only the Linux kernel. Now the license is worded to try to stop the "tivoization" issue.

This is the where a bootloader or bios determines if the crypto signature of the kernel is acceptable or not before it decides to run it or not. This means that only "approved" kernels that come from the company will run properly on the hardware.

Now this kind of restriction pretty much _only_ affects the kernel, not any other type of program. This is because only if you can control the kernel can you ensure that the system is "secure".

So it seems that the FSF is only targeting the Tivo issue, which us kernel developers have explicitly stated in public that it is acceptable to use _our_ code in this manner. So they are now trying to tell another group (us) what we should do to our code.

As the FSF has no contribution in the Linux kernel, and has nothing to do with it in general, we kernel developers are now a bit upset that someone else is trying to tell us that something we explicitly stated was acceptable use of our code, is suddenly bad and wrong.

> Given that the FSF is unlikely to throw away all the work it has done, or
> even modify it substantially/substantively, do you have any thoughts on
> what's going to happen?


I really have no idea, but I would hope that things change for the better. We are already hearing rumors from the people on the different GPLv3 committees that our statement has had an affect, but we will not know for sure until the next draft comes out.

Andrew Morton

Well gee. We're programmers and we spend our time programming, not swanning around at meetings talking about legal matters and playing politics. We find things like licensing to be rather a distraction, and dull. So most people largely ignored it all.

It was only later in the process when the thing started to take shape, when we saw where it was headed and when we began to hear the concerns of various affected parties that there was sufficient motivation to get involved.

In fact this points at a broad problem with the existing process: I'm sure that a large majority of the people who actually write this code haven't made their opinions felt to the FSF. Yet the FSF presumes to speak for them, and proposes to use their work as ammunition in the FSF's campaigns.

And why haven't these programmers made their opinions known? Some are busy. Many work for overlawyered companies and are afraid that they might be seen to be speaking for their companies. Some don't speak English very well. Almost all of them find it to be rather dull and a distraction.

Dave Miller

For the kernel I'm pretty sure things will go on as they have before.

The problems are most likely for the projects under the GNU Project umbrella. All the copyrights to those projects, such as GCC, Binutils, etc. are all assigned to the GNU Project. So the FSF could, and almost certainly will, make all of those projects use the GPL v3.

As an aside, I will note that originally the FSF used to say that they wanted copyright assigned to them "to make it easier to enforce the GPL in court for software projects under the GNU Project umbrella." But as is clear today, it's also a power thing, in that having all the copyrights assigned to them allows the FSF to choose the licensing of the code as they see fit since they are the copyright holder of the complete work.

At the point of a relicense to GPL v3 for these GNU Project source trees one of two things could happen. Either the developers are OK with this, even if to simply "grin and bear it" and things go on under GPL v3. Or, the developers are unhappy with this, and fork off a GPL v2 copy of the tree and do development there.

In the end, even though they've assigned their copyrights to the FSF, the developers do control the ultimate licensing of these GNU projects. If they don't like GPL v3 and work on the GPL v2 fork instead, the FSF is at a loss because while they can mandate whatever they like such mandates are useless if the developers don't want to contribute to the GPL v3 variant.

So being the ones who do the development work is actually a kind of power which permeates through all of the politics. If the political folks do something stupid, the developers can just take their talent and efforts elsewhere.

I'm more than familiar with this process, since I was part of the group that forked the GCC compiler project many years ago because the majority of the GCC developers found the head maintainer (Richard Kenner) impossible to work with. Although he was quite upset about it, there wasn't much that Richard Stallman and the FSF could do about it. In the end the fork became the "real GCC" under GNU Project umbrella once more.

So the opinion of the developers matters a lot, especially when it comes to licensing. It could get messy is a lot of these projects fork, but the GPL v3 isn't a done deal yet so the FSF still has time to fix things up and make it more palatable to people.

> Alone among those polled for their views on the v2 and v3 you choose
>
> 0 I don't really care at all
>
> why is this when everybody else seems to hold such extreme views on the
> subject? Do you think they're getting worked up over nothing?


First, I think the poll was pretty useless.

The poll asked what people think of the GPL v3 draft, which is by definition in draft state and therefore not ready for final consumption. Of course the GPL v3 still needs some fixing. So asking about actually using it in a major software project right now is totally pointless. What would have been more interesting would have been to ask what the developers think about the "core issues" rather than the specific implementation of those issues in the current GPL v3 draft.

For example, polling on what the kernel developers thought about "keying of the Linux kernel" in the way that Tivo does would have been much more interesting. In my opinion, I believe you would have seen about an even split down the middle on this one. But even the people who are against keying think that the DRM language in the GPL v3 meant to combat this is not done correctly.

Several kernel developers believe that GPL v2 already has enough language to make restrictions such as keying be not allowed.

Personally, I'm against keying and I'm quite unhappy with what Tivo did with the Linux kernel. This kind of keying sets a very bad precedent. For example, in the future vendors could get away with some questionable things using keying. Say a vendor sells a piece of hardware, and provides the GPL source to the kernel drivers of the new things in that piece of hardware. Then, they only allow kernel binaries signed with a special key to load. This makes the publishing of their drivers effectively useless. The vendor still controls everything and nobody gains from the code they've submitted. Nobody can recompile a kernel with their changes and actually test it on their hardware, since they have no way to produce a signed kernel that the device will actually allow to boot. So the value of this "contribution" is absolutely zero. This is, in my opinion, totally against the spirit of the GPL v2.

I would have been perfectly fine with Tivo using another OS for their product. All the world does not have to be Linux, and if Linux's license doesn't suit someone, they are free to not use it.

In most examples I've ever been shown where this kind of lockdown is supposedly "legitimate", the owner of the device is effectively making this lockdown decision. For example I'm OK with electronic voting machines used in real elections having their software locked down. The government owns those machines, and is well within their rights to lock down that hardware in order to ensure a proper and fair election to the public by preventing software tampering.

But if I purchase an electronic voting machine of my own, and it uses the Linux kernel, I very much want to tinker with it, build my own kernels, and try to poke holes in the device. How else could we validate that electronic voting machines are safe if the public has no way to test these claims out for themselves, in particular when such devices use open technologies such as the Linux kernel?

Web 2.0 Start-ups as Haiku

Web 2.0:
Surely we've all had enough?
Maybe not, like this.

A Map of the Commons

Commons are things that are held, well, in common, for the benefit of all. The traditional commons is the common land, many of which still exist in England - Clapham Common, for example. But commons can be anything. For example, free software is a commons, as is open content. The air we breathe is clearly a commons, as are the world's oceans.

Less obvious, perhaps, is the commons of tranquillity. Like other commons, it can be destroyed for all by the selfish actions of a few. But what exactly is the state of this commons today? Here in England, we now know, thanks to a neat map of this commons put together by the Council for the Preservation of Rural England.

This actually quite useful because, as they say, if you can't measure it, you can't manage it: if you don't know where the commons is most threatened, you can't take action to protect it. Well done CPRE. (Via BBC News.)

22 October 2006

A Request to the Icelandic Nation

On the occasion of its breach of a 21-year-old international ban on commercial whaling, just a quick request to the Icelandic nation: could you please close the door on your way out of the civilised world.

21 October 2006

Innovating all the Way to Openness

Innovate - the "journal of online education" - has an issue exploring

the potential of open source software and related trends to transform educational practice.

Nothing hugely new there for readers of this blog, but there are some articles with interesting case studies from the educational world. There's also a typically thoughtful and well-written piece by David Wiley, who invented the term "open content" back in 1998. You'll need to register (email address required), but it's worth that minor effort.

Intellectual Property is not "Up for Grabs"...

...as this interesting post suggests; it is just simply broken. Time to wheel it away and consign it to Time's winged dustbin.

20 October 2006

Open Source Disaster Management System

No, really: it's called Sahana. (Via Bob Sutor's Open Blog.)

OA Book on OA from OA Publisher

As the Digg lot say, "title says it all" - or nearly: worth noting too that this open access book on open access comes from a publisher that provides open access to all its titles. (Via Open Access News.)

Copyright as a Metaphor for "Rip-off"

There's a piece on C|net which I can only hope was written with the express intent of provoking a reaction, since it's basic idea is so batty:

A European court last month agreed with a group of regional publishers in Belgium that accused Google of ripping off their content. The court ordered Google to remove text summaries of the newspapers' articles, along with Web links to the publishers' sites.

As world and dog have pointed out, what Google News does is provide free - yes, free - publicity for news sites, leading to free - yes, free - extra traffic, which can then be converted to what we in the trade call dosh. The idea that Google is somehow "ripping off" the poor old media conglomerates is risible. But luckily, they seem intent on slitting their own throats, so let 'em, says I.

More serious is the implicit assumption in the C|net piece that there is something sacred about copyrighted material. Maybe there would be, if copyright did what it was originally intended to do: to provide an incentive to the creator to create. But now that copyright typically runs for 50 or even 70 years after the creator's death, it's hard to see how new works are going to be conjured up except with a Ouija board.

Copyright has broken the original social compact, which is that people aren't allowed to copy it for 14 years - yes, 14 - in return for being allowed to do what they like with it afterwards. As copyright is extended time and time again, it is becoming impossible ever to access the content it covers: there is no quid for the quo.

So copyright has become the "rip-off", demanding without giving. If media companies really wanted to stop people using their materials, they should go back to a balanced copyright that gave to both parties. The current system is so inequitable that it is no wonder most people feel morally justified in ignoring it.

On Sharing and Fake Sharing

Larry Lessig has some wise words on what Flickr gets right and YouTube gets wrong.

IE7: Mossberg and the Fox

I've not really been following the IE7 saga, since it seems to be a case of too little too late. I was pleased to find my prejudices confirmed by the Grand Old Man of populist tech journalism, Walter Mossberg:

The new Internet Explorer is a solid upgrade, but it's disappointing that after five years, the best Microsoft could do was to mostly catch up to smaller competitors.

Quite.

Kudos to Kocoras and ICANN

Looks like I was overly pessimistic about the Spamhaus case:

On 19 October 2006, United States District Court Judge Charles P. Kocoras, presiding over the e360Insight v. The Spamhaus Project matter in the Northern District of Illinois, issued an order denying e360Insight's ("e360") motion asking the Court to, among other things, suspend www.spamhaus.org. The Court explained that the relief e360 sought was too broad to be warranted under the circumstances. First, the Court noted that since there is no indication that ICANN or Tucows acted in concert with Spamhaus, the Court could not conclude that either party could be brought within the ambit of Federal Rule of Civil Procedure 65(d), which states that an order granting an injunction is "binding only upon the parties to the action, their officers, agents, servants, employees, and attorneys, and upon those persons in active concert or participation with them." Second, the Court stated that a suspension of www.spamhaus.org would cut off all lawful online activities of Spamhaus, not just those that are in contravention of the injunction the Court previously issued against Spamhaus.

Kudos to Kocoras for his intelligence, and to ICANN for not rolling over as I feared they would.

19 October 2006

It's the Big 1 - 0 - 0 - 0

This is my thousandth post.

Just thought I'd mention it.

Er, that's all, really.

Thanks for your attention.

Plone Goes Pltwo...

...as in Second Life:

The Plone Foundation announced today the broadcasting of the Plone Conference 2006 into the virtual world Second Life. It is the first big open source conference being colocated inside a virtual world. The event will be held from Oct 25-27

"We'll broadcast selected talks and tutorials each day into a virtual conference building inside Second Life. Residents who could not make it to the by-now sold-out conference can participate in virtual form. A back channel for their questions to the actual speaker will be provided, too"

And so the boundaries between RL (real life) and SL (Second Life) became just that tiny bit more friable....

Sun Supports OpenOffice.org - No, Really

It might seem strange to talk about Sun supporting OpenOffice.org - after all, it was the original donor of the code to the open source community. But what's changed is that it is now offering service plans for the software: this is news, because in the past it has only supported its own variant, StarSuite.

This is also important, because it provides a safety net to companies and government departments who want to use OpenOffice.org. In the past, they have been forced to opt for StarSuite if they wanted support; no longer.

Well done, my Sun.

All Hail the ODF Alliance

The ODF Alliance has been going for a while now, but even so this list of 300+ members is a forceful reminder that this is a standard that is getting stronger day by day. (Via Erwin's StarOffice Tango.)

The Evolution of Academic Knowledge

The complete works of Charles Darwin are now online. This is certainly an important moment in the evolution of academic knowledge, since it points the way to a future where everything will be accessible in this way - call it the Googleisation of academia.

A pity, though, that the terms of use are so restrictive: not a CC licence in sight. Obviously we're still at the Neanderthal stage as far as copyright evolution is concerned.

18 October 2006

The Integrated Open Source Stack Meme

I noted previously that Red Hat has blessed the idea of the integrated open source stack; now Novell is doing the same, with the support of IBM.

And the meme marched on.

Casing Citizendium

Citizendium, Larry Sanger's Wikipedia fork, is opening its doors, albeit in a very controlled sort of way, as a private alpha. At least the press release - characteristically lengthy - sketches in some of the details as to who is doing what with this interesting project. I'll be writing more about this in due course.

Will Lack of Open Access Wipe Out the World?

A few months ago, I asked whether lack of open access to avian 'flu data might hinder our ability to head off a pandemic; now it looks like lack of open access could lead to the destruction of civilisation as we know it. If that sounds a little far fetched, consider the facts.

The US is the largest single polluter in terms of carbon dioxide: according to the US Environmental Protection Agency, "In 1997, the United States emitted about one-fifth of total global greenhouse gases."

The EPA plays a key role in determining the US's environmental actions: "the Agency works to assess environmental conditions and to identify, understand, and solve current and future environmental problems; integrate the work of scientific partners such as nations, private sector organizations, academia and other agencies; and provide leadership in addressing emerging environmental issues and in advancing the science and technology of risk assessment and risk management."

To "assess environmental conditions and to identify, understand, and solve current and future environmental problems; integrate the work of scientific partners such as nations, private sector organizations, academia and other agencies" clearly requires information. Much of that information comes from scientific journals published around the world. Unfortunately, the EPA is in the process of cutting back on journal subscriptions:

The U.S. Environmental Protection Agency is sharply reducing the number of technical journals and environmental publications to which its employees will have online access, according to agency e-mails released today by Public Employees for Environmental Responsibility (PEER). This loss of online access compounds the effect of agency library closures, meaning that affected employees may not have access to either a hard copy or an electronic version of publications.

...

In addition to technical journals, EPA is also canceling its subscriptions to widely-read environmental news reports, such as Greenwire, The Clean Air Report and The Superfund Report, which summarize and synthesize breaking events and trends inside industry, government and academia. Greenwire, for example, recorded more than 125,000 hits from EPA staff last year.

As a result of these cuts, agency scientists and other technical specialists will no longer have ready access to materials that keep them abreast of developments within their fields. Moreover, enforcement staff, investigators and other professionals will have a harder time tracking new developments affecting their cases and projects.

So, we have the organisation whose job is to help determine the actions of the world's worst polluter cut off from much of the most recent and relevant research, in part because much of it is not open access.

No OA, no tomorrow, no comment. (Via Open Access News.)

Open Source Intelligence

Technocrat pointed me to this story on Village Voice, a title I used to read assiduously in my younger days. It's about how a network of planespotters have put together many of the pieces that go to make up the shameful jigsaw puzzle of the CIA's "torture taxi" operation, used for moving people around the world to be held and tortured without judicial oversight.

What's fascinating is the way that tiny, apparently meaningless contributions - a photo here, a Yahoo search of a plane number there - when put together, can help create something really big and important, just as open source projects pool the work of hundreds or thousands to create vast and astonishing achievements like GNU/Linux or Wikipedia.

An Ode to Unicode 5.0

Andy Updegrove has a short but justified paean to the wonder that is Unicode, one of the unsung heroes/heroines of the computer revolution. Apparently version 5.0 is now available. Don't all rush to buy a copy at once.