10 November 2010

Xanadu and the Digital Pleasure-Dome

I consider myself fortunate to have been around at the time of the birth of the Internet as a mass medium, which I date to the appearance of version 0.9 of Netscape Navigator in October 1994.

This gives me a certain perspective on things that happen online, since I can often find parallels from earlier times, but there are obviously many people who have been following things even longer, and whose perspective is even deeper. One such is Mark Pesce who also happens to be an extremely good writer, which makes his recent blog posting about the "early days" even more worth reading:

Back in the 1980s, when personal computers mostly meant IBM PCs running Lotus 1*2*3 and, perhaps, if you were a bit off-center, an Apple Macintosh running Aldus Pagemaker, the idea of a coherent and interconnected set of documents spanning the known human universe seemed fanciful. But there have always been dreamers, among them such luminaries as Douglas Engelbart, who gave us the computer mouse, and Ted Nelson, who coined the word ‘hypertext’. Engelbart demonstrated a fully-functional hypertext system in December 1968, the famous ‘Mother of all Demos’, which framed computing for the rest of the 20th century. Before man had walked on the Moon, before there was an Internet, we had a prototype for the World Wide Web. Nelson took this idea and ran with it, envisaging a globally interconnected hypertext system, which he named ‘Xanadu’ – after the poem by Coleridge – and which attracted a crowd of enthusiasts intent on making it real. I was one of them. From my garret in Providence, Rhode Island, I wrote a front end – a ‘browser’ if you will – to the soon-to-be-released Xanadu. This was back in 1986, nearly five years before Tim Berners-Lee wrote a short paper outlining a universal protocol for hypermedia, the basis for the World Wide Web.

Fascinating stuff, but it was the next paragraph that really made me stop and think:

Xanadu was never released, but we got the Web. It wasn’t as functional as Xanadu – copyright management was a solved problem with Xanadu, whereas on the Web it continues to bedevil us – and links were two-way affairs; you could follow the destination of a link back to its source. But the Web was out there and working for thousand of people by the middle of 1993, while Xanadu, shuffled from benefactor to benefactor, faded and finally died. The Web was good enough to get out there, to play with, to begin improving, while Xanadu – which had been in beta since the late 1980s – was never quite good enough to be released. ‘The Perfect is the Enemy of the Good’, and nowhere is it clearer than in the sad story of Xanadu.

The reason copyright management was a "solved problem with Xanadu" was because of something called "transclusion", which basically meant that when you quoted or copied a piece of text from elsewhere, it wasn't actually a copy, but the real thing *embedded* in your Xanadu document. This meant that it was easy to track who was doing what with your work - which made copyright management a "solved problem", as Pesce says.

I already knew this, but Pesce's juxtaposition with the sloppy, Web made me realise what a narrow escape we had. If Xanadu had been good enough to release, and if it had caught on sufficiently to establish itself before the Web had arrived, we would probably be living in a very different world.

There would be little of the creative sharing that undelies so much of the Internet - in blogs, Twitter, Facebook, YouTube. Instead, Xanadu's all-knowing transclusion would allow copyright holders to track down every single use of their content - and to block it just as easily.

I've always regarded Xanadu's failure as something of a pity - a brilliant idea before its time. But I realise now that in fact it was actually a bad idea precisely of its time - and as such, completely inappropriate for the amazing future that the Web has created for us instead. If we remember Xanadu it must be as a warning of how we nearly lost the stately pleasure-dome of digital sharing before it even began.

Follow me @glynmoody on Twitter or identi.ca.

Microsoft Demonstrates why FRAND Licensing is a Sham

A little while back I was pointing out how free software licences aren't generally compatible with Fair, Reasonable and Non-Discriminatory (FRAND) licensing, and why it would be hugely discriminatory if the imminent European Interoperability Framework v 2 were to opt for FRAND when it came to open standards, rather than insisting on restriction-free (RF) licensing.

I noted how FRAND conditions are impossible for licences like the GNU GPL, since the latter cannot pay per copy licensing fees on software that may be copied freely. As I commented there, some have suggested that there are ways around this - for example, if a big open source company like Red Hat pays a one-off charge. But that pre-supposes that licence holders would want to accommodate free software in this way: if they simply refuse to make this option available, then once again licences like the GNU GPL are simply locked out from using that technology - something that would be ridiculous for a European open standard.

Now, some may say: “ah well, this won't happen, because the licensing must be fair and reasonable”: but that then begs the question of what is fair and reasonable. It also assumes that licensors will always want to act fairly and reasonably themselves - that they won't simply ignore that condition. As it happens, we now have some pretty stunning evidence that this can't be taken for granted.

On Open Enterprise blog.

09 November 2010

A Patent No-brainer, Mr Willetts

There has been understandable excitement over David Cameron's announcement - out of the blue - that the UK government would be looking at copyright law:

On Open Enterprise blog.

Who's Lobbying Whom?

One of the frustrating things about being on the side of right, justice, logic and the rest is that all of these are trumped by naked insider power - just look at ACTA, which is a monument to closed-door deals that include rich and powerful industry groups, but expressly exclude the little people like you and me.

Against that background, it becomes easy to understand why Larry Lessig decided to move on from promoting copyright reform to tackling corruption in the US political machine. The rise of great sites like the Sunlight Foundation, whose tagline is "Making Government Transparent and Accountable" is further evidence of how much effort is going into this in the US.

The UK is lagging somewhat, despite the fact that in terms of pure open data from government sources, we're probably "ahead". But it's clear that more and more people are turning their thoughts to this area - not least because they have made the same mental journey as Lessig: we've got to do this if we are to counter the efforts of big business to get what they want regardless of whether it's right, fair or even sensible.

Here's a further sign of progress on this side of the pond:


We are excited to announce the Who’s Lobbying site launches today! The site opens with an analysis of ministerial meetings with outside interests, based on the reports released by UK government departments in October.

That analysis consists of treemaps - zoomable representations of how much time is spent with various organisations and their lobbyists:

For example, the treemap shows about a quarter of the Department of Energy and Climate Change meetings are with power companies. Only a small fraction are with environmental or climate change organisations.

It's still a little clunky at the moment, but it gives a glimpse of what might be possible: a quick and effortless consolidated picture of who's getting chummy with whom. As the cliché has it, knowledge is power, and that's certainly the case here: the more we can point to facts about disproportionate time spent with those backing one side of arguments, the easier it will be to insist on an equal hearing. And once that happens, we will be halfway there; after all, we *do* have right on our side...

Follow me @glynmoody on Twitter or identi.ca.

Is it Time for Free Software to Move on?

A remarkable continuity underlies free software, going all the way back to Richard Stallman's first programs for his new GNU project. And yet within that continuity, there have been major shifts: are we due for another such leap?

On The H Open.

08 November 2010

A Tale of Two Conferences

I was invited to give a talk at two recent conferences, the Berlin Commons Conference, and FSCONS 2010. It's generally a pleasure to accept these invitations, although I must confess that I found two major conferences with only two days between them a trifle demanding in terms of mental and physical stamina.

Indeed, both conferences were extremely stimulating, and I met many interesting people at both. However, more than anything, I was struck by huge and fundamental differences between them.

The Berlin Commons Conference was essentially the first of its kind, and a bold attempt to put the concept of the commons on the map. Of course, readers of this blog will already know exactly where to locate it, but even for many specialists whose disciplines include commons, the idea is still strange. The conference wisely sought to propel the commons into the foreground by finding, er, common ground between the various kinds of commons, and using that joint strength to muscle into the debate.

That sounded eminently sensible to me, and is something I have been advocating in my own small way (not least on this blog) for some time. But on the ground, achieving this common purpose proved much harder than expected.

In my view, at least, this was down largely to the gulf of incomprehension that we discovered between those working with traditional commons - forests, water, fish etc. - and the digital commons - free software, open content, etc. Basically it seemed to come down to this: some of the former viewed the latter as part of the problem. That is, they were pretty hostile to technology, and saw their traditional commons as antithetical to that.

By contrast, I and others working in the area of the digital commons offered this as a way to preserve the traditional, analogue commons. In particular, as I mentioned after my talk at the conference (embedded below), the Internet offers one of the most powerful tools for fighting against those - typically big, rich global corporations - that seek to enclose physical commons.


I must say I came away from the Berlin conference a little despondent, because it was evident that forming a commons coalition would be much harder than I had expected. This contrasted completely with the energising effect of attending FSCONS 2010 in Gothenburg.

It's not hard to see why. At the Swedish conference, which has been running successfully for some years, and now attracts hundreds of participants, I was surrounded by extremely positive, energetic and like-minded people. When I gave my talk (posted below), I was conscious that intentionally provocative as I was, my argument was essentially pushing against an open door: the audience, though highly critical in the best sense, were in broad agreement with my general logic.


Of course, that can make things too easy, which is dangerous if it becomes routine; but the major benefit of being confirmed in your prejudices in this way is that it encourages you to continue, and perhaps even to explore yet further. It has even inspired me to start posting a little more prolifically. You have been warned....

Follow me @glynmoody on Twitter or identi.ca.

30 October 2010

An Uncommon Conference on the Commons

Regular readers of this blog will know that the commons has been a major theme here for some years, since it offers an extremely fruitful way of looking at free software, open content and the other "opens". Recognition of the importance of the commons has been slow coming, but an important moment was when the doyenne of commons studies, Elinor Ostrom, won the Nobel Prize for Economics last year:

Elinor Ostrom has challenged the conventional wisdom that common property is poorly managed and should be either regulated by central authorities or privatized. Based on numerous studies of user-managed fish stocks, pastures, woods, lakes, and groundwater basins, Ostrom concludes that the outcomes are, more often than not, better than predicted by standard theories. She observes that resource users frequently develop sophisticated mechanisms for decision-making and rule enforcement to handle conflicts of interest, and she characterizes the rules that promote successful outcomes.

And now, building on that momentum, we have the Berlin Commons Conference:

The conference seeks to bring together a diverse group of about 150 international and Germany- and European-based commoners, intellectuals, activists and policy makers. It also aims to enhance participation and self-organization; stewardship, cooperation and networking; and open, non-linear ways to search for solutions.

Over the course of two days, the conference will assess the range of existing and potential commons-based policy approaches; develop the fundamentals of a policy framework that supports the commons; and identify and explore specific strategic opportunities to advance commons-based approaches.

The conference announcement elaborates: “The simple yet powerful and complex question to be explored throughout the conference is: What does a commons-based policy framework look like? What already exists and what do we still need to develop to nurture and protect diverse sorts of commons?”

As you can see from the list of participants, yours truly will also be attending. Apparently, there will be a live video stream of some of the sessions: not sure whether mine will be one of them. If it is, you can see me spouting my common commons nonsense around 11am CEST, 10am GMT.

Follow me @glynmoody on Twitter or identi.ca.

28 October 2010

The Limits of Openness?

I've been a long-time fan of the 3D modelling program Blender. No surprise, then, that I've also been delighted to see the Blender Foundation moving into content production to show what the software can do.

Specifically, it has produced a game (Yo! Frankie) and three animated films: Elephants Dream; Big Buck Bunny; and most recently, Sintel. Aside from their aesthetic value, what's interesting about these films is that the content is released under a cc licence.

Here's a fascinating interview with Ton Roosendaal, head of the Blender Institute, leader of Blender development, and producer of Sintel. It's well-worth reading, but there was one section that really caught my eye:

we keep most of our content closed until release. I’m a firm believer in establishing protective creative processes. In contrast to developers — who can function well individually online — an artist really needs daily and in-person feedback and stimulation.

We’ve done this now four times (three films and one game) and it’s amazing how teams grow in due time. But during this process they’re very vulnerable too. If you followed the blog you may have seen that we had quite harsh criticism on posting our progress work. If you’re in the middle of a process, you see the improvements. Online you only see the failures.

The cool thing is that a lot of tests and progress can be followed now perfectly and it suddenly makes more sense I think. Another complex factor for opening up a creative process is that people are also quite inexperienced when they join a project. You want to give them a learning curve and not hear all the time from our audience that it sucks. Not that it was that bad! But one bad criticism can ruin a day.

Those are reasonable, if not killer, arguments. But his last point is pretty inarguable:

One last thing on the “open svn” point: in theory it could work, if we would open up everything 100% from scratch. That then will give an audience a better picture of progress and growth. We did that for our game project and it was suited quite well for it. For film… most of our audience wants to get surprised more, not know the script, the dialogs, the twists. Film is more ‘art’ than games, in that respect.

That's fair: there's no real element of suspense for code, or even games, as he points out. So this suggest for certain projects like these free content films, openness may be something that needs limiting in this way, purely for the end-users' benefit.

Follow me @glynmoody on Twitter or identi.ca.

The British Library's Future: Shiny, Locked-Down Knowledge?

Yesterday, Computerworld UK carried an interesting report headed “British Library explores research technologies of the future”. Here's what it hopes to achieve:

On Open Enterprise blog.

27 October 2010

In Praise of Open Source Diversity

One of the great strengths of open source is that it offers users choice. You don't like one solution? Choose another. You don't like any solution, write your own (or pay for someone else to do it). Thus it is that many categories have several alternative offerings, all of them admirable applications, and all of them with passionate supporters.

On Open Enterprise blog.

Linux Embeds Itself Even Deeper

Because anyone can take Linux and use it as they wish without needing to ask permission (provided they comply with the licence), it ends up being used in lots of places that we rarely hear about. This contrasts with proprietary operating systems, which only get used if they are licensed directly, which means that the licensor always knows exactly what is going on - and can issue yet another boring press release accordingly.

On Open Enterprise blog.

25 October 2010

How is OpenStack Stacking Up?

You may have noticed there's a fair bit of interest in this cloud computing thing. You've probably also come across various articles suggesting this is the end of free software - and the world - as we know it, since cloud-based platforms render operating systems on servers and desktops largely moot.

On Open Enterprise blog.

22 October 2010

Jamie Love on What EU Must Do About ACTA

Jamie Love has been one of the key people writing about and fighting the worst aspects of ACTA. He's just posed a good question on Twitter:

Why haven't the EU civil society groups done more on patents in ACTA? It is quite possible to get patents out at this point.

He then followed up with this suggestion:

Some type of a focused letter on footnote 2 would be very helpful, right now. This could go either way.
That footnote, by the way, says the following:

For the purpose of this Agreement, Parties agree that patents do not fall within the scope of this Section.

"This section" is Section 2: Civil Enforcement, and so the suggestion seems to be that we push for that footnote to be accepted by all parties. So, what do people think, can we do something along these lines?

Follow me @glynmoody on Twitter or identi.ca.

20 October 2010

A (Final) Few Words on FRAND Licensing

The issue of Fair, Reasonable and Non-Discriminatory (FRAND) licensing has cropped up quite a few times in these pages. The last time I wrote about the subject was just last week, when I noted that the Business Software Alliance was worried that the imminent European Interoperability Framework (EIF) might actually require truly open standards, and so was pushing for FRAND instead.

On Open Enterprise blog.

19 October 2010

Want to Change German Copyright Law?

Of course you do - and here's your big chance. Dirk Riehle is not only the Professor for Open Source Software at the Friedrich-Alexander-University of Erlangen-Nürnberg (Germany orders these things better than we do), but he is also part of an expert council advising a “multilateral commission instituted by the German parliament to discuss and make recommendations on, well, Internet and digital society.”

On Open Enterprise blog.

15 October 2010

Why We've Learnt to Love the Labs

You may recall the global excitement a year ago, when Gmail finally came out of beta - after an astonishing five years:

On Open Enterprise blog.

14 October 2010

Microsoft Gives its Blessing to OpenOffice.org

On the 13 April 1999, a press release appeared headed “Mindcraft study shows Windows NT server outperforms Linux.” The summary read: “Microsoft Windows NT server 2.5 times faster than Linux as a file server and 3.7 times faster as Web server.” One thing the press release failed to mention was the following, found in the study itself: “Mindcraft Inc. conducted the performance tests described in this report between March 10 and March 13, 1999. Microsoft Corporation sponsored the testing reported herein.”

On Open Enterprise blog.

13 October 2010

Is GCHQ Frighteningly Clueless or Fiendishly Cunning?

I'm very sceptical about the concept of “cyber attacks”. Not that I doubt that computer systems and infrastructure are attacked: it's just their packaging as some super-duper new “threat” that I find suspicious. It smacks of bandwagon-jumping at best, and at worst looks like an attempt by greedy security companies to drum up yet more business.

On Open Enterprise blog.

11 October 2010

Whatever the BSA Says, FRAND is no Friend of Europe

I see that my old mates the Business Software Alliance are a tad concerned that the European Commission might do something sensible with the imminent European Interoperability Framework (EIF):

On Open Enterprise blog.

07 October 2010

Is Microsoft running out of steam?

Most people have heard about the 18th-century inventor James Watt and his steam engine; not so many know about the way he used patents to stifle competition and throttle further development of the technology:

Watt’s patent was very broad in scope (covering all engines making use of the separate condenser and all engines using steam as a "working substance"). In other words, the patent had a very large blocking power. The enforcement of almost absolute control on the evolution of steam technology, using the wide scope of the patent, became a crucial component of Boulton and Watt’s business strategy.

On The H Open.

Back to the Future Again: 2020 FLOSS 3.0

Yesterday I wrote about my experiences last week at the Open World Forum. As I noted, the two-day event closed with the presentation of the latest edition of the 2020 FLOSS Roadmap. Even though I'd not been to the Open World Forum before, I have written about the two previous versions of the Roadmap (still available.)

On Open Enterprise blog.

06 October 2010

Sharing: Crossing the Digital-Analogue Divide

I've been writing about all kinds of openness and sharing on this blog nearly five years now. Before that, I had been covering free software for a further ten years. Although I touch on open hardware examples here, this has all largely been about *digital* sharing.

A key concern of mine has been how this will translate into the "real", aka analogue world. For digital sharing is relatively easy, and it's possible that without such low barriers to sharing, the kinds of behaviours that are becoming common online might not translate into the offline realm.

But it seems like my fears were misplaced:

The results of Latitude Research and Shareable Magazine's The New Sharing Economy study released today indicate that online sharing does indeed seem to encourage people to share offline resources such as cars and bikes, largely because they are learning to trust each other online. And they're not just sharing to save money - an equal number of people say they share to make the world a better place.

More specifically:

* Sharing online content is a good predictor that someone is likely to share offline too. 78% of participants felt that experiences they've had interacting with people online have made them more open to the idea of sharing with strangers. In fact, every study participant who shared content online also shared various things offline. Sharing entrepreneurs are already taking advantage of this by seeding their services in contextually relevant online communities. For instance, online kids clothing exchange thredUP build relationships with prominent mommy bloggers to speed their launch.

* 75% of participants predicted that their offline sharing will increase in the next 5 years. While fast growing, this new sector has lots of unmet demand. More than half of all participants either shared vehicles casually or expressed interest in doing so. Similarly, 62% of participants either share household items casually or expressed interest in doing so. There's also high interest in sharing of physical spaces for travel, storage, and work - even with complete strangers.

If confirmed by other research, this is really important. It says that global projects like free software and Wikipedia are not just isolated, geeky instances of collaboration, sharing and altruism: they feed into large-scale, personal and local activities that are inspired by them and their digital cousins (remember social networking is one of these).

I'm obviously not surprised, since I have been working on that assumption. I also have a rough sketch of a theory why this digital sharing might spill over into the analogue world.

As those of us deeply immersed in the cultures of openness and sharing know, engaging in these activities is almost literally effortless: it takes probably a few seconds to share a link, a thought or a picture. It might take a few minutes for a blog post, and a few hours for Wikipedia article, but the barriers are still low.

And the rewards are high. Even simple "thank yous" from complete strangers (on Twitter or identi.ca, say) are immensely gratifying. Indeed, I'd be willing to bet that there are some serious hormonal consequences of getting this kind of feedback. For they are sufficiently pleasant that you tend to carry on sharing, and probably more intensely, in part to get that special buzz they engender.

At this point, your brain is positively wired for the benefits of sharing. In which case, you are maybe more willing to overcome the necessarily greater obstacles to sharing in the analogue world. Perhaps the benefits of sharing there are even greater; but even if they are only the same as for the digital realm, they are probably enough for us sharing addicts to carry on. (I'm sure there's a PhD or two in all this stuff.)

Whether or not that is a correct analysis of what's happening at the deepest level within us, this latest research is really good news for sharing, and for humanity's future, which surely will depend on us learning how to share everything - not least the planet and its resources - better. In fact, it was such good news, I felt I really had to share it with you...

Follow me @glynmoody on Twitter or identi.ca.

Dr Microsoft: Time to Be Struck Off

A Microsoft researcher offers an interesting medical metaphor:

Just as when an individual who is not vaccinated puts others’ health at risk, computers that are not protected or have been compromised with a bot put others at risk and pose a greater threat to society. In the physical world, international, national, and local health organizations identify, track and control the spread of disease which can include, where necessary, quarantining people to avoid the infection of others. Simply put, we need to improve and maintain the health of consumer devices connected to the Internet in order to avoid greater societal risk. To realize this vision, there are steps that can be taken by governments, the IT industry, Internet access providers, users and others to evaluate the health of consumer devices before granting them unfettered access to the Internet or other critical resources.

So, we're talking about computers "compromised with a bot": now, which ones might they be? Oh look, that would be almost exclusively Windows users. And why would that be? Because no matter how diligent users are in installing endless security updates to the Swiss cheese-like applications known as Windows, Internet Explorer and Microsoft Office, there are always more critical bugs that pop out of the proverbial digital woodwork to lay them open to attack and subversion.

So, where does that leave us when it comes to "improving" and "maintaining" the "health of consumer devices connected to the Internet"? Well, it means that by Microsoft's own logic, the solution is for everyone to junk a system that is still insecure, despite promise after promise after promise that this just was some minor technical detail that Microsoft would fix in the next release.

For Windows has manifestly not been fixed; moreover, Windows will *not* be fixed, because it's just not a priority; Windows may even be *unfixable*. The only sane solution is for people to move to inherently safer (although certainly not perfect or impregnable) alternatives like GNU/Linux.

For a researcher at Microsoft to attempt to avoid this inevitable conclusion by pushing the blame for this endless series of security lapses onto end users this way, and to suggest they, rather than Microsoft, should be thrown into the outer darkness. is beyond pathetic. (Via @rlancefield.)

Follow me @glynmoody on Twitter or identi.ca.

The World of the Open World Forum

Last week I went along to the Open World Forum in Paris. By that, I don't mean to imply I just bowled along there on the off-chance it might be a groovy place to be. I went there because I had been asked to chair a round-table discussion on the subject of “Open Democracy”, about which more anon (disclosure: the conference organisers paid the majority of my travel and hotel costs as a result).

On Open Enterprise blog.

04 October 2010

(Finally) Meeting Mr. Open Source Business

The careers of few people have been so intertwined with the history of open source as that of Larry Augustin. He was even present when the term “open source” was coined, at a meeting at the offices of his GNU/Linux hardware company VA Linux, on 3 February 1998. Present were Eric Raymond, John “maddog” Hall, Sam Ockman (from the Silicon Valley Linux User Group) and Christine Peterson, president of the Foresight Institute - and the person who actually came up with the name.

On Open Enterprise blog.