15 November 2010

German Court: Links Can Infringe on Copyright

Here's one of those tedious court decisions that show the judges don't really get this new-fangled Internet thing:

Pünktlich zum einem der vielen 20. Geburtstage des Word Wide Webs wurde jetzt ein Urteil des Bundesgerichtshof veröffentlicht, in dem festgestellt wird dass ein Link eine Urheberrechtsverletzung sein kann (Urteil, .pdf). In behandelten Rechtsstreit hatte der Kläger eine Website mit Stadtplänen betrieben, die so gestaltet war, dass man immer nur über ein Suchformular auf der Startseite zur gewünschten Unterseite kommen sollte.

[Google Translate: Just in time for one of the many 20th Birthdays of the World Wide Web has now published the Federal Court judge, found that a link is in the copyright infringement can be a ( ruling. pdf ). Treated in dispute, the applicant had operated a Web site with maps, which was designed so that one should only come via a search form on the home page to the desired base.]

I mean, come on: this isn't about copyright - the content is freely available; it's about how you get to that copyright material.

Thus the real issue here seems to be that a site owner is worried about losing advertising revenue if people can skip over the home page. But the solution is simple: just put ads on the inner pages of the site, too. That way, you get the best of both worlds: directly-addressable content that also generates revenue. Is that so hard?

Follow me @glynmoody on Twitter or identi.ca.

Microsoft: Super - But Not Quite Super Enough

Once upon a time, the Netcraft Web server market share was reported upon eagerly every month for the fact that it showed open source soundly trouncing its proprietary rivals. We don't hear much about that survey these days - not because things have changed, but for that very reason: it's now just become a boring fact of life that Apache has always been the top Web server, still is, and probably will be for the foreseeable future. I think we're fast approaching that situation with the top500 supercomputing table.

On Open Enterprise blog.

12 November 2010

Opening up Knowledge

I know you probably didn't notice, but I posted very little on this blog last week - nothing, in fact. This was not down to me going “meh” for a few days, but rather because I was over-eager in accepting offers to talk at conferences that were almost back to back, with the result that I had little time for much else during that period.

On Open Enterprise blog.

Time for a "Turing/Berners-Lee" Day?

On this day, in 1937:

Alan Turing’s paper entitled "On Computable Numbers with an Application to the Entscheidungs-problem" appeared on November 12, 1937, somewhat contemporaneously with Konrad Zuse’s work on the first of the Z machines in Germany, John Vincent Atanasoff ‘s work on the ABC, George Stibitz’s work on the Bell Telephony relay machine, and Howard Aiken’s on the Automatic Sequence Controlled Calculator.

Later renamed the Turing Machine, this abstract engine provided the fundamental concepts of computers that the other inventors would realise independently. So Turing provided the abstraction that would form the basic theory of computability for several decades, while others provided the pragmatic means of computation.

And on this day a little later, in 1990:

The attached document describes in more detail a Hypertext project.

HyperText is a way to link and access information of various kinds as a web of nodes in which the user can browse at will. It provides a single user-interface to large classes of information (reports, notes, data-bases, computer documentation and on-line help). We propose a simple scheme incorporating servers already available at CERN.

Maybe we should declare this date the Turing-Berners-Lee Day?

Follow me @glynmoody on Twitter or identi.ca.

Google Bowls a Googly

One of the most shocking aspects of Oracle's lawsuit against Google alleging patent and copyright infringement was its unexpected nature. The assumption had been that Google was a big company with lots of lawyers and engineers, and had presumably checked out everything before proceeding with the Android project. And then suddenly it looked as if it had made the kind of elementary mistakes a newbie startup might commit.

On Open Enterprise blog.

11 November 2010

A (Digital) Hymn to Eric Whitacre

Eric Whitacre is that remarkable thing: a composer able to write classical music that is at once completely contemporary and totally approachable even at the first hearing.

Just as, er, noteworthy is his total ease with modern technology. His website is undoubtedly one of the most attractive ever created for a composer, and uses the full panoply of the latest Internet technologies to support his music and to interact with his audience, including a blog with embedded YouTube videos, and links to Twitter and Facebook accounts.

Perhaps the best place to get a feel for his music and his amazing facility with technology is the performance of his piece "Lux Aurumque" by a "virtual choir" that he put together on YouTube (there's another video where the composer explains some of the details and how this came about.)

Against that background, it should perhaps be no surprise that on his website he has links to pages about most (maybe all?) of his compositions that include not only fascinating background material but complete embedded recordings of the pieces.

Clearly, Whitacre has no qualms about people being able to hear his music for free, since he knows that this is by far the best way to get the message out about it and to encourage people to perform it for themselves. The countless comments on these pages are testimony to the success of that approach: time and again people speak of being entranced when they heard the music on his web site - and then badgering local choirs to sing the pieces themselves.

It's really good to see a contemporary composer that really gets what digital music is about - seeding live performances - and understands that making it available online can only increase his audience, not diminish it. And so against that background, the story behind one of his very best pieces, and probably my current favourite, "Sleep", is truly dispiriting.

Originally, it was to have been a setting of Robert Frost’s "Stopping By Woods on a Snowy Evening". The composition went well:

I took my time with the piece, crafting it note by note until I felt that it was exactly the way I wanted it. The poem is perfect, truly a gem, and my general approach was to try to get out of the way of the words and let them work their magic.

But then something terrible happened:

And here was my tragic mistake: I never secured permission to use the poem. Robert Frost’s poetry has been under tight control from his estate since his death, and until a few years ago only Randall Thompson (Frostiana) had been given permission to set his poetry. In 1997, out of the blue, the estate released a number of titles, and at least twenty composers set and published Stopping By Woods on a Snowy Evening for chorus. When I looked online and saw all of these new and different settings, I naturally (and naively) assumed that it was open to anyone. Little did I know that the Robert Frost Estate had shut down ANY use of the poem just months before, ostensibly because of this plethora of new settings.

Thanks to copyright law, this is the prospect that Whitacre faced:

the estate of Robert Frost and their publisher, Henry Holt Inc., sternly and formally forbid me from using the poem for publication or performance until the poem became public domain in 2038.

I was crushed. The piece was dead, and would sit under my bed for the next 37 years because of some ridiculous ruling by heirs and lawyers.

Fortunately for him - and for us - he came up with an ingenious way of rescuing his work:

After many discussions with my wife, I decided that I would ask my friend and brilliant poet Charles Anthony Silvestri (Leonardo Dreams of His Flying Machine, Lux Aurumque, Nox Aurumque, Her Sacred Spirit Soars) to set new words to the music I had already written. This was an enormous task, because I was asking him to not only write a poem that had the exact structure of the Frost, but that would even incorporate key words from “Stopping”, like ‘sleep’. Tony wrote an absolutely exquisite poem, finding a completely different (but equally beautiful) message in the music I had already written. I actually prefer Tony’s poem now…

Not only that:

My setting of Robert Frost’s Stopping By Woods on a Snowy Evening no longer exists. And I won’t use that poem ever again, not even when it becomes public domain in 2038.

So, thanks to a disproportionate copyright term, a fine poem will never be married with sublime music that was originally written specially for it. This is the modern-day reality of copyright, originally devised for "the encouragement of learning", but now a real obstacle to the creation of new masterpieces.

Follow me @glynmoody on Twitter or identi.ca.

10 November 2010

Xanadu and the Digital Pleasure-Dome

I consider myself fortunate to have been around at the time of the birth of the Internet as a mass medium, which I date to the appearance of version 0.9 of Netscape Navigator in October 1994.

This gives me a certain perspective on things that happen online, since I can often find parallels from earlier times, but there are obviously many people who have been following things even longer, and whose perspective is even deeper. One such is Mark Pesce who also happens to be an extremely good writer, which makes his recent blog posting about the "early days" even more worth reading:

Back in the 1980s, when personal computers mostly meant IBM PCs running Lotus 1*2*3 and, perhaps, if you were a bit off-center, an Apple Macintosh running Aldus Pagemaker, the idea of a coherent and interconnected set of documents spanning the known human universe seemed fanciful. But there have always been dreamers, among them such luminaries as Douglas Engelbart, who gave us the computer mouse, and Ted Nelson, who coined the word ‘hypertext’. Engelbart demonstrated a fully-functional hypertext system in December 1968, the famous ‘Mother of all Demos’, which framed computing for the rest of the 20th century. Before man had walked on the Moon, before there was an Internet, we had a prototype for the World Wide Web. Nelson took this idea and ran with it, envisaging a globally interconnected hypertext system, which he named ‘Xanadu’ – after the poem by Coleridge – and which attracted a crowd of enthusiasts intent on making it real. I was one of them. From my garret in Providence, Rhode Island, I wrote a front end – a ‘browser’ if you will – to the soon-to-be-released Xanadu. This was back in 1986, nearly five years before Tim Berners-Lee wrote a short paper outlining a universal protocol for hypermedia, the basis for the World Wide Web.

Fascinating stuff, but it was the next paragraph that really made me stop and think:

Xanadu was never released, but we got the Web. It wasn’t as functional as Xanadu – copyright management was a solved problem with Xanadu, whereas on the Web it continues to bedevil us – and links were two-way affairs; you could follow the destination of a link back to its source. But the Web was out there and working for thousand of people by the middle of 1993, while Xanadu, shuffled from benefactor to benefactor, faded and finally died. The Web was good enough to get out there, to play with, to begin improving, while Xanadu – which had been in beta since the late 1980s – was never quite good enough to be released. ‘The Perfect is the Enemy of the Good’, and nowhere is it clearer than in the sad story of Xanadu.

The reason copyright management was a "solved problem with Xanadu" was because of something called "transclusion", which basically meant that when you quoted or copied a piece of text from elsewhere, it wasn't actually a copy, but the real thing *embedded* in your Xanadu document. This meant that it was easy to track who was doing what with your work - which made copyright management a "solved problem", as Pesce says.

I already knew this, but Pesce's juxtaposition with the sloppy, Web made me realise what a narrow escape we had. If Xanadu had been good enough to release, and if it had caught on sufficiently to establish itself before the Web had arrived, we would probably be living in a very different world.

There would be little of the creative sharing that undelies so much of the Internet - in blogs, Twitter, Facebook, YouTube. Instead, Xanadu's all-knowing transclusion would allow copyright holders to track down every single use of their content - and to block it just as easily.

I've always regarded Xanadu's failure as something of a pity - a brilliant idea before its time. But I realise now that in fact it was actually a bad idea precisely of its time - and as such, completely inappropriate for the amazing future that the Web has created for us instead. If we remember Xanadu it must be as a warning of how we nearly lost the stately pleasure-dome of digital sharing before it even began.

Follow me @glynmoody on Twitter or identi.ca.

Microsoft Demonstrates why FRAND Licensing is a Sham

A little while back I was pointing out how free software licences aren't generally compatible with Fair, Reasonable and Non-Discriminatory (FRAND) licensing, and why it would be hugely discriminatory if the imminent European Interoperability Framework v 2 were to opt for FRAND when it came to open standards, rather than insisting on restriction-free (RF) licensing.

I noted how FRAND conditions are impossible for licences like the GNU GPL, since the latter cannot pay per copy licensing fees on software that may be copied freely. As I commented there, some have suggested that there are ways around this - for example, if a big open source company like Red Hat pays a one-off charge. But that pre-supposes that licence holders would want to accommodate free software in this way: if they simply refuse to make this option available, then once again licences like the GNU GPL are simply locked out from using that technology - something that would be ridiculous for a European open standard.

Now, some may say: “ah well, this won't happen, because the licensing must be fair and reasonable”: but that then begs the question of what is fair and reasonable. It also assumes that licensors will always want to act fairly and reasonably themselves - that they won't simply ignore that condition. As it happens, we now have some pretty stunning evidence that this can't be taken for granted.

On Open Enterprise blog.

09 November 2010

A Patent No-brainer, Mr Willetts

There has been understandable excitement over David Cameron's announcement - out of the blue - that the UK government would be looking at copyright law:

On Open Enterprise blog.

Who's Lobbying Whom?

One of the frustrating things about being on the side of right, justice, logic and the rest is that all of these are trumped by naked insider power - just look at ACTA, which is a monument to closed-door deals that include rich and powerful industry groups, but expressly exclude the little people like you and me.

Against that background, it becomes easy to understand why Larry Lessig decided to move on from promoting copyright reform to tackling corruption in the US political machine. The rise of great sites like the Sunlight Foundation, whose tagline is "Making Government Transparent and Accountable" is further evidence of how much effort is going into this in the US.

The UK is lagging somewhat, despite the fact that in terms of pure open data from government sources, we're probably "ahead". But it's clear that more and more people are turning their thoughts to this area - not least because they have made the same mental journey as Lessig: we've got to do this if we are to counter the efforts of big business to get what they want regardless of whether it's right, fair or even sensible.

Here's a further sign of progress on this side of the pond:


We are excited to announce the Who’s Lobbying site launches today! The site opens with an analysis of ministerial meetings with outside interests, based on the reports released by UK government departments in October.

That analysis consists of treemaps - zoomable representations of how much time is spent with various organisations and their lobbyists:

For example, the treemap shows about a quarter of the Department of Energy and Climate Change meetings are with power companies. Only a small fraction are with environmental or climate change organisations.

It's still a little clunky at the moment, but it gives a glimpse of what might be possible: a quick and effortless consolidated picture of who's getting chummy with whom. As the cliché has it, knowledge is power, and that's certainly the case here: the more we can point to facts about disproportionate time spent with those backing one side of arguments, the easier it will be to insist on an equal hearing. And once that happens, we will be halfway there; after all, we *do* have right on our side...

Follow me @glynmoody on Twitter or identi.ca.

Is it Time for Free Software to Move on?

A remarkable continuity underlies free software, going all the way back to Richard Stallman's first programs for his new GNU project. And yet within that continuity, there have been major shifts: are we due for another such leap?

On The H Open.

08 November 2010

A Tale of Two Conferences

I was invited to give a talk at two recent conferences, the Berlin Commons Conference, and FSCONS 2010. It's generally a pleasure to accept these invitations, although I must confess that I found two major conferences with only two days between them a trifle demanding in terms of mental and physical stamina.

Indeed, both conferences were extremely stimulating, and I met many interesting people at both. However, more than anything, I was struck by huge and fundamental differences between them.

The Berlin Commons Conference was essentially the first of its kind, and a bold attempt to put the concept of the commons on the map. Of course, readers of this blog will already know exactly where to locate it, but even for many specialists whose disciplines include commons, the idea is still strange. The conference wisely sought to propel the commons into the foreground by finding, er, common ground between the various kinds of commons, and using that joint strength to muscle into the debate.

That sounded eminently sensible to me, and is something I have been advocating in my own small way (not least on this blog) for some time. But on the ground, achieving this common purpose proved much harder than expected.

In my view, at least, this was down largely to the gulf of incomprehension that we discovered between those working with traditional commons - forests, water, fish etc. - and the digital commons - free software, open content, etc. Basically it seemed to come down to this: some of the former viewed the latter as part of the problem. That is, they were pretty hostile to technology, and saw their traditional commons as antithetical to that.

By contrast, I and others working in the area of the digital commons offered this as a way to preserve the traditional, analogue commons. In particular, as I mentioned after my talk at the conference (embedded below), the Internet offers one of the most powerful tools for fighting against those - typically big, rich global corporations - that seek to enclose physical commons.


I must say I came away from the Berlin conference a little despondent, because it was evident that forming a commons coalition would be much harder than I had expected. This contrasted completely with the energising effect of attending FSCONS 2010 in Gothenburg.

It's not hard to see why. At the Swedish conference, which has been running successfully for some years, and now attracts hundreds of participants, I was surrounded by extremely positive, energetic and like-minded people. When I gave my talk (posted below), I was conscious that intentionally provocative as I was, my argument was essentially pushing against an open door: the audience, though highly critical in the best sense, were in broad agreement with my general logic.


Of course, that can make things too easy, which is dangerous if it becomes routine; but the major benefit of being confirmed in your prejudices in this way is that it encourages you to continue, and perhaps even to explore yet further. It has even inspired me to start posting a little more prolifically. You have been warned....

Follow me @glynmoody on Twitter or identi.ca.

30 October 2010

An Uncommon Conference on the Commons

Regular readers of this blog will know that the commons has been a major theme here for some years, since it offers an extremely fruitful way of looking at free software, open content and the other "opens". Recognition of the importance of the commons has been slow coming, but an important moment was when the doyenne of commons studies, Elinor Ostrom, won the Nobel Prize for Economics last year:

Elinor Ostrom has challenged the conventional wisdom that common property is poorly managed and should be either regulated by central authorities or privatized. Based on numerous studies of user-managed fish stocks, pastures, woods, lakes, and groundwater basins, Ostrom concludes that the outcomes are, more often than not, better than predicted by standard theories. She observes that resource users frequently develop sophisticated mechanisms for decision-making and rule enforcement to handle conflicts of interest, and she characterizes the rules that promote successful outcomes.

And now, building on that momentum, we have the Berlin Commons Conference:

The conference seeks to bring together a diverse group of about 150 international and Germany- and European-based commoners, intellectuals, activists and policy makers. It also aims to enhance participation and self-organization; stewardship, cooperation and networking; and open, non-linear ways to search for solutions.

Over the course of two days, the conference will assess the range of existing and potential commons-based policy approaches; develop the fundamentals of a policy framework that supports the commons; and identify and explore specific strategic opportunities to advance commons-based approaches.

The conference announcement elaborates: “The simple yet powerful and complex question to be explored throughout the conference is: What does a commons-based policy framework look like? What already exists and what do we still need to develop to nurture and protect diverse sorts of commons?”

As you can see from the list of participants, yours truly will also be attending. Apparently, there will be a live video stream of some of the sessions: not sure whether mine will be one of them. If it is, you can see me spouting my common commons nonsense around 11am CEST, 10am GMT.

Follow me @glynmoody on Twitter or identi.ca.

28 October 2010

The Limits of Openness?

I've been a long-time fan of the 3D modelling program Blender. No surprise, then, that I've also been delighted to see the Blender Foundation moving into content production to show what the software can do.

Specifically, it has produced a game (Yo! Frankie) and three animated films: Elephants Dream; Big Buck Bunny; and most recently, Sintel. Aside from their aesthetic value, what's interesting about these films is that the content is released under a cc licence.

Here's a fascinating interview with Ton Roosendaal, head of the Blender Institute, leader of Blender development, and producer of Sintel. It's well-worth reading, but there was one section that really caught my eye:

we keep most of our content closed until release. I’m a firm believer in establishing protective creative processes. In contrast to developers — who can function well individually online — an artist really needs daily and in-person feedback and stimulation.

We’ve done this now four times (three films and one game) and it’s amazing how teams grow in due time. But during this process they’re very vulnerable too. If you followed the blog you may have seen that we had quite harsh criticism on posting our progress work. If you’re in the middle of a process, you see the improvements. Online you only see the failures.

The cool thing is that a lot of tests and progress can be followed now perfectly and it suddenly makes more sense I think. Another complex factor for opening up a creative process is that people are also quite inexperienced when they join a project. You want to give them a learning curve and not hear all the time from our audience that it sucks. Not that it was that bad! But one bad criticism can ruin a day.

Those are reasonable, if not killer, arguments. But his last point is pretty inarguable:

One last thing on the “open svn” point: in theory it could work, if we would open up everything 100% from scratch. That then will give an audience a better picture of progress and growth. We did that for our game project and it was suited quite well for it. For film… most of our audience wants to get surprised more, not know the script, the dialogs, the twists. Film is more ‘art’ than games, in that respect.

That's fair: there's no real element of suspense for code, or even games, as he points out. So this suggest for certain projects like these free content films, openness may be something that needs limiting in this way, purely for the end-users' benefit.

Follow me @glynmoody on Twitter or identi.ca.

The British Library's Future: Shiny, Locked-Down Knowledge?

Yesterday, Computerworld UK carried an interesting report headed “British Library explores research technologies of the future”. Here's what it hopes to achieve:

On Open Enterprise blog.

27 October 2010

In Praise of Open Source Diversity

One of the great strengths of open source is that it offers users choice. You don't like one solution? Choose another. You don't like any solution, write your own (or pay for someone else to do it). Thus it is that many categories have several alternative offerings, all of them admirable applications, and all of them with passionate supporters.

On Open Enterprise blog.

Linux Embeds Itself Even Deeper

Because anyone can take Linux and use it as they wish without needing to ask permission (provided they comply with the licence), it ends up being used in lots of places that we rarely hear about. This contrasts with proprietary operating systems, which only get used if they are licensed directly, which means that the licensor always knows exactly what is going on - and can issue yet another boring press release accordingly.

On Open Enterprise blog.

25 October 2010

How is OpenStack Stacking Up?

You may have noticed there's a fair bit of interest in this cloud computing thing. You've probably also come across various articles suggesting this is the end of free software - and the world - as we know it, since cloud-based platforms render operating systems on servers and desktops largely moot.

On Open Enterprise blog.

22 October 2010

Jamie Love on What EU Must Do About ACTA

Jamie Love has been one of the key people writing about and fighting the worst aspects of ACTA. He's just posed a good question on Twitter:

Why haven't the EU civil society groups done more on patents in ACTA? It is quite possible to get patents out at this point.

He then followed up with this suggestion:

Some type of a focused letter on footnote 2 would be very helpful, right now. This could go either way.
That footnote, by the way, says the following:

For the purpose of this Agreement, Parties agree that patents do not fall within the scope of this Section.

"This section" is Section 2: Civil Enforcement, and so the suggestion seems to be that we push for that footnote to be accepted by all parties. So, what do people think, can we do something along these lines?

Follow me @glynmoody on Twitter or identi.ca.

20 October 2010

A (Final) Few Words on FRAND Licensing

The issue of Fair, Reasonable and Non-Discriminatory (FRAND) licensing has cropped up quite a few times in these pages. The last time I wrote about the subject was just last week, when I noted that the Business Software Alliance was worried that the imminent European Interoperability Framework (EIF) might actually require truly open standards, and so was pushing for FRAND instead.

On Open Enterprise blog.

19 October 2010

Want to Change German Copyright Law?

Of course you do - and here's your big chance. Dirk Riehle is not only the Professor for Open Source Software at the Friedrich-Alexander-University of Erlangen-Nürnberg (Germany orders these things better than we do), but he is also part of an expert council advising a “multilateral commission instituted by the German parliament to discuss and make recommendations on, well, Internet and digital society.”

On Open Enterprise blog.

15 October 2010

Why We've Learnt to Love the Labs

You may recall the global excitement a year ago, when Gmail finally came out of beta - after an astonishing five years:

On Open Enterprise blog.

14 October 2010

Microsoft Gives its Blessing to OpenOffice.org

On the 13 April 1999, a press release appeared headed “Mindcraft study shows Windows NT server outperforms Linux.” The summary read: “Microsoft Windows NT server 2.5 times faster than Linux as a file server and 3.7 times faster as Web server.” One thing the press release failed to mention was the following, found in the study itself: “Mindcraft Inc. conducted the performance tests described in this report between March 10 and March 13, 1999. Microsoft Corporation sponsored the testing reported herein.”

On Open Enterprise blog.

13 October 2010

Is GCHQ Frighteningly Clueless or Fiendishly Cunning?

I'm very sceptical about the concept of “cyber attacks”. Not that I doubt that computer systems and infrastructure are attacked: it's just their packaging as some super-duper new “threat” that I find suspicious. It smacks of bandwagon-jumping at best, and at worst looks like an attempt by greedy security companies to drum up yet more business.

On Open Enterprise blog.

11 October 2010

Whatever the BSA Says, FRAND is no Friend of Europe

I see that my old mates the Business Software Alliance are a tad concerned that the European Commission might do something sensible with the imminent European Interoperability Framework (EIF):

On Open Enterprise blog.