22 November 2010

Jauchzet: Bach's Organ Music Free Online

A few months ago, Musopen ran a fundraiser on Kickstarter:

Musopen is a non-profit dedicated to providing copyright free music content: music recordings, sheet music and a music textbook. This project will use your donations to purchase and release music to the public domain. Right now, if you were to buy a CD of Beethoven's 9th symphony, you would not be legally allowed to do anything but listen to it. You wouldn't be able to share it, upload it, or use it as a soundtrack to your indie film- yet Beethoven has been dead for 183 years and his music is no longer copyrighted. There is a lifetime of music out there, legally in the public domain, but it has yet to be recorded and released to the public.

This is such an eminently sensible idea: releasing music into the public domain that all can use as they wish. I don't think this other project is public domain (anyone know?), but it's still a nice gesture:

Free downloads of the complete organ works of Johann Sebastian Bach, recorded by Dr. James Kibbie on original baroque organs in Germany, are offered on this site.

Here's where the money's coming from:

This project is sponsored by the University of Michigan School of Music, Theatre & Dance with generous support from Dr. Barbara Furin Sloat in honor of J. Barry Sloat. Additional support has been provided by the Office of Vice-President for Research, the University of Michigan.

It's another model that would be good to see utilised elsewhere, ideally with the results being put into the public domain. (Via @ulyssestone, @alexrossmusic)

Follow me @glynmoody on Twitter or identi.ca.

21 November 2010

No Art Please, You're Not British

I thought we had got beyond this daftness:

A Cellist was held at Heathrow Airport and questioned for 8 hours this week. A terrorist suspect? False passport? Drug smuggling? If only it was so dramatic and spectacular. Her crime was coming to the UK with her cello, to participate in musicology conference organised by the School of Music at the University of Leeds and it was for this reason that Kristin Ostling was deported back to Chicago. What was UK Borders Agency (UKBA) thinking? That she would sell her cello to earn some cash, or do a spot of moonlighting at some secretive classical music gig, while she was here?

The Conference organiser, Professor Derek Scott informed the Manifesto Club that “She was not being paid a penny for this, but these zealous officers decided that playing a cello is work and, paid or unpaid, she could not be allowed in.”

Lovely logic here: if you are a professional cellist it follows that putting bow to string is work, and therefore not permitted according to the terms of your visa. And as the article explains, it's the same for painters and photographers: if you dare to create a masterpiece here in the UK, you might end up being deported, and blacklisted.

Now, call me old fashioned, but it seems to me that we should actually be *begging* artists to come here and create: it not only enriches the cultural ecosystem based on the UK and all it contains, it makes it more likely that other artists and non-artists will want to come to the country to see where these works were spawned, bringing with them all that valuable touristic dosh that everyone seems to be scrabbling after these days.

But the problem is really deeper than this simple loss of these earnings. What is really disturbing is the crass way the UK Borders Agency equates artistic creation with work: if you act as an artist - even if you are not paid - you are theoretically doing something that should have a price on it. This is really part and parcel of the thinking that everything should be copyrighted and patented - that you can't do stuff for free, or simply give away your intellectual creations.

It's a sick viewpoint that leads to kind of shaming situations described above. And of course, in the usual way, the people imposing these absurd practices haven't though things through. After all, if musicians can't play, or artists paint, when they come to the UK, surely that must mean by the same token that visiting foreign mathematicians can't manipulate formulae, and philosophers are forbidden to think here...?

Follow me @glynmoody on Twitter or identi.ca.

Digital Society vs. Digital Economy Act

Here's an interesting move:

Britons will be forced to apply online for government services such as student loans, driving licences, passports and benefits under cost-cutting plans to be unveiled this week.

Officials say getting rid of all paper applications could save billions of pounds. They insist that vulnerable groups will be able to fill in forms digitally at their local post offices.

The plans are likely to infuriate millions of people. Around 27% of households still have no internet connection at home and six million people aged over 65 have never used the web.

Lord Oakeshott, a Liberal Democrat Treasury spokesman, said: "We must cut costs and boost post offices as much as we possibly can, but many millions of people – not just pensioners – are not online and never will be. They must never be made to feel the state treats them as second-class citizens."

As an out-and-out technophile, I have a lot of sympathy with this move. After all, it's really akin to moving everyone to electricity. But it does mean that strenuous efforts must be made to ensure that everyone really has ready access to the Internet.

And that, of course, is a bit of a problem when the ultimate sanction of the Digital Economy Act is to block people's access (even if the government tries to deny that it will "disconnect" people - it amounts to the same thing, whatever the words.) If, as this suggests and I think is right, the Internet becomes an absolutely indispensable means of exercising key rights (like being able to communicate with the government) then it inevitably makes taking those rights away even more problematic.

So I predict that the more the present coaltion pushes in this direction, the more difficulties it will have down the line with courts unimpressed with people being disadvantaged so seriously for allegedly infringing on a government-granted monopoly: this makes a response that was never proportionate to begin with even more disproportionate.

Follow me @glynmoody on Twitter or identi.ca.

20 November 2010

Tim BL: Open Standards Must be Royalty-Free

Yesterday I went along to the launch of the next stage of the UK government's open data initiative, which involved releasing information about all expenditure greater than £25,000 (I'll be writing more about this next week). I realised that this was a rather more important event than I had initially thought when I found myself sitting one seat away from Sir Tim Berners-Lee (and the intervening seat was occupied by Francis Maude, Minister for the Cabinet Office and Paymaster General.)

Sir Tim came across as a rather archetypal professor in his short presentation: knowledgeable and passionate, but slightly unworldly. I get the impression that even after 20 years he's still not really reconciled to his fame, or to the routine expectation that he will stand up and talk in front of big crowds of people.

He seems much happier with the written word, as evidence by his excellent recent essay in the Scientific American, called "Long Live the Web". It's a powerful defence of the centrality of the Web to our modern way of life, and of the key elements that make it work so well. Indeed, I think it rates as one of the best such piece I've read, written by someone uniquely well-placed to make the case.

But I want to focus on just one aspect here, because I think it's significant that Berners-Lee spends so much time on it. It's also timely, because it concerns an area that is under great pressure currently: truly open standards. Here's what Berners-Lee writes on the subject:

The basic Web technologies that individuals and companies need to develop powerful services must be available for free, with no royalties. Amazon.com, for example, grew into a huge online bookstore, then music store, then store for all kinds of goods because it had open, free access to the technical standards on which the Web operates. Amazon, like any other Web user, could use HTML, URI and HTTP without asking anyone’s permission and without having to pay. It could also use improvements to those standards developed by the World Wide Web Consortium, allowing customers to fill out a virtual order form, pay online, rate the goods they had purchased, and so on.

By “open standards” I mean standards that can have any committed expert involved in the design, that have been widely reviewed as acceptable, that are available for free on the Web, and that are royalty-free (no need to pay) for developers and users. Open, royalty-free standards that are easy to use create the diverse richness of Web sites, from the big names such as Amazon, Craigslist and Wikipedia to obscure blogs written by adult hobbyists and to homegrown videos posted by teenagers.

Openness also means you can build your own Web site or company without anyone’s approval. When the Web began, I did not have to obtain permission or pay royalties to use the Internet’s own open standards, such as the well-known transmission control protocol (TCP) and Internet protocol (IP). Similarly, the Web Consortium’s royalty-free patent policy says that the companies, universities and individuals who contribute to the development of a standard must agree they will not charge royalties to anyone who may use the standard.

There's nothing radical or new there: after all, as he says, the W3C specifies that all its standards must be royalty-free. But it's a useful re-statement of that policy - and especially important at a time when many are trying to paint Royalty-Free standards as hopeless unrealistic for open standards. The Web's continuing success is the best counter-example we have to that view, and Berners-Lee's essay is a splendid reminder of that fact. Do read it.

Follow me @glynmoody on Twitter or identi.ca.

18 November 2010

Microsoft: "Linux at the End of its Life Cycle"

Regular readers of this blog will know that I've tracked the rather painful history of attempts to increase the deployment of free software in Russia, notably in its schools. Well, that saga continues, it seems, with doubts being expressed about the creation of a Russian national operating system based on GNU/Linux:

Иногда приходится слышать, что идея национальной программной платформы содержит в себе логическое противоречие. Ведь если такая платформа действительно будет создаваться на базе СПО, то такое программное обеспечение будет более чем на 90% произведено не в России, а за рубежом. Соответственно, и НПП у нас получится, скорее, какая-нибудь американо-германо-индийская, а не российская.

[Google Translate: Sometimes we hear that the idea of a national software platform contains a logical contradiction. After all, if this platform really will be created based on the ACT, then this software will be more than 90% are not produced in Russia and abroad. Accordingly, the NPP, we will, more likely, some kind of US-German-Indian, not Russian.]

That story will doubtless run and run. But what interested me was the accompanying quote from Nikolai Pryanishnikov, president of Microsoft in Russia; it's a corker:

"Компания Microsoft выступает за технологическую нейтральность и считает, что выбор ОС должен быть обусловлен исключительно качествами самой ОС, ее экономической эффективностью, стоящими практическими задачами, безопасностью, а не идеологическими соображениями.

С нашей точки зрения, наиболее эффективным для развития инновационной экономики в стране представляется не создание аналога существующих ОС, на что уйдут огромные средства и много времени, а взяв за основу наиболее распространенную ОС, проверенную российскими спецслужбами, создавать собственные приложения и решения, вкладывая при этом средства в перспективные научные российские разработки. Нужно иметь в виду, что Linux не является российской ОС и, кроме того, находится в конце своего жизненного цикла".

[Google Translate: "Microsoft supports technological neutrality and considers that the choice of OS should be caused solely as the greatest operating system, its economic efficiency, standing practical problems, safety, rather than ideological considerations.

From our point of view, the most effective for the development of an innovative economy in the country seems not to create an analogue of the existing OS, which will take huge amounts of money and time, and taking as basis the most popular operating systems, proven by Russian security services, to create custom applications and solutions, investing in this funds in promising scientific Russian developments. We must bear in mind that Linux is not a Russian OS and, moreover, is at the end of its life cycle."]

The idea that "Linux is at the end of its life cycle" is rather rich coming from the vendor of a platform that is increasingly losing market share, both at the top and bottom end of the market, while Linux just gets stronger. I'd wager that variants of Linux will be around rather longer than Windows.

Update: the Russian publication CNews Open, from which the story above was taken, points out that Russia is aiming to create a national software platform, not a national operating system. Quite what this means seems to be somewhat unclear:

даже российским участникам сообщества сегодня по-прежнему трудно понять, что конкретно представляет собой российская национальная программная платформа

[Google Translate: even the Russian participants of the community today is still difficult to understand exactly what constitutes Russia's national software platform.]

Let's hope things become a little clearer in due course: with its wealth of top-class programmers, Russia has the potential to become a key player the free software world.

Follow me @glynmoody on Twitter or identi.ca.

A Peek Inside the EU's Digital Inner Circle

The European Commission looms large in these pages. But despite that importance, it remains - to me, at least - an opaque beast. Hugely-important decisions are emitted by it, as the result of some long and deeply complex process, but the details remain pretty mysterious.

On Open Enterprise blog.

17 November 2010

Can You Feel the Tension?

There's an important conference taking place in Brussels next week: "Tensions between Intellectual Property Rights and the ICT standardisation process: reasons and remedies - 22 November 2010". It's important because it has a clear bearing on key documents like the forthcoming European Interoperability Framework v2.

It all sounds jolly reasonable:

Key ICT standards are perceived by many as critical technology platforms with a strong public interest dimension. However, concerns are voiced that Intellectual Property Rights (IPRs) and their exclusivity potential, may hinder or prevent standardisation.

The European Commission and the European Patent Office (EPO) are organising a conference to address some specific issues on patents and ICT standards: are today’s IPR features still compatible with fast moving markets and the very complex requirements of ICT standardisation in a global knowledge economy environment? Where are problems that we can we fix?

Unfortunately, I can't go - actually, better make that *fortunately* I can't go, because upon closer inspection the agenda [.pdf] shows that this is a conference with a clear, er, agenda: that is, the outcome has already been decided.

You can tell just by its framing: this is "a conference to address some specific issues on patents and ICT standards". ICT is mostly about software, and yet software cannot be patented "as such". So, in a sense, this ought to be a trivial conference lasting about five minutes. The fact that it isn't shows where things are going to head: towards accepting and promoting patents in European standards, including those for software.

That's not really surprising, given who are organising it - the European Commission and the European Patent Office (EPO). The European Commission has always been a big fan of software patents; and the EPO is hardly likely to be involved with a conference that says: "you know, we *really* don't need all these patents in our standards."

Of course, the opposite result - that patents are so indescribably yummy that we need to have as many as possible in our European ICT standards - must emerge naturally and organically. And so to ensure that natural and organic result, we have a few randomly-selected companies taking part.

For example, there's a trio of well-known European companies: Nokia, Ericsson and Microsoft. By an amazing coincidence - as an old BBC story reminds us - all of them were fervent supporters of the European legislation to make software patentable:

Big technology firms, such as Philips, Nokia, Microsoft, Siemens, and telecoms firm Ericsson, continued to voice their support for the original bill.

So, no possible bias there, then.

Then there are a couple of outfits you may have heard of - IBM and Oracle, both noted for loving software patents slightly more than life itself. So maybe a teensy bit of bias there.

But wait, you will say: you are being totally unfair. After all, is there not an *entire* massive one-hour session entitled "Open source, freely available software and standardisation"? (although I do wonder what on earth this "freely available software" could be - obviously nothing so subversive as free-as-in-freedom software.)

And it's true, that session does indeed exist; here's part of the description:

This session will explore potential issues around standardisation and the topic of open source software and free licences. We will look at examples of how standards are successfully implemented in open source. We will also consider licensing issues that may exist regarding the requirement to pay royalties for patents present in standards, as well as other licensing terms and conditions in relation to the community approach common in open source and free software technology development.

But what's the betting that those "examples of how standards are successfully implemented in open source" will include rare and atypical cases where FRAND licences have been crafted into a free software compatible form, and which will then be used to demonstrate that FRAND is the perfect solution for ICT licensing in Europe?

Luckily, we have Karsten Gerloff from the FSFE to fight against the software patent fan club, and tell it as it is. Pity he's on his own on this though - and no, poor Erwin Tenhumberg does not count. He may be "Open Source Programme Manager, SAP", but SAP is one of the fiercest proponents of patenting software in Europe, as I've discussed a couple of times.

So this leaves us with Karsten against the collective might of the European Commission, EPO, Microsoft, Nokia, Ericsson, IBM, Oracle and SAP: clearly they'll be some of that "tension", as the conference title promises, but a fair fight conducted on a level playing-field? Not so much....

Follow me @glynmoody on Twitter or identi.ca.

16 November 2010

Will Mark Zuckerberg Prove He's Open Source's BFF?

Although I don't use it much myself, I've heard that Facebook is quite popular in some quarters. This makes its technological moves important, especially when they impact free software. Yesterday, we had what most have seen as a pretty big announcement from the company that does precisely that:

On Open Enterprise blog.

15 November 2010

A Great Indian Takeaway

As you may have noticed, I've been writing quite a lot about the imminent European Interoperability Framework (EIF), and the extent to which it supports true open standards that can be implemented by all. Of course, that's not just a European question: many governments around the world are grappling with exactly the same issue. Here's a fascinating result from India that has important lessons for the European Commission as they finalise EIF v2.

On Open Enterprise blog.

Beyond a Joke: On the Road to China

By now, you will have read all about the #twitterjoketrial. But you may not have come across this story:


On 17 of October, Wang Yi retweeted a post by Hua Chunhui who satirically challenged the anti-Japanese angry youths in China by inviting them to destroy the Japan pavilion in Shanghai Expo. She added a comment, “Angry youth, come forward and break the pavilion!” in her retweet.

The police interpreted her satire as a public order disturbance and asked the Labour Re-education committee to sentence her to one year labour camp, from November 15 2010 to November 9 2011 in Zhenzhou Shibali river labour re-education camp.

People will point out one year in a labour camp is very different from the few thousand quid fine meted out to Paul Chambers, and I of course would agree: the UK is not China.

But the *attitude* - that humour or satire is a "threat" of some kind, and must be punished in the courts - is shockingly similar. And that is what is most disturbing for me here in the UK about the #twitterjoketrial case: the authorities here are now *thinking* like their Chinese counterparts (who must be delighted to have this high-profile backing for their approach from those hypocritical Westerners). We are on the road to China.

Is this really the journey we want to take? Weren't we trying to get China to come to us?

Follow me @glynmoody on Twitter or identi.ca.

German Court: Links Can Infringe on Copyright

Here's one of those tedious court decisions that show the judges don't really get this new-fangled Internet thing:

Pünktlich zum einem der vielen 20. Geburtstage des Word Wide Webs wurde jetzt ein Urteil des Bundesgerichtshof veröffentlicht, in dem festgestellt wird dass ein Link eine Urheberrechtsverletzung sein kann (Urteil, .pdf). In behandelten Rechtsstreit hatte der Kläger eine Website mit Stadtplänen betrieben, die so gestaltet war, dass man immer nur über ein Suchformular auf der Startseite zur gewünschten Unterseite kommen sollte.

[Google Translate: Just in time for one of the many 20th Birthdays of the World Wide Web has now published the Federal Court judge, found that a link is in the copyright infringement can be a ( ruling. pdf ). Treated in dispute, the applicant had operated a Web site with maps, which was designed so that one should only come via a search form on the home page to the desired base.]

I mean, come on: this isn't about copyright - the content is freely available; it's about how you get to that copyright material.

Thus the real issue here seems to be that a site owner is worried about losing advertising revenue if people can skip over the home page. But the solution is simple: just put ads on the inner pages of the site, too. That way, you get the best of both worlds: directly-addressable content that also generates revenue. Is that so hard?

Follow me @glynmoody on Twitter or identi.ca.

Microsoft: Super - But Not Quite Super Enough

Once upon a time, the Netcraft Web server market share was reported upon eagerly every month for the fact that it showed open source soundly trouncing its proprietary rivals. We don't hear much about that survey these days - not because things have changed, but for that very reason: it's now just become a boring fact of life that Apache has always been the top Web server, still is, and probably will be for the foreseeable future. I think we're fast approaching that situation with the top500 supercomputing table.

On Open Enterprise blog.

12 November 2010

Opening up Knowledge

I know you probably didn't notice, but I posted very little on this blog last week - nothing, in fact. This was not down to me going “meh” for a few days, but rather because I was over-eager in accepting offers to talk at conferences that were almost back to back, with the result that I had little time for much else during that period.

On Open Enterprise blog.

Time for a "Turing/Berners-Lee" Day?

On this day, in 1937:

Alan Turing’s paper entitled "On Computable Numbers with an Application to the Entscheidungs-problem" appeared on November 12, 1937, somewhat contemporaneously with Konrad Zuse’s work on the first of the Z machines in Germany, John Vincent Atanasoff ‘s work on the ABC, George Stibitz’s work on the Bell Telephony relay machine, and Howard Aiken’s on the Automatic Sequence Controlled Calculator.

Later renamed the Turing Machine, this abstract engine provided the fundamental concepts of computers that the other inventors would realise independently. So Turing provided the abstraction that would form the basic theory of computability for several decades, while others provided the pragmatic means of computation.

And on this day a little later, in 1990:

The attached document describes in more detail a Hypertext project.

HyperText is a way to link and access information of various kinds as a web of nodes in which the user can browse at will. It provides a single user-interface to large classes of information (reports, notes, data-bases, computer documentation and on-line help). We propose a simple scheme incorporating servers already available at CERN.

Maybe we should declare this date the Turing-Berners-Lee Day?

Follow me @glynmoody on Twitter or identi.ca.

Google Bowls a Googly

One of the most shocking aspects of Oracle's lawsuit against Google alleging patent and copyright infringement was its unexpected nature. The assumption had been that Google was a big company with lots of lawyers and engineers, and had presumably checked out everything before proceeding with the Android project. And then suddenly it looked as if it had made the kind of elementary mistakes a newbie startup might commit.

On Open Enterprise blog.

11 November 2010

A (Digital) Hymn to Eric Whitacre

Eric Whitacre is that remarkable thing: a composer able to write classical music that is at once completely contemporary and totally approachable even at the first hearing.

Just as, er, noteworthy is his total ease with modern technology. His website is undoubtedly one of the most attractive ever created for a composer, and uses the full panoply of the latest Internet technologies to support his music and to interact with his audience, including a blog with embedded YouTube videos, and links to Twitter and Facebook accounts.

Perhaps the best place to get a feel for his music and his amazing facility with technology is the performance of his piece "Lux Aurumque" by a "virtual choir" that he put together on YouTube (there's another video where the composer explains some of the details and how this came about.)

Against that background, it should perhaps be no surprise that on his website he has links to pages about most (maybe all?) of his compositions that include not only fascinating background material but complete embedded recordings of the pieces.

Clearly, Whitacre has no qualms about people being able to hear his music for free, since he knows that this is by far the best way to get the message out about it and to encourage people to perform it for themselves. The countless comments on these pages are testimony to the success of that approach: time and again people speak of being entranced when they heard the music on his web site - and then badgering local choirs to sing the pieces themselves.

It's really good to see a contemporary composer that really gets what digital music is about - seeding live performances - and understands that making it available online can only increase his audience, not diminish it. And so against that background, the story behind one of his very best pieces, and probably my current favourite, "Sleep", is truly dispiriting.

Originally, it was to have been a setting of Robert Frost’s "Stopping By Woods on a Snowy Evening". The composition went well:

I took my time with the piece, crafting it note by note until I felt that it was exactly the way I wanted it. The poem is perfect, truly a gem, and my general approach was to try to get out of the way of the words and let them work their magic.

But then something terrible happened:

And here was my tragic mistake: I never secured permission to use the poem. Robert Frost’s poetry has been under tight control from his estate since his death, and until a few years ago only Randall Thompson (Frostiana) had been given permission to set his poetry. In 1997, out of the blue, the estate released a number of titles, and at least twenty composers set and published Stopping By Woods on a Snowy Evening for chorus. When I looked online and saw all of these new and different settings, I naturally (and naively) assumed that it was open to anyone. Little did I know that the Robert Frost Estate had shut down ANY use of the poem just months before, ostensibly because of this plethora of new settings.

Thanks to copyright law, this is the prospect that Whitacre faced:

the estate of Robert Frost and their publisher, Henry Holt Inc., sternly and formally forbid me from using the poem for publication or performance until the poem became public domain in 2038.

I was crushed. The piece was dead, and would sit under my bed for the next 37 years because of some ridiculous ruling by heirs and lawyers.

Fortunately for him - and for us - he came up with an ingenious way of rescuing his work:

After many discussions with my wife, I decided that I would ask my friend and brilliant poet Charles Anthony Silvestri (Leonardo Dreams of His Flying Machine, Lux Aurumque, Nox Aurumque, Her Sacred Spirit Soars) to set new words to the music I had already written. This was an enormous task, because I was asking him to not only write a poem that had the exact structure of the Frost, but that would even incorporate key words from “Stopping”, like ‘sleep’. Tony wrote an absolutely exquisite poem, finding a completely different (but equally beautiful) message in the music I had already written. I actually prefer Tony’s poem now…

Not only that:

My setting of Robert Frost’s Stopping By Woods on a Snowy Evening no longer exists. And I won’t use that poem ever again, not even when it becomes public domain in 2038.

So, thanks to a disproportionate copyright term, a fine poem will never be married with sublime music that was originally written specially for it. This is the modern-day reality of copyright, originally devised for "the encouragement of learning", but now a real obstacle to the creation of new masterpieces.

Follow me @glynmoody on Twitter or identi.ca.

10 November 2010

Xanadu and the Digital Pleasure-Dome

I consider myself fortunate to have been around at the time of the birth of the Internet as a mass medium, which I date to the appearance of version 0.9 of Netscape Navigator in October 1994.

This gives me a certain perspective on things that happen online, since I can often find parallels from earlier times, but there are obviously many people who have been following things even longer, and whose perspective is even deeper. One such is Mark Pesce who also happens to be an extremely good writer, which makes his recent blog posting about the "early days" even more worth reading:

Back in the 1980s, when personal computers mostly meant IBM PCs running Lotus 1*2*3 and, perhaps, if you were a bit off-center, an Apple Macintosh running Aldus Pagemaker, the idea of a coherent and interconnected set of documents spanning the known human universe seemed fanciful. But there have always been dreamers, among them such luminaries as Douglas Engelbart, who gave us the computer mouse, and Ted Nelson, who coined the word ‘hypertext’. Engelbart demonstrated a fully-functional hypertext system in December 1968, the famous ‘Mother of all Demos’, which framed computing for the rest of the 20th century. Before man had walked on the Moon, before there was an Internet, we had a prototype for the World Wide Web. Nelson took this idea and ran with it, envisaging a globally interconnected hypertext system, which he named ‘Xanadu’ – after the poem by Coleridge – and which attracted a crowd of enthusiasts intent on making it real. I was one of them. From my garret in Providence, Rhode Island, I wrote a front end – a ‘browser’ if you will – to the soon-to-be-released Xanadu. This was back in 1986, nearly five years before Tim Berners-Lee wrote a short paper outlining a universal protocol for hypermedia, the basis for the World Wide Web.

Fascinating stuff, but it was the next paragraph that really made me stop and think:

Xanadu was never released, but we got the Web. It wasn’t as functional as Xanadu – copyright management was a solved problem with Xanadu, whereas on the Web it continues to bedevil us – and links were two-way affairs; you could follow the destination of a link back to its source. But the Web was out there and working for thousand of people by the middle of 1993, while Xanadu, shuffled from benefactor to benefactor, faded and finally died. The Web was good enough to get out there, to play with, to begin improving, while Xanadu – which had been in beta since the late 1980s – was never quite good enough to be released. ‘The Perfect is the Enemy of the Good’, and nowhere is it clearer than in the sad story of Xanadu.

The reason copyright management was a "solved problem with Xanadu" was because of something called "transclusion", which basically meant that when you quoted or copied a piece of text from elsewhere, it wasn't actually a copy, but the real thing *embedded* in your Xanadu document. This meant that it was easy to track who was doing what with your work - which made copyright management a "solved problem", as Pesce says.

I already knew this, but Pesce's juxtaposition with the sloppy, Web made me realise what a narrow escape we had. If Xanadu had been good enough to release, and if it had caught on sufficiently to establish itself before the Web had arrived, we would probably be living in a very different world.

There would be little of the creative sharing that undelies so much of the Internet - in blogs, Twitter, Facebook, YouTube. Instead, Xanadu's all-knowing transclusion would allow copyright holders to track down every single use of their content - and to block it just as easily.

I've always regarded Xanadu's failure as something of a pity - a brilliant idea before its time. But I realise now that in fact it was actually a bad idea precisely of its time - and as such, completely inappropriate for the amazing future that the Web has created for us instead. If we remember Xanadu it must be as a warning of how we nearly lost the stately pleasure-dome of digital sharing before it even began.

Follow me @glynmoody on Twitter or identi.ca.

Microsoft Demonstrates why FRAND Licensing is a Sham

A little while back I was pointing out how free software licences aren't generally compatible with Fair, Reasonable and Non-Discriminatory (FRAND) licensing, and why it would be hugely discriminatory if the imminent European Interoperability Framework v 2 were to opt for FRAND when it came to open standards, rather than insisting on restriction-free (RF) licensing.

I noted how FRAND conditions are impossible for licences like the GNU GPL, since the latter cannot pay per copy licensing fees on software that may be copied freely. As I commented there, some have suggested that there are ways around this - for example, if a big open source company like Red Hat pays a one-off charge. But that pre-supposes that licence holders would want to accommodate free software in this way: if they simply refuse to make this option available, then once again licences like the GNU GPL are simply locked out from using that technology - something that would be ridiculous for a European open standard.

Now, some may say: “ah well, this won't happen, because the licensing must be fair and reasonable”: but that then begs the question of what is fair and reasonable. It also assumes that licensors will always want to act fairly and reasonably themselves - that they won't simply ignore that condition. As it happens, we now have some pretty stunning evidence that this can't be taken for granted.

On Open Enterprise blog.

09 November 2010

A Patent No-brainer, Mr Willetts

There has been understandable excitement over David Cameron's announcement - out of the blue - that the UK government would be looking at copyright law:

On Open Enterprise blog.

Who's Lobbying Whom?

One of the frustrating things about being on the side of right, justice, logic and the rest is that all of these are trumped by naked insider power - just look at ACTA, which is a monument to closed-door deals that include rich and powerful industry groups, but expressly exclude the little people like you and me.

Against that background, it becomes easy to understand why Larry Lessig decided to move on from promoting copyright reform to tackling corruption in the US political machine. The rise of great sites like the Sunlight Foundation, whose tagline is "Making Government Transparent and Accountable" is further evidence of how much effort is going into this in the US.

The UK is lagging somewhat, despite the fact that in terms of pure open data from government sources, we're probably "ahead". But it's clear that more and more people are turning their thoughts to this area - not least because they have made the same mental journey as Lessig: we've got to do this if we are to counter the efforts of big business to get what they want regardless of whether it's right, fair or even sensible.

Here's a further sign of progress on this side of the pond:


We are excited to announce the Who’s Lobbying site launches today! The site opens with an analysis of ministerial meetings with outside interests, based on the reports released by UK government departments in October.

That analysis consists of treemaps - zoomable representations of how much time is spent with various organisations and their lobbyists:

For example, the treemap shows about a quarter of the Department of Energy and Climate Change meetings are with power companies. Only a small fraction are with environmental or climate change organisations.

It's still a little clunky at the moment, but it gives a glimpse of what might be possible: a quick and effortless consolidated picture of who's getting chummy with whom. As the cliché has it, knowledge is power, and that's certainly the case here: the more we can point to facts about disproportionate time spent with those backing one side of arguments, the easier it will be to insist on an equal hearing. And once that happens, we will be halfway there; after all, we *do* have right on our side...

Follow me @glynmoody on Twitter or identi.ca.

Is it Time for Free Software to Move on?

A remarkable continuity underlies free software, going all the way back to Richard Stallman's first programs for his new GNU project. And yet within that continuity, there have been major shifts: are we due for another such leap?

On The H Open.

08 November 2010

A Tale of Two Conferences

I was invited to give a talk at two recent conferences, the Berlin Commons Conference, and FSCONS 2010. It's generally a pleasure to accept these invitations, although I must confess that I found two major conferences with only two days between them a trifle demanding in terms of mental and physical stamina.

Indeed, both conferences were extremely stimulating, and I met many interesting people at both. However, more than anything, I was struck by huge and fundamental differences between them.

The Berlin Commons Conference was essentially the first of its kind, and a bold attempt to put the concept of the commons on the map. Of course, readers of this blog will already know exactly where to locate it, but even for many specialists whose disciplines include commons, the idea is still strange. The conference wisely sought to propel the commons into the foreground by finding, er, common ground between the various kinds of commons, and using that joint strength to muscle into the debate.

That sounded eminently sensible to me, and is something I have been advocating in my own small way (not least on this blog) for some time. But on the ground, achieving this common purpose proved much harder than expected.

In my view, at least, this was down largely to the gulf of incomprehension that we discovered between those working with traditional commons - forests, water, fish etc. - and the digital commons - free software, open content, etc. Basically it seemed to come down to this: some of the former viewed the latter as part of the problem. That is, they were pretty hostile to technology, and saw their traditional commons as antithetical to that.

By contrast, I and others working in the area of the digital commons offered this as a way to preserve the traditional, analogue commons. In particular, as I mentioned after my talk at the conference (embedded below), the Internet offers one of the most powerful tools for fighting against those - typically big, rich global corporations - that seek to enclose physical commons.


I must say I came away from the Berlin conference a little despondent, because it was evident that forming a commons coalition would be much harder than I had expected. This contrasted completely with the energising effect of attending FSCONS 2010 in Gothenburg.

It's not hard to see why. At the Swedish conference, which has been running successfully for some years, and now attracts hundreds of participants, I was surrounded by extremely positive, energetic and like-minded people. When I gave my talk (posted below), I was conscious that intentionally provocative as I was, my argument was essentially pushing against an open door: the audience, though highly critical in the best sense, were in broad agreement with my general logic.


Of course, that can make things too easy, which is dangerous if it becomes routine; but the major benefit of being confirmed in your prejudices in this way is that it encourages you to continue, and perhaps even to explore yet further. It has even inspired me to start posting a little more prolifically. You have been warned....

Follow me @glynmoody on Twitter or identi.ca.

30 October 2010

An Uncommon Conference on the Commons

Regular readers of this blog will know that the commons has been a major theme here for some years, since it offers an extremely fruitful way of looking at free software, open content and the other "opens". Recognition of the importance of the commons has been slow coming, but an important moment was when the doyenne of commons studies, Elinor Ostrom, won the Nobel Prize for Economics last year:

Elinor Ostrom has challenged the conventional wisdom that common property is poorly managed and should be either regulated by central authorities or privatized. Based on numerous studies of user-managed fish stocks, pastures, woods, lakes, and groundwater basins, Ostrom concludes that the outcomes are, more often than not, better than predicted by standard theories. She observes that resource users frequently develop sophisticated mechanisms for decision-making and rule enforcement to handle conflicts of interest, and she characterizes the rules that promote successful outcomes.

And now, building on that momentum, we have the Berlin Commons Conference:

The conference seeks to bring together a diverse group of about 150 international and Germany- and European-based commoners, intellectuals, activists and policy makers. It also aims to enhance participation and self-organization; stewardship, cooperation and networking; and open, non-linear ways to search for solutions.

Over the course of two days, the conference will assess the range of existing and potential commons-based policy approaches; develop the fundamentals of a policy framework that supports the commons; and identify and explore specific strategic opportunities to advance commons-based approaches.

The conference announcement elaborates: “The simple yet powerful and complex question to be explored throughout the conference is: What does a commons-based policy framework look like? What already exists and what do we still need to develop to nurture and protect diverse sorts of commons?”

As you can see from the list of participants, yours truly will also be attending. Apparently, there will be a live video stream of some of the sessions: not sure whether mine will be one of them. If it is, you can see me spouting my common commons nonsense around 11am CEST, 10am GMT.

Follow me @glynmoody on Twitter or identi.ca.

28 October 2010

The Limits of Openness?

I've been a long-time fan of the 3D modelling program Blender. No surprise, then, that I've also been delighted to see the Blender Foundation moving into content production to show what the software can do.

Specifically, it has produced a game (Yo! Frankie) and three animated films: Elephants Dream; Big Buck Bunny; and most recently, Sintel. Aside from their aesthetic value, what's interesting about these films is that the content is released under a cc licence.

Here's a fascinating interview with Ton Roosendaal, head of the Blender Institute, leader of Blender development, and producer of Sintel. It's well-worth reading, but there was one section that really caught my eye:

we keep most of our content closed until release. I’m a firm believer in establishing protective creative processes. In contrast to developers — who can function well individually online — an artist really needs daily and in-person feedback and stimulation.

We’ve done this now four times (three films and one game) and it’s amazing how teams grow in due time. But during this process they’re very vulnerable too. If you followed the blog you may have seen that we had quite harsh criticism on posting our progress work. If you’re in the middle of a process, you see the improvements. Online you only see the failures.

The cool thing is that a lot of tests and progress can be followed now perfectly and it suddenly makes more sense I think. Another complex factor for opening up a creative process is that people are also quite inexperienced when they join a project. You want to give them a learning curve and not hear all the time from our audience that it sucks. Not that it was that bad! But one bad criticism can ruin a day.

Those are reasonable, if not killer, arguments. But his last point is pretty inarguable:

One last thing on the “open svn” point: in theory it could work, if we would open up everything 100% from scratch. That then will give an audience a better picture of progress and growth. We did that for our game project and it was suited quite well for it. For film… most of our audience wants to get surprised more, not know the script, the dialogs, the twists. Film is more ‘art’ than games, in that respect.

That's fair: there's no real element of suspense for code, or even games, as he points out. So this suggest for certain projects like these free content films, openness may be something that needs limiting in this way, purely for the end-users' benefit.

Follow me @glynmoody on Twitter or identi.ca.

The British Library's Future: Shiny, Locked-Down Knowledge?

Yesterday, Computerworld UK carried an interesting report headed “British Library explores research technologies of the future”. Here's what it hopes to achieve:

On Open Enterprise blog.