17 October 2009

The Commons Meme Becomes More Common

One of the great knock-on benefits of Elinor Ostrom sharing the Nobel prize for Economics is that the concept of the commons is getting the best airing that it's ever had. Here's another useful meditation on the subject from someone who knows what he's talking about, since he's written a book on the subject:

Old fables die hard. That's surely been the history of the so-called "tragedy of the commons," one of the most durable myths of the past generation. In a famous 1968 essay, biologist Garrett Hardin alleged that it is nearly impossible for people to manage shared resources as a commons. Invariably someone will let his sheep over-graze a shared pasture, and the commons will collapse. Or so goes the fable.

In fact, as Professor Elinor Ostrom's pioneering scholarship over the past three decades has demonstrated, self-organized communities of "commoners" are quite capable of managing forests, fisheries and other finite resources without destroying them. On Monday, Ostrom won a Nobel Prize in Economics for explaining how real-life commons work, especially in managing natural resources.

As he notes:

Although Ostrom has not written extensively about the Internet and online commons, her work clearly speaks to the ways that people can self-organize themselves to take care of resources that they care about. The power of digital commons can be seen in the runaway success of Linux and other open-source software. It is evident, too, in the explosive growth of Wikipedia, Craigslist (classified ads), Flickr (photo-sharing), the Internet Archive (historical Web artifacts) and Public.Resource.org (government information). Each commons acts as a conscientious steward of its collective wealth.

And this is an acute observation:

A key reason that all these Internet commons flourish is because the commoners do not have to get permission from, or make payments to, a corporate middleman. They can build what they want directly, and manage their work as they wish. The cable and telephone companies that provide access to the Internet are not allowed to favor large corporate users with superior service while leaving the rest of us--including upstart competitors and non-market players--with slower, poorer-quality service.

In an earlier time, this principle was known as "common carriage"--the idea that everyone shall have roughly equivalent access and service, without discrimination. Today, in the Internet context, it is known as "net neutrality."

Neat: another reason we need to preserve Net neutrality is to preserve all the commons - past, present and future - it enables.

Follow me @glynmoody on Twitter or identi.ca.

15 October 2009

Open Sourcing America's Operating System

And how do you do that? By making all of the laws freely available - and, presumably, searchable and mashable:

Public.Resource.Org is very pleased to announce that we're going to be working with a distinguished group of colleagues from across the country to create a solid business plan, technical specs, and enabling legislation for the federal government to create Law.Gov. We envision Law.Gov as a distributed, open source, authenticated registry and repository of all primary legal materials in the United States.

This is great news, because Carl Malamud - the force behind this initiative - has been urging it for years: now it looks like it's beginning to take a more concrete form:

The process we're going through to create the case for Law.Gov is a series of workshops hosted by our co-conveners. At the end of the process, we're submitting a report to policy makers in Washington. The process will be an open one, so that in addition to the main report which I'll be authoring, anybody who wishes to submit their own materials may do so. There is no one answer as to how the raw materials of our democracy should be provided on the Internet, but we're hopeful we're going to be able to bring together a group from both the legal and the open source worlds to help crack this nut.

I particularly liked the following comment:

Law.Gov is a big challenge for the legal world, and some of the best thinkers in that world have joined us as co-conveners. But, this is also a challenge for the open source world. We'd like to submit such a convincing set of technical specs that there is no doubt in anybody's mind that it is possible to do this. There are some technical challenges and missing pieces as well, such as the pressing need for an open source redaction toolkit to sit on top of OCR packages such as Tesseract. There are challenges for librarians as well, such as compiling a full listing of all materials that should be in the repository.

What's interesting is that this recognises that open source is not just an inspiration, but a key part of the solution, because - like the open maths movement I wrote about below - it needs new kinds of tools, and free software is the best way to provide them.

Now, if only someone could do something similar in the UK....

Open Source Mathematics

This is incredibly important:

On 27 January 2009, one of us — Gowers — used his blog to announce an unusual experiment. The Polymath Project had a conventional scientific goal: to attack an unsolved problem in mathematics. But it also had the more ambitious goal of doing mathematical research in a new way. Inspired by open-source enterprises such as Linux and Wikipedia, it used blogs and a wiki to mediate a fully open collaboration. Anyone in the world could follow along and, if they wished, make a contribution. The blogs and wiki functioned as a collective short-term working memory, a conversational commons for the rapid-fire exchange and improvement of ideas.

The collaboration achieved far more than Gowers expected, and showcases what we think will be a powerful force in scientific discovery — the collaboration of many minds through the Internet.

You can read the details of what happened - and it's inspiring stuff - in the article. But as well as flagging up this important achievement, I wanted to point to some interesting points it makes:

The process raises questions about authorship: it is difficult to set a hard-and-fast bar for authorship without causing contention or discouraging participation. What credit should be given to contributors with just a single insightful contribution, or to a contributor who is prolific but not insightful? As a provisional solution, the project is signing papers with a group pseudonym, 'DHJ Polymath', and a link to the full working record. One advantage of Polymath-style collaborations is that because all contributions are out in the open, it is transparent what any given person contributed. If it is necessary to assess the achievements of a Polymath contributor, then this may be done primarily through letters of recommendation, as is done already in particle physics, where papers can have hundreds of authors.

The project also raises questions about preservation. The main working record of the Polymath Project is spread across two blogs and a wiki, leaving it vulnerable should any of those sites disappear. In 2007, the US Library of Congress implemented a programme to preserve blogs by people in the legal profession; a similar but broader programme is needed to preserve research blogs and wikis.

These two points are also relevant to free software and other open endeavours. So far, attribution hasn't really been a problem, since everyone who contributes is acknowledged - for example through the discussions around the code. Similarly, preservation is dealt with through the tools for source code management and the discussion lists. But there are crucial questions of long-term preservation - not least for historical purposes - which are not really being addressed, even by the longest-established open projects like GNU.

For example, when I wrote Rebel Code, I often found it hard to track down the original sources for early discussions. Some of them have probably gone for ever, which is tragic. Maybe more thought needs to be given - not least by central repositories and libraries - about how important intellectual moments that have been achieved collaboratively are preserved for posterity to look at and learn from.

Talking of which, the article quoted above has this to say on that subject:

The Polymath process could potentially be applied to even the biggest open problems, such as the million-dollar prize problems of the Clay Mathematics Institute in Cambridge, Massachusetts. Although the collaborative model might deter some people who hope to keep all the credit for themselves, others could see it as their best chance of being involved in the solution of a famous problem.

Outside mathematics, open-source approaches have only slowly been adopted by scientists. One area in which they are being used is synthetic biology. DNA for the design of living organisms is specified digitally and uploaded to an online repository such as the Massachusetts Institute of Technology Registry of Standard Biological Parts. Other groups may use those designs in their laboratories and, if they wish, contribute improved designs back to the registry. The registry contains more than 3,200 parts, deposited by more than 100 groups. Discoveries have led to many scientific papers, including a 2008 study showing that most parts are not primitive but rather build on simpler parts (J. Peccoud et al. PLoS ONE 3, e2671; 2008). Open-source biology and open-source mathematics thus both show how science can be done using a gradual aggregation of insights from people with diverse expertise.

Similar open-source techniques could be applied in fields such as theoretical physics and computer science, where the raw materials are informational and can be freely shared online. The application of open-source techniques to experimental work is more constrained, because control of experimental equipment is often difficult to share. But open sharing of experimental data does at least allow open data analysis. The widespread adoption of such open-source techniques will require significant cultural changes in science, as well as the development of new online tools. We believe that this will lead to the widespread use of mass collaboration in many fields of science, and that mass collaboration will extend the limits of human problem-solving ability.

What's exciting about this - aside from the prospect of openness spreading to all these other areas - is that there's a huge opportunity for the open source community to start, er, collaborating with the scientific one in producing these new kinds of tools that currently don't exist and are unlikely to be produced by conventional software houses (since spontaneously collaborative communities can't actually pay for anything). I can't wait.

Follow me @glynmoody on Twitter or identi.ca.

Gates Gives $300 million - but with a Catch

It's becoming increasingly evident that Bill Gates' philanthropy is not simple and disinterested, but has woven into it a complex agenda that has to do with his love of intellectual monopolies - and power. Here's the latest instalment:


The Bill and Melinda Gates Foundation, which is donating another $120 million to boosting agriculture in the developing world, will focus on self-help aid for poor farmers to sustain and grow production, a top adviser to the world's leading charitable foundation said.

Sounds good, no? Here are more details:

The Gates Foundation, with a $30 billion endowment to improve health and reduce poverty in developing countries, began investing in agricultural projects three years ago. The latest grants bring its farm sector awards to $1.4 billion.

One of its first investments was in African seeds through the Alliance for a Green Revolution in Africa (AGRA). The group is expected to introduce more than 1,000 new seed varieties of at least 10 crops to improve African production by 2016.

"Alliance for a Green Revolution in Africa" also sounds good; here's a little background on that organisation:

It has not gone unnoticed that AGRA falls under the direct supervision of the Global Development Program, whose senior programme officer is Dr. Robert Horsch, who worked for Monsanto for 25 years before he joined the Gates Foundation. Horsch was part of the scientific team in the company that developed Monsanto’s YieldGard, BollGard and RoundUp Ready technologies. Horsch’s task at the Gates Foundation is to apply biotechnology toward improving crop yields in regions including sub-Saharan Africa. Lutz Goedde another senior program officer of the Global Development Program, is also a recruit from the biotech industry as he used to head Alta Genetics, the world's largest privately owned cattle genetics improvement and artificial insemination Company, worth US$100 million.

That is, AGRA not only has close links with the Gates Foundation, but also with Monsanto - the Microsoft of the seed world.

If you read the rest of the document from which the above information was taken, you'll see that the AGRA programme is essentially promoting approaches using seeds that are genetically modified and patented. Here's the conclusion:

Sub-Saharan Africa represents an extremely lucrative market for seed companies. The development interventions by AGRA appear on the face of it, to benevolent. However, not only will AGRA facilitate the change to a market based agricultural sector in Africa replacing traditional agriculture, but it will also go a long way towards laying the groundwork for the entry of private fertilizer and agrochemical companies and seed companies, and more particularly, GM seed companies.

So Gates' donations are ultimately promoting an agriculture based on intellectual monopolies - just as Microsoft does in the software field. The latest $300 million doesn't sound quite so generous now, does it?

Follow me @glynmoody on Twitter or identi.ca.

14 October 2009

Who is La Rochefoucauld of Twitter?

Mozilla's Tristan Nitot has come up with a rather fine aphorism:

Twitter, c'est la version XXI°S des salons mondains, mais limitée à 140 caractères, et à l'échelle du globe.

So come on people, start polishing those tweets: somewhere out there is La Rochefoucauld of Twitter....

Follow me @glynmoody on Twitter or identi.ca.

12 October 2009

Windows Does Not Scale

Who's afraid of the data deluge?


Researchers and workers in fields as diverse as bio-technology, astronomy and computer science will soon find themselves overwhelmed with information. Better telescopes and genome sequencers are as much to blame for this data glut as are faster computers and bigger hard drives.

While consumers are just starting to comprehend the idea of buying external hard drives for the home capable of storing a terabyte of data, computer scientists need to grapple with data sets thousands of times as large and growing ever larger. (A single terabyte equals 1,000 gigabytes and could store about 1,000 copies of the Encyclopedia Britannica.)

The next generation of computer scientists has to think in terms of what could be described as Internet scale. Facebook, for example, uses more than 1 petabyte of storage space to manage its users’ 40 billion photos. (A petabyte is about 1,000 times as large as a terabyte, and could store about 500 billion pages of text.)

Certainly not GNU/Linux: the latest Top500 supercomputer rankings show that the GNU/Linux family has 88.60%. Windows? Glad you asked: 1%.

So, forget about whether there will ever be a Year of the GNU/Linux Desktop: the future is about massive data-crunchers, and there GNU/Linux already reigns supreme, and has done for years. It's Windows that's got problems....

Follow me @glynmoody on Twitter or identi.ca.

09 October 2009

Why Creativity Needs Shorter Copyright Terms

In response to a tweet of mine about shortening copyright to stimulate creativity, someone questioned the logic. It's an important point, so it seems useful to do some thinking out loud on the subject.

First, I should probably address the question of whether *longer* copyright stimulates creativity. The basic argument seems to be that longer copyright terms mean greater incentives, which means greater creativity. But does anyone seriously think about the fact that their creations will still be in copyright 69 years after their death? It won't do them any good, and probably won't do their descendants much good either, since the income at this point is generally close to zero.

Indeed, speaking as an author, I know that practically all my income from writing comes within the first couple of years; after that, it's dribs and drabs. If my copyright were cut down to even five years, it would make only a marginal difference to my total remuneration.

Now, clearly I'm not JK Rowling, but the point is, neither are 99.99999% of authors: I know from talking to other run-of-the mill writers that the same holds for them, too. So in practical terms, reducing the copyright term would have little effect on the money that most creators earned as result.

But let's look at the main part of my claim: that reducing copyright's term would encourage creativity. This is based on the rough syllogism that all artists draw on their predecessors in some way; making more prior creativity available would allow more artists to draw on it in more ways; and so this would increase overall creativity.

For the first assertion, look at history. Painters once began by mixing paints in another artist's studio, then drawing unimportant bits in his (usually his) works, learning how to emulate his style. Then they gradually painted more important bits in the style of that artist, often doing the low-cost jobs or rush jobs that he didn't have time or inclination to execute. Then, one day, that apprentice would set up on his (usually his) own, building on all the tricks and techniques he had learned from his master, but gradually evolving his own style.

Today, would-be artists tend not to become apprentices in the same way. Instead, they typically go to art school, where they learn to *copy* the masters in order to learn their techniques. Often you see them doing this in art galleries, as they strive to reproduce the exact same effect in their own copy. It teaches them the basics of painting that they can then build on in their own work.

In music, something very similar happens: journeyman composers write pieces in the style of the acknowledged masters, often copying their themes and structure very closely. This is true even for extreme geniuses. For example, in order to learn how to write in the new early classical style, the eight-year-old Mozart arranged three piano sonatas from J C Bach's Op. 5 as keyboard concertos.

Mozart also "borrowed" entire themes - most famously in the overture to The Magic Flute, where he takes a simple tune from a piano sonata by Clementi, and transforms it. Some composers did this on a regular basis. Handel, in particular, was quite unscrupulous in taking themes from fellow composers, and turning them into other, rather better, works. Moreover, the widely-used form of musical variations is based generally on taking a well-known theme and subjecting it to various transformations.

That was in the past, when art was an analogue artefact. Copying took place through trying to reproduce an artistic effect, or by borrowing musical themes etc. Today, in the digital age, copying is not such an incidental act, but central to how we use computers. When we access something online, we copy it to our computers (even audio streaming has to be assembled into copies of small chunks of sound before we can hear it).

Digital plasticity - the ability to compute with any content - makes the clumsy copying and learning processes of the past trivially easy. A child can take a digital image of a work of art and cut and paste elements of it into his or her own work; anyone can sample music, distort it and mix it with their own; texts can be excerpted and juxtaposed with others drawn from very diverse backgrounds to create mosaics of meaning.

All these amazingly rich and innovative things are now very easy to do practically, but the possibilities of doing so are stymied by laws that were drawn up for an analogue age. Those laws were not designed to forbid artists from learning from existing creations, but to stop booksellers producing unauthorised copies - a totally different issue. The idea of using just part of a work was not really a concern. But it is today, when the cut and paste metaphor is central to the digital world. That is why we need to reduce copyright to the bare minimum, so that the legal obstacles to creating in this new, inherently digital way, are removed.

If we don't, one of two things will happen. Either we will fail to realise the full creative potential of computing, or else the younger generation of artists will simply ignore the law. Either is clearly unsatisfactory. What is needed is a copyright regime that is balanced. That is far from being the case today. As the media industry (sic) ratchets up copyright terms again and again, creation has become subservient to the corporation, and the creators are cut off from their past - and hence future.

Follow me @glynmoody on Twitter or identi.ca.

07 October 2009

EU Consultation on Post-i2010 - Please Do It

Stupidly, I thought this EU consultation would be the usual clueless nonsense, unredeemable even by witty comments from people like me. I was wrong. It's actually an incredibly wide-ranging questionnaire about very important topics. Indeed, it's not even obvious to me what my "correct" answers should be - it actually makes you think.

Here's a small sample of the deep questions it wants us to consider:

The future of the sustained internet services growth - internet to drive innovation

Challenges and issues here include:

- Design and development of the future internet - semantic web, Internet of Things, scalability, mobility, security etc.

- Keeping the internet open to competition, innovation and user choice - issues here include: interoperability, keeping the internet and internet-based services open and a level playing field for innovation (end-to-end connectivity, service level agreements, cross-platform services, net neutrality and open business models), open standards, low barriers to entry, etc.

...

Promoting access to creativity at all levels

In terms of expectations, Internet users' and the creative content providing sector have never been as at odds as they are today. Creative industry players are struggling to find new viable business models that are able to ensure sufficient revenues for creators and to meet consumer expectations. The market for digital content is still fragmented and broadcasters and other content providers, together with end-users are prevented from benefiting from a true digital Single Market.

Participative platforms have grown as passive users (readers, viewers, consumers etc.) have become active producers (or "prosumers"). These users tend to ignore their statutory rights and their obligations towards rights holders for the content they transform or/and simply share in web 2.0 communities. Moreover, intermediaries generally impose take-it- or-leave-it complex standard terms of use to their users. Against this background, users currently do not enjoy a clear set of rights balancing the conditions set by rights holders (with DRMs [Digital Rights Management] and/or license agreements) and internet services or platforms imposing restrictive standard terms of use.

...

Openness as a global issue

The challenge is to keep the internet open, based on open platforms and open standards. Many issues can only be resolved through international cooperation. The ICT strategies in the EU have often been inward-looking, which is difficult to justify, given the globalisation of modern ICT and the internet.

...

Challenges of participatory web

The growth of the participatory web is adding new challenges and pressures on public administrations, as well as opportunities. Web 2.0 enables citizens to shift their relationship with government. There is increasing demand on administrations to become ever more transparent and open to citizen involvement both in the delivery of services and in the design of public policies. If managed correctly, these demands may lead to delivery of better, more personalised services at lower cost as well as more trust in the public administration. This also applies to key services such as health care and education, where practitioners and beneficiaries of the service alike can benefit from mutually enriching communities of interest.

This is all really important stuff; so if you are an EU citizen, please take part - you have until this Friday, 9 October. The good news is that you don't need to fill in the whole thing - you can just pick and choose the bits that matter to you. Usefully, you can download the questionnaire in a variety of languages before you fill it in online - I highly recommend doing so.

Follow me @glynmoody on Twitter or identi.ca.

Browser Ballot Screen: Time to Prepare

It looks like it's happening:


The European Commission will on 9 October 2009 formally invite comments from consumers, software companies, computer manufacturers and other interested parties on an improved proposal by Microsoft to give present and future users of the Windows PC operating system a greater choice of web browsers. The commitments have been offered by Microsoft after the Commission expressed specific concerns that Microsoft may have infringed EC Treaty rules on abuse of a dominant position (Article 82) by tying its web browser (Internet Explorer) to its client PC operating system Windows, and are an improved version of the proposals made by Microsoft in July 2009 (see MEMO/09/352 ). The improvements concern greater information to consumers about web browsers, the features of each browser, an improved user experience as well as a review by the Commission to ensure the proposals genuinely work to benefit consumers. Interested parties can submit comments within one month. The Commission welcomes Microsoft’s proposal as it has the potential to give European consumers real choice over how they access and use the internet. Following the market test, the Commission could decide to adopt a decision under Article 9 (1) of Regulation 1/2003, which would make the commitments legally binding on Microsoft.

It's hard to comment on this until we see what form the ballot screen will take, but I'm prepared to accept that this may be done in a fair manner. Assuming it is, what might the implications be?

Perhaps the most important one is that Firefox needs to be prepared for a massive onslaught when this goes live. I have heard the slightly tongue-in-cheek suggestion that Microsoft is hoping to bring Firefox's servers to their collective digital knees by allowing such a ballot screen; even assuming that's not the case, it's certainly true that Mozilla must start planning for the sudden peak in interest that is likely to follow the implementation of the ballot screen idea. It would be a terrible shame if people tried to download Firefox and failed because the Mozilla servers keel over.

Follow me @glynmoody on Twitter or identi.ca.

Meet Microsoft, the Delusional

This is hilarious:

Jean Philippe Courtois, president of Microsoft Europe, described the company as an underdog in Paris today.

He said Bing had between three and five percent market share in search and could only grow - although he admitted it could take a long time.

...

Despite Microsoft having to live with open source software for 10 years, it had retained its share in the market place, he said.

Er, what, like the browser sector, where Firefox now has nearly 24% market share worldwide, and Microsoft's share is decreasing? Or Apache's 54% in the Web server world, where Microsoft's share is decreasing? Or GNU/Linux's 88% market share of the top 500 supercomputers in the world, where Microsoft's share is static?

Microsoft the underdog? Or just a dog?

Follow me @glynmoody on Twitter or identi.ca.

Becta Says: Teach Us a Lesson...

...which is surely a offer we can't refuse.

For many years, Becta was one of the main obstacles to getting open source used within UK schools: it simply refused to budge from an almost pathological dependence on Microsoft and its products. Today, the situation is slowing improving, but it will take years to undo the harm caused by Becta's insistence on propagating the Microsoft monoculture in education.

At least Teach Us a Lesson seems to be starting off on the right foot:


Becta’s Teach us a Lesson competition launches today, Wednesday 7 October, following the speech that Kevin Brennan, the Minister for Further Education, made at the Learning Revolution Expo yesterday.

The competition seeks to find the brightest and best ideas for developing online resources for people to find informal learning opportunities that interest them. This will happen by having entries submitted to the competition website, where they will be commented on and rated by other site users.

This, then, is about opening up in terms of drawing on ideas outside Becta. More specifically:

There are some things we are trying to avoid:

* Using proprietary products which will not permit open sharing or which run counter to Government policy on open standards

At long last, Becta seems to have learned its lesson...

Follow me @glynmoody on Twitter or identi.ca.

06 October 2009

Postcodes: Royal Fail

Here's a perfect example of why intellectual commons should not be enclosed.

The UK Postcode data set is obviously crucial information for businesses and ordinary citizens - something that is clearly vital to the smooth running of everyday life. But more than that, it is geographic information that allows all kinds of innovative services to be provided by people with clever ideas and some skill.

That's exactly what happened when the Postcode database was leaked on to the Internet recently. People used that information to do all sorts of things that hadn't been done before, presumably because the company that claims to own this information, Royal Mail, was charging an exorbitant amount for access to it.

And then guess what happened? Yup, the nasties started arriving:

On Friday the 2nd October we received correspondence from the Royal Mail demanding that we close this site down (see below). One of the directors of Ernest Marples Postcodes Ltd has also been threatened personally.

We are not in a position to mount an effective legal challenge against the Royal Mail’s demands and therefore have closed the ErnestMarples.com API effective immediately.

We understand that this will cause harm and considerable inconvenience to the many people who are using or intend to use the API to power socially useful tools, such as HealthWhere, JobcentreProPlus.com and PlanningAlerts.com. For this, we apologise unreservedly.

Specifically, intellectual monopolies of a particularly stupid kind are involved:

Our client is the proprietor of extensive intellectual property rights in the Database, including copyright in both the Database and the software, and database rights.

Here's what Wikipedia has to say about these "database rights":

The Directive 96/9/EC of the European Parliament and of the Council of 11 March 1996 on the legal protection of databases is a European Union directive in the field of copyright law, made under the internal market provisions of the Treaty of Rome. It harmonizes the treatment of databases under copyright law, and creates a new sui generis right for the creators of databases which do not qualify for copyright.

Before 1996, these sui generis "database rights" did not exist; they were created in the EU because lobbyists put forward the argument that they would offer an incentive to create more databases than the Americans, whose database publishers strangely didn't seem to need this new "right" to thrive, and so make the mighty EU even mightier - at least as far as those jolly exciting databases were concerned.

Rather wisely, afterwards the EU decided to do some research in this area, comparing their creation before and after the new sui generis right was brought in, to see just how great that incentive proved to be - a unique opportunity to test the theory that underpins intellectual monopolies. Here are the results of that research:

Introduced to stimulate the production of databases in Europe, the “sui generis”protection has had no proven impact on the production of databases.

According to the Gale Directory of Databases, the number of EU-based database “entries” was 3095 in 2004 as compared to 3092 in 1998 when the first Member States had implemented the “sui generis” protection into national laws.

It is noteworthy that the number of database “entries” dropped just as most of the EU-15 had implemented the Directive into national laws in 2001. In 2001, there were 4085 EU-based “entries” while in 2004 there were only 3095.

While the evidence taken from the GDD relies on the number of database “entries” and not on the overall turnover achieved or the information supplied by means of databases, they remain the only empirical data available.

So, the official EU study finds that the sui generis protection has had no proven impact on the production of databases; in fact, the number of databases went *down* after it was introduced.

Thus these "database rights" have been shown to stifle the production of databases - negating the whole claimed point of their introduction. Moreover, the Royal Mail's bullying of a couple of people who are trying to offer useful services that would not otherwise exist, shows the danger of entrusting such a critical data commons to commercial entities who then enclose it by claiming "database rights" in them: they will always be tempted to maximise their own profit, rather than the value to society as a whole.

Giving the Royal Mail a monopoly on this critical dataset - one that for all practical purposes can never be created again - is like giving a genetics company a monopoly on the human genome. That was attempted (hello, Celera) but, fortunately for us, thwarted, thanks largely to free software. Today, the human genome is an intellectual commons (well, most of it), and the Postcode data should be, too.

Follow me @glynmoody on Twitter or identi.ca.

Blogger's Massive Fail

I can't believe this. When posting the previous entry, I got this message:


Blogger currently allows a maximum of 10 labels per post, and 2000 labels per blog. To get rid of this message, you will have to correct the appropriate label counts.

It seems that I have exceeded my quota of 2000 labels per blog: how insane is that? How can I limit myself to a set number of labels given that the world moves on and new ideas come along that need new labels.

Time to explore those export options....

Open Source and the Fear of Failure

Yesterday I took part in an interesting event organised by BT called "Accelerating Enterprise adoption of Open Source Software" (disclaimer: filthy lucre was involved.) One topic that elicited much comment was why the public sector has singularly failed to deploy open source. As well as political issues (Tony Blair was and presumably still is manifestly in awe of (Sir) Bill Gates), there's another important issue to do with a fear of failure.

Nobody in government wants to take a chance on something new, so they stick with the old suppliers and the old solutions. When those (almost inevitably) fail, this causes people to be even more cautious, and so the vicious circle continues.

That's clearly bad news for open source, but here's a particularly good articulation of why the fear of failure is bad for governments more generally:

When I’ve spoken with government people, they confess a phobia of failure. Yet without the opportunity to fail, government – like industry and media – cannot experiment and thus innovate. We must give government the license to fail. That is difficult, especially because it is the citizenry that must grant that permission. I think government must begin to recast its relationship by opening up pilot procts to input and discussion, to smart ideas and improvements. I’m not suggesting for a second that every decision be turned into a vote, that law become a wiki. Government still exercises its responsibility. But it needs to use the new mechanisms of the web to hear those ideas. I would look for examples to Dell’s Ideastorm, Starbucks’ My Starbucks Idea, and Best Buy’s Idea Exchange.

Follow me @glynmoody on Twitter or identi.ca.

01 October 2009

Korea Cottons on to the Microsoft Monoculture

I've written several times about the extraordinary situation in South Korea - otherwise one of the most advanced technological nations - that maintains an almost total dependence on Microsoft's ActiveX technology for banking and government connections. Now it seems that the Koreans themselves are finally waking up to the disadvantages - and dangers - of that situation:

The bizarre coexistence of advanced hardware and an outdated user environment is a result of the country's overreliance on the technology of Microsoft, the U.S. software giant that owns the Korean computing experience like a fat kid does a cookie jar.

It is estimated that around 99 percent of Korean computers run on Microsoft's Windows operating system, and a similar rate of Internet users rely on the company's Internet Explorer (IE) Web browser to connect to cyberspace.

The article points out the obvious security issues with this approach:

This is a risky arrangement, since Active-X controls require full access to the Windows operating system and are often abused by cyber criminals who spread malicious programs to direct the browser to download files that compromise the user's control of the computer.

But it seems that the problem goes *much* deeper:

Even Microsoft seems ready to bail on Active-X, looking to phase out the program over security concerns and compatibility issues. However, in Korea, where most Web sites rely on Active-X to enable a variety of functions from online transactions to simple flash features, the program is abundant and critical as air.

This leads to awkwardness whenever Microsoft introduces a new product here. The release of Windows Vista caused massive disruption when Active-X used by banks and online shopping sites didn't function properly.

And the Korean Internet users sweated over Microsoft's initial plans to reduce its support for Active-X in IE8, the latest version of the company's Web browser. Although IE8 did end up backing Active-X, strengthened security features have made its use more complicated.

The reliance on Active-X has locked Korean computer users into a depressing cycle where they are prevented from venturing off to other operating systems and browsers, and stuck with an outdated technologies their creator can't wait to dispel.

That is, by instituting a monoculture, and becoming completely dependent not just on one manufacturer, but on one particular - and very unsatisfactory - technology used by that manufacturer, the Koreans find themselves trapped, left behind even by Microsoft, which wants to move on.

There could be no better demonstration of why mandating one proprietary technology in this way, rather than choosing an open standard with multiple implementations with the scope for future development, is folly.

Unfortunately, the article quoted above doesn't seem very optimistic on the chances of openness breaking out in South Korea any time soon, so it may well be that all its superb Internet infrastructure will go to waste as it remains locked into aging and increasingly obsolete technology on the software side. (Via Mozilla in Asia.)

Follow me @glynmoody on Twitter and identi.ca.

30 September 2009

What Light on Yonder gov.uk Site Breaks?

The first glint of hope for openness in the UK government begins to sparkle:


From today we are inviting developers to show government how to get the future public data site right - how to find and use public sector information.

The developer community through initiatives such as Show Us a Better Way, the Power of Information Taskforce, MySociety and Rewired State have consistently demonstrated their eagerness and abilities to "Code a Better Country". You have given us evidence and examples to help drive this forward within government.

We have an early preview of what the site could look like; we are now inviting interaction and comment from the developer community. With over 1000 existing data sets, from 7 departments (brought together in re-useable form for the first time) and community resources, we want developers to work with us to use the data to create great applications; give us feedback on the early operational community; and tell us how to develop what we have into a single point of access for government-held public data.

We know it is still work in progress, and there’s still a lot to do. That’s why we need you to help us get this right. Let us know what features or changes would make the site better for your and what other data sources you would like to see here.

Now there's an offer you can't refuse...get stuck in, people. (Via Glyn Wintle.)

Follow me @glynmoody on Twitter or identi.ca.

29 September 2009

Thanks for Keeping us in the Picture

Although e-petitions don't often accomplish much (the apology for Alan Turing being a notable exception), they do have the virtue of forcing the UK government to say something. In response to this:

“We the undersigned petition the Prime Minister to remove new restrictions on photography in public places.”

we got this:

It is a statutory defence for a person to prove that they had a reasonable excuse for eliciting, publishing or communicating the relevant information. Legitimate journalistic activity (such as covering a demonstration for a newspaper) is likely to constitute such an excuse. Similarly, an innocent tourist or other sight-seer taking a photograph of a police officer is likely to have a reasonable excuse.

Since most people can't *prove* they had reasonable excuse for taking a photo - is "because it was a nice shot" *reasonable*? And how do you *prove* it was reasonable at the time? - this very high legal bar obviously implies that non-journalistic Brits had better not take any snaps of Plod because, otherwise, you're nicked.

Follow me @glynmoody on Twitter or identi.ca.

26 September 2009

Freedom is Slavery, Slavery is Freedom

The Competitive Enterprise Institute is always good for a laugh thanks to its transparent agenda (the use of the weasel word "competitive" gives it away), and it doesn't disappoint in the following, which is about the evils of net neutrality and openness:

Consider the Apple iPhone. The remarkably successful smartphone has arguably been a game-changer in the wireless world, having sold tens of millions of handsets since its 2007 launch and spurring dozens of would-be “iPhone killers” in the process. If you listen to net neutrality advocates’ mantra, you would assume the iPhone must be a wide open device with next to no restrictions. You would be mistaken. In fact, the iPhone is a prototypical “walled garden.” Apple vets every single iPhone app, and Apple reserves the right to reject iPhone apps if they “duplicate [iPhone] functionality” or “create significant network congestion.”

Why, then, has the iPhone enjoyed such popularity? It’s because consumer preferences are diverse and constantly evolving. Most users, it seems, do not place openness on the same pedestal that net neutrality advocates do. Proprietary platforms like the iPhone have advantages of their own– a cohesive, centrally-managed user experience, for one– but have disadvantages as well.

Which is fair enough. But it then goes on to say:

But under the FCC’s proposed neutrality rules, the iPhone and similar devices that place limits on the content and applications that users can access would likely be against the law.

Net neutrality has nothing to do with the edges - which is where the iPhone resides - and everything about the wiring that connects the edges. It is about preventing those who control the networks from blocking innovative services - like the iPhone - being offered across them. It would only apply if Apple owned the network and refused to allow third parties to offer rival services to its iPhone - clearly not the case. It does not forbid Apple from choosing which apps to run on the iPhone, any more than it forces Microsoft to go open source.

Painting the freedom of net neutrality as a kind of slavery in this way is really a tour-de-force of topsy-turvism, even by the high standards of the Competitive Enterprise Institute.

Follow me @glynmoody on Twitter or identi.ca.

25 September 2009

Won't Someone Please Think of the, er, Plants?

I've tweeted this, but it's so good, I just have to blog it too:

CO2 is not a pollutant. CO2 makes Earth green because it supports all plant life. It is Earth's greatest airborne fertilizer. Even man-made CO2 contributes to plant growth that in turn sustains humanity and ecosystems.

CO2 Is Green is working to insure that all federal laws or regulations are founded upon science and not politics or scientific myths. No one wants the plant and animal kingdoms, including humanity, to be harmed if atmospheric CO2 is reduced. The current dialog in Washington needs to reflect these inalterable facts of nature. We cannot afford to make mistakes that would actually harm both the plant and animal kingdoms.

Oh lordy, those poor little plants and animals - deprived of the life-giving CO2. How could mankind be so cruel and insensate? How could we have overlooked such an obvious thing until now?

Update: Don't miss Adam Pope's super-sleuthing in the comments that suggests this site just might have something to do with the gas and oil industries...

Follow me @glynmoody on Twitter or identi.ca.

24 September 2009

More Evil from the Intellectual Monopolies Mob

One of the best windows into the otherwise dark and murky world of backroom deals among proponents of intellectual monopolies can be found in the reports on the U.S.-EU IPR Enforcement Working Group (doesn't that word "enforcement" really say it all?). Here are a couple of the highlights of the latest one:

The U.S. and EU both expressed a desire to engage labor movements in delivering a “positive and constructive message” about IPR protection and enforcement. The RIAA (Recording Industry Association of America) and IIPA (International Intellectual Property Association) were both very enthusiastic about this proposal.

Basically, the IM mob are desperately trying to con unions into doing their dirty work by pushing out propaganda on intellectual monopolies. I just love the line "The RIAA (Recording Industry Association of America) and IIPA (International Intellectual Property Association) were both very enthusiastic about this proposal": you bet they are. Their own ham-fisted efforts have backfired so spectacularly that they are desperate for someone else not tainted by their inept approach of punishing consumers to try.

The following is also significant:

The discussion on future work mostly focus on climate change. General Electric and Microsoft were particularly outspoken in highlighting their fear that some current negotiations over green technology and IPR would weaken IPR. They also denounced the inclusion of proposals that limit patentable subject matter and recommend compulsory licenses or licenses of rights.

As well as Microsoft's usual bleating about not being allowed to patent software in some jurisdictions, it's interesting to note that both it and General Electric seem to rate the preservation of intellectual monopolies rather higher than the preservation of our planet. Pure evil. (Via Ray Corrigan.)

Follow me @glynmoody on Twitter or identi.ca.

Cracks in the ACTA Wall of Secrecy

I've lamented many times the totally unjustified secrecy of the ACTA negotiations: these affect billions of people who have a right to know what their elected representatives are up to before this stuff is simply imposed on us. Hitherto, there's been no suggestion of any dissension within the ACTA ranks; so this comment in a blog post from Jamie Love about a lunch meeting of civil society NGOs held by the UK's Intellectual Property Office during the WIPO meeting is intriguing:


The UK IP office said it had complained frequently of the secrecy of the ACTA negotiations.

Perhaps if we can get a few more of the insiders moaning about this unnecessary lack of transparency, things will finally start moving.

Follow me @glynmoody on Twitter or identi.ca.

23 September 2009

Big Win for GNU GPL in France

One of the fallback positions for purveyors of FUD is that the GNU GPL may not be valid, because it hasn't been properly tested in court. That's getting increasingly implausible as a stance. After being upheld in Germany a few times, here's a big decision in its favour in France:

In a landmark ruling that will set legal precedent, the Paris Court of Appeals decided last week that the company Edu4 violated the terms of the GNU General Public License (GPL) when it distributed binary copies of the remote desktop access software VNC but denied users access to its corresponding source code. The suit was filed by Association pour la formation professionnelle des adultes (AFPA), a French education organization.

...

The events of the case go back to early 2000, when Edu4 was hired to provide new computer equipment in AFPA's classrooms. Shortly thereafter, AFPA discovered that VNC was distributed with this equipment. Despite repeated requests, with mediation from the Free Software Foundation France, Edu4 refused to provide AFPA with the source code to this version of VNC. Furthermore, FSF France later discovered that Edu4 had removed copyright and license notices in the software. All of these activities violate the terms of the GNU GPL. AFPA filed suit in 2002 to protect its rights and obtain the source code.

There are a couple of important points about this decision. The first is noted in the post quoted above:

"what makes this ruling unique is the fact that the suit was filed by a user of the software, instead of a copyright holder. It's a commonly held belief that only the copyright holder of a work can enforce the license's terms - but that's not true in France. People who received software under the GNU GPL can also request compliance, since the license grants them rights from the authors."

The other point flows from this. The French legal system has many novel aspects, so it's important that the GNU GPL was upheld here, just as it was in Germany. It means that not only is the approach that the GPL takes being upheld by courts, it is being upheld in courts that look at things from different legal perspectives. That augurs well for future rulings in other jurisdictions.

Follow me @glynmoody on Twitter or identi.ca.

21 September 2009

Microsoft, Monsanto and Intellectual Monopolies

Here's a brilliant, must-read feature exposing some of the hidden agendas of the Green Revolution and the dark side of the Gates Foundation's work in Africa. In particular, it makes explicit the symmetry of Microsoft and Monsanto in their use of intellectual monopolies to make their users increasingly powerless:

The preference for private sector contributions to agriculture shapes the Gates Foundation's funding priorities. In a number of grants, for instance, one corporation appears repeatedly--Monsanto. To some extent, this simply reflects Monsanto's domination of industrial agricultural research. There are, however, notable synergies between Gates and Monsanto: both are corporate titans that have made millions through technology, in particular through the aggressive defense of proprietary intellectual property. Both organizations are suffused by a culture of expertise, and there's some overlap between them. Robert Horsch, a former senior vice president at Monsanto, is, for instance, now interim director of Gates's agricultural development program and head of the science and technology team. Travis English and Paige Miller, researchers with the Seattle-based Community Alliance for Global Justice, have uncovered some striking trends in Gates Foundation funding. By following the money, English told us that "AGRA used funds from the Bill and Melinda Gates Foundation to write twenty-three grants for projects in Kenya. Twelve of those recipients are involved in research in genetically modified agriculture, development or advocacy. About 79 percent of funding in Kenya involves biotech in one way or another." And, English says, "so far, we have found over $100 million in grants to organizations connected to Monsanto."

This isn't surprising in light of the fact that Monsanto and Gates both embrace a model of agriculture that sees farmers suffering a deficit of knowledge--in which seeds, like little tiny beads of software, can be programmed to transmit that knowledge for commercial purposes. This assumes that Green Revolution technologies--including those that substitute for farmers' knowledge--are not only desirable but neutral. Knowledge is never neutral, however: it inevitably carries and influences relations of power.

I fear that with hindsight we will see that contrary to the almost universal view that Gates is redeeming his bad boy years at Microsoft with the good boy promises of his Foundation, Gates will actually do even more damage in the realm of agriculture than he has in the world of computing. (Via Roy Schestowitz.)

Follow me @glynmoody on Twitter or identi.ca.

On the Road to Mendeley

Vic Keegan had an interesting article in the Guardian last week about a new site mendeley.com:


The music radio site Last.fm is one of the great ideas from the UK during the first dotcom boom. Users can listen to their own songs and other tracks recommended by Last.fm's algorithms based on their tastes, including iTunes, and those of friends. It could easily have been a one-trick pony. But now a few academics have applied its serendipity to scientific research. Why can't researchers, instead of waiting anywhere up to three years for their papers to jump all the hurdles, be part of a real-time market place – a fusion of iTunes and Last.fm for science? They pitched the idea, among others, to two of Last.fm's investors: Spencer Hyman and Stefan Glaenzer, newly enriched by the sale of Last.fm to CBS. They bought into the idea of using the site's principles to aggregate users' data (anonymously) while building up a databank of articles. Now the show is on the road and expanding fast. It is free, but a premium version will be added soon.

What's particularly fascinating is to see the cross-over of ideas from arts to science, and that both are driven by the insight that sharing with others brings huge benefits to them and to you.

Even though it's not open source, it's good to see that from the start there's a GNU/Linux version of the Mendeley client. Since the power of the site comes from the network effects of sharing, not the secret sauce hidden in the code, there doesn't seem to be any reason why that code shouldn't be opened up, and plenty of benefits in doing so. Now that Mendeley has started on its journey of sharing, let's hope they go the whole way.

Follow me @glynmoody on Twitter or identi.ca.

17 September 2009

Analogue or Digital? - Both, Please

Recently, I bought the complete works of Brahms. Of course, I was faced with the by-now common problem of whether to buy nostalgic CDs, or evanescent MP3s. The price was about the same, so there was no guidance there. Of course, ecologically, I should have gone for the downloads, but in the end I choose the CDs - partly for the liner stuff you never get with an MP3, and partly because I have the option of degrading the CD bits to lossy MP3, which doesn't work so well the other way.

So imagine my surprise - and delight - when I discovered after paying for said CDs that the company - Deutsche Grammophon - had also given me access to most of the CDs as streams from its Web site, for no extra cost (I imagine the same would have been true of the MP3s). This was a shrewd move because (a) it made me feel good about the company, even though it cost them very little, and (b) I'm now telling people about this fact, which is great publicity for them.

But maybe my delight is actually a symptom of something deeper: that having access to both analogue and digital instantiations of information is getting the best of both worlds.

This struck me when I read the following story:

Google will make some 2 million out-of-copyright books that it has digitally scanned available for on-demand printing in a deal announced Thursday. The deal with On Demand Books, a private New York-based company, lets consumers print a book in about 10 minutes, and any title will cost around $8.

The books are part of a 10 million title corpus of texts that Google ( GOOG - news - people ) has scanned from libraries in the U.S. and Europe. The books were published before 1923, and therefore do not fall under the copyright dispute that pits Google against interests in technology, publishing and the public sector that oppose the company's plans to allow access to the full corpus.

That in itself, is intriguing: Google getting into analogue goods? But the real importance of this move is hinted at in the following:

On Demand already has 1.6 million titles available for print, but the Google books are likely to be more popular, as they can be searched for and examined through Google's popular engine.

That's true, but not really the key point, which is that as well as being able to search *for* books, you can search *through* them. That is, Google is giving you an online search capability for the physical books you buy from them.

This is a huge breakthrough. At the moment, you have to choose between the pleasure of reading an analogue artefact, and the convenience of its digital equivalent. With this new scheme, Google will let you find a particular phrase - or even word - in the book you have in your hands, because the latter is a physical embodiment of the one you use on the screen to search through its text.

The trouble is, of course, that this amazing facility is only available for those books out of copyright that Google has scanned. Which gives us yet another reason for repealing the extraordinarily stupid copyright laws that stop this kind of powerful service being offered for *all* text.

Follow me @glynmoody on Twitter or identi.ca.