16 August 2007

Paying the Price of Intellectual Monopolies

Oh look, here's unnecessary, Draconian intellectual monopoly regulation that has major negative consequences:


An unexpected implication in the legislating procedure of the proposed EU Directive on criminal measures aimed at ensuring the enforcement of intellectual property rights (IPRED2) puts legitimate businesses under clear threat of criminal sanctions.

Now, why am I not surprised by this?

The Triumph of Free (as in Beer)

With The New York Times and The Wall Street Journal said to be looking at removing the “pay wall” around their online content, and others – including CNN, Google and AOL – having already done so, one question springs to mind: Are we seeing the death of paid content online, and the return of free as a business model?

Yup - at least, free as in beer: now we need to work on the free as in freedom part.

Google Health...

...is coming. And you thought the privacy issues of using Google were bad now. (Via John Battelle.)

Not So Au Courant

This piece from The Courant is like the coelacanth: not very pretty, but fascinating for its atavistic traits:

Unlike copyright-protected software, such as Microsoft's Windows, open source software is available either as a free public-domain offering or under a nominal licensing fee.

Well, no. To be strictly open source, software must have an OSI-approved licence. Such licences generally (always?) depend on copyright law for their enforcement. So, by definition, open source software uses copyright as much as Microsoft's Windows, just for different ends.

This was a common confusion when free software started appearing in the mainstream, but it's quite surprising to see it popping up nowadays.

Of Open and Closed Geography

Talking of the price we pay for idiotically closed geographical data:

The United States has benefitted in many ways from having public data sets that are freely used by scholars, commercial firms, consultants, and the public. An example of this is the TIGER system (Topologically Integrated Geographic Encoding and Referencing system http://www.census.gov/geo/www/tiger/) Many countries do not, and one British geospatial expert estimated that the closed nature of their system has cost them one billion pounds in lost business.

(Via Open Access News.)

The Idiots of OS (Ordnance Survey)

This really makes my blood boil.

After a year of negotiations, academic geographers have conceded defeat in their attempt to find a way to make a pioneering 3D representation of the capital, Virtual London, available to all comers via the Google Earth online map.

Followers of Technology Guardian's Free Our Data campaign will have guessed the reason: Virtual London is partly derived from proprietary data owned by the government through its state-owned mapping agency, Ordnance Survey (OS). What makes the situation bizarre is that Virtual London's development was funded by another arm of the government, the office of the mayor of London.

In other words, I helped pay for this information, twice - as taxpayer, and as London ratepayer - and yet I am not allowed to access it.

The Ordnance Survey's excuse is pathetic:

OS said granting Google special terms for Virtual London would be unfair on other licensees. "We provide an open, fair and transparent set of terms for providers seeking to operate in the same commercial space as each other. We cannot therefore license Google in a different way to other providers. We are completely supportive of anyone putting our data on the web as long as they have a licence to do so." Google would not comment.

Commercial space - and what about the public space, you know those tiresome little people that pay for your salary?

Thanks for nothing.

Open Source's Best-Kept Secret Redux

About 18 months ago, I wrote a post called "Open Source's Best-Kept Secret" about Eclipse, how wonderful it was, and yet how few knew about it. Now what do I find?

Eclipse may be the most important open-source "project" that people outside the industry, and even some within it, have never heard of.

Yup, Matt and I agree again. His piece is an excellent interview with the head of Eclipse, Mike Milinkovich. I also interviewed him recently, for my feature about the open source ecosystem in Redmond Magazine. Matt's ranges more widely, and is probably the best intro to what Eclipse is up to, how it functions, and why it is so important.

Indeed, I wonder whether it will actually prove to be the most important open source project of all in the long term. As Matt points out:

In late June, Eclipse made available the largest-ever simultaneous release of open-source software, called Europa: 17 million lines of code, representing the contributions of 310 open-source developers in 19 countries. Twenty-one new tools were included in the "Europa" release, all free to download.

Think about that. The Linux kernel has around 6 million lines of code.... The Java Development Kit that Sun open sourced has 6.5 million.... Sun's StarOffice release in 2000 (which was believed to be the largest open-source release to that point) had 9 million.... Firefox has 2.5 million.

Yeah, think about it....

15 August 2007

They Gave Me of the Fruit...

...and I did eat:

But now an international scientific counterculture is emerging. Often referred to as “open science,” this growing movement proposes that we err on the side of collaboration and sharing. That’s especially true when it comes to creating and using the basic scientific tools needed both for downstream innovation and for solving broader human problems.

Open science proposes changing the culture without destroying the creative tension between the two ends of the science-for-innovation rope. And it predicts that the payoff – to human knowledge and to the economies of knowledge-intensive countries like Canada – will be much greater than any loss, by leveraging knowledge to everyone’s benefit.

"Sharing the fruits of science", it's called. Nothing new, but interesting for the outsider's viewpoint. (Via Open Access News.)

Welcome to the Era of Personal Genomics

I've been wittering on about personal genomics for some time: well, it's here, people. If you don't believe, me, take a look at this site (note, it's one of those old-fashioned FTP thingies, but Firefox should cope just fine).

Not much to see, you say? Just a couple of boring old directories - one called "Venter", the other "Watson". And inside those directories, lots of pretty massive files - some 35 Mbytes, some double that. And inside those files? Oh, just some boring letters; you know the kind of thing - AAGTGGTACCATTGACGCACAGGACACAGTG etc.

Nothing much: just the essence of the first two people to have their entire genomes (or nearly) sequenced - and all made freely available.... (Via Discovering Biology in a Digital World.)

O'Reilly? I Think Not

Once again, Matt gets it, and Tim doesn't:

"I will predict that virtually every open source company (including Red Hat) will eventually be acquired by a big proprietary software company."

Thus spake Tim O'Reilly in the comments to one of his other posts. Tim believes that open source, at least as defined by open-source licensing, has a short shelf-life that will be consumed by Web 2.0 (i.e., web companies hijacking open-source software to deliver proprietary web services) or by traditional proprietary software vendors.

In other words, why don't I just give up, sell out, and go home? I guess I would if I thought that Tim were right. He's not, not in this instance.

There's something more fundamental going on here than "Proprietary software meets open source. Proprietary software decides to commandeer open source. Open source proves to be a nice lapdog to proprietary software." I actually believe that open source, not proprietary software, is the natural state of the industry, and that Tim's proprietary world is anomalous.

I particularly liked this distinction between the service aspects of software, and the attempts to view it as an instantiation of various intellectual monopolies:

Suddenly, the license matters more, not less, because it is the license that ensures the conversation focuses on the right topic - service - rather than on inane jabberings that only vendors care about. You know, like intellectual property.

And there's another crucial reason why proprietary software companies can't just open their chequebooks and acquire those pesky open source upstarts. Unlike companies who seem to think that they are co-extensive with the intellectual monopolies they foist on customers, open source outfits know they are defined by the high-quality people - both employees and those out in the community - that code for the customers.

For example, one reason people take out subscriptions to Red Hat's offerings is that they get to stand in line for the use of Alan Cox's brain. Imagine, now, that proprietary company X "buys" Red Hat: well, what exactly does it buy? Certainly not Alan Cox's brain, which will leave with him (one hopes) when he moves immediately to another open source company (or just hacks away in Wales for pleasure). Sure, the purchaser will have all kinds of impressive legal documents spelling out what it "owns" - but precious little to offer customers anymore, who are likely to follow wherever Alan Cox and his ilk go.

The (Uncommon) Fedora Commons

When I first heard about Fedora Commons I naively assumed it had something to do with the Linux distro Fedora, but I was wrong:

Fedora Commons is a non-profit organization providing sustainable technologies to create, manage, publish, share and preserve digital content as a basis for intellectual, organizational, scientific and cultural heritage by bringing two communities together.

Communities of practice that include scholars, artists, educators, Web innovators, publishers, scientists, librarians, archivists, publishers, records managers, museum curators or anyone who presents, accesses, or preserves digital content.

Software developers who work on the cutting edge of open source Web and enterprise content technologies to ensure that collaboratively created knowledge is available now and in the future.

Fedora Commons is the home of the unique Fedora open source software, a robust integrated repository-centered platform that enables the storage, access and management of virtually any kind of digital content.

So not only is Fedora an organisation - recently funded to the tune of $4.9 million by the Gordon and Betty Moore Foundation - aiming to create a commons of "intellectual, organizational, scientific and cultural heritage", but it is also a piece of code:

Institutions and organizations face increasing demands to deliver rich digital content. A scan of the web reveals complex multi-media content that combines text, images, audio, and video. Much of this content is produced dynamically through the use of servlet technology and distributed web services.

Delivery of rich content is possible through a variety of technologies. But, delivery is only one aspect of a suite of content management tasks. Content needs to be created, ingested, and stored. It needs to be aggregated and organized in collections. It must be described with metadata. It must be available for reuse and refactoring. And, finally, it must be preserved.

Without some form of standardization, the costs of such management tasks become prohibitive. Content managers find themselves jury-rigging tasks onto each new form of digital content. In the end, they are faced with a maze of specialized tools, repositories, formats, and services that must be upgraded and integrated over time.

Content managers need a flexible content repository system that allows them to uniformly store, manage, and deliver all their existing content and that will accommodate new forms that will inevitably arise in the future.

Fedora is an open source digital repository system that meets these challenges.

In fact, Fedora is nothing less than "Flexible Extensible Digital Object Repository Architecture". So the name is logical - pity it's so confusing in the context of open source.

Linux Weather Forecast

Get your umbrellas out for the Linux Weather Forecast:

The need for a Linux Weather Forecast arises out of Linux’s unique development model. With proprietary software, product managers define a “roadmap” they deliver to engineers to implement, based on their assessments of what users want, generally gleaned from interactions with a few customers. While these roadmaps are publicly available, they are frequently not what actually gets technically implemented and are often delivered far later than the optimistic timeframes promised by proprietary companies.

Conversely, in Linux and open source software, users contribute directly to the software, setting the direction with their contributions. These changes can quickly get added to the mainline kernel and other critical packages, depending on quality and usefulness. This quick feedback and development cycle results in fast software iterations and rapid feature innovation. A new kernel version is generally released every three months, new desktop distributions every six months, and new enterprise distributions every 18 months.

While the forecast is not a roadmap or centralized planning tool, the Linux Weather Forecast gives users, ISVs, partners and developers a chance to track major developments in Linux and adjust their business accordingly, without having to comb through mailing lists of the thousands of developers currently contributing to Linux. Through the Linux Weather Forecast, users and ecosystem members can track the amazing innovation occurring in the Linux community. This pace of software innovation is unmatched in the history of operating systems. The Linux Weather Forecast will help disseminate the right information to the ever growing audience of Linux developers and users in the server, desktop and mobile areas of computing, and will complement existing information available from distributions in those areas.

Good to see Jonathan Corbet, editor of LWN.net, for whom I write occasionally, spreading some of his deep kernelly knowledge in this way.

14 August 2007

RSS as the Lubricant of Openness

Facebook creaks open a little more - and RSS is the lubricant. (Via TechCrunch.)

GiveMeaning? - Give Me a Break

I wrote recently about the plight of the Tibetan people. One of the problems is that it is hard for an average non-Tibetan to do much to help the situation. So I was pleased that Boing Boing pointed me to what sounded a worthy cause that might, even if in a small way, help preserve Tibetan culture:

The Tibetan Endangered Music Project has so far recorded about 400 endangered traditional Tibetan songs. We now have the opportunity to make these songs available online, at a leading Tibetan language website (www.tibettl.com). However, this volunteer run website is unable to fund hosting for our material. The cost of hosting space is 1.5 RMB (less than 20 US cents) for every MB. One song in mp3 format is approximately 1.5 MB. 1900 USD would allow us to buy 10 GB of hosting space, which will take care of all our needs for the forseeable future (allowing 6700 1.5 MB songs to be uploaded). It would also allow us to expand to video hosting in the future, or to provide high quality (.wav) formats instead of only compressed mp3 format.

Wow - preserving the Tibetan musical commons for the Tibetans: sign me up, I thought.

So I did sign up. But that's where the problem began.

Despite being signed up and in, I could not - can not - find anywhere to give money to this lot. Now, naively, I would have thought that a site called GiveMeaning, expressly designed to help people give money to worthy causes, would, er, you know, help people give money, maybe with a big button saying "GIVE NOW". But what do I know? I've only been using the Web for about 14 years, so maybe I'm still a little wet behind the ears.

On the other hand, it could just be that this is one of the most stupid sites in the known universe, designed to drive altruists mad as a punishment for wanting to help others. Either way, it looks like the Tibetan musical commons is going to have to do without my support, which is a pity.

Why Openness Matters - Doubly

Here's a great demonstration of why openness is so important.

Wikipedia is famously open, so in general anyone can edit stuff. But this editing is also done in the open, in that all changes are tracked. Now, some people edit anonymously, but their IP addresses are logged. This information too is freely available, so here's an idea that some bright chap had:

Griffith thus downloaded the entire encyclopedia, isolating the XML-based records of anonymous changes and IP addresses. He then correlated those IP addresses with public net-address lookup services such as ARIN, as well as private domain-name data provided by IP2Location.com.

The result: A database of 5.3 million edits, performed by 2.6 million organizations or individuals ranging from the CIA to Microsoft to Congressional offices, now linked to the edits they or someone at their organization's net address has made.

As a result, dedicated crowd-sourcers are poring over Wikipedia, digging out those embarrassing self-edits. For example:

On Christmas Eve 2004, a Disney user deleted a citation on the "digital rights management" page to DRM critic Cory Doctorow along with a link to a speech he gave to Microsoft's Research Group on the subject. Later, a Disney user altered the "opponents" discussion of the entry, arguing that consumers embrace DRM: "In general, consumers knowingly enter into the arrangement where they are granted limited use of the content."

or:

"Removed ECHELON link, irrelevant to article," reads the comment explaining this cut. The contributor's IP address belongs to the National Security Agency.

or even:

Microsoft's MSN Search is now "a major competitor to Google". Take it from this anonymous contributor, whose IP address belongs to Waggener Edstrom, Microsoft's PR firm.

Now that's what I call openness.

Amazon Goes Lulu

I'm a big fan of Lulu.com, the self-publishing company, not least because the man behind it, Bob Young, also co-founded Red Hat, and is one of the most passionate defenders of the open source way I have come across.

So the news that CreateSpace is going into the on-demand publishing business is interesting - especially since the company is a subsidiary of Amazon, which means that self-published authors will be able to hitch a ride on the Amazon behemoth. But as far as I can tell, Lulu still offers a more thorough vision, with its global reach and finer-grained publishing options. But if nothing else, Amazon's entry into this space will serve to validate the whole idea in the eyes of doubters.

Google Books: A Cautionary Tale

Google Books is important:

the Google Project has, however unintentionally, made not only conventional libraries themselves, but other projects digitizing cultural artifacts appear inept or inadequate. Project Gutenberg and its 17,000 books in ascii appear insignificant and superfluous beside the millions of books that Google is contemplating. So do most scanning projects by conventional libraries. As a consequence of the assumed superiority of Google’s approach, therefore, it is highly unlikely that either the funds or the energies for an alternative project of similar magnitude will become available, nor are the libraries who are lending their books (at significant costs to their funds, their books, and their users) likely to undertake such an effort a second time. With each scanned page, Google Books’ Library Project, by its quantity if not necessarily by its quality, makes the possibility of a better alternative unlikely. The Project may then become the library of the future, whatever its quality, by default. So it does seems important to probe what kind of quality Google Book Project might present to an ordinary user that Google envisages wanting to find a book.

But also unsatisfactory:

The Google Books Project is no doubt an important, in many ways invaluable, project. It is also, on the brief evidence given here, a highly problematic one. Relying on the power of its search tools, Google has ignored elemental metadata, such as volume numbers. The quality of its scanning (and so we may presume its searching) is at times completely inadequate. The editions offered (by search or by sale) are, at best, regrettable.

Rather worrying. (Via O'Reilly Radar.)

Microsoft Bends its Knee to the OSI

So, Microsoft has finally done it, and submitted two of its licences to the OSI for approval. Here's my earlier analysis of what's going on here.

A Public Enquiry into the Public Domain

The public domain is a vastly underappreciated resource - which probably explains why there have been so many successful assaults on it in recent years through copyright, patent and trademark extensions. But now, it seems, people are starting to wake up to its central importance for the digital world:

The new tools of the information society make that public domain material has a considerable potential for re-use - by citizens or for new creative expressions (e.g. documentaries, services for tourism, learning material). It contains published works, such as literary or artistic works, music and audiovisual material for which copyright has expired, material that has been assigned to the public domain by the right holders or by law, mathematical methods, algorithms, methods of presenting information and raw data, such as facts and numbers. A rich public domain has, logically, the potential to stimulate the further development of the information society. It would provide creators – e.g. documentary makers, musicians, multimedia producers, but also schoolchildren doing a Web project – with raw material that they can build on and experiment with, without high transaction or other costs. This is particularly important in the digital context, where the integration of existing material has become much easier.

Although there is some evidence of its importance, there has been no systematic attempt to map or measure its social and economic impact. This is a problem when addressing policy issues that build on public domain material (e.g. digital libraries) or that have an impact on the public domain (e.g. discussions on intellectual property instruments) in the digital age.

The European Union aims to remedy this lack with a study:

Call for tender: "Assessment of the Economic and Social impact of the Public Domain in the Information Society" was published today in the Supplement to the Official Journal of the European Union 2007/S 151-187363. The envisaged purpose of the assessment is to analyse the economic and social impact of the public domain and to gauge its potential to contribute for the benefit of the citizens and the economy.

Portuguese Ministry of Education Goes Free

The Portuguese Ministry of Education is doing the sensible thing and giving away a CD full of free (Windows) software to 1.6 million students, saving itself (and the taxpayers) around 300 million Euros. Nothing amazing about that, perhaps, since it's a sensible thing to do (not that everyone does it).

What's more interesting, for me, at least, is the set of software included on the CD:

* OpenOffice.org
* Firefox
* Thunderbird
* NVU
* Inkscape
* GIMP

These are pretty much the cream of the free software world, and show the increasing depths of desktop apps. Also interesting are the specifically educational programs included:

* Freemind and CmapTools
* Celestia
* Geogebra
* JMOL
* Modellus

Some of these were new to me, notably Geogebra:
GeoGebra is a free and multi-platform dynamic mathematics software for schools that joins geometry, algebra and calculus.

and Modellus (which isn't actually free software, just free):

Modellus enables students and teachers (high school and college) to use mathematics to create or explore models interactively.

It's always surprised me that that more use isn't made of free software in education, since the benefits are obvious: by pooling efforts, duplication is eliminated, and the quality of tools improved. (Via Erwin Tenhumberg.)

13 August 2007

Red Hat Meets Eclipse

Here's an interesting example of major open source projects meeting to produce a highly-targeted commercial product:

Red Hat Developer Studio is a set of eclipse-based development tools that are pre-configured for JBoss Enterprise Middleware Platforms and Red Hat Enterprise Linux. Developers are not required to use Red Hat Developer Studio to develop on JBoss Enterprise Middleware and/or Red Hat Linux. But, many find these pre-configured tools offer significant time-savings and value, making them more productive and speeding time to deployment.

Google's Gift of Taking

Absolutely:

It's not often that Google kills off one of its services, especially one which was announced with much fanfare at a big mainstream event like CES 2006. Yet Google Video's commercial aspirations have indeed been terminated: the company has announced that it will no longer be selling video content on the site. The news isn't all that surprising, given that Google's commercial video efforts were launched in rather poor shape and never managed to take off. The service seemed to only make the news when embarrassing things happened.

Yet now Google Video has given us a gift—a "proof of concept" in the form of yet another argument against DRM—and an argument for more reasonable laws governing copyright controls. How could Google's failure be our gain? Simple. By picking up its marbles and going home, Google just demonstrated how completely bizarre and anti-consumer DRM technology can be. Most importantly, by pulling the plug on the service, Google proved why consumers have to be allowed to circumvent copy controls.

12 August 2007

The Real Spectrum Commons

I have referred to radio spectrum as a commons several times in this blog. But there's a problem: since spectrum seems to be rivalrous - if I have it, you can't - this means that the threat of a tragedy of the commons has to be met by regulation. And that, as we see, is often unsatisfactory, not least because powerful companies usually get the lion's share.

But it seems - luckily - I was wrong about spectrum necessarily being rivalrous:

Software defined radio that is beginning to emerge from the labs into actual tests has the ability to render all spectrum management moot. Small wonder that the legal mandarins there have begun to sneer that open source SDR cannot be trusted.

In other words, when you make radio truly digital, it can be intelligent, and simply avoid the problem of commons over-use.

11 August 2007

Irony in the Blood

Well spotted:


To recap:

1. In all likelihood, fossil fuel emissions are one of the primary causes of global warming;

2. global warming has melted the Arctic ice cap faster than any time on record; so

3. Russia, Denmark, Canada, and the United States are racing to make a no-more-land grab in the Arctic; in order to

4. claim fossil fuel drilling rights for the Arctic seabed.

Middle Kingdom Patently on the Way to the Top

This could have interesting repercussions:

China has seen a sharp increase in requests for patents, according to the UN's intellectual property agency.

The number of requests for patents in China grew by 33% in 2005 compared with the previous year.

That gives it the world's third highest number behind Japan and the United States, the agency said.

Why is this important? Well, currently, patents are being pushed largely by the US as a way of asserting itself economically, notably against that naughty China, which, it is frequently claimed, just rips off the West's ideas. But as China becomes one of the world's leading holders of patents, we can expect to see it start asserting those against everyone else - including the US. Which might suddenly find that it is not quite so keen on those unfair intellectual monopolies after all....