15 July 2009

Bill Gates Gets Sharing...Almost

Yesterday I wrote about Microsoft's attempt to persuade scientists to adopt its unloved Windows HPC platform by throwing in a few free (as in beer) programs. Here's another poisoned chalice that's being offered:

In between trying to eradicate polio, tame malaria, and fix the broken U.S. education system, Gates has managed to fulfill a dream of taking some classic physics lectures and making them available free over the Web. The lectures, done in 1964 by noted scientist (and Manhattan Project collaborator) Richard Feynman, take notions such as gravity and explains how they work and the broad implications they have in understanding the ways of the universe.

Gates first saw the series of lectures 20 years ago on vacation and dreamed of being able to make them broadly available. After spending years tracking down the rights--and spending some of his personal fortune--Gates has done just that. Tapping his colleagues in Redmond to create interactive software to accompany the videos, Gates is making the collection available free from the Microsoft Research Web site.

What a kind bloke - spending his *own* personal fortune of uncountable billions, just to make this stuff freely available.

But wait: what do we find when go to that "free" site:

Clicking will install Microsoft Silverlight.

So it seems that this particular free has its own non-free (as in freedom) payload: what a surprise.

That's a disappointment - but hardly unexpected; Microsoft's mantra is that you don't get something for nothing. But elsewhere in the interview with Gates, there's some rather interesting stuff:

Education, particularly if you've got motivated students, the idea of specializing in the brilliant lecture and text being done in a very high-quality way, and shared by everyone, and then the sort of lab and discussion piece that's a different thing that you pick people who are very good at that.

Technology brings more to the lecture availability, in terms of sharing best practices and letting somebody have more resources to do amazing lectures. So, you'd hope that some schools would be open minded to this fitting in, and making them more effective.

What's interesting is that his new-found work in the field of education is bringing him constantly face-to-face with the fact that sharing is actually rather a good thing, and that the more the sharing of knowledge can be facilitated, the more good results.

Of course, he's still trapped by the old Microsoft mindset, and constantly thinking how he can exploit that sharing, in this case by freighting it with all kinds of Microsoft gunk. But at least he's started on the journey, albeit unknowingly.

Follow me @glynmoody on Twitter or identi.ca.

14 July 2009

Hamburg Declaration = Humbug Declaration

You may have noticed that in the 10 years since Napster, the music industry has succeeded in almost completely ruining its biggest opportunity to make huge quantities of money, alienating just about anyone under 30 along the way (and a fair number of us old fogies, too).

Alas, it seems that some parts of the newspaper industry have been doing their job of reporting so badly that they missed that particular news item. For what does it want to do? Follow the music industry's lemming-like plunge off the cliff of "new intellectual property rights protection":

On the day that Commissioner Viviane Reding unveils her strategy for a Digital Europe during the Lisbon Council, and as the European Commission's consultation on the Content Online Report draws to a close this week, senior members of the publishing world are presenting to Information Society Commissioner Viviane Reding and Internal Market Commissioner Charlie McCreevy, a landmark declaration adopted on intellectual property rights in the digital world in a bid to ensure that opportunities for a diverse, free press and quality journalism thrive online into the future.

This is the first press communiqué on a significant meeting convened on 26th June in Berlin by news group Chief Executives from both the EPC and the World Association of Newspapers where the 'Hamburg Declaration' was signed, calling for online copyright to be respected, to allow innovation to thrive and consumers to be better served.

This comes from an extraordinary press release, combining arrogant self-satisfaction with total ignorance about how the Internet works:

A fundamental safeguard of democratic society is a free, diverse and independent press. Without control over our intellectual property rights, the future of quality journalism is at stake and with it our ability to provide our consumers with quality and varied information, education and entertainment on the many platforms they enjoy.

What a load of codswallop. What makes them think they are the sole guardians of that "free, diverse and independent press"? In case they hadn't noticed, the Internet is rather full of "quality and varied information, education and entertainment on the many platforms", most of it quite independent of anything so dull as a newspaper. As many others have pointed out, quality journalism is quite separate from old-style press empires, even if the latter have managed to produce the former from time to time.

Then there's this:

We continue to attract ever greater audiences for our content but, unlike in the print or TV business models, we are not the ones making the money out of our content. This is unsustainable.

Well, at least they got the last bit. But if they are attracting "ever greater audiences" for their content, but are not making money, does this not suggest that they are doing something fundamentally wrong? In a former incarnation, I too was a publisher. When things went badly, I did not immediately call for new laws: I tried again with something different. How about if newspaper publishers did the same?

This kind of self-pitying bleating would be extraordinary enough were it coming out of a vacuum; but given the decade of exemplary failure by the music industry taking *exactly* the same approach, it suggests a wilful refusal to look reality in the face that is quite extraordinary.

Speaking personally, the sooner all supporters of the Humbug Declaration are simply omitted from every search engine on Earth, the better: I'm sure we won't miss them, but they sure will miss the Internet...

Follow me @glynmoody on Twitter or identi.ca.

I Fear Microsoft Geeks Bearing Gifts...

Look, those nice people at Microsoft Research are saving science from its data deluge:

Addressing an audience of prominent academic researchers today at the 10th annual Microsoft Research Faculty Summit, Microsoft External Research Corporate Vice President Tony Hey announced that Microsoft Corp. has developed new software tools with the potential to transform the way much scientific research is done. Project Trident: A Scientific Workflow Workbench allows scientists to easily work with large volumes of data, and the specialized new programs Dryad and DryadLINQ facilitate the use of high-performance computing.

Created as part of the company’s ongoing efforts to advance the state of the art in science and help address world-scale challenges, the new tools are designed to make it easier for scientists to ingest and make sense of data, get answers to questions at a rate not previously possible, and ultimately accelerate the pace of achieving critical breakthrough discoveries. Scientists in data-intensive fields such as oceanography, astronomy, environmental science and medical research can now use these tools to manage, integrate and visualize volumes of information. The tools are available as no-cost downloads to academic researchers and scientists at http://research.microsoft.com/en-us/collaboration/tools.

Aw, shucks, isn't that just *so* kind? Doing all this out of the goodness of their hearts? Or maybe not:

Project Trident was developed by Microsoft Research’s External Research Division specifically to support the scientific community. Project Trident is implemented on top of Microsoft’s Windows Workflow Foundation, using the existing functionality of a commercial workflow engine based on Microsoft SQL Server and Windows HPC Server cluster technologies. DryadLINQ is a combination of the Dryad infrastructure for running parallel systems, developed in the Microsoft Research Silicon Valley lab, and the Language-Integrated Query (LINQ) extensions to the C# programming language.

So basically Project Trident is more Project Trojan Horse - an attempt to get Microsoft HPC Server cluster technologies into the scientific community without anyone noticing. And why might Microsoft be so keen to do that? Maybe something to do with the fact that Windows currently runs just 1% of the top 500 supercomputing sites, while GNU/Linux has over 88% share.

Microsoft's approach here can be summed up as: accept our free dog biscuit, and be lumbered with a dog.

Follow me @glynmoody on Twitter or identi.ca.

Batik-Makers Say "Tidak" to Copyright

Yesterday I was talking about how patents are used to propagate Western ideas and power; here's a complementary story about local artists understanding that copyright just ain't right for them:


Joko, speaking at this year’s Solo Batik Fashion Festival over the weekend, said that the ancient royal city was one of the principal batik cities in Indonesia, with no fewer than 500 unique motifs created here that are not found in any other region. The inventory process, however, was hampered by the reluctance of the batik makers to claim ownership over pieces.

The head of the Solo trade and industry office, Joko Pangarso, said copyright registration work had begun last year, but was constantly held up when it was found a particular batik only had a motif name because the creator declined to attach their own.

“So far only 10 motifs have been successfully included in the list,” he said. “The creators acknowledged their creations but asked for minimal exposure.

Interestingly, this is very close to the situation for software. The batik motifs correspond to sub-routines: both are part of the commons that everyone draws upon; copyrighting those patterns is as counter-productive as patenting subroutines, since it makes further creation almost impossible without "infringement". This reduces the overall creativity - precisely the opposite effect that intellectual monopolists claim. (Via Boing Boing.)

Follow me @glynmoody on Twitter or identi.ca.

13 July 2009

National Portrait Gallery: Nuts

This is so wrong:

Below is a letter I received from legal representatives of the National Portrait Gallery, London, on Friday, July 10, regarding images of public domain paintings in Category:National Portrait Gallery, London and threatening direct legal action under UK law. The letter is reproduced here to enable public discourse on the issue. For a list of sites discussing this event see User:Dcoetzee/NPG legal threat/Coverage. I am consulting legal representation and have not yet taken action.

Look, NPG, your job is to get people to look at your pix. Here's some news: unless they're in London, they can't do that. Put those pix online, and (a) that get to see the pix and (b) when they're in London, they're more likely to come and visit, no?

So you should be *encouraging* people to upload your pix to places like Wikipedia; you should be thanking them. The fact that you are threatening them with legal action shows that you don't have even an inkling of what you are employed to do.

Remind me not to pay the part of my UK taxes that goes towards your salary....

Are Patents Intellectual Monopolies? You Decide

Talking of intellectual monopolies, you may wonder why I use this term (well, if you've been reading this blog for long, you probably don't.) But in any case, here's an excellent exposition as to why, yes, patents are indeed monopolies:

On occasion you get some defender of patents who is upset when we use the m-word to describe these artificial state-granted monopoly rights. For example here one Dale Halling, a patent attorney (surprise!) posts about "The Myth that Patents are a Monopoly" and writes, " People who suggest a patent is a monopoly are not being intellectually honest and perpetuating a myth to advance a political agenda."

Well, let's see.

Indeed, do read the rest of yet another great post from the Against Monopoly site.

Follow me @glynmoody on Twitter or identi.ca.

What Are Intellectual Monopolies For?

If you still doubted that intellectual monopolies are in part a neo-colonialist plot to ensure the continuing dominance of Western nations, you could read this utterly extraordinary post, which begins:

The fourteenth session of the WIPO Intergovernmental Committee on Genetic Resources, Traditional Knowledge and Folklore (IGC), convened in Geneva from June 29, 2009 to July 3, 2009, collapsed at the 11th hour on Friday evening as the culmination of nine years of work over fourteen sessions resulted in the following language; “[t]he Committee did not reach a decision on this agenda item” on future work. The WIPO General Assembly (September 2009) will have to untangle the intractable Gordian knot regarding the future direction of the Committee.

At the heart of the discussion lay a proposal by the African Group which called for the IGC to submit a text to the 2011 General Assembly containing “a/(n) international legally binding instrument/instruments” to protect traditional cultural expressions (folklore), traditional knowledge and genetic resources. Inextricably linked to the legally binding instruments were the African Group’s demands for “text-based negotiations” with clear “timeframes” for the proposed program of work. This proposal garnered broad support among a group of developing countries including Malaysia, Thailand, Fiji, Bolivia, Brazil, Ecuador, Philippines, Sri Lanka, Cuba, Yemen India, Peru, Guatemala, China, Nepal and Azerbaijan. Indonesia, Iran and Pakistan co-sponsored the African Group proposal.

The European Union, South Korea and the United States could not accept the two principles of “text-based negotiations” and “internationally legally binding instruments”.

Australia, Canada and New Zealand accepted the idea of “text-based negotiations” but had reservations about “legally binding instruments” granting sui generis protection for genetic resources, traditional knowledge and folklore.

We can't possibly have dveloping countries protecting their traditional medicine and national lore - "genetic resources, traditional knowledge and folklore" - from being taken and patented by the Western world. After all, companies in the latter have an inalienable right to turn a profit by licensing that same traditional knowledge it back to the countries it was stolen from (this has already happened). That's what intellectual monopolies are for.

Follow me @glynmoody on Twitter or identi.ca.

10 July 2009

This Could Save Many Lives: Let's Patent It

Bill Gates is amazing; just look at this brilliant idea he's come up with:

using large fleets of vessels to suppress hurricanes through various methods of mixing warm water from the surface of the ocean with colder water at greater depths. The idea is to decrease the surface temperature, reducing or eliminating the heat-driven condensation that fuels the giant storms.

Against the background of climate change and increased heating of the ocean's surface in areas where hurricanes emerge, just imagine how many lives this could save - a real boon for mankind. Fantastic.

Just one problemette: he's decided to patent the idea, along with his clever old chum Nathan Myhrvold.

The filings were made by Searete LLC, an entity tied to Intellectual Ventures, the Bellevue-based patent and invention house run by Nathan Myhrvold, the former Microsoft chief technology officer. Myhrvold and several others are listed along with Gates as inventors.

After all, can't have people just going out there and saving thousands of lives without paying for the privilege, can we?

Follow me @glynmoody on Twitter or identi.ca.

Do We Need Open Access Journals?

One of the key forerunners of the open access idea was arxiv.org, set up by Paul Ginsparg. Here's what I wrote a few years back about that event:

At the beginning of the 1990s, Ginsparg wanted a quick and dirty solution to the problem of putting high-energy physics preprints (early versions of papers) online. As it turns out, he set up what became the arXiv.org preprint repository on 16 August, 1991 – nine days before Linus made his fateful “I'm doing a (free) operating system (just a hobby, won't be big and professional like gnu) for 386(486) AT clones” posting. But Ginsparg's links with the free software world go back much further.

Ginsparg was already familiar with the GNU manifesto in 1985, and, through his brother, an MIT undergraduate, even knew of Stallman in the 1970s. Although arXiv.org only switched to GNU/Linux in 1997, it has been using Perl since 1994, and Apache since it came into existence. One of Apache's founders, Rob Hartill, worked for Ginsparg at the Los Alamos National Laboratory, where arXiv.org was first set up (as an FTP/email server at xxx.lanl.org). Other open source programs crucial to arXiv.org include TeX, GhostScript and MySQL.

arxiv.org was and is a huge success, and that paved the way for what became the open access movement. But here's an interesting paper - hosted on arxiv.org:

Contemporary scholarly discourse follows many alternative routes in addition to the three-century old tradition of publication in peer-reviewed journals. The field of High- Energy Physics (HEP) has explored alternative communication strategies for decades, initially via the mass mailing of paper copies of preliminary manuscripts, then via the inception of the first online repositories and digital libraries.

This field is uniquely placed to answer recurrent questions raised by the current trends in scholarly communication: is there an advantage for scientists to make their work available through repositories, often in preliminary form? Is there an advantage to publishing in Open Access journals? Do scientists still read journals or do they use digital repositories?

The analysis of citation data demonstrates that free and immediate online dissemination of preprints creates an immense citation advantage in HEP, whereas publication in Open Access journals presents no discernible advantage. In addition, the analysis of clickstreams in the leading digital library of the field shows that HEP scientists seldom read journals, preferring preprints instead.

Here are the article's conclusions:

Scholarly communication is at a cross road of new technologies and publishing models. The analysis of almost two decades of use of preprints and repositories in the HEP community provides unique evidence to inform the Open Access debate, through four main findings:

1. Submission of articles to an Open Access subject repository, arXiv, yields a citation advantage of a factor five.

2. The citation advantage of articles appearing in a repository is connected to their dissemination prior to publication, 20% of citations of HEP articles over a two-year period occur before publication.

3. There is no discernable citation advantage added by publishing articles in “gold” Open Access journals.

4. HEP scientists are between four and eight times more likely to download an article in its preprint form from arXiv rather than its final published version on a journal web site.

On the one hand, it would be ironic if the very field that acted as a midwife to open access journals should also be the one that begins to undermine it through a move to repository-based open publishing of preprints. On the other, it doesn't really matter; what's important is open access to the papers. If these are in preprint form, or appear as fully-fledged articles in peer-reviewed open access journals is a detail, for the users at least; it's more of a challenge for publishers, of course... (Via @JuliuzBeezer.)

Follow me @glynmoody on Twitter or identi.ca.

08 July 2009

Not Kissing the Rod, Oh My Word, No

Becta today [6 July 2009] welcomes Microsoft's launch of the new Subscription Enrolment Schools Pilot (SESP) for UK schools, which provides greater flexibility and choice for schools who wish to use a Microsoft subscription agreement.

Great, and what might that mean, exactly?

The new licensing scheme removes the requirement that schools using subscription agreements pay Microsoft to licence systems that are using their competitor's technologies. So for the first time schools using Microsoft's subscription licensing agreements can decide for themselves how much of their ICT estate to licence.

So BECTA is celebrating that fact that schools - that is, we taxpayers - *no longer* have to "pay Microsoft to licence systems that are using their competitor's technologies"? They can now use GNU/Linux, for example, *without* having to pay Microsoft for the privilege?

O frabjous day! Callooh! Callay!

Follow me @glynmoody on Twitter or identi.ca.

Policing the Function Creep...

Remember how the poor darlings in the UK government absolutely *had to* allow interception of all our online activities so that those plucky PC Plods could maintain their current stunning success rate in their Whirr on Terruh and stuff like that? Well, it seems that things have changed somewhat:

Detectives will be required to consider accessing telephone and internet records during every investigation under new plans to increase police use of communications data.

The policy is likely to significantly increase the number of requests for data received by ISPs and telephone operators.

Just as every investigation currently has to include a strategy to make use of its subjects' financial records, soon CID officers will be trained to always draw up a plan to probe their communications.

The plans have been developed by senior officers in anticipation of the implementation of the Interception Modernisation Programme (IMP), the government's multibillion pound scheme to massively increase surveillance of the internet by storing details of who contacts whom online.

Er, come again? "CID officers will be trained to always draw up a plan to probe their communications"? How does that square with this being a special tool for those exceptional cases when those scary terrorists and real hard naughty criminals are using tricky high-tech stuff like email? Doesn't it imply that we are all terrorist suspects and hard 'uns now?

Police moves to prepare for the glut of newly accessible data were revealed today by Deputy Assistant Commissioner Janet Williams. She predicted always considering communications data will lead to a 20 per cent increase in the productivity of CID teams.

She told The Register IMP had "informed thinking" about use of communications data, but denied the plans gave the lie to the government line that massively increased data retention will "maintain capability" of law enforcement to investigate crime.

Well, Mandy Rice-Davies applies, m'lud...

Follow me @glynmoody on Twitter or identi.ca.

07 July 2009

Are Microsoft's Promises For Ever?

This sounds good:

I have some good news to announce: Microsoft will be applying the Community Promise to the ECMA 334 and ECMA 335 specs.

ECMA 334 specifies the form and establishes the interpretation of programs written in the C# programming language, while the ECMA 335 standard defines the Common Language Infrastructure (CLI) in which applications written in multiple high-level languages can be executed in different system environments without the need to rewrite those applications to take into consideration the unique characteristics of those environments.

"The Community Promise is an excellent vehicle and, in this situation, ensures the best balance of interoperability and flexibility for developers," Scott Guthrie, the Corporate Vice President for the .Net Developer Platform, told me July 6.

It is important to note that, under the Community Promise, anyone can freely implement these specifications with their technology, code, and solutions.

You do not need to sign a license agreement, or otherwise communicate to Microsoft how you will implement the specifications.

The Promise applies to developers, distributors, and users of Covered Implementations without regard to the development model that created the implementations, the type of copyright licenses under which it is distributed, or the associated business model.

Under the Community Promise, Microsoft provides assurance that it will not assert its Necessary Claims against anyone who makes, uses, sells, offers for sale, imports, or distributes any Covered Implementation under any type of development or distribution model, including open-source licensing models such as the LGPL or GPL.

But boring old sceptic that I am, I have memories of this:

The Software Freedom Law Center (SFLC), provider of pro-bono legal services to protect and advance free and open source software, today published a paper that considers the legal implications of Microsoft's Open Specification Promise (OSP) and explains why it should not be relied upon by developers concerned about patent risk.

SFLC published the paper in response to questions from its clients and the community about the OSP and its compatibility with the GNU General Public License (GPL). The paper says that the promise should not be relied upon because of Microsoft's ability to revoke the promise for future versions of specifications, the promise's limited scope, and its incompatibility with free software licenses, including the GPL.

That was then, of course, what about now? Well, here's what the FAQ says on the subject:

Q: Does this CP apply to all versions of the specification, including future revisions?

A: The Community Promise applies to all existing versions of the specifications designated on the public list posted at /interop/cp/, unless otherwise noted with respect to a particular specification.


Now, is it just me, or does Microsoft conspicuously fail to answer its own question? The question was: does it apply to all versions *including* future revision? And Microsoft's answer is about *existing* versions: so doesn't that mean it could simply not apply the promise to a future version? Isn't this the same problem as with the Open Specification Promise? Just asking.

03 July 2009

The Engine of Scientific Progress: Sharing

Here's a post saying pretty much what I've been saying, but in a rather different way:


Here we present a simple model of one of the most basic uses of results, namely as the engine of scientific progress. Research results are more than just accumulated knowledge. Research results make possible new questions, which in turn lead to even more knowledge. The resulting pattern of exponential growth in knowledge is called an issue tree. It shows how individual results can have a value far beyond themselves, because they are shared and lead to research by others.


Follow me @glynmoody on Twitter or identi.ca.

02 July 2009

Patents Don't Promote Innovation: Study

It's extraordinary how the myth that patents somehow promote innovation is still propagated and widely accepted; and yet there is practically *no* empirical evidence that it's true. All the studies that have looked at this area rigorously come to quite a different conclusion. Here's yet another nail in that coffin, using a very novel approach:

Patent systems are often justified by an assumption that innovation will be spurred by the prospect of patent protection, leading to the accrual of greater societal benefits than would be possible under non-patent systems. However, little empirical evidence exists to support this assumption. One way to test the hypothesis that a patent system promotes innovation is to simulate the behavior of inventors and competitors experimentally under conditions approximating patent and non-patent systems.

Employing a multi-user interactive simulation of patent and non-patent (commons and open source) systems ("PatentSim"), this study compares rates of innovation, productivity, and societal utility. PatentSim uses an abstracted and cumulative model of the invention process, a database of potential innovations, an interactive interface that allows users to invent, patent, or open source these innovations, and a network over which users may interact with one another to license, assign, buy, infringe, and enforce patents.

Data generated thus far using PatentSim suggest that a system combining patent and open source protection for inventions (that is, similar to modern patent systems) generates significantly lower rates of innovation (p<0.05), productivity (p<0.001), and societal utility (p<0.002) than does a commons system. These data also indicate that there is no statistical difference in innovation, productivity, or societal utility between a pure patent system and a system combining patent and open source protection.

The results of this study are inconsistent with the orthodox justification for patent systems. However, they do accord well with evidence from the increasingly important field of user and open innovation. Simulation games of the patent system could even provide a more effective means of fulfilling the Constitutional mandate ― "to promote the Progress of . . . useful Art" than does the orthodox assumption that technological innovation can be encouraged through the prospect of patent protection.

When will people get the message and start sharing for mutual benefit?

Follow me @glynmoody on Twitter or identi.ca.

01 July 2009

Help Me Go Mano a Mano with Microsoft

Next week, I'm taking part in a debate with a Microsoft representative about the passage of the OOXML file format through the ISO process last year. Since said Microsoftie can draw on the not inconsiderable resources of his organisation to provide him with a little back-up, I thought I'd try to even the odds by putting out a call for help to the unmatched resource that is the Linux Journal community. Here's the background to the meeting, and the kind of info I hope people might be able to provide....

On Linux Journal.

30 June 2009

Why Scientific Publishing Will Never be the Same

For those of us tracking open access and its wider import, it's pretty clear that scientific publishing has changed for ever. But for some within the industry, there remains the desperate hope that all this new-fangled open, collaborative stuff will just blow over.

Forget about it: as this magisterial analysis shows, not only is there no way for traditional publishing to get from there to here - it's a terrifying leap across a minimum to the next maximum - but the most exciting stuff is *already* happening in other forms of publishing:

What’s new today is the flourishing of an ecosystem of startups that are experimenting with new ways of communicating research, some radically different to conventional journals. Consider Chemspider, the excellent online database of more than 20 million molecules, recently acquired by the Royal Society of Chemistry. Consider Mendeley, a platform for managing, filtering and searching scientific papers, with backing from some of the people involved in Last.fm and Skype. Or consider startups like SciVee (YouTube for scientists), the Public Library of Science, the Journal of Visualized Experiments, vibrant community sites like OpenWetWare and the Alzheimer Research Forum, and dozens more. And then there are companies like Wordpress, Friendfeed, and Wikimedia, that weren’t started with science in mind, but which are increasingly helping scientists communicate their research. This flourishing ecosystem is not too dissimilar from the sudden flourishing of online news services we saw over the period 2000 to 2005.

It’s easy to miss the impact of blogs on research, because most science blogs focus on outreach. But more and more blogs contain high quality research content. Look at Terry Tao’s wonderful series of posts explaining one of the biggest breakthroughs in recent mathematical history, the proof of the Poincare conjecture. Or Tim Gowers recent experiment in “massively collaborative mathematics”, using open source principles to successfully attack a significant mathematical problem. Or Richard Lipton’s excellent series of posts exploring his ideas for solving a major problem in computer science, namely, finding a fast algorithm for factoring large numbers. Scientific publishers should be terrified that some of the world’s best scientists, people at or near their research peak, people whose time is at a premium, are spending hundreds of hours each year creating original research content for their blogs, content that in many cases would be difficult or impossible to publish in a conventional journal. What we’re seeing here is a spectacular expansion in the range of the blog medium. By comparison, the journals are standing still.

What's even better about this piece is that it's not content to point out why traditional publishing has big problems: it also offers some practical suggestions of what people *should* be looking at:

These opportunities can still be grasped by scientific publishers who are willing to let go and become technology-driven, even when that threatens to extinguish their old way of doing things. And, as we’ve seen, these opportunites are and will be grasped by bold entrepreneurs. Here’s a list of services I expect to see developed over the next few years. A few of these ideas are already under development, mostly by startups, but have yet to reach the quality level needed to become ubiquitous. The list could easily be continued ad nauseum - these are just a few of the more obvious things to do.

Fantastic stuff - do read it all if you have time.

Follow me @glynmoody on Twitter or identi.ca.

Squishing the Media Bugs

Belatedly, I'm writing about this very cool idea:


It’s called MediaBugs.org. And the idea is to create a web site, a web service, that people in a community, in this case the San Francisco Bay Area, can bring problems and errors that they find in media coverage and post them and try to get them fixed. [...]

The inspiration of the project is from what’s called a bug tracker in an open-source project. So if you’re developing open-source software, you have this project, and you put up a public website that anyone can bring these — file these bugs. If you’re using Firefox and something breaks, you go to their web site, and you tell them about it.

I fear the practice might be harder than the theory, but I certainly wish it well.

Follow me @glynmoody on Twitter or identi.ca.

Winning the Open Web

It seems an unfair fight. On the one hand, you have some of the biggest, most powerful multinationals, intent on defending their turf and extending their power and profits. On the other, you have a tiny number of ragtag idealists who believe that knowledge belongs to everyone, and that no one should have disproportionately long monopolies on its supply.

And yet: in the last few years a remarkable series of victories have been one by the latter against the former, to the extent that representatives of the big media industries have warned that they are losing the "battle".

Against that background of uneven forces - but not quite in the way the media companies mean it - sharing information about past successes so as to drive future ones is crucially important. And yet it is rarely done, probably because the practitioners are too busy fighting the battles to write about it.

Enter Becky Hogge, former Executive Director of the Open Rights Group, who happily has had some time on her hands to prepare a handy report entitled "Winning the Web":


Winning the Web is a 2009 report funded by the Open Society Institute and written by Becky Hogge, former Executive Director of the Open Rights Group. It examined 6 successful campaigns for intellectual property reform, in Brazil, Canada, the US, France, New Zealand, and the UK.

Lessons drawn from the study of the campaigns include the importance of coalition-forming, the best way to conduct online mobilisation campaigns, and the need for a more unified critique of current intellectual property regimes.

The introduction fleshes out the idea:

The global intellectual property regime is no longer fit for purpose. As the networked, digital age matures, it puts into the hands of millions of citizens the tools to access create and share “content”: text, pictures, music and video; data, news, analysis and art. Against this, the intellectual property regime falters. It presents citizens with a choice: stop using the technology – stop communicating, stop creating – or break the law.

Legal reform is presented with two separate challenges. The first is a small but vocal minority of entrenched corporate interests – the rightsholder lobby. Wedded to business models that pre-date the age of networked digital technology, they exploit their position as incumbents to influence legislators. Often representing the world’s biggest multinational corporations, they hijack a narrative that belongs to poor artists struggling in garrets and use the considerable profits they have made from exploiting these artists in the twentieth century to access the corridors of power and make their case.

That legislators listen is related to a second, geopolitical, challenge. Since the 1970s, the developed world has sought to use the global intellectual property regime to ensure its continued prosperity. Motivated by the ability of developing countries to undercut it on the global manufacturing market, it has sought to augment the financial privilege afforded to “knowledge workers”. The self-interest behind this practice is masked by a flawed orthodoxy that is rarely backed up by evidence – that more intellectual property provision is always good for economic growth.

Against this backdrop, a global IP reform movement (also called the access to knowledge movement) is emerging. Motivated by a range of concerns – from global justice, to the narrowing spectrum of permitted speech, to the broadening of surveillance power – these individuals and organisations approach their campaigning work with combined levels of ingenuity and intellectual rigour that make them stand out in the history of fledgling civil rights movements. Recently, these pockets of activism have taken IP reform issues to a wide audience, triggering sweeping civic action in the general population.

For me, the best bits are the detailed case studies of successful campaigns around the world. I knew the bare outlines, but the report really fleshes these out, and then uses them to provide concrete suggestions of what lessons can be learned for the future.

There's one other point is one that I've long thought absolutely crucial:

In the UK, citizens can engage with their elected representatives (including MEPs) using a one-click service called WriteToThem.com. Jim Killock is keen to stress that it is vital that such a tool be developed for all EU member states:

Writetothem.eu is absolutely critical if we want to run these campaigns in the next four years. It shows the contempt in which we seem to hold our European institutions and the irrelevance that they are felt to have across Europe.

This really must be a priority, or else all future campaigns in Europe will suffer as a result.

Follow me @glynmoody on Twitter or identi.ca.

29 June 2009

Watching the Watchers

I read with interest this morning the following:

“Snitchtown” is an essay by Cory Doctorow that first appeared in Forbes.com in June 2007. This SoFoBoMo project is an attempt to illustrate that essay with photographs of some of the 4.2 million CCTV cameras currently estimated to be active in Britain.

It got me thinking: how about setting up a database - a surveillance of surveillance database - that has pictures and locations of CCTVs in the UK? It could be crowd sourced, and anonymous, solving problems of scaling and legal issues. If nothing else, it would put the watchers on notice that they are being watched....

26 June 2009

The World Wins South Korea for Firefox

I've written before about the curious case of South Korea, where the use of Internet Explorer and ActiveX is almost mandatory. I rather despaired of anything changing this situation, since there didn't seem to be any way to get around it from outside. And yet, remarkably, change is coming:

Korea's multifaceted e-government services will be made available for those logging on from FireFox or Safari, web browsers that are gaining more popularity worldwide as an alternative to Internet Explorer.

According to the government Sunday, users of these "non-traditional" browsers will be able to file for year-end tax returns, sign up for a new passport or look for job openings and do much more at various service Web sites operated by the state.

The Ministry of Public Administration and Security, which is in charge of directing e-government initiatives, said that it will invest 11.5 billion won this year for technical projects to increase the browser compatibility of 49 e-government service Web sites.

Starting 2011, all of the 150 e-government Web sites are expected to be accessible from any browser.

That's remarkable, as is the reason for the change:

The development is expected to be useful for overseas Koreans or foreigners logging on to Web sites such as www.hikorea.go.kr from aboard through alternate browsers. Operated by the Ministry of Justice, the Web site is a comprehensive online repository of information for oversea Koreans, immigrants and foreign nationals.

Some civic groups have consistently raised the need to consider expanding e-government services to users of non-traditional browsers.

While the percentage of Koreans users of alternative browsers is still minimal, more netizens worldwide are increasingly surfing the net on browsers other than the Internet Explorer.

A ministry report showed that 21.7 percent of Web users worldwide are browsing on FireFox, and 8 percent on Safari, a browser developed by Apple. IE users make up 67.4 percent of the total Web population.

"We believe that enabling minor browsers to host our e-government services will help overseas Koreans to access the assistance they need and increase Korea's status as a leader in e-government initiatives," a ministry official said.

This shows that what's happening outside a country can still have considerable influence on the internal market - provided there is a big enough expatriate community that still "calls home". Given the increasingly globalised nature of computing, that offers hope for other parts of the world that may be lagging in their uptake of open source.

And if you're wondering why it matters anyway that the South Koreans should be able to use Firefox and other "non-standard" browsers - don't you just love that description? - it's because the country's users have some of the fastest broadband connections in the world; that means that new applications based on such connectivity may well emerge there first, so it's important that open source be available and viable for all kinds of uses. (Via Asa Dotzler.)

Follow me @glynmoody on Twitter or identi.ca.

Show Your Ardour for Ardour

Ardour is a fine open source music program; but like many fine open source programs, it has a problem: money - lack of it. In order to continue to improve the code, the Arbour team ideally needs dosh to pay for programmers and other such handy things; but it's not really happening:


As of now, June 25th, the financial side of things is not looking so good. Last month (May) didn't quite make the $4500 goal, and this month looks certain to fall short by quite a significant margin. There are currently 5-1/2 days left this month, and 28% of the target is still not met. There are no companies backing this project at this time, so its totally incumbent on those of you who use the program and have not yet helped pay to support it to step up and do the right thing. Thanks to everyone who has paid for their contributions and support.

Ardour will continue in some sense even if I find other work, and I believe that Carl, Dave, Hans and others will likely keep up some of their efforts anyway. Since the new download system started, there have been about 9000 OS X downloads and 6000 source code downloads. Less than 3% of the OS X downloads and only three source code downloads were associated with up-front payment, though it seems likely than many users donated after the fact. With a user-base like that, it seems to me that it should be possible to pay one full-time north american developer and to offer occasional payments to others for their outstanding work. What do you think?

Ardour is hardly the only project with these problems, which means that the open source world faces a larger issue: how to raise funds to pay for work that isn't being carried out mostly in bedrooms. It's not something many are thinking about (Matt Asay is an honourable exception), so it's not likely to get solved any time soon - which leaves Ardour in a bit of a pickle. Suggestions and contributions welcome.... (Via Leslie P. Polzer.)

Follow me @glynmoody on Twitter or identi.ca.

Eee, Look: A Useful E-petition Response

Even though I keep signing the wretched things, e-petitions have not generated much action from the UK government. Which makes the following case rather interesting.

The following e-petition:

“We the undersigned petition the Prime Minister to ask the Communities Secretary to require that all software produced by councils under the Timely Information to Citizens project be released under an open source licence.”

Produced this response:

The Government supports the principle that, where new software is being developed by the Timely Information to Citizens pilots, this should wherever possible be released under open source licence and available for use by other local authorities.

For many of the Timely Information to Citizens pilots, the focus is not on new software, but on how existing tools and techniques can be used to bring information together and present it in more useful and accessible ways. Several of the projects will utilise existing open source software to create new information sources and channels, and will share their experiences of doing so with other authorities.

Where the pilots will result in new software tools, ownership and intellectual property rights will usually remain with the individual local authorities. However, most of the authorities concerned have already made a commitment to make these tools available as open source software, or for use by their partner organisations, and we are working to secure the commitment of the remaining.

What impresses me is (a) the reasonableness of the response and (b) the fact that the release of government-developed software should be released as open source "in principle". I do believe we're getting there....

Next, Linux Revolutionises...Printers

Here's a new printer from HP:


Last June 22, HP announced its new all-in-one printer, the Photosmart Premium with TouchSmart Web. Aside from cramming a fax machine, copier, scanner, and a printer into one device, run of the mill technology by today's standards, this new printer can actually print straight from the Web using on-device applications fashioned specifically for this purpose.

As the headline to that story makes clear, that's a Linux-based printer: indeed, it's pretty much unthinkable that these innovative approaches could use anything else. Linux's small footprint, speed, customisability and low cost make it ideal - uniquely so. Where would we be without it?

25 June 2009

Authoring Beautiful HTML...

...ain't easy in the open source world, as David Ascher points out in this post:

However, for regular folks, life is not rosy yet in the Open Web world. Authoring beautiful HTML is, even with design and graphics talent, still way, way too hard. I’m writing this using Wordpress 2.8, which has probably some of the best user experience for simple HTML authoring. As Matt Mullenweg (the founder of Wordpress) says, it’s still not good enough. As far as I can tell, there are currently no truly modern, easy to use, open source HTML composition tools that we could use in Thunderbird for example to give people who want to design wholly original, designed email messages. That’s a minor problem in the world of email, which is primarily about function, not form, and I think we’ll be able to go pretty far with templates, but it’s a big problem for making design on the web more approachable.

There are some valiant efforts to clean up the old, crufty, scary composer codebase that Mozilla has relied on for years. There are simple blog-style editors like FCKEditor and its successor CKEditor. There are in-the-browser composition tools like Google Pages or Google Docs, but those are only for use by Google apps, and only work well when they limit the scope of the design space substantially (again, a rational choice). None of these can provide the flexibility that Ventura Publisher or PageMaker had in the dark ages; none of them can compete from a learnability point of view with the authoring tools that rely on closed stacks; none of them allow the essential polish that hand-crafted code can yield. That’s a gap, and an opportunity.

Let's hope people in the free software world seize it.

Follow me @glynmoody on Twitter or identi.ca.

Crowdsourcing Evil

This was inevitable:

a friend in Iran that I have been in touch with via Skype (which seems to work very well)” told him that a specific Web site, Gerdab.ir, is being used by the Iranian government to identify protesters by crowd-sourcing.

It's a tool: like all tools, it can be use for good...or not.... (Via @cshirky.)

Your Number is Up...

...and it's £2.2 trillion:

The gap between what the Government expects to spend and what it actually brings in has risen five-fold, from £120 billion to £608 billion in the space of six months.

At that rate, according to the Institute for Fiscal Studies, it will take 23 years to return government borrowing to anything like normal levels – Gordon Brown’s famous “golden rule”.

And of course, every year you borrow keeps adding to what you owe. Right now, the Government calculates that it owes a total of £2.2 trillion – about £144,000 per household. The figure has trebled since the bank bail-outs. Some traders are beginning to wonder if Britain can actually pay its debts. If they start pulling out, then we really are bust.

Tell me again why we can afford to spend £19 billion on ID cards and associated super-duper databases...?

24 June 2009

Pillars of Open Government

As you may have noticed, I'm writing more about open government these days, simply because there's more to write about - and that's great. Here's some clueful stuff from the other side of the globe. It's by Kate Lundy, a member of the Australian Senate, who really seems to get this openness thing:


For the Australian government, an opportunity to construct what I see as the three pillars of Open Government is presented. Each of these pillars assumes the basic principle of citizen engagement at every possible opportunity to both empower people, and to ensure the results are actually appropriate and useful.

The three pillars of open government.

* Citizen-centric services
* Open and transparent government
* Innovation facilitation

I particularly liked this one:

The second pillar is open and transparent government. This pillar builds on the principles that citizens have a right to the information they need to inform themselves about public and political affairs, and to participate in the democratic processes in an informed way. This second pillar is to ensure genuine means of engagement between citizens and the government in policy and decision-making. This is always harder than it sounds but it is essential to garner the wisdom of the crowd. It is vital that government engage with the broader community not just for a conversation, but in genuine partnership between political leaders and the people so we can as a society respond most effectively to the specific social and economic challenges communities confront. This localisation of policy solutions is essential to ensure relevance of government solutions to real situations, and essential to ensure a reasonable response time to new issues and emergencies. Open and transparent government will grow citizen trust and ultimately participation in policy development and government directions.

Great to see people all around the world working on this stuff. Pillars of open government, indeed.

Follow me @glynmoody on Twitter or identi.ca.

Sugar on a Stick v1 Strawberry is Out

Although I've been sceptical of the OLPC project, not least because of its ridiculous decision to offer a Windows XP version - putting them in thrall to a US monopolist is really good way to "help" the developing world, people - I'm a big fan of the GNU/Linux-based Sugar Learning platform. I'm also a big fan of using USB sticks as a way of providing complete software solutions based on GNU/Linux. So it should come as no surprise that I think this is fab:

Sugar Labs, nonprofit provider of the Sugar Learning Platform to over one-million children worldwide, announces the immediate availability of Sugar on a Stick v1 Strawberry. Available free for download at www.sugarlabs.org, Sugar on a Stick can be loaded onto an ordinary 1GB USB flash drive and used to reboot any PC or netbook directly into the award-winning Sugar environment. It runs on recent Macs with a helper CD and in Windows using virtualization. Sugar on a Stick is designed to work with a School Server that can provide content distribution, homework collection, backup services, Moodle integration, and filtered access to the Internet. Today’s Strawberry release is meant for classroom testing; feedback will be incorporated into the next version, available towards the end of 2009.

...

Learning Activities are at the heart of Sugar. Sugar on a Stick includes 40 Activities to interest young learners such as Read, Write, Paint, and Etoys. Hundreds more Activities are available free for download at the Sugar Activity Library. Most “Sugarized” Activities have student collaboration built-in; students and teachers work, play, and learn on the same Activities together. The Sugar Learning Platform is open, so by leveraging the work of other open source projects, existing software for children can be integrated; for example, the acclaimed GCompris suite of 100 Activities developed over the past five years by Bruno Coudoin was recently added to Sugar, including Activities such as Chess, Geography, and Sudoku. Teachers and parents interested in Sugar’s Activities and its modern interface for children can watch short videos on the recently opened Sugar Labs Dailymotion channel.

Note that this is a great way to (a) use old PCs (b) provide educational materials in a single package for free (c) to avoid security issues associated with Windows. What's not to like?

Follow me @glynmoody on Twitter or identi.ca.

23 June 2009

Why Open Source, Clouds and Crowds Rule

The Guardian's crowd-sourcing of the initial analysis of hundreds of thousands of PDFs of MPs' expenses is fast becoming mythic. If you want some more technical details, this post is a good place to start. I was particular struck by the following:

As well as the Guardian’s first Django joint, this was its first project with EC2, the Amazon contract-hosting service beloved by startups for its low capital costs.

Willison’s team knew they would get a huge burst of attention followed by a long, fading tail, so it wouldn’t make sense to prepare the Guardian’s own servers for the task. In any case, there wasn’t time.

“The Guardian has lead time of several weeks to get new hardware bought and so forth,” Willison said. “The project was only approved to go ahead less than a week before it launched.”

With EC2, the Guardian could order server time as needed, rapidly scaling it up for the launch date and down again afterward. Thanks to EC2, Willison guessed the Guardian’s full out-of-pocket cost for the whole project will be around £50.

As for the software, it was all open-source, freely available to the Guardian — and to anyone else who might want to imitate them. Willison hopes to organize his work in the next few weeks.

None of this happens without open source to allow zero-cost hacks; nothing happens without clouds, that allow immediate and low-cost scale-up. (Fifty quid? Blimey.) Bottom line: increasingly popular crowdsourcing efforts won't be happening without either.

Sarkozy Will Go "All the Way" for Monopolies

Curious stuff coming out of France:

“By defending copyright I do not just defend artistic creation, I also defend my idea of a free society where everyone’s freedom is based on respect for the rights of others. I am also defending the future of our culture. It is the future of creation.”

Er, sorry, mon brave, you seem to have forgotten that copyright is a monopoly: as such, it's antithetical to freedom. Indeed, it *takes away* the freedom from all those it is imposed upon, which is practically the entire population of the world. Artists create irrespective of copyright - they have to, because of an inner urge, not because copyright says they get a monopoly.

And as for the "future of our culture", you obviously don't understand that it is inextricably bound up with *past* culture. If it can't build on what everyone has created before, just as they did - and copyright makes this increasingly difficult - your culture won't have any future.

Follow me @glynmoody on Twitter or identi.ca.

Big Victory for FoI and UK Transparency

Kudos to Computer Weekly:


The information commissioner has ordered the opening of confidential files on a wide range of high-risk IT projects, including the ID cards scheme, joined up police intelligence systems and the NHS National Programme for IT (NPfIT).

It is the most far-reaching decision under the Freedom of Information Act for government IT.

It is also a victory for Computer Weekly’s campaign for the release of the results of Gateway reviews on the progress of major IT-based projects.

MPs have complained that the first they knew of problems on projects such as the IT to support tax credit and child support payments was when constituents contacted them.

Our campaign has been aimed at persuading government to release information about projects in time for MPs and others to ask informed questions, and possibly avert a failure.

I particularly liked the list of feeble excuses used for not giving out the information, especially the last one, which is extraordinary in its arrogance:

# It would prevent policy formulation or development taking place in the self-contained space needed to ensure it was done well.

# It would make policy development less effective because departments’ attention would be focused on obtaining a “green light”.

# It would cause reports to become bland and anodyne, defeating their purpose.

# It would make interviewees, senior responsible owners and the private sector less willing to participate in reviews or co-operate with interviewers.

# It would cause delays in the completion of reports as words and phrases would be argued over.

# It is unnecessary. The public interest is already met by the information about the programme in the public domain combined with parliamentary scrutiny.

and the list of responses from the Information Commissioner:

* It would allow the public a better understanding of the development of the programmes which are the subject of Gateway reviews.
* It would allow project risks and concerns to be identified.

* It would not damage the Gateway process in the way the OGC has suggested.

* The public scrutiny of projects by the National Audit Office and Public Accounts Committee involve largely historical and retrospective analyses. Gateway reviews “would provide a level of public scrutiny of current projects”.

* It would inform the debate as to the merits of the schemes, the practicalities involved and the feasibility.

* It would ensure that “schemes as complex as these are properly scrutinised and implemented”.

* It is unrealistic to imagine that civil servants will not participate if reviews are to be published. In accordance with the Civil Service Code, “civil servants must fulfil their duties and obligations responsibly.”

Those are crucially important points, because they apply to everything else, past, present and future.

Well done, Computer Weekly for waging and winning this battle: now let's all take it forward to make UK government even more transparent.

Follow me @glynmoody on Twitter or identi.ca.

GNU/Linux Tops TOP500 Supercomputers Again

The fact that GNU/Linux totally dominates the top 500 supercomputing list is hardly news, but the fact that it has managed to *increase* its market share yet further is.

Here are the results for June 2009:


GNU/Linux 443 (88.6%)
Windows 5 (1.0%)
Unix 22 (4.4%)

and here are the figures for six months ago:


GNU/Linux 439 (87.8%)
Windows 5 (1.0%)
Unix 23 (4.6%)

Notice that plucky little Windows, from that small and hopelessly out-gunned company up in Seattle has bravely managed to increase its share by precisely 0%: an impressive result considering the millions of dollars it has spent trying to break into this market.

Snarky? Moi?

Update: More details about the top 20, and GNU/Linux's dominance here.

Follow me @glynmoody on Twitter or identi.ca.

22 June 2009

Open Source Dendrochronology

How could I resist this story? Aside from the great headline, it's about old-style closed-source science being challenge by open science, with open data - the only kind, if you think about it:


Dendrochronology is the study of tree-rings to determine when and where a tree has grown. Everybody knows that trees produce one ring every year. But the rings also vary in width according to each year's local weather conditions. If you've got enough rings in a wood sample, then their widths form a unique "bar code". Collect enough samples of various ages from buildings and bog wood, and you can join the bar codes up to a reference curve covering thousands of years.

Dendrochronology has a serious organisational problem that impedes its development as a scientific discipline and tends to compromise its results. This is the problem of proprietary data. When a person or organisation has made a reference curve, then in many cases they will not publish it. They will keep it as an in-house trade secret and offer their paid services as dendrochronologists. This means that dendrochronology becomes a black box into which customers stick samples, and out of which dates come, but only the owner of the black box can evaluate the process going on inside. This is of course a deeply unscientific state of things. And regardless of the scientific issue, I am one of those who feel that if dendro reference curves are produced with public funding, then they should be published on-line as a public resource.

But there is a resistance movement: amateur dendrochronologists such as my buddies Torbjörn Axelsson and Åke Larsson. They practice open source data transparency on the net, which means that arguably amateur dendrochronology is at this time more scientific than the professional variety.

I think it's also interesting because the story shows that, even in specialised areas like dendrochronology, openness makes a big difference to how the science is conducted - and how reliable its results are. (Via @BoraZ.)

Follow me @glynmoody on Twitter or identi.ca.

MPs Plot Against Transparency - and Lose the Plot

They just don't get it, do they?


Parliament is planning to block the future release of expenses receipts after the humiliation endured by MPs this week, The Times has learnt.

Senior MPs have drawn up plans to replace the publication of every receipt with a spreadsheet detailing individual claims.

The changes would make less information available for public scrutiny, despite the anger caused this week by the way in which details were blacked out from the official files.

Look chaps, open means open, as in o-p-e-n: we're not going to settle for less. Get used to it, because we're going to keep coming back and coming back until we get audit trail clarity from our money in your pockets to every last expense.

Follow me @glynmoody on Twitter or identi.ca.

21 June 2009

The Saga of Ogg the Great

Well, with a name like "Ogg" that's what it should be called; instead someone has put together what they term more prosaically "The history of Ogg on the Web":

In the year 2000, while working at CSIRO as a research scientist, I had the idea that video (and audio) should be hyperlinked content on the Web just like any Web page. Conrad Parker and I developed the vision of a “Continuous Media Web” and called the technology that was necessary to develop “Annodex” for “annotated and indexed media”.

Not many people now know that this was really the beginning of Ogg on the Web. Until then, Ogg Vorbis and the emerging Ogg Theora were only targeted at desktop applications in competition to MP3 and MPEG-2.

Despite the modest name, this is important stuff. As I wrote elsewhere recently, I believe that the arrival of Firefox 3.5, with it support for Ogg's formats, will mark a turning point in open video and audio. It's good to have background information on how it all started.

19 June 2009

Managing Identity Without ID Cards

I've always been slightly conflicted about Jerry Fishenden. He obviously knew what he was talking about, but he was, you know, one of the *them* - a Microsoftie. Or rather, *was* a Microsoft since he's a free man now. And you sense a new freedom in his writing, too, which means that I can start recommending his stuff unreservedly.

Here, for instance, is nothing less than a core idea of how to manage identity in the 21st century without ID cards or any of the associated stupidities:


In the work of leading identity, security and privacy thinkers such as Stefan Brands and Kim Cameron,* it is possible to see the art of the possible (Cameron's laws of identity can be found here). Stefan’s work on minimal disclosure, for example, makes it possible to prove information about ourselves ("I am over 18", "I am over 65", "I am a UK citizen", etc) without disclosing any personal information, such as our full name, place and date of birth, age or address. Neither would the technology leave an audit trail of where we have been and whom we have interacted with. It would leave our private lives private. Indeed, it would enable us to have better privacy in our private lives than we do today, when we are often forced to disclose personal information to a whole host of people and organisations.

Got that? We can prove anything about ourselves that we need to, without giving up *all* information as the Labour government wants, and without leaving audit trails. Effectively, this is the public key cryptography of identity, where mathematical magic lets you do apparently impossible things.

This is so obviously exactly what we should be doing for identity management in a world that clearly requires it, and so exactly meets the needs of those of us concerned about profound issues of civil liberties, that you really have to wonder what bunch of utterly witless morons at the Home Office are stopping this eminently sensible thing from happening, and pursuing instead the worst of all possible worlds with an expensive, insecure, intrusive and unworkable system.

Follow me @glynmoody on Twitter or identi.ca.

Elsevier Does a Microsoft with Open Access

Nice one, Elsevier:

A multinational journal giant is understood to be courting vice- chancellors in an effort to win their support for an alternative to open-access institutional research repositories.

Elsevier is thought to be mooting a new idea that could undermine universities' own open-access repositories. It would see Elsevier take over the job of archiving papers and making them available more widely as PDF files.

If successful, it would represent a new tactic by publishers in their battle to secure their future against the threat posed by the open-access publishing movement.

Most UK universities operate open-access repositories, where scholars can voluntarily deposit final drafts of their pay-to-access journal publications online. Small but growing numbers are also making such depositions mandatory.

I've seen these kind of stories so many times in the world of open source, with Microsoft as the main protagonist, that they warm the cockles of my heart when I see them popping up in other areas like open access. Why? Because if a multi-billion pound company like Elsevier is starting to stoop to this kind of tactic, it demonstrates just how profoundly worried it is - and how close open access is to widespread acceptance.

Follow me @glynmoody on Twitter or identi.ca.

Water, Water, Everywhere - Linked by Open Source

Is there no domain in which open source is not storming ahead? What about this:

OpenMI stands for Open Modelling Interface and Environment, a standard for model linkage in the water domain.

Integrated catchment management asks for integrated analysis that can be supported by integrated modelling systems. These modelling systems can only be developed and maintained if they are based on a collection of interlinked models. OpenMI has been designed to provide a widely accepted unified method to link models, both legacy code and new ones.

To support those adopting the OpenMI, a set of tools has been developed to aid conversion, and simplify the configuration and running of linked models. These utilities make up the Open Modelling Environment. The commercial implications of the OpenMI have been considered both from the points of view of the vendors/suppliers of existing systems and the developers of new models and related tools. OpenMI avoids the need to abandon or rewrite current applications, thus protecting the huge investment in model development. Making a new component OpenMI compliant simplifies the process of bringing it to the market place and ensures it will be interoperable with many other systems.

And in case you were wondering:

The OpenMI source code is available under the GNU Library or Lesser General Public License (LGPL)

Follow me @glynmoody on Twitter or identi.ca.

Opening up: New York Senate's Doing It *Now*

Vancouver may have promised that it will do it, the New York Senate is actually opening up completely now:

Welcome to the Open NYSenate

To pursue its commitment to transparency and openness the New York State Senate is undertaking a cutting-edge program to not only release data, but help empower citizens and give back to the community. Under this program the New York Senate will, for the first time ever, give developers and other users direct access to its data through APIs and release its original software to the public. By placing the data and technological developments generated by the Senate in the public domain, the New York Senate hopes to invigorate, empower and engage citizens in policy creation and dialogue.

...

Original Software

As a user of Open-Source software the New York Senate wants to help give back to the community that has given it so much - including this website. To meet its needs the Senate is constantly devleoping new code and fixing existing bugs. Not only does the Senate recognize that it has a responsibility to give back to the Open Source community, but public developments, made with public money should be public.

...

Data Sets

The New York Senate's Open Data page is the official repository of all government data. There you can browse through data produced by and considered by the Senate in their original forms as well as various other file types created for your convenience; including but not limited to: Excel spreadsheets, .csv, text files and PDFs. To supplement the source data it is making available, the Senate has also created the Plain Language Initiative designed to help explain complex data sets and legal terms in plain language.

...

Open-Source Software & Software Licenses

In order to make the Senate's information and software as public as possible, it is has adopted unique system using two types of licenses - GNU General Public License as well as the BSD License. This system is meant to ensure the most public license is used in each specific case such that:

(i) Any Software released containing components with preexisting GPL copyrights must be released pursuant to a GPL v3 copyright restriction.

(ii) Any Software created independently by the Senate without any preexisting licensing restrictions on any of its components shall be released under dual licensing and take one of two forms: (a) a BSD license, or (b) a GPL v3 license. The ultimate user of such Software shall choose which form of licensing makes the most sense for his or her project.

This is getting too easy: I want more of a challenge to opening up government.

Anyway, kudos to all involved - great move.

Follow me @glynmoody on Twitter or identi.ca.

Open Source Sent to Siberia

Russia is emerging as a real open source power-house, especially in the eduction sector. Here's some more good news, this time from Siberia:

За 2009 год школы Сибирского федерального округа должны перейти на "Пакет свободного программного обеспечения для образовательных учреждений", в основе которого лежит операционная система Linux. Об этом сообщил корр. "ТАСС-Сибирь" президент ассоциации "Информатизация образования Сибири" Виктор Корнеев. Более того, от популярной операционной системы Windows будут отказываться и бюджетные учреждения, однако в них процесс перехода на новое программное обеспечение затянется на ближайшие 5 лет.

Особо Виктор Дмитриевич отметил, что среди трех регионов России, в которых проводилась апробация этого программного обеспечения, был один регион СФО – Томская область. Именно здесь, наряду с подобными мероприятиями в Татарстане и Пермском крае, Областной центр развития образования проводил мероприятия по внедрению программного обеспечения на базе Linux во все школы Томской области. "А сегодня мы готовы перевести на эти программы все школы Сибирского федерального округа, причем сделать это в кратчайшие сроки – не более чем за один год. Единственное отличие от пилотного проекта в том, что упаковка будет несколько скромнее", - отмечает Виктор Корнеев, демонстрируя массивную запечатанную коробку, в которой находилось 4 вида программного обеспечения для разных типов компьютеров.

[Via Google Translate: During 2009 the School of the Siberian Federal District to move to "free software package for educational institutions", which is based on the operating system Linux. The statement was made by a correspondent. Moreover, from the popular Windows operating system will refuse, and budgetary institutions, but in the process of transition to new software is delayed for the next 5 years.

Especially Victor D. noted that among the three regions of Russia, in which the testing of the software, was one region SFD - Tomsk Oblast. It is here, along with similar activities in Tatarstan and the Perm region, the regional center for educational development activities conducted on the introduction of software based on Linux in all schools in the Tomsk region. "And today we are ready to transfer these programs to all schools in the Siberian Federal District, and to do so as soon as possible - no more than one year. The only difference from the pilot project in that the package will be slightly more modest," - noted Victor Korneev, demonstrating massive sealed box in which there were 4 types of software for different types of computers.]

Follow me @glynmoody on Twitter or identi.ca.

Reclaim The Commons: A Manifesto

As long-suffering readers of this blog will have noticed, I rather like the concept of the commons. As well as being good in itself, it also provides a way of linking many disparate fields - software, content, data, knowledge, fisheries, forests, oceans, the atmosphere. That's not really surprising, since the thing these all have in, er, common is that we share them, and the commons offers a model for sharing without destroying.

It's a viewpoint that's becoming increasingly widely shared (sorry, these words just keep popping up), and now we have this splendid manifesto that is specifically about all the commons I mentioned above, and how we need to change our attitudes to them:

Humankind is suffering from an unprecedented campaign of privatization and commodification of the most basic elements of life: nature, culture, human work and knowledge itself. In countless arenas, businesses are claiming our shared inheritance - sciences, creative works, water, the atmosphere, health, education, genetic diversity, even living creatures - as private property. A compulsive quest for short-term financial gain is sacrificing the prosperity of all and the stability of the Earth itself.

The dismal consequences of market enclosures can be seen in our declining ecosystems: the erosion of soil and biodiversity, global climate change, reduction of food sovereingty. Agressive intellectual property politics harness those suffering from neglected deseases or who can't purchase patented medicines, reduce cultural diversity, limit access to knowledge and education, and promote a global consumerist culture.

...

a new vision of society is arising - one that honors human rights, democratic participation, inclusion and cooperation. People are discovering that alternatives and commons-based approaches offer practical solutions for protecting water and rivers, agricultural soils, seeds, knowledge, sciences, forest, oceans, wind, money, communication and online collaborations, culture, music and other arts, open technologies, free software, public services of education, health or sanitization, biodiversity and the wisdom of traditional knowledges.

The manifesto has a very concrete, practical aim alongside the more general one of raising awareness of the commons:

The signers of this Manifesto, launched at the World Social Forum of 2009, call upon all citizens and organizations to commit themselves to recovering the Earth and humanity's shared inheritance and future creations. Let us demonstrate how commons-based management - participatory, collaborative and transparent - offers the best hope for building a world that is sustainable, fair and life-giving.

This Manifesto calls upon all citizens of the world to deepen the notion of the commons and to share the diverse approaches and experiences that it honors. In our many different ways, let us mobilize to reclaim the commons, organize their de-privatization and get them off markets, and strengthen our individual initiatives by joining together in this urgent, shared mission.

I particularly liked the framing of commons-based management as "participatory, collaborative and transparent", since this applies perfectly to open source, open content, and all the other things this blog has been covering.

I've signed the manifesto, and I urge you to do so and spread - no, share - the news about this important initiative.

Follow me @glynmoody on Twitter or identi.ca.

ODF and the Art of Interoperability

It's hard to believe that there was such sound and fury when OOXML was being pushed through the ISO process. At the time, it seemed like the end of the world, since it looked like Microsoft had succeeded in obtaining a nominal parity with ODF, which had been approved earlier.

My, what a difference a year makes....

On Open Enterprise blog.

18 June 2009

TACD Fights ACTA on "IPRs"

One of the frustrating aspects about the Anti-Counterfeit Trade Agreement (ACTA) is that it is a cosy club of rich and powerful nations plus a few of their equally rich and powerful chums in select industry. Meanwhile, hoi polloi - that's you and me - don't get a look in, even though we are the most affected.

So I was delighted to find that a group of like-minded consumer organisations are not only getting together, but starting to stand up for us on this important issue. Behold the Transatlantic Consumer Dialogue:


is a forum of US and EU consumer organisations which develops and agrees joint consumer policy recommendations to the US government and European Union to promote the consumer interest in EU and US policy making.

...

The TACD working group on intellectual property was created in 2001. The European Co-Chair of the working group is Jill Johnstone of Consumer Focus in the UK. The US co-chair is James Love of Knowledge Ecology International. The TACD IP working group staff expert is Anne-Catherine Lorrain.

It has now put out a Resolution on the enforcement of copyright, trademarks, patents and other intellectual property rights. Here's what the TACD says about it:

The TACD Resolution comes at a time when governments in Europe and North America are considering a wide range of new global standards for IP enforcement. Among those new norms are the proposed “Anti-Counterfeiting” Trade Agreement (ACTA) [On April 6, 2009, USTR released a detailed summary of the current state of the Anti-Counterfeiting Trade Agreement (ACTA) negotiations: http://www.ustr.gov/sites/default/files/uploads/factsheets/2009/asset_upload_file917_15546.pdf], new customs procedures through the World Customs Organization (WCO), anti-Counterfeiting measures at the World Health Organization (WHO), WTO disputes over enforcement, proposals in Europe for “graduated response” and other Internet filtering solutions, several European Union Directives and bills pending before the U.S. Congress and other countries on the topic of IP enforcement, bilateral trade agreements, and unilateral trade sanctions by Europe and the United States.

The 2,000 word TACD Resolution touches on a wide range of topics relating to IP enforcement policies and practices, ranging from transparency, evidence and process, to both general and detailed recommendations on substantive policies. TACD first discussed the Resolution with representatives from the European Union and the U.S. Government on June 9, 2009, during the TACD 10th Annual Meeting in Brussels.

The resolution itself is pretty sensible, requesting "Transparency and Openness", "Evidence and Analysis" amongst other things. As for ACTA, here's how it starts:

There should be no further meetings on the Anti-Counterfeiting Treaty until the EU and the US publish the full text of all negotiating documents, and agree to additional transparency measures, including the accreditation of consumers and/or their representatives as observers.

The term "counterfeit" should not be used to describe activities relating to the mere infringement of copyrights or trademarks where there is no intent to deceive or any likelihood of confusion as to the authorized manufacturer, distributor or provider of the service. Possible patent infringements should not be referred to as counterfeits.

It would be too much to expect the ACTA participants to pay much attention, but it takes things up yet another notch; one day, the pressure will be too much, and something will have to give.

Firefox 3.5: What's in a Number?

I had an interesting chat this morning with Mike Shaver, VP, Engineering at Mozilla, about the imminent Firefox 3.5. Its launch takes place against a background where Firefox continues to make gains in the browser market, passing the 50% share in some European countries, and where it has created an unparalleled ecosystem of addons that places it at the forefront of the browser world in terms of capability and customisability....

On Open Enterprise blog.

The Green Intellectual Property Project

The Green Intellectual Property (GIP) Project aims at greening our society through two activities;

* implementing the Green Intellectual Property (GIP) System, and
* promoting patent applications of green technologies.

Interesting approach. The GIP:

The GIP System was first proposed in 2003 by Itaru Nitta, the founder of the GIP Project. Simply, the GIP System would divert a part of the patent-related monetary flow toward a trust fund, called the GIP Trust Fund. This Fund would provide subsidies and royalty assumptions for introducing and developing patent-protected green technologies. The green technologies encompass eco-friendly apparatuses, nursing-care for the elderly, welfare services for disabled people, organic agriculture, essential medicines, and all technologies advancing social welfare.

I'd still like to be shot of the whole caboodle, whatever colour it is.

Follow me @glynmoody on Twitter or identi.ca.

17 June 2009

Digital Britain, Analogue Thinking

Clocking in at 238 pages, the final Digital Britain report is an impressive piece of work. It provides a comprehensive survey of how many aspects of British life are being transformed by the transition from the old world in which information is largely stored and transmitted in an analogue format, to one that is inherently digital. Moreover, to its credit, the report is suffused with a sense that this is an epochal and exciting change, not just a minor change of emphasis.

That's the good news.

The bad news is that the report is riddled with old, analogue thinking that vitiates most of its proposals....

On Open Enterprise blog.

The Doctor Who Model of Open Source

I often write of the way in which other domains are learning from open source and its successes. But that's not to say the traffic is all one way: increasingly, the other opens have much to *teach* open source, too.

For example, Peter Murray-Rust is one of the leading exponents of open data and open chemistry, notably through the Blue Obelisk group:


The Internet has brought together a group of chemists/programmers/informaticians who are driven by wanting to do things better, but are frustrated with the Closed systems that chemists currently have to work with. They share a belief in the concepts of Open Data, Open Standards and Open Source (ODOSOS) (but not necessarily Open Access). And they express this in code, data, algorithms, specifications, tutorials, demonstrations, articles and anything that helps get the message across.

Here's an interesting point he raised recently:

How do we sustain Open Source in a distributed world? We are facing this challenge with several of our chemical software creations/packages. People move, institutions change. Open Source does not, of itself, grow and flourish – it needs nurturing. Many packages require a lot of work before they are in a state to be usefully enhanced by the community - “throw it over the wall and it will flourish” does not work.

Many OS projects have clear governance and (at least implicitly) funded management. Examples are Apache, Eclipse, etc. Many others have the “BDFL” - Benevolent Dictator For Life with characters such as R[M]S, Linus, Guido Python, Larry Perl, etc. These command worldwide respect and they have income models which are similar to literary giants. These models don’t (yet?) work for chemistry.

Instead the Blue Obelisk community seems to have evolved a “Doctor Who” model. You’ll recall that every few years something fatal happens to the Doctor and you think he is going to die and there will never be another series. Then he regenerates. The new Doctor has a different personality, a different philosophy (though always on the side of good). It is never clear how long any Doctor will remain unregenerated or who will come after him. And this is a common theme in the Blue Obelisk.

The rest of the post fleshes out this analogy - well worth reading.

Follow me @glynmoody on Twitter or identi.ca.