09 April 2009

Should an Open Source Licence Ever Be Patent-Agnostic?

Sharing lies at the heart of free software, and drives much of its incredible efficiency as a development methodology. It means that coders do not have to re-invent the wheel, but can borrow from pre-existing programs. Software patents, despite their name, are about locking down knowledge so that it cannot be shared without permission (and usually payment). But are there ever circumstances when software patents that require payment might be permitted by an open source licence? That's the question posed by a new licence that is being submitted to the Open Source Inititative (OSI) for review.

On Linux Journal.

Follow me on Twitter @glynmoody

08 April 2009

Time to Get Rid of ICANN

ICANN has always been something of a disaster area, showing scant understanding of what the Internet really is, contemptuous of its users, and largely indifferent to ICANN's responsibilities as guardian of a key part of its infrastructure. Here's the latest proof that ICANN is not fit for its purpose:


The familiar .com, .net, .org and 18 other suffixes — officially "generic top-level domains" — could be joined by a seemingly endless stream of new ones next year under a landmark change approved last summer by the Internet Corp. for Assigned Names and Numbers, the entity that oversees the Web's address system.

Tourists might find information about the Liberty Bell, for example, at a site ending in .philly. A rapper might apply for a Web address ending in .hiphop.

"Whatever is open to the imagination can be applied for," says Paul Levins, ICANN's vice president of corporate affairs. "It could translate into one of the largest marketing and branding opportunities in history."

Got that? This change is purely about "marketing and branding opportunities"...the fact that it will fragment the Internet, sow confusion among hundreds of millions of users everywhere, and lead to the biggest explosion of speculative domain squatting and hoarding by parasites who see the Internet purely as a system to be gamed, is apparently a matter of supreme indifference to those behind ICANN: the main thing is that it's a juicy business opportunity.

Time to sack the lot, and put control of the domain name system where it belongs: in the hands of engineers who care.

Follow me on Twitter @glynmoody

Second Life + Moodle = Sloodle

Moodle is one of open source's greatest success stories. It's variously described as an e-learning or course management system. Now, given that education is also one of the most popular applications of Second Life, it would be a natural fit somehow to meld Moodle and Second Life. Enter Sloodle, whose latest version has just been released:

Version 0.4 integrates Second Life 3D classrooms with Moodle, the world’s most popular open source e-learning system with over 30 million users (http://www.moodle.org). This latest release allows teachers and students to prepare materials in an easy-to-use, web-based environment and then log into Second Life to put on lectures and student presentations using their avatars.

The new tools also let students send images from inside Second Life directly to their classroom blog. Students are finding this very useful during scavenger hunt exercises where teachers send them to find interesting content and bring it back to report to their classmates.

Tools that cross the web/3D divide are becoming more popular as institutions want to focus on the learning content rather than the technical overhead involved in orienting students into 3D settings and avatars.

As an open-source platform SLOODLE is both freely available and easily enhanced and adapted to suit the needs of diverse student populations. And web hosts are lining up to support the platform. A number of third-party web hosts now offer Moodle hosting with SLOODLE installed either on request or as standard, making easier than ever to get started with SLOODLE.

SLOODLE is funded and supported by Eduserv - http://www.eduserv.ac.uk/ and is completely free for use under the GNU GPL license.

The project was founded by Jeremy Kemp of San José State University, California, and Dr. Daniel Livingstone of the University of the West of Scotland, UK.

Follow me on Twitter @glynmoody

Forward to the Past with Forrester

Looking at Forrester's latest report on open source , I came across the following:

The bottom line is that in most application development shops, the use of open source software has been a low-level tactic instead of a strategic decision made by informed executives. As a result, while there’s executive awareness of the capital expenditure advantages of adopting OSS, other benefits, potential risks, and the structural changes required to take full advantage of OSS are still poorly understood.

On Open Enterprise blog.

Follow me on Twitter @glynmoody

Open Access as "Middleware"

Interesting simile:

A "legacy system" in the world of computing provides a useful analogy for understanding the precarious state of contemporary academic publishing. This comparison might also keep us from stepping backward in the very act of stepping forward in promoting Open Access publishing and Institutional Repositories. I will argue that, vital as it is, the Open Access movement should really be seen in its current manifestation as academic "middleware" servicing the "legacy system" of old-school scholarship.

(Via Open Access News.)

Follow me on Twitter @glynmoody

07 April 2009

OpenStreetMap Navigates to Wikipedia

One of the powerful features of open source is re-use: you don't have to re-invent the wheel, but can build on the work of others. That's straightforward enough for software, but it can also be applied to other fields of openness. Here's a fantastic example: embedding OpenStreetMap in Wikipedia entries:


For some time, there have been efforts to bring OpenStreetMap (OSM) and Wikipedia closer together. Both projects have the mission to produce free knowledge and information through a collaborative community process. Because of the similarities, there are many users active in both projects – however mutual integration is still lacking.

For this reason, Wikimedia Deutschland (WM-DE, the German Chapter of Wikimedia) are providing funds of 15.000 Euro (almost $20k) and starting a corresponding pilot project. A group of interested Wikipedians and OSM users have partnered up to reach two goals: The integration of OSM-maps in Wikipedia and the installation of a map toolserver. The map toolserver will serve to prototype new mapping-related projects and preparing them for deployment on the main Wikimedia cluster.

Here's how it will work:

Maps are an important part of the information in encyclopaedic articles - however currently mostly static maps are used. With interactive free maps and a marking system a way of presenting information can be created.

For some time there have been MediaWiki Extensions available for embedding OpenStreetMap maps into MediaWiki. That's a great start, but it isn't enough. If these extensions were deployed on Wikipedia without any kind of proxy set-up, the OpenStreetMap tile servers would struggle to handle the traffic.

One of our aims is to build an infrastructure in the Wikimedia projects that allows us to keep the OSM data, or at least the tile images, ready locally in the Wikimedia network. We still have to gain a some experience about this, but we are optimistic about that. On one side, we have a number of Wikipedians in the team, who are versed in MediaWiki and scaling software systems, and on the other side we have OSM users who can set up the necessary geo database.

We learned much from the use of the Wikimedia-Toolservers – for example that on a platform for experimenting much more useful tools were developed than it was predicted. Interested developers have a good starting position to develop new tools with new possibilities.

We expect similar results from the map toolserver. As soon as it is online, everyone who is interested and presents his ideas of development projects and his state of knowledge can apply for an account. We want to allow as many users as possible to implement their ideas without having to care about the basic setup. We hope that in the spirit of the creation and distribution of free content many new maps and visualisations emerge.

Now, it's happening:

There has been rapid progress on the subject of adding OpenStreetMap maps to Wikimedia projects (e.g. Wikipedia) during the MediaWiki Developer Meet-Up taking place right now in Berlin.

Maps linked to Wikipedia content *within* Wikipedia: I can't wait.

Follow me on Twitter @glynmoody

Transparency and Open Government

Not my words, but those of that nice Mr Obama:


My Administration is committed to creating an unprecedented level of openness in Government. We will work together to ensure the public trust and establish a system of transparency, public participation, and collaboration. Openness will strengthen our democracy and promote efficiency and effectiveness in Government.

Wow.

Specifically:

Government should be transparent. Transparency promotes accountability and provides information for citizens about what their Government is doing.

...

Government should be participatory. Public engagement enhances the Government's effectiveness and improves the quality of its decisions.

...

Government should be collaborative. Collaboration actively engages Americans in the work of their Government.

Read the whole thing - and weep for poor old, locked-up UK....

Follow me on Twitter @glynmoody

Google and Microsoft Agree: This is Serious

You know things are bad when a coalition includes Google and Microsoft agreeing on something...

On Open Enterprise blog.

Follow me on Twitter @glynmoody

RFCs: Request for Openness

There's a fascinating history of the RFCs in the New York Times, written by a person who was there at the beginning:

Our intent was only to encourage others to chime in, but I worried we might sound as though we were making official decisions or asserting authority. In my mind, I was inciting the wrath of some prestigious professor at some phantom East Coast establishment. I was actually losing sleep over the whole thing, and when I finally tackled my first memo, which dealt with basic communication between two computers, it was in the wee hours of the morning. I had to work in a bathroom so as not to disturb the friends I was staying with, who were all asleep.

Still fearful of sounding presumptuous, I labeled the note a “Request for Comments.” R.F.C. 1, written 40 years ago today, left many questions unanswered, and soon became obsolete. But the R.F.C.’s themselves took root and flourished. They became the formal method of publishing Internet protocol standards, and today there are more than 5,000, all readily available online.

For me, most interesting comments are the following:

The early R.F.C.’s ranged from grand visions to mundane details, although the latter quickly became the most common. Less important than the content of those first documents was that they were available free of charge and anyone could write one. Instead of authority-based decision-making, we relied on a process we called “rough consensus and running code.” Everyone was welcome to propose ideas, and if enough people liked it and used it, the design became a standard.

After all, everyone understood there was a practical value in choosing to do the same task in the same way. For example, if we wanted to move a file from one machine to another, and if you were to design the process one way, and I was to design it another, then anyone who wanted to talk to both of us would have to employ two distinct ways of doing the same thing. So there was plenty of natural pressure to avoid such hassles. It probably helped that in those days we avoided patents and other restrictions; without any financial incentive to control the protocols, it was much easier to reach agreement.

This was the ultimate in openness in technical design and that culture of open processes was essential in enabling the Internet to grow and evolve as spectacularly as it has. In fact, we probably wouldn’t have the Web without it. When CERN physicists wanted to publish a lot of information in a way that people could easily get to it and add to it, they simply built and tested their ideas. Because of the groundwork we’d laid in the R.F.C.’s, they did not have to ask permission, or make any changes to the core operations of the Internet. Others soon copied them — hundreds of thousands of computer users, then hundreds of millions, creating and sharing content and technology. That’s the Web.

I think this is right: the RFCs are predicated on complete openness, where anyone can make suggestions and comments. The Web built on that basis, extending the possibility of openness to everyone on the Internet. In the face of attempts to kill net neutrality in Europe, it's something we should be fighting for.

Follow me on Twitter @glynmoody

06 April 2009

The Latest Act in the ACTA Farce

I think the Anti-Counterfeiting Trade Agreement(ACTA) will prove something of a watershed in the negotiations of treaties. We have already gone from a situation where governments around the world have all-but denied the thing existed, to the point where the same people are now scrambling to create some semblance of openness without actually revealing too much.

Here's the latest attempt, which comes from the US team:

A variety of groups have shown their interest in getting more information on the substance of the negotiations and have requested that the draft text be disclosed. However, it is accepted practice during trade negotiations among sovereign states to not share negotiating texts with the public at large, particularly at earlier stages of the negotiation. This allows delegations to exchange views in confidence facilitating the negotiation and compromise that are necessary in order to reach agreement on complex issues. At this point in time, ACTA delegations are still discussing various proposals for the different elements that may ultimately be included in the agreement. A comprehensive set of proposals for the text of the agreement does not yet exist.

This is rather amusing. On the one hand, the negotiators have to pretend that "a comprehensive set of proposals for the text of the agreement does not yet exist", so that we can't find out the details; on the other, they want to finish off negotiations as quickly as possible, so as to prevent too many leaks. Of course, they can't really have it both ways, which is leading to this rather grotesque dance of the seven veils, whereby bits and pieces are revealed in an attempt to keep us quiet in the meantime.

The latest summary does contain some interesting background details that I'd not come across before:

In 2006, Japan and the United States launched the idea of a new plurilateral treaty to help in the fight against counterfeiting and piracy, the so-called Anti-Counterfeiting Trade Agreement (ACTA). The aim of the initiative was to bring together those countries, both developed and developing, that are interested in fighting counterfeiting and piracy, and to negotiate an agreement that enhances international co-operation and contains effective international standards for enforcing intellectual property rights.

Preliminary talks about such an anti-counterfeiting trade agreement took place throughout 2006 and 2007 among an initial group of interested parties (Canada, the European Commission, Japan, Switzerland and the United States). Negotiations started in June 2008 with the participation of a broader group of participants (Australia, Canada, the European Union and its 27 member states, Japan, Mexico, Morocco, New Zealand, Republic of Korea, Singapore, Switzerland and the United States).

The rest, unfortunately, is the usual mixture of half-truths and outright fibs. But this constant trickle of such documents shows that they are taking notice of us, and that we must up the pressure for full disclosure of what exactly is being negotiated in our name.

Follow me on Twitter @glynmoody

A Different Kind of Wörterbuch

Linguee seems to offer an interesting twist on a boring area - bilingual dictionaries:

With Linguee, you can search for words and expressions in many millions of bilingual texts in English and German. Every expression is accompanied by useful additional information and suitable example sentences.

...

When you translate texts to a foreign language, you usually look for common phrases rather than translations of single words. With its intelligent search and the significantly larger amount of stored text content, Linguee is the right tool for this task. You find:

* In what context a translation is used
* How frequent a particular translation is
* Example sentences: How have other people translated an expression?

By searching not only for a single word, but for a respective word in its context, you can easily find a translation that fits optimal in context. With its large number of entries, Linguee often retrieves translations of rare terms that you don't find anywhere else.

There two other points of interest. The source of the texts:

Our most important source is the bilingual web. Other valuable sources include EU documents and patent specifications.

And the fact that a "GPL version of the Linguee dictionary" is available.

Follow me on Twitter @glynmoody

Google's Perpetual Monopoly on Orphan Works

Here's an interesting analysis of the Google Book Search settlement. This, you will recall, resolved the suit that authors and publishers brought against Google for scanning books without permission - something it maintained it could do without, since it only wanted to index its contents, not display them in their entirely.

At first this looked like an expensive and unnecessary way out for Google: many hoped that it would fight in the courts to determine what was permitted under fair use. But as people have had time to digest its implications, the settelement is beginning to look like a very clever move:


Thanks to the magic of the class action mechanism, the settlement will confer on Google a kind of legal immunity that cannot be obtained at any price through a purely private negotiation. It confers on Google immunity not only against suits brought by the actual members of the organizations that sued Google, but also against suits brought by anyone who doesn’t explicitly opt out. That means that Google will be free to mine the vast body of orphan works without fear of liability.

Any competitor that wants to get the same legal immunity Google is getting will have to take the same steps Google did: start scanning books without the publishers’ and authors’ permission, get sued by authors and publishers as a class, and then negotiate a settlement. The problem is that they’ll have no guarantee that the authors and publishers will play along. The authors and publishers may like the cozy cartel they’ve created, and so they may have no particular interest in organizing themselves into a class for the benefit of the new entrant. Moreover, because Google has established the precedent that “search rights” are something that need to be paid for, it’s going to be that much harder for competitors to make the (correct, in my view) argument that indexing books is fair use.

It seems to me that, in effect, Google has secured for itself a perpetual monopoly over the commercial exploitation of orphan works. Google’s a relatively good company, so I’d rather they have this monopoly than the other likely candidates. But I certainly think it’s a reason to be concerned.

Cunning.

Follow me on Twitter @glynmoody

All Tatarstan Schools Moving to Free Software

Tatarstan is the place to be:

До конца текущего года все школы Татарстана планируется перевести на свободное программное обеспечение на базе операционной системы «Linux».

...

По словам замминистра, в каждой школе республики на уровне кружков планируется открыть курсы по обучению работе в «Linux» учащихся. Но до этого предстоит еще подготовить специалистов, которые будут руководить этими кружками.

Людмила Нугуманова заявила, что Татарстан полностью перейдет на программное обеспечение с открытым кодом на основе операционной системы «Linux». Ведь в 2010 году закончится подписка на лицензию базового пакета программного обеспечения для школ на платформе «Microsoft». «За продолжение подписки придется платить немалые деньги, либо остаться на нашем отечественном продукте «Linux», - отметила она.

Как сообщила начальник отдела развития информационных технологий в образовании Министерства образования и науки РТ Надежда Сулимова, в прошлом году новый софт установлен в 612 школах республики (всего в Татарстане функционируют почти 2,4 тысячи общеобразовательных учреждений).

[Via Google Translate: By the end of this year, all schools of Tatarstan plans to transfer to the free software operating system based on «Linux».

...

According to the Deputy Minister, in each of the school-level workshops are planned to open courses on the work of «Linux» students. But before that is still to prepare professionals who will lead these clubs.

Ludmila Nugumanova said that Tatarstan is fully pass on the software and open source based operating system «Linux». Indeed, in 2010, will end subscription base license software package for schools on the platform «Microsoft». «For the continuation of subscriptions to pay a lot of money, or stay on our domestic product« Linux », - she said.

As the head of the department of information technology in education the Ministry of Education and Science of the Republic of Tatarstan Hope Sulimova, last year, new software is installed in 612 schools (only in Tatarstan there are almost 2,4 thousand general educational institutions).]

Follow me on Twitter @glynmoody

How Can We Save Thunderbird Now Email is Dying?

I like Thunderbird. I've been using it for years, albeit now more as a backup for my Gmail account than as my primary email client. But it's always been the Cinderella of the Mozilla family, rather neglected compared to its more glamorous sister Firefox. The creation of the Mozilla Messaging subsidiary of the Mozilla Foundation means that efforts are already underway to remedy that. But there's a deeper problem that Thunderbird needs to face, too....

On Open Enterprise blog.

Follow me on Twitter @glynmoody

05 April 2009

Top 10 Measurements for Open Government

One of the most exciting applications of openness in recent months has been to government. A year ago, open government was sporadic and pretty forlorn as a field; today it is flourishing, notable under the alternative name of "transparency". At the forefront of that drive is the Sunlight Foundation, which has just published a suggested top 10 measurements of just how open a government real is:


1. Open data: The federal government should make all data searchable, findable and accessible.

2. Disclose spending data: The government should disclose how it is spending taxpayer dollars, who is spending it and how it’s being spent.

3. Procurement data: How does the government decide where the money is getting spent, who gets it, how they are spending it and how can we measure success.

4. Open portal for public request for information: There should be a central repository for all Freedom of Information Act requests that are public to that people can see in real time when the requests come in, how fast the government responds to them.

5. Distributed data: The government should make sure it builds redundancy in their system so that data is not held in just one location, but held in multiple places in case of a disaster, terrorist attack or some other reason where the data is damaged. Redundancy would guarantee government could rebuild the data for future use.

6. Open meetings: Government meetings should be open to the public so that citizens can tell who is trying to influence government. All schedules should be published as soon as they happen so that people can see who is meeting with whom and who is trying to influence whom.

7. Open government research: Currently, when government conducts research, it usually does not report the data it collects until the project is finished. Government should report its research data while its being collected in beta form. This would be a measure of transparency and would change the relationship that people have to government research as it is being collected.

8. Collection transparency: Government should disclose how it is collecting information, for whom are they collecting the data, and why is it relevant. The public should have the ability to judge whether or not it valuable to them, and giving them the ability to comment on it.

9. Allowing the public to speak directly to the president: Recently, we saw the president participate in something called “Open for Questions,” where he gave the public access to ask questions. This allowed him to burst his bubble and be in touch with the American public directly is another measure of transparency.

10. Searchable, crawl able and accessible data: If the government were to make all data searchable, crawl able and accessible we would go along way in realizing all the goals presented at the Gov 2.0 Camp.

Great stuff, exciting times. Now, if only the UK government could measure up to these....

Follow me on Twitter @glynmoody

Who Can Put the "Open" in Open Science?

One of the great pleasures of blogging is that your mediocre post tossed off in a couple of minutes can provoke a rather fine one that obviously took some time to craft. Here's a case in point.

The other day I wrote "Open Science Requires Open Source". This drew an interesting comment from Stevan Harnad, pretty much the Richard Stallman of open access, as well as some tweets from Cameron Neylon, one of the leading thinkers on and practitioners of open science. He also wrote a long and thoughtful reply to my post (including links to all our tweets, rigorous chap that he is). Most of it was devoted to pondering the extent to which scientists should be using open source:

It is easy to lose sight of the fact that for most researchers software is a means to an end. For the Open Researcher what is important is the ability to reproduce results, to criticize and to examine. Ideally this would include every step of the process, including the software. But for most issues you don’t need, or even want, to be replicating the work right down to the metal. You wouldn’t after all expect a researcher to be forced to run their software on an open source computer, with an open source chipset. You aren’t necessarily worried what operating system they are running. What you are worried about is whether it is possible read their data files and reproduce their analysis. If I take this just one step further, it doesn’t matter if the analysis is done in MatLab or Excel, as long as the files are readable in Open Office and the analysis is described in sufficient detail that it can be reproduced or re-implemented.

...

Open Data is crucial to Open Research. If we don’t have the data we have nothing to discuss. Open Process is crucial to Open Research. If we don’t understand how something has been produced, or we can’t reproduce it, then it is worthless. Open Source is not necessary, but, if it is done properly, it can come close to being sufficient to satisfy the other two requirements. However it can’t do that without Open Standards supporting it for documenting both file types and the software that uses them.

The point that came out of the conversation with Glyn Moody for me was that it may be more productive to focus on our ability to re-implement rather than to simply replicate. Re-implementability, while an awful word, is closer to what we mean by replication in the experimental world anyway. Open Source is probably the best way to do this in the long term, and in a perfect world the software and support would be there to make this possible, but until we get there, for many researchers, it is a better use of their time, and the taxpayer’s money that pays for that time, to do that line fitting in Excel. And the damage is minimal as long as source data and parameters for the fit are made public. If we push forward on all three fronts, Open Data, Open Process, and Open Source then I think we will get there eventually because it is a more effective way of doing research, but in the meantime, sometimes, in the bigger picture, I think a shortcut should be acceptable.

I think these are fair points. Science needs reproduceability in terms of the results, but that doesn't imply that the protocols must be copied exactly. As Neylon says, the key is "re-implementability" - the fact that you *can* reproduce the results with the given information. Using Excel instead of OpenOffice.org Calc is not a big problem provided enough details are provided.

However, it's easy to think of circumstances where *new* code is being written to run on proprietary engines where it is simply not possible to check the logic hidden in the black boxes. In these circumstances, it is critical that open source be used at all levels so that others can see what was done and how.

But another interesting point emerged from this anecdote from the same post:

Sometimes the problems are imposed from outside. I spent a good part of yesterday battling with an appalling, password protected, macroed-to-the-eyeballs Excel document that was the required format for me to fill in a form for an application. The file crashed Open Office and only barely functioned in Mac Excel at all. Yet it was required, in that format, before I could complete the application.

Now, this is a social issue: the fact that scientists are being forced by institutions to use proprietary software in order to apply for grants or whatever. Again, it might be unreasonable to expect young scientists to sacrifice their careers for the sake of principle (although Richard Stallman would disagree). But this is not a new situation. It's exactly the problem that open access faced in the early days, when scientists just starting out in their career were understandably reluctant to jeopardise it by publishing in new, untested journals with low impact factors.

The solution in that case was for established scientists to take the lead by moving their work across to open access journals, allowing the latter to gain in prestige until they reached the point where younger colleagues could take the plunge too.

So, I'd like to suggest something similar for the use of open source in science. When established scientists with some clout come across unreasonable requirements - like the need to use Excel - they should refuse. If enough of them put their foot down, the organisations that lazily adopt these practices will be forced to change. It might require a certain courage to begin with, but so did open access; and look where *that* is now...

Follow me on Twitter @glynmoody

03 April 2009

Why We Should Teach Maths with Open Source

Recently, I was writing about science and open source (of which more anon); here are some thoughts on why maths ought to be taught using free software:

I personally feel it is terrible to *train* students mainly to use closed source commercial mathematics software. This is analogous to teaching students some weird version of linear algebra or calculus where they have to pay a license fee each time they use the fundamental theorem of calculus or compute a determinant. Using closed software is also analogous to teaching those enthusiastic students who want to learn the proofs behind theorems that it is illegal to do so (just as it is literally illegal to learn *exactly* how Maple and Mathematica work!). From a purely practical perspective, getting access to commercial math software is very frustrating for many students. It should be clear that I am against teaching mathematics using closed source commercial software.

Follow me on Twitter @glynmoody

User-Generated Content: Microsoft vs. Google

Back in November I was urging you to submit your views on a consultation document on the role of copyright in the knowledge economy, put out by the European Commission. The submissions have now been published online, and I'm deeply disappointed to see that not many of took a blind bit of notice of my suggestion...

On Open Enterprise blog.

Follow me on Twitter @glynmoody

HADOPI Law Passed - by 12 Votes to 4

What a travesty of democracy:

Alors que le vote n'était pas prévu avant la semaine prochaine, les quelques députés présents à l'hémicycle à la fin de la discussion sur la loi Création et Internet ont été priés de passer immédiatement au vote, contrairement à l'usage. La loi a été adoptée, en attendant son passage en CMP puis au Conseil Constitutionnel.

On peine à en croire la démocratie dans laquelle on prétend vivre et écrire. Après 41 heures et 40 minutes d'une discussion passionnée sur le texte, il ne restait qu'une poignée de courageux députés autour de 22H45 jeudi soir lorsque l'Assemblée Nationale a décidé, sur instruction du secrétaire d'Etat Roger Karoutchi, de passer immédiatement au vote de la loi Création et Internet, qui n'était pas attendu avant la semaine prochaine. Un fait exceptionnel, qui permet de masquer le nombre important de députés UMP qui se seraient abstenus si le vote s'était fait, comme le veut la tradition, après les questions au gouvernment mardi soir. Ainsi l'a voulu Nicolas Sarkozy.

...

Quatre députés ont voté non (Martine Billard, Patrick Bloche et deux députés non identifiés), et une dizaine de mains se sont levées sur les bancs de la majorité pour voter oui. En tout, 16 députés étaient dans l'hémicycle au moment du vote.

[Via Google Translate: While the vote was not expected until next week, the few members in the chamber at the end of the discussion on the Creation and Internet law were invited to proceed immediately to vote, contrary to custom.The law was passed, until it passes then CMP in the Constitutional Council.

It is difficult to believe in democracy in which we aim to live and write. After 41 hours and 40 minutes of passionate discussion on the text, there remained only a handful of courageous members around 22:45 Thursday evening when the National Assembly decided, on the instructions of the Secretary of State Roger Karoutchi to pass immediately to vote on the Creation and Internet law, which was not expected before next week. One exception, which allows you to hide the large number of UMP deputies who would have abstained if the vote had been, as tradition dictates, after the government issues Tuesday night. Thus wished Nicolas Sarkozy.

...

Pack is voted. Four members voted no (Martine Billard, Patrick Bloche and two unidentified deputies), and a dozen hands were raised on the banks of the majority to vote yes. In all, 16 MPs were in the chamber for the vote.]

So one of the most important, and contentious piece of legislation in recent years is passed by trickery. In this way, those pushing this law have shown their true colours and their contempt for the democratic process.

Follow me on Twitter @glynmoody

02 April 2009

"Piracy Law" Cuts *Traffic* not "Piracy"

This story is everywhere today:


Internet traffic in Sweden fell by 33% as the country's new anti-piracy law came into effect, reports suggest.

Sweden's new policy - the Local IPRED law - allows copyright holders to force internet service providers (ISP) to reveal details of users sharing files.

According to figures released by the government statistics agency - Statistics Sweden - 8% of the entire population use peer-to-peer sharing.

The implication in these stories is that this kind of law is "working", in the sense that it "obviously" cuts down copyright infringement, because it's cutting down traffic.

In your dreams.

All this means is that people aren't sharing so much stuff online. But now that you can pick up a 1 Terabyte external hard drive for less than a hundred quid - which can store about a quarter of a million songs - guess what people are going to turn to in order to swap files in the future?

Follow me on Twitter @glynmoody

Open Science Requires Open Source

As Peter Suber rightly points out, this paper offers a reversal of the usual argument, where open access is justified by analogy with open source:


Astronomical software is now a fact of daily life for all hands-on members of our community. Purpose-built software for data reduction and modeling tasks becomes ever more critical as we handle larger amounts of data and simulations. However, the writing of astronomical software is unglamorous, the rewards are not always clear, and there are structural disincentives to releasing software publicly and to embedding it in the scientific literature, which can lead to significant duplication of effort and an incomplete scientific record. We identify some of these structural disincentives and suggest a variety of approaches to address them, with the goals of raising the quality of astronomical software, improving the lot of scientist-authors, and providing benefits to the entire community, analogous to the benefits provided by open access to large survey and simulation datasets. Our aim is to open a conversation on how to move forward.

The central argument is important: that you can't do science with closed source software, because you can't examine its assumptions or logic (that "incomplete scientific record"). Open science demands open source.

Follow me on Twitter @glynmoody

Second Chance at Life

Two years ago, the virtual world Second Life was everywhere, as pundits and press alike rushed to proclaim it as the Next Big Digital Thing. Inevitably, the backlash began soon afterwards. The company behind it, Linden Lab, lost focus and fans; key staff left. Finally, last March, Second Life's CEO, creator and visionary, Philip Rosedale, announced that he was taking on the role of chairman of the board, and bringing in fresh leadership. But against an increasingly dismal background, who would want to step into his shoes?

From the Guardian.

Follow me on Twitter @glynmoody

31 March 2009

Trailing Clouds of Openness

As you may have heard, there's been a bit of a to-do over a new “Open Cloud Manifesto.” Here's the central idea:

The industry needs an objective, straightforward conversation about how this new computing paradigm will impact organizations, how it can be used with existing technologies, and the potential pitfalls of proprietary technologies that can lead to lock-in and limited choice....

On Open Enterprise blog.

Follow me on Twitter @glynmoody

Do Open Source Companies *Really* Support Free Software?

Asterisk, a PBX, telephony engine, and telephony applications toolkit, is one of open source best-kept secrets. As with many open source projects, there is a company has been set up to provide support, Digium. Here's its latest press release....

On Open Enterprise blog.

Follow me on Twitter @glynmoody

30 March 2009

Bad News: Microsoft Gets its Way with TomTom

Well, the question as to how the great Microsoft vs. TomTom suit would finish has been answered:

Microsoft and TomTom announced on Monday that they have reached a settlement in their respective patent suits....

On Open Enterprise blog.

Follow me on Twitter @glynmoody

Open Source Social Documentation for Museums

Again, open source reaches ever-new bits:


The new MAA Documentation System combines open-source technologies with deep social computing principles to create a truly innovative approach to museum documentation. The new MAA Documentation System shifts the age-old documentation principles of standardized description and information accumulation to multi-vocal and multi-source accounts and distributed documentation.

For the past few years, the MAA has been developing an open-source Documentation System. With over 20 years experience of developing its own Documentation Systems and Collections Management Systems, the MAA is just about to finish one of the most ambitious upgrades of its history. In fact, this system is the result of a complete re-think of its documentation practices. Thought the new system takes account of documentation standards, such as SPECTRUM, and newer developments such as CollectionSpace, it differs from the traditional approaches is several key respects.

And if that isn't wonderful enough, this new project comes from Cambridge's Museum of Archaeology & Anthropology - known to its friends as Arch and Anth. Its old, Victorian(?) building was one of the most atmospheric places in Cambridge.

Follow me on Twitter @glynmoody

Save the European Internet – Write to Your MEPs (Again)

Last week I was urging you to write to a particular set of MEPs about proposed changes to the Telecoms Package, which is wending its slow way through the European Union's legislative system. Now it's time to write to *all* you MEPs, since a crucially important vote in a couple of committees is to take place tomorrow. You can read more about what's been happening and why that's a problem on the La Quadrature du Net site, which also offers a detailed analysis of the Telecom Package and the proposed amendments.

Here's what I've just sent to all my MEPs using WriteToThem:

I am writing to ask you as my representative to contact your colleagues on the IMCO and ITRE committees about crucial votes on the Telecoms Package, taking place in 31 March. At stake is nothing less than the future of the Internet in Europe. If amendments being supported by AT&T and others go through, the main driver of the Internet – and with it, online innovation – will be nullified.

This would be deeply ironic, since it was in Europe that the most important online innovation of all – the Web – was invented. In fact, no less a person than Sir Tim Berners-Lee, its inventor, has warned (at http://dig.csail.mit.edu/breadcrumbs/node/144) that the loss of net neutrality – which is what some of the proposed amendments would lead to – would have made it impossible for him to have carried out his revolutionary work. If we wish Europe to remain in the forefront of digital innovation, it is vital that the net neutrality of the Internet be preserved.

This is a complex issue – I personally find it very difficult to navigate through the many conflicting options before the committees. Fortunately, others have already done the hard work, and boiled down the recommendations to the following.

For your colleagues on the IMCO committee, please urge then to:

Vote against the amendments authorizing “net discrimination” and guarantee it is not put in place, by :

rejecting amendements 136=137=138 pushed by AT&T (and the related recitals 116, 117=118)

voting for amendment 135 bringing protection against “net discrimination”

as a default, if the first ones were all rejected, vote for ams 139+141

Vote for positive protection of EU citizens' fundamental rights in amendments 72=146

Vote for protecting EU citizens' privacy by rejecting amendment 85 and voting for am. 150.

Similarly, for those on the IMRE committee, please ask them to:

Protect EU citizens fundamental rights and freedoms by voting for amendment 46=135 (first reading amendment 138).

Reject the notion of “lawful content” in amendment 45 for it is a major breach to the technical neutrality of the network, would turn operators into private judges, and open the door to “graduated response” (or “three strikes”) schemes of corporate police.

If you or your colleagues are interested in seeing the detailed analysis of all the amendments, it can be found here:

http://www.laquadrature.net/wiki/Telecoms_Package_2nd_Reading_ITRE_IMCO_Voting_List.

This is a critical series of votes for the Internet in Europe. At a time of great economic turmoil, the last thing we can afford is to throttle Europe's entrepreneurial spirit; for this reason, I hope that you will be able to convince your colleagues on the committees to vote as suggested above.

Sadly, this is really important and really urgent. Please add your voice if you can, or the Internet as we know may cease to exist in Europe soon, to be replaced with something closer to a cable TV service. You have been warned.

Follow me on Twitter @glynmoody

29 March 2009

Building on Richard Stallman's Greatest Achievement

What was Richard Stallman's greatest achievement? Some might say it's Emacs, one of the most powerful and adaptable pieces of software ever written. Others might plump for gcc, an indispensable tool used by probably millions of hackers to write yet more free software. And then there is the entire GNU project, astonishing in its ambition to create a Unix-like operating system from scratch. But for me, his single most important hack was the creation of the GNU General Public Licence....

On Linux Journal.

Follow me on Twitter @glynmoody

28 March 2009

Phished by Visa

This is utterly scandalous:

Not content with destroying the world’s economies, the banking industry is also bent on ruining us individually, it seems. Take a look at Verified By Visa. Allegedly this protects cardholders - by training them to expect a process in which there’s absolutely no way to know whether you are being phished or not. Even more astonishing is that this seen as a benefit!

...

Craziness. But it gets better - obviously not everyone is pre-enrolled in this stupid scheme, so they also allow for enrolment using the same inline scheme. Now the phishers have the opportunity to also get information that will allow them to identify themselves to the bank as you. Yes, Visa have provided a very nicely tailored and packaged identity theft scheme. But, best of all, rather like Chip and PIN, they push all blame for their failures on to the customer

I've instinctively hated these "Verified by Visa" ever since they came out, and tried not to use them. The fact that they are not just inherently insecure, but encouraging merchants to use this in the most insecure way possible, is astonishing even for an industry as rank and rotten as banking.

The one consolation has to be that Verified by Visa is so demonstrably insecure that it should be easy to challenge in court any attempts to make customers pay for the banks' own stupidity.

Follow me on Twitter @glynmoody

27 March 2009

Why Everyone Hates the PRS

Another classic post from Mike Masnick about the absurdities our current copyright regime visits upon us:

PRS has now threatened a woman who plays classical music to her horses in her stable to keep them calm. She had been turning on the local classical music station, saying that it helped keep the horse calm -- but PRS is demanding £99 if she wants to keep providing such a "public performance." And it's not just a one-off. Apparently a bunch of stables have been receiving such calls.

That's pathetic enough, but it's Masnick's parting shot that really struck me:

The group seems to believe that playing music in almost any situation now constitutes a public performance and requires a licensing fee. You just know they're salivating over the opportunity to go after people playing music in their cars with the windows down.

Because you know what? I bet the PRS is really considering how to do this.

Follow me on Twitter @glynmoody

26 March 2009

"Three Strikes" Struck Down for Third Time

As I wrote earlier today, things are looking bad for the Internet in Europe. But the European Parliament continues to do its bit protecting you and me. Here's the latest from the excellent Quadrature du Net site:

The European Parliament, endorsing the Lambrinidis report and turning its back on all the amendments supported by the French government and defended by Jacques Toubon and Jean-Marie Cavada, has just rejected "graduated response" for the third time. France is definitely alone in the world with its kafkaesque administrative machinery, an expensive mechanism for arbitrary punishment.

The report of Eurodeputy Stavros Lambrinidis concerning the protection of individual liberties on the Internet has just been confirmed by the European parliament by an overwhelming vote of 481 to 252.

It stands in clear opposition to the French HADOPI law in "holding that illiteracy with computers will be the illiteracy of the 21st century; holding that guaranteeing Internet access to all citizens is the same as guaranteeing all citizens access to education and holding that such access must not be refused in punishment by governments or private organizations; holding that this access should not be used abusively for illegal activities; holding that attention must be paid to emerging questions such as network neutrality, interoperability, the global accessibility of all Internet nodes, and the use of open formats and standards."

The approval of the Lambrinidis report and the rejection of the French amendments is the third consecutive time that the European Parliament has rejected the French "graduated response", since the approval of the Bono amendment to the report on cultural industries and the well-known
Bono/Cohn-Bendit/Roithova Amendment 138.

Furthermore, all the amendments supported by the French government, notably those proposed by Eurodeputies Jacques Toubon and Jean-Marie Cavada, have been rejected. They were trying specifically to prevent measures related to graduated response, showing that the French government realizes that Europe is about to render the HADOPI law obsolete before it even comes to a vote.

Alas, this is by no means the end. The same wretched clause will come bounding back, along with all kinds of other stupidities. The fight goes on....

Follow me on Twitter @glynmoody

Patents Fail to Make Patent = Patent Failure

WIPO has just published a study entitled Dissemination of Patent Information. I've not read it, but here's someone who has, with an interesting observation:


In the first 71 paragraphs of the study, theoretical availability of patent information is confused with dissemination of patent information. Indeed, the study itself, belatedly, recognises the distinction between the theory of patent law and disclosure and the reality of accessing useful patent information in paragraph 72. Here the study states that availability of information does not always mean it is accessible in practical terms. Based on the figures provided in the study, in practical terms, accessibility of patent information is quite poor.

In other words, the one thing that patents *must* do - to disclose and make patent - they generally do badly. The net effect is that patents take away from the knowledge commons, without giving back even the paltry payment they owe. Add it to the (long) list of why patents fail. (Via Open Access News.)

Follow me on Twitter @glynmoody

Open Knowledge Conference (OKCon) 2009

If open knowledge is your thing, London is the place, Saturday the time:

The Open Knowledge Conference (OKCon) is back for its fourth installment bringing together individuals and groups from across the open knowledge spectrum for a day of discussions workshops.

This year the event will feature dedicated sessions on open knowledge and development and on the semantic web and open data. Plus there's the usual substantial allocation of 'Open Space' -- sessions, workshops and discussions proposed either via the CFP or on the day
.
Follow me on Twitter @glynmoody

Save the European Internet – Write to Your MEPs

Things seem to be going from bad to worse with the EU's Telecoms Package. Now, not only do we have to contend with French attempts to push through its “three strikes and you're out” approach again, which the European Parliament threw out, but there are several other amendments that are being proposed that will effectively gut the Internet in Europe.

The Open Rights Group has a good summary of two of the main threats (also available from its Blackout Europe Facebook group):

One of the most controversial issues is that of the three-strikes strongly and continuously pushed by France in the EU Council. Although most of the dispositions introducing the graduate response system were rejected in first reading of the Telecom Package, there are still some alarming ones persisting. France is trying hard to get rid of Amendment 138 which seeks to protect users’ rights against the three-strikes sanctions and which, until now, has stopped the EU from applying the three-strikes policy. Also, some new amendments reintroduce the notion of lawful content, which will impose the obligation on ISPs to monitor content going through their networks.

The UK government is pushing for the “wikipedia amendments” (so-called because one of them has been created by cutting and pasting a text out of the wikipedia) in order to allow ISPs to make limited content offers. The UK amendments eliminate the text that gives users rights to access and distribute content, services and applications, replacing it with a text that says “there should be transparency of conditions under which services are provided, including information on the conditions to and/or use of applications and services, and of any traffic management policies.”

To these, we must now add at least one more, which the indispensable IPtegrity site has spotted:

Six MEPs have taken text supplied by the American telecoms multi-national, AT&T, and pasted it directly into amendments tabled to the Universal Services directive in the Telecoms Package. The six are Syed Kamall , Erika Mann, Edit Herczog , Zita Pleštinská , Andreas Schwab , and Jacques Toubon .

AT&T and its partner Verizon, want the regulators in Europe to keep their hands-off new network technologies which will provide the capability for broadband providers to restrict or limit users access to the Internet. They have got together with a group of other telecoms companies to lobby on this issue. Their demands pose a threat to the neutrality of the network, and at another level, to millions of web businesses in Europe.

As you can read, this is a grave danger for the Internet in Europe, because it would allow telecom companies to impose restrictions on the services they provide. That is, at will, they can discriminate against new services that threaten their existing offerings – and hence throttle online innovation. The Internet has grown so quickly, and become so useful, precisely because it is an end-to-end service: it does not take note of or discriminate between packets, it simply delivers them.

What is particularly surprising is that one of the MEPs putting forward this amendment is the UK's Syed Kamall, who has a technical background, and in the past has shown himself aware of the larger technological issues. I'm really not sure why he is involved in this blatant attempt by the telecoms companies to subvert the Internet in Europe.

Since he is one of my MEPs (he represents London), I've used the WriteToThem service to send him the following letter:

I was surprised and greatly disappointed to learn that you are proposing an amendment to the Telecoms Package that would have the consequence of destroying the network neutrality of the Internet – in many ways, its defining feature.

Your amendment 105, which requires network providers to inform users of restrictions and/or limitations on their communications services will allow companies to impose arbitrary blocks on Internet services; instead, we need to ensure that no such arbitrary restrictions are possible.

As the inventor of the Web, Sir Tim Berners-Lee, has pointed out when net neutrality was being debated in the US (http://dig.csail.mit.edu/breadcrumbs/node/144):

“When I invented the Web, I didn't have to ask anyone's permission. Now, hundreds of millions of people are using it freely. I am worried that that is going end in the USA.

I blogged on net neutrality before, and so did a lot of other people. ... Since then, some telecommunications companies spent a lot of money on public relations and TV ads, and the US House seems to have wavered from the path of preserving net neutrality. There has been some misinformation spread about. So here are some clarifications.

Net neutrality is this:

If I pay to connect to the Net with a certain quality of service, and you pay to connect with that or greater quality of service, then we can communicate at that level.

That's all. Its up to the ISPs to make sure they interoperate so that that happens.

Net Neutrality is NOT asking for the internet for free.

Net Neutrality is NOT saying that one shouldn't pay more money for high quality of service. We always have, and we always will

There have been suggestions that we don't need legislation because we haven't had it. These are nonsense, because in fact we have had net neutrality in the past -- it is only recently that real explicit threats have occurred.”

He concludes:

“Yes, regulation to keep the Internet open is regulation. And mostly, the Internet thrives on lack of regulation. But some basic values have to be preserved. For example, the market system depends on the rule that you can't photocopy money. Democracy depends on freedom of speech. Freedom of connection, with any application, to any party, is the fundamental social basis of the Internet, and, now, the society based on it.”

I'm afraid that what your amendment will do is to destroy that freedom. I am therefore asking you to withdraw your amendment, to preserve the freedom of the connection that allows new services to evolve, and innovations to be made without needing to ask permission of the companies providing the connection. Instead, the Internet needs net neutrality to be enshrined in law, and if possible, I would further request you and your colleagues to work towards this end.

If you are also based in London – or in a constituency represented by one of the five other MEPs mentioned in the IPtegrity story - I urge you to write a similar (but *not* identical) letter to them. It is vitally important these amendments be withdrawn, since most MEPs will be unaware of the damage they can do, and might well wave them through. Further letters to all MEPs will also be needed in due course, but I think it's best to concentrate on these particular amendments for the moment, since they are a new and distrubing development.

Follow me on Twitter @glynmoody

25 March 2009

A Question Red Hat Must Answer

With apologies for returning to the theme of patents, I'd like to direct your attention to a long and interesting piece that has appeared on the Digital Majority site asking a very important question: “Did Red Hat lobby for, or against software patents in Europe?”

On Open Enterprise blog.

Follow me on Twitter @glynmoody

Copyright: An Open Letter for Closed Minds

Another impressive line-up of mega-academics denouncing the lack of logic for the proposed copyright extension currently being considered in the EU (I'll be writing about this again soon). Here's Rufus Pollock's intro, setting this open letter in a historical context:

The letter, of which I was a signatory, is focused on the change in the UK government’s position (from one of opposition to a term extension to, it appears, one of allowing an extension “perhaps to 70 years”). However, it is noteworthy that this is only one in a long line long line of well-nigh universal opposition among scholars to this proposal to extend copyright term.

For example, last April a joint letter was sent to the Commission signed by more than 30 of the most eminent European (and a few US) economists who have worked on intellectual property issues (including several Nobel prize winners, the Presidents of the EEA and RES, etc). The letter made very clear that term extension was considered to be a serious mistake (you can find a cached copy of this letter online here). More recently — only two weeks ago — the main European centres of IP law issued a statement (addendum) reiterating their concerns and calling for a rejection of the current proposal.

Despite this well-nigh universal opposition from IP experts the Commission put forward a proposal last July to extend term from 50 to 95 years (retrospectively as well as prospectively). That proposal is now in the final stages of its consideration by the European Parliament and Council. We can only hope that they will understand the basic point that an extension of the form proposed must inevitably to more harm than good to the welfare of the EU and should therefore be opposed.

Do read the letter too: the intellectual anger at this stupidity is palpable.

Follow me on Twitter @glynmoody

UK Pupils to Learn How to Be Spied On

Here are two interesting stories:

First, the government wants children to use social networking sites like Twitter:


Children will no longer have to study the Victorians or the second world war under proposals to overhaul the primary school curriculum, the Guardian has learned.

However, the draft plans will require children to master Twitter and Wikipedia and give teachers far more freedom to decide what youngsters should be concentrating on in classes.

Second, the government wants to monitor social networking sites like Twitter:

Social networking sites like Facebook could be monitored by the UK government under proposals to make them keep details of users' contacts.

Putting these together, we can deduce that UK government has decided that passively monitoring people isn't enough: now it's time actively to train future generations in the fine art of being spied upon.

Follow me on Twitter @glynmoody

24 March 2009

And RMS Spake, and it Was Good

As well as being a great coder, RMS is a fine writer (he made a number of excellent suggestions when I sent him rough drafts of the relevant chapter of Rebel Code). So it's a pity that he doesn't write much these days.

And it's also a red-letter day when he does, as with his latest missive: "The Javascript Trap". This describes a problem he has spotted: non-free Javascript.

It is possible to release a Javascript program as free software, by distributing the source code under a free software license. But even if the program's source is available, there is no easy way to run your modified version instead of the original. Current free browsers do not offer a facility to run your own modified version instead of the one delivered in the page. The effect is comparable to tivoization, although not quite so hard to overcome.

...

It is possible to release a Javascript program as free software, by distributing the source code under a free software license. But even if the program's source is available, there is no easy way to run your modified version instead of the original. Current free browsers do not offer a facility to run your own modified version instead of the one delivered in the page.

He comes up with some interesting solutions:

we need to change free browsers to support freedom for users of pages with Javascript. First of all, browsers should be able to tell the user about nontrivial non-free Javascript programs, rather than running them. Perhaps NoScript could be adapted to do this.

Browser users also need a convenient facility to specify Javascript code to use instead of the Javascript in a certain page.

RMS: where would we be without him?

Follow me on Twitter @glynmoody

Why Software Should not be Patentable

As I've written elsewhere today, there's a lot of activity happening around software patents at the moment. One forum where they're being considered is WIPO.

The FSFE has put together a suitably diplomatic submission to that one of its committees about why software should not be patentable; here's the key section:


the economic rationale for patents is based on providing incentives in cases of market failure, disclosure of knowledge in the public domain, as well as technology transfer, commercialisation, and diffusion of knowledge. The “three step test for inclusion in the patent system” should therefore be based on demonstrated market failure to provide innovation, demonstrated positive disclosure from patenting, and effectiveness of the patent system in the area to disseminate knowledge. Software fails all three tests, for instance, as innovation in the IT industry has been dramatic before the introduction of patents, there is no disclosure value in software patents, and patents play no role in the diffusion of knowledge about software development.

I think this is one of the best summaries on the subject. One to cut out and keep.

Follow me on Twitter @glynmoody

Are You Using R?

It seems appropriate on Ada Lovelace Day to note the move of one of the best-known female champions of open source from Intel to the startup REvolution Computing....

On Open Enterprise blog.

Patently, There's Something in the Air

Yesterday I was writing about the latest moves in the TomTom saga, and its involvement with the Open Invention Network patent commons. But beyond that specific case, patents – particularly software patents – really seem to be in the air at the moment....

On Open Enterprise blog.

We Have a Choice

as civilization collapses, we're going to see horrific scarcities, creating massive personal and collective stresses that will break both individuals (to the point of suicide, terrorism and murder) and nations (to the point of insurrection, civil war, and anarchy -- a hundred Afghanistans). We're going to see dreadful pandemic diseases and poverty and famine that will be utterly shattering, like the abject horror the world witnessed during the Irish potato famine where millions simply sat around, hopeless and increasingly gaunt, until they died an agonizing death alongside those they loved and couldn't save. We're going to see the kind of spiritual vacuum and decay that is eating Russia and the former Soviet republics alive today, with population and life expectancy plummeting, drug addiction at epidemic levels, and crime and gang violence out of control. It is nature's last and most reluctant way of restoring to sustainable populations species whose numbers and voraciousness have run amok.

Or, as an alternative, we could be sensible and tackle the problems facing us - climate change, deforestation, overfishing, overpopulation, peak oil, peak water, poverty - seriously, not with political posturing and soundbites, and maybe come out the other side.

Where are the Alpha *Female* Hackers?

Today is Ada Lovelace Day:

Ada Lovelace Day is an international day of blogging to draw attention to women excelling in technology.

Women’s contributions often go unacknowledged, their innovations seldom mentioned, their faces rarely recognised. We want you to tell the world about these unsung heroines. Entrepreneurs, innovators, sysadmins, programmers, designers, games developers, hardware experts, tech journalists, tech consultants. The list of tech-related careers is endless.

Recent research by psychologist Penelope Lockwood discovered that women need to see female role models more than men need to see male ones. That’s a relatively simple problem to begin to address. If women need female role models, let’s come together to highlight the women in technology that we look up to. Let’s create new role models and make sure that whenever the question “Who are the leading women in tech?” is asked, that we all have a list of candidates on the tips of our tongues.

Not surprisingly, my first thought was: who have we got in the world of free software? There are certainly some big names like Mitchell Baker, Chief Lizard Wrangler of Mozilla and Stormy Peters, Executive Director of the GNOME Foundation.

But notice that both of these occupy executive positions: they hack business/legal/social systems. And while there are plenty of female coders contributing to free software projects, I can't think of any high-profile ones that might stand alongside the obvious alpha males in the coding world.

Now, this is probably due to my ignorance as much as anything. So I'd like to put out a call for names that I ought to know in this context - women who code at a high level, and whose names I should be mentioning more often. And as a pendant I'd also be interested on people's thoughts as to how we can nurture more top-flight female hackers.

Update: Just come across this great List of women in Open Source.

23 March 2009

The New Crowdsourcing: Pubsourcing

Interesting development here:

The United States has unveiled an unlikely weapon in its battle against drugs gangs and illegal immigrants at the Texas-Mexico border - pub-goers in Australia.

The drinkers are the most far-flung of a sizeable army of hi-tech foot soldiers recruited to assist the border protection effort.

Anyone with an internet connection can now help to patrol the 1,254-mile frontier through a network of webcams set up to allow the public to monitor suspicious activity. Once logged in, the volunteers spend hours studying the landscape and are encouraged to email authorities when they see anyone on foot, in vehicles or aboard boats heading towards US territory from Mexico.

But the important point here is not just the quaint locale: it is the fact that the observers are completely disconnected from the observed. There is no human connection, so there would be no compunction in reporting anything required.

This is the perfect surveillance system: not where your neighbours keep an eye on you, but where total strangers the other side of the world do. (Via The Reg.)

Have I Got News for *Them*

This is just incredible:

Major media companies are increasingly lobbying Google to elevate their expensive professional content within the search engine's undifferentiated slush of results.

Many publishers resent the criteria Google uses to pick top results, starting with the original PageRank formula that depended on how many links a page got. But crumbling ad revenue is lending their push more urgency; this is no time to show up on the third page of Google search results. And as publishers renew efforts to sell some content online, moreover, they're newly upset that Google's algorithm penalizes paid content.

Let's just get this right. The publishers resent the fact that the stuff other than "professional content" is rising to the top of Google searches, because of the PageRank algorithm. But wait, doesn't the algorithm pick out the stuff that has most links - that is, those sources that people for some reason find, you know, more relevant?

So doesn't this mean that the "professional content" isn't, well, so relevant? Which means that the publisher are essentially getting what they deserve because their "professional content" isn't actually good enough to attract people's attention and link love?

And the idea that Google's PageRank is somehow "penalising" paid content by not ignoring the fact that people are reading it less than other stuff, is just priceless. Maybe publishers might want to consider *why* their "professional content" is sinking like a stone, and why people aren't linking to it? You know, little things like the fact it tends to regard itself as above the law - or the algorithm, in this case? (Via MicroPersuasion.)

Patent Commons: Uncommon but Patently Good

News that TomTom is joining the Open Invention Network (OIN) reminded me that the latter is an example of a patent commons, where patents are shared on a like-for-like basis:

OIN grants patent license to licensee

– All OIN patents and applications for all products

Licensee grants patent license to OIN

– All licensee patents and applications for the Linux System

Licensee grants license to other current and future licensees

– All licensee patents and applications for the Linux System

It's an interesting approach, and one that's gradually gaining adherents. For example, IBM set up something called the Eco-Patents Commons:

The Eco-Patent Commons is an initiative to create a collection of patents on technology that directly or indirectly protects the environment. The patents will be pledged by companies and other intellectual property rights holders and made available to anyone free of charge.

What's interesting here is that one commons - that of eco-patents - is being used to protect another - the environment. There's more information about the idea in this post by someone who works for IBM and is involved in the project.

Why TomTom is the new SCO (in the nicest possible way)

The 2003 SCO lawsuit, for those of you too young to recall, began as a modest request for $1 billion from IBM for allegedly “misusing and misappropriating SCO’s proprietary software” amongst other things....

On Open Enterprise blog.

The State of the Database State

A recurrent theme in these posts – and throughout Computerworld UK – has been the rise of vast, unnecessary and ultimately doomed databases in the UK.

But those stories have been largely sporadic and anecdotal; what has been lacking has been a consolidated, coherent and compelling analysis of what is going on in this area – what is wrong, and how we can fix it.

That analysis has just arrived in the form of the Database State report, commissioned by the Joseph Rowntree Foundation from the Foundation for Information Policy Research (FIPR).

On Open Enterprise blog.

22 March 2009

Why Barclays Are Barking

The little brouhaha concerning the Guardian and Barclays Bank is a wonderful object lesson in how the Internet changes everything. Once those super-secret documents were put up for even a few seconds, the game was over: taking them down from the Guardian afterwards really is the proverbial closing of the stable door after the horse has bolted.

Inevitably, a copy has made its way to Wikileaks; inevitably that link is being exposed all over the place, which has led to the site being overloaded (do make a donation if you can: I've given my widow's mite). Barclays Bank can apply for as many injunctions as they like, the judge can - and probably will - huff and puff as much as he/she likes, but the game's over: this stuff is out.

And quite right too: these documents either show the bank engaged in something dodgy, in which case they should be published, or they don't, in which case there's no problem in them being public anyway, since the bank is asking for serious scads of public dosh, and is effectively being part-nationalised.

But even if it weren't, it would be folly to try to keep them secret now: it would only ensure that even more people write about them, and point to them, and maybe even read them. The rules have changed.

Тaking the War against Terror to a New Level...

..of utter, inane stupidity. Here's the grand summing-up of Brown's "new level":

Terrorism threatens the rights that all in this country should hold dear, including the most fundamental human right of all - the right to life. We know that terrorists will keep on trying to strike and that protecting Britain against this threat remains our most important job.

That tired old Blairite trope: the "right to life" as the "the most fundamental human right of all". Except that it's not a *right*: do I have a right to life when I'm suffering from a terminal disease? Do I have a right to life when I'm 123 years old? Do I have a right to life when the Sun explodes? "Right to life": an idiotic meme, which certainly has no "right to life".

What he should have said is this:

This government threatens the rights that all in this country should hold dear, including the most fundamental human right of all - freedom. We know that this government will keep on trying to strike and that protecting Britain against this threat remains your most important job.