15 April 2009

The Value of Sharing

Yesterday I wrote about how the media industries abuse language in order to justify their broken business models; today I'd like to complement this by looking at their misuse of numbers.

On Open Enterprise blog.

RMS on Amazon's "Swindle"

As you've probably seen, there is concern over Amazon's plans to pull the text-to-voice capability of the Kindle e-book reader, because of misguided pressure from authors groups in the US. There's been a lot of discussion about this, and how to react to it, on the A2k mailing list, including the following characteristic submission from a certain Richard M Stallman:


I sympathize with the feeling behind these protests, but they are directed at the wrong target.

The protestors rightly condemn the Authors Guild for demanding the removal of the screen reader feature, but the way they are doing it makes Amazon look like a victim. Actually it is the main perpetrator.

The reason that Amazon can turn off the screen reader capability is that the machines use non-free software, controlled by Amazon rather than by the user. If Amazon can turn this off retroactively (does anyone know for certain if they did?), it implies the product has a dangerous back door.

In addition, the Amazon Swindle is designed with Digital Restrictions Management to stop people from sharing. It is a nasty product with an evil goal.

I hope there will be protests against Amazon's role in these events.

Well, at least he's consistent.

Follow me on Twitter @glynmoody

14 April 2009

Up Next for UK: Ban on Photos of CCTVs?

This is too rich:

The man probing police conduct over the death of a newspaper seller during the G20 protests was wrong to claim there were no CCTV cameras in the area near the Bank of England, it was revealed today.

Several cameras could have captured footage of the incident two weeks ago, contradicting comments made by Nick Hardwick, the chairman of the Independent Police Complaints Commission.

Mr Hardwick made the claim in response to the IPCC being accused of sweeping away evidence of police brutality.

I don't know which is more breathtaking: the fact that he said it, or the fact that he thought it wouldn't be checked and found to be inconsistent with reality, as the Daily Mail pictures prove.

Which brings up the interesting possibility that having banned photos of the police - so as to avoid members of the public taking evidence of police brutality - the next logical step would be to forbid people to take photos of CCTVs or to talk about their location - because it would "help terrorists" - so that the police can then claim that they don't exist in an area where police brutality has taken place.

And if you think that's utterly impossible, you haven't been paying attention.... (Via @stevepurkiss.)

Update: What a surprise, the IPCC has suddenly found those errant CCTV cameras. Amazing how a picture can change one's perception.

Follow me on Twitter @glynmoody

Channelling the Power of Open Source

This blog tends to concentrate on two broad aspects of open source: the issues that affect enterprise users, and the companies based around creating free software. But this misses out a crucial player, that of the “channel”, also known by the equally unhelpful name of “value-added resellers”, or VARs....

On Open Enterprise blog.

Follow me on Twitter @glynmoody

Let's Drop this Insulting “Digital Piracy” Meme

Until recently, piracy referred to the lawless times and characters of the 17th and 18th centuries – or, if closer to the present, to artful/humorous representations of them in books, theatre and film. This has allowed the media industries to appropriate the historical term, and re-fashion it for their own purposes. And they have been highly successful: even normally sane journalists now write about “software piracy”, or “music piracy”....

On Open Enterprise blog.

Follow me on Twitter @glynmoody

13 April 2009

Of Bruce's Law and Derek's Corollary

Much will be written about the events of the last few days concerning the leaked Labour emails, and the plans to create a scurrilous blog. The focus will rightly be on the rise of blogs as a powerful force within the world of journalism, fully capable of bringing down politicians. But here I'd like to examine an aspect that I suspect will receive far less attention.

At the centre of the storm are the emails: what they say, who sent them and who received them. One suggestion was that they were stolen from a cracked account, but that version seems increasingly discounted in favour of the idea that someone who disapproved of the emails' contents simply leaked them. What's interesting for me is how easy this has become.

Once upon a time – say, ten years ago – you would have needed to break into an office somewhere to steal a document in order to leak it. Now, though, the almost universal use of computers means that all this stuff is handily stored in digital form. As a result, sending it to other people is as simple as writing their name (or just the first few letters of their name, given the intelligence built into email clients these days.) This means that multiple copies probably exist in different physical locations.

Moreover, making a further copy leaves *no* trace whatsoever; indeed, the whole of the Internet is based on copies, so creating them is nothing special. Trying to stop copies being made of a digital document, once sent out, is an exercise in futility, because that implies being in control of multiple pre-existing copies at multiple locations – possibly widely separated.

Bruce Schneier has memorably written "trying to make digital files uncopyable is like trying to make water not wet." I'd like to call this Bruce's Law. What has happened recently to the Labour emails is an inevitable consequence of Bruce's Law – the fact that digital documents, once circulated, can and will be copied. Tender and thoughtful alike, perhaps we should dub this fact as Derek's Corollary, in honour of one of the people who has done so much to bring its effects to our attention.

Follow me on Twitter @glynmoody

Why, Actually, Are They Hiding ACTA?

One of the curious aspects of articles and posts about the Anti-Counterfeiting Trade Agreement (ACTA) is that it's all a kind of journalistic shadow-boxing. In the absence of the treaty text, everybody has been relying on leaks, and nudges and winks in the form of official FAQs and “summaries” to give them some clue as to its content....

On Open Enterprise blog.

Follow me on Twitter @glynmoody

Mikeyy Update

OK, I seem to have regained control of my Twitter homepage, and cleared out the infection. But you might want to (a) be sceptical about it and other Twitter follow requests (b) use a non-Web Twitter client and (c) install the NoScript add-on for Firefox, which blocks the operation of Mikeyy and its ilk.

Apologies again.

Urgent: Do *Not* Vist My Twitter Page

My Twitter account has become infected with Mikeyy - ironically because I was checking out whether to block a new follower. Please ignore all my Twitter posts for the moment, especially the last one, which is fake and infected. And apologies to anyone who may already have been infected in this way.

It's slightly annoying that this is the not the first, but the second wave of such infections: I wish Twitter would get this vulnerability sorted out, or it will make Twitter unusable.

10 April 2009

Tesla Model S Sedan Runs on GNU/Linux...

...well, its "Haptic Entertainment And Navigation System" does:


Its a 17-inch LCD touch computer screen that has 3G or wireless connectivity. When we were in the car, the screen featured Google Maps. Tesla’s website verifies that the screen will be able to feature sites like Google Maps and Pandora Music. From what we saw yesterday, the screen is divided vertically into three separate areas: the maps/navigation screen, radio/entertainment area, and climate controls. The navigation screen has several tabs: “internet,” “navigation,” “car,” “backup,” and “phone.” The entertainment section has several tabs, including “audio,” “media,” “streaming,” “playlists,” “artists” and “songs.” The climate controls seem pretty standard. Our driver (see video) says that the computer is going to be run on some kind of Google Maps software and will feature a “full browser.” It’s not surprising that Google Maps is integrated into the interface, Google co-founders Sergey Brin and Larry Page are investors in Tesla. The dashboard is also an LCD touch screen. Tesla has also confirmed to us that the computer/entertainment center will be Linux-based.

GNU/Linux: it's the future.

Follow me on Twitter @glynmoody

How Apt: Apt-urls Arrive

One of the unsung virtues of open source is the ease with which you can add, remove and upgrade programs. In part, this comes down to the fact that all software is freely available, so you don't need to worry about cost: if you want it, you can have it. This makes installation pretty much a one-click operation using managers like Synaptic.

Now things have become even easier:


As of this morning, apt-urls are enabled on the Ubuntu Wiki. What does this mean? In simple terms, this feature provides a simple, wiki-based interface for apt, the base of our software management system. It means that we can now insert clickable links on the wiki that can prompt users to install software from the Ubuntu repositories.

That's pretty cool, but even more amazing is the fact that when I click on the link in the example on the above page, it *already* works:

If you are a Firefox user on Ubuntu, you will also note that the link I’ve provided here works, too. This is because Firefox also allows apt-urls to work in regular web pages.

Free software is just *so* far ahead of the closed stuff: how could anyone seriously claim that it doesn't innovate?

Follow me on Twitter @glynmoody

Open Sourcing 3D Printer Materials

I've written a fair amount about open source fabbers, but here's someone addressing another important aspect: open sourcing how to make the basic material used by 3D printers:

About five years ago, Mark Ganter, a UW mechanical engineering professor and longtime practitioner of 3-D printing, became frustrated with the high cost of commercial materials and began experimenting with his own formulas. He and his students gradually developed a home-brew approach, replacing a proprietary mix with artists' ceramic powder blended with sugar and maltodextrin, a nutritional supplement. The results are printed in a recent issue of Ceramics Monthly. Co-authors are Duane Storti, UW associate professor of mechanical engineering, and Ben Utela, a former UW doctoral student.

"Normally these supplies cost $30 to $50 a pound. Our materials cost less than a dollar a pound," said Ganter. He said he wants to distribute the free recipes in order to democratize 3-D printing and expand the range of printable objects.

(Via Boing Boing.)

Follow me on Twitter @glynmoody

09 April 2009

OpenSecrets Moves To 'Open Data' Model

More welcome transparency moves in the US:

Campaign finance clearinghouse OpenSecrets.org, which is run by the nonpartisan Center for Responsive Politics, is going "open data" next week, according to an e-mail circulated by the center on Thursday.

...

CRP is expecting all sorts of data mash-ups, maps and other cool projects to result from the new capability. Transparency group the Sunlight Foundation helped fund OpenSecrets.org's OpenData initiative to make millions of records available under a Creative Commons "Attribution-Noncommercial-Share Alike" license. CRP will continue to offer its data to commercial users for a fee.

Follow me on Twitter @glynmoody

*Truly* Open Education

Here's some brilliant out-of-the-box thinking by Tony Hirst on how higher education should really be done:

imagine this: when you start your degree, you sign up to the 100 quid a year subscription plan (maybe with subscription waiver while you’re still an undergrad). When you leave, you get a monthly degree top-up. Nothing too onerous, just a current awareness news bundle made up from content related to the undergrad courses you took. This content could be produced as a side effect of keeping currently taught courses current: as a lecturer updates their notes from one year to the next, the new slide becomes the basis for the top-up news item. Or you just tag each course, and then pass on a news story or two discovered using that tag (Martin, you wanted a use for the Guardian API?!;-)

Having the subscription in place means you get 100 quid a year per alumni, without having to do too much at all…and as I suspect we all know, and maybe most of us bear testament to, once the direct debit is in place, there can be quite a lot of inertia involved in stopping it…

But there’s more - because you also have an agreement with the alumni to send them stuff once a month (and as a result maybe keep the alumni contacts database up to date a little more reliably?). Like the top-up content that is keeping their degree current (err….? yeah, right…)…

…and adverts… adverts for proper top-up/CPD courses, maybe, that they can pay to take…

…or maybe they can get these CPD courses for “free” with the 1000 quid a year, all you can learn from, top-up your degree content plan (access to subscription content and library services extra…)

Or how about premium “perpetual degree” plans, that get you a monthly degree top-up and the right to attend one workshop a year “for free” (with extra workshops available at cost, plus overheads;-)

Quick: put this man in charge of the music industry....

Follow me on Twitter @glynmoody

Proof That Some Lawyers *Do* Get It

Not just right, but very well put:

I often explain to clients and prospective clients that the main reward for a great, original product is a successful business based on that product. Intellectual property notwithstanding, the best way to protect most great ideas is by consistently excellent execution, high quality, responsive customer service, continued innovation and overall staying ahead of the competition by delivering more value. Absent the rare circumstance of an entire industry dedicated to counterfeits, à la Vuitton, if an enterprise can’t fathom protecting its value proposition without some kind of gaudy trademark protection, ultimately something has to give.

Fender, according to the record in this opinion, understood the truth well for decades. It warned consumers to stick with its quality “originals” and not to be fooled by “cheap imitations,” and it flourished. But for all these years, Fender never claimed to think the sincerest form of flattery was against the law. Only in the feverish IP-crazy atmosphere of our current century did the company deem it “necessary” to spend a fortune that could have been used on product development, marketing or any darned thing on a quixotic quest for a trademark it never believed in itself. That is more than an impossible dream — it’s a crying shame.

(Via Luis Villa's blog.)

Follow me on Twitter @glynmoody

French "Three Strikes" Law Unexpectedly Thrown Out

In an incredible turn of events, the French HADOPI legislation, which seemed certain to become law, has been thrown out:

French lawmakers have unexpectedly rejected a bill that would have cut off the Internet connections of people who illegally download music or films.

On Open Enterprise blog.

Follow me on Twitter @glynmoody

Should an Open Source Licence Ever Be Patent-Agnostic?

Sharing lies at the heart of free software, and drives much of its incredible efficiency as a development methodology. It means that coders do not have to re-invent the wheel, but can borrow from pre-existing programs. Software patents, despite their name, are about locking down knowledge so that it cannot be shared without permission (and usually payment). But are there ever circumstances when software patents that require payment might be permitted by an open source licence? That's the question posed by a new licence that is being submitted to the Open Source Inititative (OSI) for review.

On Linux Journal.

Follow me on Twitter @glynmoody

08 April 2009

Time to Get Rid of ICANN

ICANN has always been something of a disaster area, showing scant understanding of what the Internet really is, contemptuous of its users, and largely indifferent to ICANN's responsibilities as guardian of a key part of its infrastructure. Here's the latest proof that ICANN is not fit for its purpose:


The familiar .com, .net, .org and 18 other suffixes — officially "generic top-level domains" — could be joined by a seemingly endless stream of new ones next year under a landmark change approved last summer by the Internet Corp. for Assigned Names and Numbers, the entity that oversees the Web's address system.

Tourists might find information about the Liberty Bell, for example, at a site ending in .philly. A rapper might apply for a Web address ending in .hiphop.

"Whatever is open to the imagination can be applied for," says Paul Levins, ICANN's vice president of corporate affairs. "It could translate into one of the largest marketing and branding opportunities in history."

Got that? This change is purely about "marketing and branding opportunities"...the fact that it will fragment the Internet, sow confusion among hundreds of millions of users everywhere, and lead to the biggest explosion of speculative domain squatting and hoarding by parasites who see the Internet purely as a system to be gamed, is apparently a matter of supreme indifference to those behind ICANN: the main thing is that it's a juicy business opportunity.

Time to sack the lot, and put control of the domain name system where it belongs: in the hands of engineers who care.

Follow me on Twitter @glynmoody

Second Life + Moodle = Sloodle

Moodle is one of open source's greatest success stories. It's variously described as an e-learning or course management system. Now, given that education is also one of the most popular applications of Second Life, it would be a natural fit somehow to meld Moodle and Second Life. Enter Sloodle, whose latest version has just been released:

Version 0.4 integrates Second Life 3D classrooms with Moodle, the world’s most popular open source e-learning system with over 30 million users (http://www.moodle.org). This latest release allows teachers and students to prepare materials in an easy-to-use, web-based environment and then log into Second Life to put on lectures and student presentations using their avatars.

The new tools also let students send images from inside Second Life directly to their classroom blog. Students are finding this very useful during scavenger hunt exercises where teachers send them to find interesting content and bring it back to report to their classmates.

Tools that cross the web/3D divide are becoming more popular as institutions want to focus on the learning content rather than the technical overhead involved in orienting students into 3D settings and avatars.

As an open-source platform SLOODLE is both freely available and easily enhanced and adapted to suit the needs of diverse student populations. And web hosts are lining up to support the platform. A number of third-party web hosts now offer Moodle hosting with SLOODLE installed either on request or as standard, making easier than ever to get started with SLOODLE.

SLOODLE is funded and supported by Eduserv - http://www.eduserv.ac.uk/ and is completely free for use under the GNU GPL license.

The project was founded by Jeremy Kemp of San José State University, California, and Dr. Daniel Livingstone of the University of the West of Scotland, UK.

Follow me on Twitter @glynmoody

Forward to the Past with Forrester

Looking at Forrester's latest report on open source , I came across the following:

The bottom line is that in most application development shops, the use of open source software has been a low-level tactic instead of a strategic decision made by informed executives. As a result, while there’s executive awareness of the capital expenditure advantages of adopting OSS, other benefits, potential risks, and the structural changes required to take full advantage of OSS are still poorly understood.

On Open Enterprise blog.

Follow me on Twitter @glynmoody

Open Access as "Middleware"

Interesting simile:

A "legacy system" in the world of computing provides a useful analogy for understanding the precarious state of contemporary academic publishing. This comparison might also keep us from stepping backward in the very act of stepping forward in promoting Open Access publishing and Institutional Repositories. I will argue that, vital as it is, the Open Access movement should really be seen in its current manifestation as academic "middleware" servicing the "legacy system" of old-school scholarship.

(Via Open Access News.)

Follow me on Twitter @glynmoody

07 April 2009

OpenStreetMap Navigates to Wikipedia

One of the powerful features of open source is re-use: you don't have to re-invent the wheel, but can build on the work of others. That's straightforward enough for software, but it can also be applied to other fields of openness. Here's a fantastic example: embedding OpenStreetMap in Wikipedia entries:


For some time, there have been efforts to bring OpenStreetMap (OSM) and Wikipedia closer together. Both projects have the mission to produce free knowledge and information through a collaborative community process. Because of the similarities, there are many users active in both projects – however mutual integration is still lacking.

For this reason, Wikimedia Deutschland (WM-DE, the German Chapter of Wikimedia) are providing funds of 15.000 Euro (almost $20k) and starting a corresponding pilot project. A group of interested Wikipedians and OSM users have partnered up to reach two goals: The integration of OSM-maps in Wikipedia and the installation of a map toolserver. The map toolserver will serve to prototype new mapping-related projects and preparing them for deployment on the main Wikimedia cluster.

Here's how it will work:

Maps are an important part of the information in encyclopaedic articles - however currently mostly static maps are used. With interactive free maps and a marking system a way of presenting information can be created.

For some time there have been MediaWiki Extensions available for embedding OpenStreetMap maps into MediaWiki. That's a great start, but it isn't enough. If these extensions were deployed on Wikipedia without any kind of proxy set-up, the OpenStreetMap tile servers would struggle to handle the traffic.

One of our aims is to build an infrastructure in the Wikimedia projects that allows us to keep the OSM data, or at least the tile images, ready locally in the Wikimedia network. We still have to gain a some experience about this, but we are optimistic about that. On one side, we have a number of Wikipedians in the team, who are versed in MediaWiki and scaling software systems, and on the other side we have OSM users who can set up the necessary geo database.

We learned much from the use of the Wikimedia-Toolservers – for example that on a platform for experimenting much more useful tools were developed than it was predicted. Interested developers have a good starting position to develop new tools with new possibilities.

We expect similar results from the map toolserver. As soon as it is online, everyone who is interested and presents his ideas of development projects and his state of knowledge can apply for an account. We want to allow as many users as possible to implement their ideas without having to care about the basic setup. We hope that in the spirit of the creation and distribution of free content many new maps and visualisations emerge.

Now, it's happening:

There has been rapid progress on the subject of adding OpenStreetMap maps to Wikimedia projects (e.g. Wikipedia) during the MediaWiki Developer Meet-Up taking place right now in Berlin.

Maps linked to Wikipedia content *within* Wikipedia: I can't wait.

Follow me on Twitter @glynmoody

Transparency and Open Government

Not my words, but those of that nice Mr Obama:


My Administration is committed to creating an unprecedented level of openness in Government. We will work together to ensure the public trust and establish a system of transparency, public participation, and collaboration. Openness will strengthen our democracy and promote efficiency and effectiveness in Government.

Wow.

Specifically:

Government should be transparent. Transparency promotes accountability and provides information for citizens about what their Government is doing.

...

Government should be participatory. Public engagement enhances the Government's effectiveness and improves the quality of its decisions.

...

Government should be collaborative. Collaboration actively engages Americans in the work of their Government.

Read the whole thing - and weep for poor old, locked-up UK....

Follow me on Twitter @glynmoody

Google and Microsoft Agree: This is Serious

You know things are bad when a coalition includes Google and Microsoft agreeing on something...

On Open Enterprise blog.

Follow me on Twitter @glynmoody

RFCs: Request for Openness

There's a fascinating history of the RFCs in the New York Times, written by a person who was there at the beginning:

Our intent was only to encourage others to chime in, but I worried we might sound as though we were making official decisions or asserting authority. In my mind, I was inciting the wrath of some prestigious professor at some phantom East Coast establishment. I was actually losing sleep over the whole thing, and when I finally tackled my first memo, which dealt with basic communication between two computers, it was in the wee hours of the morning. I had to work in a bathroom so as not to disturb the friends I was staying with, who were all asleep.

Still fearful of sounding presumptuous, I labeled the note a “Request for Comments.” R.F.C. 1, written 40 years ago today, left many questions unanswered, and soon became obsolete. But the R.F.C.’s themselves took root and flourished. They became the formal method of publishing Internet protocol standards, and today there are more than 5,000, all readily available online.

For me, most interesting comments are the following:

The early R.F.C.’s ranged from grand visions to mundane details, although the latter quickly became the most common. Less important than the content of those first documents was that they were available free of charge and anyone could write one. Instead of authority-based decision-making, we relied on a process we called “rough consensus and running code.” Everyone was welcome to propose ideas, and if enough people liked it and used it, the design became a standard.

After all, everyone understood there was a practical value in choosing to do the same task in the same way. For example, if we wanted to move a file from one machine to another, and if you were to design the process one way, and I was to design it another, then anyone who wanted to talk to both of us would have to employ two distinct ways of doing the same thing. So there was plenty of natural pressure to avoid such hassles. It probably helped that in those days we avoided patents and other restrictions; without any financial incentive to control the protocols, it was much easier to reach agreement.

This was the ultimate in openness in technical design and that culture of open processes was essential in enabling the Internet to grow and evolve as spectacularly as it has. In fact, we probably wouldn’t have the Web without it. When CERN physicists wanted to publish a lot of information in a way that people could easily get to it and add to it, they simply built and tested their ideas. Because of the groundwork we’d laid in the R.F.C.’s, they did not have to ask permission, or make any changes to the core operations of the Internet. Others soon copied them — hundreds of thousands of computer users, then hundreds of millions, creating and sharing content and technology. That’s the Web.

I think this is right: the RFCs are predicated on complete openness, where anyone can make suggestions and comments. The Web built on that basis, extending the possibility of openness to everyone on the Internet. In the face of attempts to kill net neutrality in Europe, it's something we should be fighting for.

Follow me on Twitter @glynmoody

06 April 2009

The Latest Act in the ACTA Farce

I think the Anti-Counterfeiting Trade Agreement(ACTA) will prove something of a watershed in the negotiations of treaties. We have already gone from a situation where governments around the world have all-but denied the thing existed, to the point where the same people are now scrambling to create some semblance of openness without actually revealing too much.

Here's the latest attempt, which comes from the US team:

A variety of groups have shown their interest in getting more information on the substance of the negotiations and have requested that the draft text be disclosed. However, it is accepted practice during trade negotiations among sovereign states to not share negotiating texts with the public at large, particularly at earlier stages of the negotiation. This allows delegations to exchange views in confidence facilitating the negotiation and compromise that are necessary in order to reach agreement on complex issues. At this point in time, ACTA delegations are still discussing various proposals for the different elements that may ultimately be included in the agreement. A comprehensive set of proposals for the text of the agreement does not yet exist.

This is rather amusing. On the one hand, the negotiators have to pretend that "a comprehensive set of proposals for the text of the agreement does not yet exist", so that we can't find out the details; on the other, they want to finish off negotiations as quickly as possible, so as to prevent too many leaks. Of course, they can't really have it both ways, which is leading to this rather grotesque dance of the seven veils, whereby bits and pieces are revealed in an attempt to keep us quiet in the meantime.

The latest summary does contain some interesting background details that I'd not come across before:

In 2006, Japan and the United States launched the idea of a new plurilateral treaty to help in the fight against counterfeiting and piracy, the so-called Anti-Counterfeiting Trade Agreement (ACTA). The aim of the initiative was to bring together those countries, both developed and developing, that are interested in fighting counterfeiting and piracy, and to negotiate an agreement that enhances international co-operation and contains effective international standards for enforcing intellectual property rights.

Preliminary talks about such an anti-counterfeiting trade agreement took place throughout 2006 and 2007 among an initial group of interested parties (Canada, the European Commission, Japan, Switzerland and the United States). Negotiations started in June 2008 with the participation of a broader group of participants (Australia, Canada, the European Union and its 27 member states, Japan, Mexico, Morocco, New Zealand, Republic of Korea, Singapore, Switzerland and the United States).

The rest, unfortunately, is the usual mixture of half-truths and outright fibs. But this constant trickle of such documents shows that they are taking notice of us, and that we must up the pressure for full disclosure of what exactly is being negotiated in our name.

Follow me on Twitter @glynmoody

A Different Kind of Wörterbuch

Linguee seems to offer an interesting twist on a boring area - bilingual dictionaries:

With Linguee, you can search for words and expressions in many millions of bilingual texts in English and German. Every expression is accompanied by useful additional information and suitable example sentences.

...

When you translate texts to a foreign language, you usually look for common phrases rather than translations of single words. With its intelligent search and the significantly larger amount of stored text content, Linguee is the right tool for this task. You find:

* In what context a translation is used
* How frequent a particular translation is
* Example sentences: How have other people translated an expression?

By searching not only for a single word, but for a respective word in its context, you can easily find a translation that fits optimal in context. With its large number of entries, Linguee often retrieves translations of rare terms that you don't find anywhere else.

There two other points of interest. The source of the texts:

Our most important source is the bilingual web. Other valuable sources include EU documents and patent specifications.

And the fact that a "GPL version of the Linguee dictionary" is available.

Follow me on Twitter @glynmoody

Google's Perpetual Monopoly on Orphan Works

Here's an interesting analysis of the Google Book Search settlement. This, you will recall, resolved the suit that authors and publishers brought against Google for scanning books without permission - something it maintained it could do without, since it only wanted to index its contents, not display them in their entirely.

At first this looked like an expensive and unnecessary way out for Google: many hoped that it would fight in the courts to determine what was permitted under fair use. But as people have had time to digest its implications, the settelement is beginning to look like a very clever move:


Thanks to the magic of the class action mechanism, the settlement will confer on Google a kind of legal immunity that cannot be obtained at any price through a purely private negotiation. It confers on Google immunity not only against suits brought by the actual members of the organizations that sued Google, but also against suits brought by anyone who doesn’t explicitly opt out. That means that Google will be free to mine the vast body of orphan works without fear of liability.

Any competitor that wants to get the same legal immunity Google is getting will have to take the same steps Google did: start scanning books without the publishers’ and authors’ permission, get sued by authors and publishers as a class, and then negotiate a settlement. The problem is that they’ll have no guarantee that the authors and publishers will play along. The authors and publishers may like the cozy cartel they’ve created, and so they may have no particular interest in organizing themselves into a class for the benefit of the new entrant. Moreover, because Google has established the precedent that “search rights” are something that need to be paid for, it’s going to be that much harder for competitors to make the (correct, in my view) argument that indexing books is fair use.

It seems to me that, in effect, Google has secured for itself a perpetual monopoly over the commercial exploitation of orphan works. Google’s a relatively good company, so I’d rather they have this monopoly than the other likely candidates. But I certainly think it’s a reason to be concerned.

Cunning.

Follow me on Twitter @glynmoody

All Tatarstan Schools Moving to Free Software

Tatarstan is the place to be:

До конца текущего года все школы Татарстана планируется перевести на свободное программное обеспечение на базе операционной системы «Linux».

...

По словам замминистра, в каждой школе республики на уровне кружков планируется открыть курсы по обучению работе в «Linux» учащихся. Но до этого предстоит еще подготовить специалистов, которые будут руководить этими кружками.

Людмила Нугуманова заявила, что Татарстан полностью перейдет на программное обеспечение с открытым кодом на основе операционной системы «Linux». Ведь в 2010 году закончится подписка на лицензию базового пакета программного обеспечения для школ на платформе «Microsoft». «За продолжение подписки придется платить немалые деньги, либо остаться на нашем отечественном продукте «Linux», - отметила она.

Как сообщила начальник отдела развития информационных технологий в образовании Министерства образования и науки РТ Надежда Сулимова, в прошлом году новый софт установлен в 612 школах республики (всего в Татарстане функционируют почти 2,4 тысячи общеобразовательных учреждений).

[Via Google Translate: By the end of this year, all schools of Tatarstan plans to transfer to the free software operating system based on «Linux».

...

According to the Deputy Minister, in each of the school-level workshops are planned to open courses on the work of «Linux» students. But before that is still to prepare professionals who will lead these clubs.

Ludmila Nugumanova said that Tatarstan is fully pass on the software and open source based operating system «Linux». Indeed, in 2010, will end subscription base license software package for schools on the platform «Microsoft». «For the continuation of subscriptions to pay a lot of money, or stay on our domestic product« Linux », - she said.

As the head of the department of information technology in education the Ministry of Education and Science of the Republic of Tatarstan Hope Sulimova, last year, new software is installed in 612 schools (only in Tatarstan there are almost 2,4 thousand general educational institutions).]

Follow me on Twitter @glynmoody

How Can We Save Thunderbird Now Email is Dying?

I like Thunderbird. I've been using it for years, albeit now more as a backup for my Gmail account than as my primary email client. But it's always been the Cinderella of the Mozilla family, rather neglected compared to its more glamorous sister Firefox. The creation of the Mozilla Messaging subsidiary of the Mozilla Foundation means that efforts are already underway to remedy that. But there's a deeper problem that Thunderbird needs to face, too....

On Open Enterprise blog.

Follow me on Twitter @glynmoody

05 April 2009

Top 10 Measurements for Open Government

One of the most exciting applications of openness in recent months has been to government. A year ago, open government was sporadic and pretty forlorn as a field; today it is flourishing, notable under the alternative name of "transparency". At the forefront of that drive is the Sunlight Foundation, which has just published a suggested top 10 measurements of just how open a government real is:


1. Open data: The federal government should make all data searchable, findable and accessible.

2. Disclose spending data: The government should disclose how it is spending taxpayer dollars, who is spending it and how it’s being spent.

3. Procurement data: How does the government decide where the money is getting spent, who gets it, how they are spending it and how can we measure success.

4. Open portal for public request for information: There should be a central repository for all Freedom of Information Act requests that are public to that people can see in real time when the requests come in, how fast the government responds to them.

5. Distributed data: The government should make sure it builds redundancy in their system so that data is not held in just one location, but held in multiple places in case of a disaster, terrorist attack or some other reason where the data is damaged. Redundancy would guarantee government could rebuild the data for future use.

6. Open meetings: Government meetings should be open to the public so that citizens can tell who is trying to influence government. All schedules should be published as soon as they happen so that people can see who is meeting with whom and who is trying to influence whom.

7. Open government research: Currently, when government conducts research, it usually does not report the data it collects until the project is finished. Government should report its research data while its being collected in beta form. This would be a measure of transparency and would change the relationship that people have to government research as it is being collected.

8. Collection transparency: Government should disclose how it is collecting information, for whom are they collecting the data, and why is it relevant. The public should have the ability to judge whether or not it valuable to them, and giving them the ability to comment on it.

9. Allowing the public to speak directly to the president: Recently, we saw the president participate in something called “Open for Questions,” where he gave the public access to ask questions. This allowed him to burst his bubble and be in touch with the American public directly is another measure of transparency.

10. Searchable, crawl able and accessible data: If the government were to make all data searchable, crawl able and accessible we would go along way in realizing all the goals presented at the Gov 2.0 Camp.

Great stuff, exciting times. Now, if only the UK government could measure up to these....

Follow me on Twitter @glynmoody

Who Can Put the "Open" in Open Science?

One of the great pleasures of blogging is that your mediocre post tossed off in a couple of minutes can provoke a rather fine one that obviously took some time to craft. Here's a case in point.

The other day I wrote "Open Science Requires Open Source". This drew an interesting comment from Stevan Harnad, pretty much the Richard Stallman of open access, as well as some tweets from Cameron Neylon, one of the leading thinkers on and practitioners of open science. He also wrote a long and thoughtful reply to my post (including links to all our tweets, rigorous chap that he is). Most of it was devoted to pondering the extent to which scientists should be using open source:

It is easy to lose sight of the fact that for most researchers software is a means to an end. For the Open Researcher what is important is the ability to reproduce results, to criticize and to examine. Ideally this would include every step of the process, including the software. But for most issues you don’t need, or even want, to be replicating the work right down to the metal. You wouldn’t after all expect a researcher to be forced to run their software on an open source computer, with an open source chipset. You aren’t necessarily worried what operating system they are running. What you are worried about is whether it is possible read their data files and reproduce their analysis. If I take this just one step further, it doesn’t matter if the analysis is done in MatLab or Excel, as long as the files are readable in Open Office and the analysis is described in sufficient detail that it can be reproduced or re-implemented.

...

Open Data is crucial to Open Research. If we don’t have the data we have nothing to discuss. Open Process is crucial to Open Research. If we don’t understand how something has been produced, or we can’t reproduce it, then it is worthless. Open Source is not necessary, but, if it is done properly, it can come close to being sufficient to satisfy the other two requirements. However it can’t do that without Open Standards supporting it for documenting both file types and the software that uses them.

The point that came out of the conversation with Glyn Moody for me was that it may be more productive to focus on our ability to re-implement rather than to simply replicate. Re-implementability, while an awful word, is closer to what we mean by replication in the experimental world anyway. Open Source is probably the best way to do this in the long term, and in a perfect world the software and support would be there to make this possible, but until we get there, for many researchers, it is a better use of their time, and the taxpayer’s money that pays for that time, to do that line fitting in Excel. And the damage is minimal as long as source data and parameters for the fit are made public. If we push forward on all three fronts, Open Data, Open Process, and Open Source then I think we will get there eventually because it is a more effective way of doing research, but in the meantime, sometimes, in the bigger picture, I think a shortcut should be acceptable.

I think these are fair points. Science needs reproduceability in terms of the results, but that doesn't imply that the protocols must be copied exactly. As Neylon says, the key is "re-implementability" - the fact that you *can* reproduce the results with the given information. Using Excel instead of OpenOffice.org Calc is not a big problem provided enough details are provided.

However, it's easy to think of circumstances where *new* code is being written to run on proprietary engines where it is simply not possible to check the logic hidden in the black boxes. In these circumstances, it is critical that open source be used at all levels so that others can see what was done and how.

But another interesting point emerged from this anecdote from the same post:

Sometimes the problems are imposed from outside. I spent a good part of yesterday battling with an appalling, password protected, macroed-to-the-eyeballs Excel document that was the required format for me to fill in a form for an application. The file crashed Open Office and only barely functioned in Mac Excel at all. Yet it was required, in that format, before I could complete the application.

Now, this is a social issue: the fact that scientists are being forced by institutions to use proprietary software in order to apply for grants or whatever. Again, it might be unreasonable to expect young scientists to sacrifice their careers for the sake of principle (although Richard Stallman would disagree). But this is not a new situation. It's exactly the problem that open access faced in the early days, when scientists just starting out in their career were understandably reluctant to jeopardise it by publishing in new, untested journals with low impact factors.

The solution in that case was for established scientists to take the lead by moving their work across to open access journals, allowing the latter to gain in prestige until they reached the point where younger colleagues could take the plunge too.

So, I'd like to suggest something similar for the use of open source in science. When established scientists with some clout come across unreasonable requirements - like the need to use Excel - they should refuse. If enough of them put their foot down, the organisations that lazily adopt these practices will be forced to change. It might require a certain courage to begin with, but so did open access; and look where *that* is now...

Follow me on Twitter @glynmoody

03 April 2009

Why We Should Teach Maths with Open Source

Recently, I was writing about science and open source (of which more anon); here are some thoughts on why maths ought to be taught using free software:

I personally feel it is terrible to *train* students mainly to use closed source commercial mathematics software. This is analogous to teaching students some weird version of linear algebra or calculus where they have to pay a license fee each time they use the fundamental theorem of calculus or compute a determinant. Using closed software is also analogous to teaching those enthusiastic students who want to learn the proofs behind theorems that it is illegal to do so (just as it is literally illegal to learn *exactly* how Maple and Mathematica work!). From a purely practical perspective, getting access to commercial math software is very frustrating for many students. It should be clear that I am against teaching mathematics using closed source commercial software.

Follow me on Twitter @glynmoody

User-Generated Content: Microsoft vs. Google

Back in November I was urging you to submit your views on a consultation document on the role of copyright in the knowledge economy, put out by the European Commission. The submissions have now been published online, and I'm deeply disappointed to see that not many of took a blind bit of notice of my suggestion...

On Open Enterprise blog.

Follow me on Twitter @glynmoody

HADOPI Law Passed - by 12 Votes to 4

What a travesty of democracy:

Alors que le vote n'était pas prévu avant la semaine prochaine, les quelques députés présents à l'hémicycle à la fin de la discussion sur la loi Création et Internet ont été priés de passer immédiatement au vote, contrairement à l'usage. La loi a été adoptée, en attendant son passage en CMP puis au Conseil Constitutionnel.

On peine à en croire la démocratie dans laquelle on prétend vivre et écrire. Après 41 heures et 40 minutes d'une discussion passionnée sur le texte, il ne restait qu'une poignée de courageux députés autour de 22H45 jeudi soir lorsque l'Assemblée Nationale a décidé, sur instruction du secrétaire d'Etat Roger Karoutchi, de passer immédiatement au vote de la loi Création et Internet, qui n'était pas attendu avant la semaine prochaine. Un fait exceptionnel, qui permet de masquer le nombre important de députés UMP qui se seraient abstenus si le vote s'était fait, comme le veut la tradition, après les questions au gouvernment mardi soir. Ainsi l'a voulu Nicolas Sarkozy.

...

Quatre députés ont voté non (Martine Billard, Patrick Bloche et deux députés non identifiés), et une dizaine de mains se sont levées sur les bancs de la majorité pour voter oui. En tout, 16 députés étaient dans l'hémicycle au moment du vote.

[Via Google Translate: While the vote was not expected until next week, the few members in the chamber at the end of the discussion on the Creation and Internet law were invited to proceed immediately to vote, contrary to custom.The law was passed, until it passes then CMP in the Constitutional Council.

It is difficult to believe in democracy in which we aim to live and write. After 41 hours and 40 minutes of passionate discussion on the text, there remained only a handful of courageous members around 22:45 Thursday evening when the National Assembly decided, on the instructions of the Secretary of State Roger Karoutchi to pass immediately to vote on the Creation and Internet law, which was not expected before next week. One exception, which allows you to hide the large number of UMP deputies who would have abstained if the vote had been, as tradition dictates, after the government issues Tuesday night. Thus wished Nicolas Sarkozy.

...

Pack is voted. Four members voted no (Martine Billard, Patrick Bloche and two unidentified deputies), and a dozen hands were raised on the banks of the majority to vote yes. In all, 16 MPs were in the chamber for the vote.]

So one of the most important, and contentious piece of legislation in recent years is passed by trickery. In this way, those pushing this law have shown their true colours and their contempt for the democratic process.

Follow me on Twitter @glynmoody

02 April 2009

"Piracy Law" Cuts *Traffic* not "Piracy"

This story is everywhere today:


Internet traffic in Sweden fell by 33% as the country's new anti-piracy law came into effect, reports suggest.

Sweden's new policy - the Local IPRED law - allows copyright holders to force internet service providers (ISP) to reveal details of users sharing files.

According to figures released by the government statistics agency - Statistics Sweden - 8% of the entire population use peer-to-peer sharing.

The implication in these stories is that this kind of law is "working", in the sense that it "obviously" cuts down copyright infringement, because it's cutting down traffic.

In your dreams.

All this means is that people aren't sharing so much stuff online. But now that you can pick up a 1 Terabyte external hard drive for less than a hundred quid - which can store about a quarter of a million songs - guess what people are going to turn to in order to swap files in the future?

Follow me on Twitter @glynmoody

Open Science Requires Open Source

As Peter Suber rightly points out, this paper offers a reversal of the usual argument, where open access is justified by analogy with open source:


Astronomical software is now a fact of daily life for all hands-on members of our community. Purpose-built software for data reduction and modeling tasks becomes ever more critical as we handle larger amounts of data and simulations. However, the writing of astronomical software is unglamorous, the rewards are not always clear, and there are structural disincentives to releasing software publicly and to embedding it in the scientific literature, which can lead to significant duplication of effort and an incomplete scientific record. We identify some of these structural disincentives and suggest a variety of approaches to address them, with the goals of raising the quality of astronomical software, improving the lot of scientist-authors, and providing benefits to the entire community, analogous to the benefits provided by open access to large survey and simulation datasets. Our aim is to open a conversation on how to move forward.

The central argument is important: that you can't do science with closed source software, because you can't examine its assumptions or logic (that "incomplete scientific record"). Open science demands open source.

Follow me on Twitter @glynmoody

Second Chance at Life

Two years ago, the virtual world Second Life was everywhere, as pundits and press alike rushed to proclaim it as the Next Big Digital Thing. Inevitably, the backlash began soon afterwards. The company behind it, Linden Lab, lost focus and fans; key staff left. Finally, last March, Second Life's CEO, creator and visionary, Philip Rosedale, announced that he was taking on the role of chairman of the board, and bringing in fresh leadership. But against an increasingly dismal background, who would want to step into his shoes?

From the Guardian.

Follow me on Twitter @glynmoody

31 March 2009

Trailing Clouds of Openness

As you may have heard, there's been a bit of a to-do over a new “Open Cloud Manifesto.” Here's the central idea:

The industry needs an objective, straightforward conversation about how this new computing paradigm will impact organizations, how it can be used with existing technologies, and the potential pitfalls of proprietary technologies that can lead to lock-in and limited choice....

On Open Enterprise blog.

Follow me on Twitter @glynmoody

Do Open Source Companies *Really* Support Free Software?

Asterisk, a PBX, telephony engine, and telephony applications toolkit, is one of open source best-kept secrets. As with many open source projects, there is a company has been set up to provide support, Digium. Here's its latest press release....

On Open Enterprise blog.

Follow me on Twitter @glynmoody

30 March 2009

Bad News: Microsoft Gets its Way with TomTom

Well, the question as to how the great Microsoft vs. TomTom suit would finish has been answered:

Microsoft and TomTom announced on Monday that they have reached a settlement in their respective patent suits....

On Open Enterprise blog.

Follow me on Twitter @glynmoody

Open Source Social Documentation for Museums

Again, open source reaches ever-new bits:


The new MAA Documentation System combines open-source technologies with deep social computing principles to create a truly innovative approach to museum documentation. The new MAA Documentation System shifts the age-old documentation principles of standardized description and information accumulation to multi-vocal and multi-source accounts and distributed documentation.

For the past few years, the MAA has been developing an open-source Documentation System. With over 20 years experience of developing its own Documentation Systems and Collections Management Systems, the MAA is just about to finish one of the most ambitious upgrades of its history. In fact, this system is the result of a complete re-think of its documentation practices. Thought the new system takes account of documentation standards, such as SPECTRUM, and newer developments such as CollectionSpace, it differs from the traditional approaches is several key respects.

And if that isn't wonderful enough, this new project comes from Cambridge's Museum of Archaeology & Anthropology - known to its friends as Arch and Anth. Its old, Victorian(?) building was one of the most atmospheric places in Cambridge.

Follow me on Twitter @glynmoody

Save the European Internet – Write to Your MEPs (Again)

Last week I was urging you to write to a particular set of MEPs about proposed changes to the Telecoms Package, which is wending its slow way through the European Union's legislative system. Now it's time to write to *all* you MEPs, since a crucially important vote in a couple of committees is to take place tomorrow. You can read more about what's been happening and why that's a problem on the La Quadrature du Net site, which also offers a detailed analysis of the Telecom Package and the proposed amendments.

Here's what I've just sent to all my MEPs using WriteToThem:

I am writing to ask you as my representative to contact your colleagues on the IMCO and ITRE committees about crucial votes on the Telecoms Package, taking place in 31 March. At stake is nothing less than the future of the Internet in Europe. If amendments being supported by AT&T and others go through, the main driver of the Internet – and with it, online innovation – will be nullified.

This would be deeply ironic, since it was in Europe that the most important online innovation of all – the Web – was invented. In fact, no less a person than Sir Tim Berners-Lee, its inventor, has warned (at http://dig.csail.mit.edu/breadcrumbs/node/144) that the loss of net neutrality – which is what some of the proposed amendments would lead to – would have made it impossible for him to have carried out his revolutionary work. If we wish Europe to remain in the forefront of digital innovation, it is vital that the net neutrality of the Internet be preserved.

This is a complex issue – I personally find it very difficult to navigate through the many conflicting options before the committees. Fortunately, others have already done the hard work, and boiled down the recommendations to the following.

For your colleagues on the IMCO committee, please urge then to:

Vote against the amendments authorizing “net discrimination” and guarantee it is not put in place, by :

rejecting amendements 136=137=138 pushed by AT&T (and the related recitals 116, 117=118)

voting for amendment 135 bringing protection against “net discrimination”

as a default, if the first ones were all rejected, vote for ams 139+141

Vote for positive protection of EU citizens' fundamental rights in amendments 72=146

Vote for protecting EU citizens' privacy by rejecting amendment 85 and voting for am. 150.

Similarly, for those on the IMRE committee, please ask them to:

Protect EU citizens fundamental rights and freedoms by voting for amendment 46=135 (first reading amendment 138).

Reject the notion of “lawful content” in amendment 45 for it is a major breach to the technical neutrality of the network, would turn operators into private judges, and open the door to “graduated response” (or “three strikes”) schemes of corporate police.

If you or your colleagues are interested in seeing the detailed analysis of all the amendments, it can be found here:

http://www.laquadrature.net/wiki/Telecoms_Package_2nd_Reading_ITRE_IMCO_Voting_List.

This is a critical series of votes for the Internet in Europe. At a time of great economic turmoil, the last thing we can afford is to throttle Europe's entrepreneurial spirit; for this reason, I hope that you will be able to convince your colleagues on the committees to vote as suggested above.

Sadly, this is really important and really urgent. Please add your voice if you can, or the Internet as we know may cease to exist in Europe soon, to be replaced with something closer to a cable TV service. You have been warned.

Follow me on Twitter @glynmoody

29 March 2009

Building on Richard Stallman's Greatest Achievement

What was Richard Stallman's greatest achievement? Some might say it's Emacs, one of the most powerful and adaptable pieces of software ever written. Others might plump for gcc, an indispensable tool used by probably millions of hackers to write yet more free software. And then there is the entire GNU project, astonishing in its ambition to create a Unix-like operating system from scratch. But for me, his single most important hack was the creation of the GNU General Public Licence....

On Linux Journal.

Follow me on Twitter @glynmoody

28 March 2009

Phished by Visa

This is utterly scandalous:

Not content with destroying the world’s economies, the banking industry is also bent on ruining us individually, it seems. Take a look at Verified By Visa. Allegedly this protects cardholders - by training them to expect a process in which there’s absolutely no way to know whether you are being phished or not. Even more astonishing is that this seen as a benefit!

...

Craziness. But it gets better - obviously not everyone is pre-enrolled in this stupid scheme, so they also allow for enrolment using the same inline scheme. Now the phishers have the opportunity to also get information that will allow them to identify themselves to the bank as you. Yes, Visa have provided a very nicely tailored and packaged identity theft scheme. But, best of all, rather like Chip and PIN, they push all blame for their failures on to the customer

I've instinctively hated these "Verified by Visa" ever since they came out, and tried not to use them. The fact that they are not just inherently insecure, but encouraging merchants to use this in the most insecure way possible, is astonishing even for an industry as rank and rotten as banking.

The one consolation has to be that Verified by Visa is so demonstrably insecure that it should be easy to challenge in court any attempts to make customers pay for the banks' own stupidity.

Follow me on Twitter @glynmoody

27 March 2009

Why Everyone Hates the PRS

Another classic post from Mike Masnick about the absurdities our current copyright regime visits upon us:

PRS has now threatened a woman who plays classical music to her horses in her stable to keep them calm. She had been turning on the local classical music station, saying that it helped keep the horse calm -- but PRS is demanding £99 if she wants to keep providing such a "public performance." And it's not just a one-off. Apparently a bunch of stables have been receiving such calls.

That's pathetic enough, but it's Masnick's parting shot that really struck me:

The group seems to believe that playing music in almost any situation now constitutes a public performance and requires a licensing fee. You just know they're salivating over the opportunity to go after people playing music in their cars with the windows down.

Because you know what? I bet the PRS is really considering how to do this.

Follow me on Twitter @glynmoody

26 March 2009

"Three Strikes" Struck Down for Third Time

As I wrote earlier today, things are looking bad for the Internet in Europe. But the European Parliament continues to do its bit protecting you and me. Here's the latest from the excellent Quadrature du Net site:

The European Parliament, endorsing the Lambrinidis report and turning its back on all the amendments supported by the French government and defended by Jacques Toubon and Jean-Marie Cavada, has just rejected "graduated response" for the third time. France is definitely alone in the world with its kafkaesque administrative machinery, an expensive mechanism for arbitrary punishment.

The report of Eurodeputy Stavros Lambrinidis concerning the protection of individual liberties on the Internet has just been confirmed by the European parliament by an overwhelming vote of 481 to 252.

It stands in clear opposition to the French HADOPI law in "holding that illiteracy with computers will be the illiteracy of the 21st century; holding that guaranteeing Internet access to all citizens is the same as guaranteeing all citizens access to education and holding that such access must not be refused in punishment by governments or private organizations; holding that this access should not be used abusively for illegal activities; holding that attention must be paid to emerging questions such as network neutrality, interoperability, the global accessibility of all Internet nodes, and the use of open formats and standards."

The approval of the Lambrinidis report and the rejection of the French amendments is the third consecutive time that the European Parliament has rejected the French "graduated response", since the approval of the Bono amendment to the report on cultural industries and the well-known
Bono/Cohn-Bendit/Roithova Amendment 138.

Furthermore, all the amendments supported by the French government, notably those proposed by Eurodeputies Jacques Toubon and Jean-Marie Cavada, have been rejected. They were trying specifically to prevent measures related to graduated response, showing that the French government realizes that Europe is about to render the HADOPI law obsolete before it even comes to a vote.

Alas, this is by no means the end. The same wretched clause will come bounding back, along with all kinds of other stupidities. The fight goes on....

Follow me on Twitter @glynmoody

Patents Fail to Make Patent = Patent Failure

WIPO has just published a study entitled Dissemination of Patent Information. I've not read it, but here's someone who has, with an interesting observation:


In the first 71 paragraphs of the study, theoretical availability of patent information is confused with dissemination of patent information. Indeed, the study itself, belatedly, recognises the distinction between the theory of patent law and disclosure and the reality of accessing useful patent information in paragraph 72. Here the study states that availability of information does not always mean it is accessible in practical terms. Based on the figures provided in the study, in practical terms, accessibility of patent information is quite poor.

In other words, the one thing that patents *must* do - to disclose and make patent - they generally do badly. The net effect is that patents take away from the knowledge commons, without giving back even the paltry payment they owe. Add it to the (long) list of why patents fail. (Via Open Access News.)

Follow me on Twitter @glynmoody

Open Knowledge Conference (OKCon) 2009

If open knowledge is your thing, London is the place, Saturday the time:

The Open Knowledge Conference (OKCon) is back for its fourth installment bringing together individuals and groups from across the open knowledge spectrum for a day of discussions workshops.

This year the event will feature dedicated sessions on open knowledge and development and on the semantic web and open data. Plus there's the usual substantial allocation of 'Open Space' -- sessions, workshops and discussions proposed either via the CFP or on the day
.
Follow me on Twitter @glynmoody

Save the European Internet – Write to Your MEPs

Things seem to be going from bad to worse with the EU's Telecoms Package. Now, not only do we have to contend with French attempts to push through its “three strikes and you're out” approach again, which the European Parliament threw out, but there are several other amendments that are being proposed that will effectively gut the Internet in Europe.

The Open Rights Group has a good summary of two of the main threats (also available from its Blackout Europe Facebook group):

One of the most controversial issues is that of the three-strikes strongly and continuously pushed by France in the EU Council. Although most of the dispositions introducing the graduate response system were rejected in first reading of the Telecom Package, there are still some alarming ones persisting. France is trying hard to get rid of Amendment 138 which seeks to protect users’ rights against the three-strikes sanctions and which, until now, has stopped the EU from applying the three-strikes policy. Also, some new amendments reintroduce the notion of lawful content, which will impose the obligation on ISPs to monitor content going through their networks.

The UK government is pushing for the “wikipedia amendments” (so-called because one of them has been created by cutting and pasting a text out of the wikipedia) in order to allow ISPs to make limited content offers. The UK amendments eliminate the text that gives users rights to access and distribute content, services and applications, replacing it with a text that says “there should be transparency of conditions under which services are provided, including information on the conditions to and/or use of applications and services, and of any traffic management policies.”

To these, we must now add at least one more, which the indispensable IPtegrity site has spotted:

Six MEPs have taken text supplied by the American telecoms multi-national, AT&T, and pasted it directly into amendments tabled to the Universal Services directive in the Telecoms Package. The six are Syed Kamall , Erika Mann, Edit Herczog , Zita Pleštinská , Andreas Schwab , and Jacques Toubon .

AT&T and its partner Verizon, want the regulators in Europe to keep their hands-off new network technologies which will provide the capability for broadband providers to restrict or limit users access to the Internet. They have got together with a group of other telecoms companies to lobby on this issue. Their demands pose a threat to the neutrality of the network, and at another level, to millions of web businesses in Europe.

As you can read, this is a grave danger for the Internet in Europe, because it would allow telecom companies to impose restrictions on the services they provide. That is, at will, they can discriminate against new services that threaten their existing offerings – and hence throttle online innovation. The Internet has grown so quickly, and become so useful, precisely because it is an end-to-end service: it does not take note of or discriminate between packets, it simply delivers them.

What is particularly surprising is that one of the MEPs putting forward this amendment is the UK's Syed Kamall, who has a technical background, and in the past has shown himself aware of the larger technological issues. I'm really not sure why he is involved in this blatant attempt by the telecoms companies to subvert the Internet in Europe.

Since he is one of my MEPs (he represents London), I've used the WriteToThem service to send him the following letter:

I was surprised and greatly disappointed to learn that you are proposing an amendment to the Telecoms Package that would have the consequence of destroying the network neutrality of the Internet – in many ways, its defining feature.

Your amendment 105, which requires network providers to inform users of restrictions and/or limitations on their communications services will allow companies to impose arbitrary blocks on Internet services; instead, we need to ensure that no such arbitrary restrictions are possible.

As the inventor of the Web, Sir Tim Berners-Lee, has pointed out when net neutrality was being debated in the US (http://dig.csail.mit.edu/breadcrumbs/node/144):

“When I invented the Web, I didn't have to ask anyone's permission. Now, hundreds of millions of people are using it freely. I am worried that that is going end in the USA.

I blogged on net neutrality before, and so did a lot of other people. ... Since then, some telecommunications companies spent a lot of money on public relations and TV ads, and the US House seems to have wavered from the path of preserving net neutrality. There has been some misinformation spread about. So here are some clarifications.

Net neutrality is this:

If I pay to connect to the Net with a certain quality of service, and you pay to connect with that or greater quality of service, then we can communicate at that level.

That's all. Its up to the ISPs to make sure they interoperate so that that happens.

Net Neutrality is NOT asking for the internet for free.

Net Neutrality is NOT saying that one shouldn't pay more money for high quality of service. We always have, and we always will

There have been suggestions that we don't need legislation because we haven't had it. These are nonsense, because in fact we have had net neutrality in the past -- it is only recently that real explicit threats have occurred.”

He concludes:

“Yes, regulation to keep the Internet open is regulation. And mostly, the Internet thrives on lack of regulation. But some basic values have to be preserved. For example, the market system depends on the rule that you can't photocopy money. Democracy depends on freedom of speech. Freedom of connection, with any application, to any party, is the fundamental social basis of the Internet, and, now, the society based on it.”

I'm afraid that what your amendment will do is to destroy that freedom. I am therefore asking you to withdraw your amendment, to preserve the freedom of the connection that allows new services to evolve, and innovations to be made without needing to ask permission of the companies providing the connection. Instead, the Internet needs net neutrality to be enshrined in law, and if possible, I would further request you and your colleagues to work towards this end.

If you are also based in London – or in a constituency represented by one of the five other MEPs mentioned in the IPtegrity story - I urge you to write a similar (but *not* identical) letter to them. It is vitally important these amendments be withdrawn, since most MEPs will be unaware of the damage they can do, and might well wave them through. Further letters to all MEPs will also be needed in due course, but I think it's best to concentrate on these particular amendments for the moment, since they are a new and distrubing development.

Follow me on Twitter @glynmoody