Showing posts with label french. Show all posts
Showing posts with label french. Show all posts

11 December 2009

Uncommon Meditations on the Commons

It's significant that books about the commons are starting to appear more frequently now. Here's one that came out six months ago:


Who Owns the World? The Rediscovery of the Commons, has now been published by oekom Verlag in Berlin. (The German title is Wem gehört die Welt – Zur Wiederentdeckung der Gemeingüter.) The book is an anthology of essays by a wide range of international authors, including Elinor Ostrom, Richard Stallman, Sunita Narain, Ulrich Steinvorth, Peter Barnes, Oliver Moldenhauer, Pat Mooney and David Bollier.

Unfortunately, its text no longer seems available in English (please correct me if I'm wrong), although there is a version in Spanish [.pdf]. For those of you a little rusty in that tongue, there's a handy review and summary of the book that actually turns into a meditation on some unusual aspects of the commons in its own right. The original, in French, is also available.

Here's the conclusion:

Those who love the commons and reciprocity rightly highlight the risks entailed by their necessary relationships with politics and the State, with money and the market. This caution should not lead them to isolate the commons from the rest of the world, however, or from the reign of the State and market. State and market are not cadavers which can be nailed into a coffin and thrown into the sea. For a very, very long time, they will continue to contaminate or threaten the reciprocal relationships that lie at the heart of the commons, with their cold logic. We can only try to reduce their importance. We must hope that reciprocal relationships will grow in importance with respect to relationships of exchange and of authority.

Worth reading.

Follow me @glynmoody on Twitter or identi.ca.

19 November 2008

And the Firefoxiest Country is...Indonesia

While some people (like me) have been fixated on the jolly good work being done in Europe in terms of boosting Firefox's market share, it seems that they (I) have overlooked an even bigger success:


One aspect of our global expansion is in our user base. By the end of 2007, nearly fifty percent of Firefox users chose a language other than English. In a fast forward, the first country in which Firefox usage appears to have crossed the 50% mark is Indonesia, surpassing 50% in July 2008. A set of European countries (Sovenia, Poland, and Finland) see Firefox usage above 40%.

And let's not forget that Indonesia is (a) big and (b) getting bigger fast. Indonesian will arguably be the other major world language of the future (along with Mandarin, English, Hindustani, Spanish, Arabic, Russian and French).

09 January 2007

Lost in Translation

I wrote recently about the goings-on at the Council of the European Union, and their strange reason for not supporting GNU/Linux users. But now, it seems, everything has been explained:

The European Union has blamed a translation mistake for its claim that it cannot legally support Linux.

Oh, that's OK, then. But, er, what exactly happened?

A spokesman for the Council of the EU, the Union's representative body, told ZDNet UK: "It was originally written in French, and the French version has no such statement. So it is a mistake."

Hm: the statement didn't exist, and then a "translation error" made it come into existence? How odd. But wait, there's more:

The spokesman explained that the service was only fully launched in September, and there was a need to get the service up and running, even if that meant not supporting all operating systems. He also said there was a cost, and complexity, of supporting additional operating systems such as Linux. And he added: "If we change, it is not only for Linux, we would have to open up to all open sources."

Now, hang on a minute: supporting GNU/Linux just means making RealAudio feeds available, since these can be played by open source systems as well as on proprietary systems. That's one more format, not an infinitude of "open sources" - just like many Web sites provide.

This is beginning to get fishier than the EU's fisheries policy....

07 July 2006

The Other Kind of Open Source Languages

I am constantly delighted by the wit and wisdom of TechDirt. The latest example: a nice little meditation on the virtues of "open source" languages like English, where anyone can make up their own words, and that do not have standards bodies à l'Académie française telling people what is and isn't allowed.

It's true that French isn't exactly closed source (I believe you're allowed to write words down across the Channel), but it's a nice conceit.

28 February 2006

Wanted: a Rosetta for the MegaWikipedia

As I write, Wikipedia has 997,131 articles - close to the magic, if totally arbitrary, one million (if we had eleven fingers, we'd barely be halfway to the equally magic 1,771,561): the MegaWikipedia.

Except that it's not there, really. The English-language Wikipedia may be approaching that number, but there are just eight other languages with more than 100,000 articles (German, French, Italian, Japanese, Dutch, Polish, Portuguese and Swedish), and 28 with more than 10,000. Most have fewer than 10,000.

Viewed globally, then, Wikipedia is nowhere near a million articles on average, across the languages.

The disparity between the holdings in different languages is striking; it is also understandable, given the way Wikipedia - and the Internet - arose. But the question is not so much Where are we? as Where do we go from here? How do we bring most of the other Wikipedias - not just five or six obvious ones - up to the same level of coverage as the English one?

Because if we don't, Wikipedia will never be that grand, freely-available summation of knowledge that so many hope for: instead, it will be a grand, freely-available summation of knowledge for precisely those who already have access to much of it. And the ones who actually need that knowledge most - and who currently have no means of accessing it short of learning another language (for which they probably have neither the time nor the money) - will be excluded once more.

Clearly, this global Wikipedia cannot be achieved simply by hoping that there will be enough volunteers to write all the million articles in all the languages. In any case, this goes against everything that free software has taught us - that the trick is to build on the work of others, rather than re-invent everything each time (as proprietary software is forced to do). This means that most of the articles in non-English tongues should be based on those in English. Not because English is "better" as a language, or even because its articles are "better": but simply because they are there, and they provide the largest foundation on which to build.

A first step towards this is to use machine translations, and the new Wikipedia search engine Qwika shows the advantages and limitations of taking this approach. Qwika lets you search in several languages through the main Wikipedias and through machine-translations of the English version. In effect, it provides a pseudo-conversion of the English Wikipedia to other tongues.

But what is needed is something more thoroughgoing, something formal - a complete system for expediting the translation of all of the English-language articles into other languages. And not just a few: the system needs to be such that any translator can use it to create new content based on the English items. The company behind Ubuntu, Canonical, already has a system that does something similar for people who are translating open source software into other languages. It's called, appropriately enough, Rosetta.

Now that the MegaWikipedia is in sight - for Anglophones, at least - it would be the perfect time to move beyond the succerssful but rather ad hoc approach currently taken to creating multilingual Wikipedia content, and to put the Net's great minds to work on the creation of something better - something scalable: a Rosetta for the MegaWikipedia.

What better way to celebrate what is, for all the qualifications above, truly a milestone in the history of open content, than by extending it massively to all the peoples of the world, and not just to those who understand English?