02 January 2010

This Reminds Me of Something...

Interesting piece about the problems of remembering as we grow older:

if you are primed with sounds that are close to those you’re trying to remember — say someone talks about cherry pits as you try to recall Brad Pitt’s name — suddenly the lost name will pop into mind. The similarity in sounds can jump-start a limp brain connection. (It also sometimes works to silently run through the alphabet until landing on the first letter of the wayward word.)

This is exactly the method that I have developed in my old age: when I can't remember a name or word, I start saying apparently random sounds to myself, gradually focussing on those that *feel* close to the one I'm looking for. It something takes a while, but I can generally find the word, and it usually has some connection with the ones that I pronounce in my journey towards it.

I also found that this resonated with my experience too:

continued brain development and a richer form of learning may require that you “bump up against people and ideas” that are different. In a history class, that might mean reading multiple viewpoints, and then prying open brain networks by reflecting on how what was learned has changed your view of the world.

I find working in the field of computing useful here, since there are always new things to try. As the article says, it seems particularly helpful to try out things you are *not* particularly sympathetic to. It's the reason that I started twittering on 1 January last year: to force myself to do something new and something challenging. Well, that seemed to work out. Question is, what should I be doing this year?

Follow me @glynmoody on Twitter or identi.ca.

31 December 2009

What Lies at the Heart of "Avatar"?

If nothing else, "Avatar" is a computational tour-de-force. Here are some details of the kit they used:

It takes a lot of data center horsepower to create the stunning visual effects behind blockbuster movies such as King Kong, X-Men, the Lord of the Rings trilogy and most recently, James Cameron’s $230 million Avatar. Tucked away in Wellington, New Zealand are the facilities where visual effects company Weta Digital renders the imaginary landscapes of Middle Earth and Pandora at a campus of studios, production facilities, soundstages and a purpose-built data center.

...

The Weta data center got a major hardware refresh and redesign in 2008 and now uses more than 4,000 HP BL2×220c blades (new BL2×220c G6 blades announced last month), 10 Gigabit Ethernet networking gear from Foundry and storage from BluArc and NetApp. The system now occupies spot 193 through 197 in the Top 500 list of the most powerful supercomputers.

Here's info about Weta from the Top500 site:

Site WETA Digital
System Family HP Cluster Platform 3000BL
System Model Cluster Platform 3000 BL 2x220
Computer Cluster Platform 3000 BL2x220, L54xx 2.5 Ghz, GigE
Vendor Hewlett-Packard
Application area Media
Installation Year 2009

Operating System Linux

Oh, look: Linux. Why am I not surprised...?

Follow me @glynmoody on Twitter or identi.ca.

30 December 2009

The Wisdom of the Conservatives

I don't have much time for either of the main UK political parties (or many of the others, come to that), but I must give some kudos to the Tories for latching onto an ironic weakness of Labour: its authoritarian hatred of openness. And here the former are at it again, showing the UK government how it should be done:


The Conservatives are today announcing a competition, with a £1million prize, for the best new technology platform that helps people come together to solve the problems that matter to them – whether that’s tackling government waste, designing a local planning strategy, finding the best school or avoiding roadworks.

This online platform will then be used by a future Conservative government to throw open the policy making process to the public, and harness the wisdom of the crowd so that the public can collaborate to improve government policy. For example, a Conservative government would publish all government Green Papers on this platform, so that everyone can have their say on government policies, and feed in their ideas to make them better.

This is in addition to our existing radical commitment to introduce a Public Reading Stage for legislation so that the public can comment on draft bills, and highlight drafting errors or potential improvements.

That said, the following is a bit cheeky:

Harnessing the wisdom of the crowd in this way is a fundamentally Conservative approach, based on the insight that using dispersed information, such as that contained within a market, often leads to better outcomes than centralised and closed systems.

Tories as bastions of the bottom-up approach? Stalin would have been proud of that bit of historical revisionism.

The only remaining question (other than whether the Conservatives will win the forthcoming UK General Election) is whether the software thus produced will be released under an open source licence. I presume so, since this would also be "a fundamentally Conservative approach"....

Follow me @glynmoody on Twitter or identi.ca.

What Took Wired So Loongson?

I've been writing about the Loongson chip for three years now. As I've noted several times, this chip is important because (a) it's a home-grown Chinese chip (albeit based on one from MIPS) and (b) Windows doesn't run on it, but GNU/Linux does.

It looks like Wired magazine has finally woken up to the story (better late than never):


Because the Loongson eschews the standard x86 chip architecture, it can’t run the full version of Microsoft Windows without software emulation. To encourage adoption of the processor, the Institute of Computing Technology is adapting everything from Java to OpenOffice for the Loongson chip and releasing it all under a free software license. Lemote positions its netbook as the only computer in the world with nothing but free software, right down to the BIOS burned into the motherboard chip that tells it how to boot up. It’s for this last reason that Richard “GNU/Linux” Stallman, granddaddy of the free software movement, uses a laptop with a Loongson chip.

Because GNU/Linux distros have already been ported to the Loongson chip, neither Java nor OpenOffice.org needs "adapting" so much as recompiling - hardly a challenging task. As for "releasing it all under a free software license", they had no choice.

But at least Wired got it right about the potential impact of the chip:

Loongson could also reshape the global PC business. “Compared to Intel and IBM, we are still in the cradle,” concedes Weiwu Hu, chief architect of the Loongson. But he also notes that China’s enormous domestic demand isn’t the only potential market for his CPU. “I think many other poor countries, such as those in Africa, need low-cost solutions,” he says. Cheap Chinese processors could corner emerging markets in the developing world (and be a perk for the nation’s allies and trade partners).

And that’s just the beginning. “These chips have implications for space exploration, intelligence gathering, industrialization, encryption, and international commerce,” says Tom Halfhill, a senior analyst for Microprocessor Report.

Yup.

Follow me @glynmoody on Twitter or identi.ca.

29 December 2009

The Lost Decades of the UK Web

This is a national disgrace:

New legal powers to allow the British Library to archive millions of websites are to be fast-tracked by ministers after the Guardian exposed long delays in introducing the measures.

The culture minister, Margaret Hodge, is pressing for the faster introduction of powers to allow six major libraries to copy every free website based in the UK as part of their efforts to record Britain's cultural, scientific and political history.

The Guardian reported in October that senior executives at the British Library and National Library of Scotland (NLS) were dismayed at the government's failure to implement the powers in the six years since they were established by an act of parliament in 2003.

The libraries warned that they had now lost millions of pages recording events such as the MPs' expenses scandal, the release of the Lockerbie bomber and the Iraq war, and would lose millions more, because they were not legally empowered to "harvest" these sites.

So, 20 years after Sir Tim Berners-Lee invented the technology, and well over a decade after the Web became a mass medium, and the British Library *still* isn't archiving every Web site?

History - assuming we have one - will judge us harshly for this extraordinary UK failure to preserve the key decades of the quintessential technology of our age. It's like burning down a local digital version of the Library of Alexandria, all over again.

Follow me @glynmoody on Twitter or identi.ca.

Copyright Infringement: A Modest Proposal

The UK government's Canute-like efforts to stem the tide of online copyright infringement have plumbed new depths, it seems:


Proposals to suspend the internet connections of those who repeatedly share music and films online will leave consumers with a bill for £500 million, ministers have admitted.

The Digital Economy Bill would force internet service providers (ISPs) to send warning letters to anyone caught swapping copyright material illegally, and to suspend or slow the connections of those who refused to stop. ISPs say that such interference with their customers’ connections would add £25 a year to a broadband subscription.

As Mike Masnick points out:

Note, of course, that the music industry itself claims that £200 million worth of music is downloaded in the UK per year (and, of course, that's only "losses" if you use the ridiculous and obviously incorrect calculation that each download is a "lost sale").

So this absurd approach will actually cost far more than it will save, even accepting the grossly-inflated and self-serving figures from the music industry.

Against that background, I have a suggestion.

Given that the UK government seems happy for huge sums of money to be spent on this fool's errand, why not spend it more effectively, in a way that sustains businesses, rather than penalising them, and which actually encourages people not to download copyrighted material from unauthorised sources?

This can be done quite simply: by giving everyone who wants it a free Spotify Premium subscription. These normally cost £120 per year, but buying a national licence for the 10 million families or so who are online would presumably garner a generous discount - say, of 50% - bringing the total price of the scheme to around £600 million, pretty much the expected cost of the current plans.

As I can attest, once you get the Spotify Premium habit, you really don't want to bother with downloading files and managing them: having everything there, in the cloud, nicely organised, is just *so* convenient (well, provided you don't lose your connection). I'm sure that my scheme would lead to falls in the levels of file sharing that the government is looking for; and anyway, it could hardly be worse than the proposals in the Digital Economy bill.

Update: On Twitter, Barbara Cookson suggested a clever tweak to this idea: "absolution for ISPs who include #spotify as part of package". Nice.

Follow me @glynmoody on Twitter or identi.ca.

28 December 2009

Making Money by Giving Stuff Away

Open source software is obviously extremely interesting to companies from a utilitarian viewpoint: it means they can reduce costs and – more significantly – decrease their dependence on single suppliers. But there's another reason why businesses should be following the evolution of this field: it offers important lessons about how the economics of a certain class of products is changing.

On Open Enterprise blog.

24 December 2009

ACTA as the (Fool's) "Gold Standard"

I've noted before that at the heart of the ACTA negotiations there is a con-trick being played upon the world: insofar as the mighty ones deign to pass down any crumbs of information to us little people, it is framed in terms of the dangers of counterfeit medicines and the like, and how we are being "protected". But, then, strangely, those counterfeit medicines morph into digital copies of songs - where there is obviously no danger whatsoever - but the same extreme measures are called for.

Unfortunately, the European Union has now joined in the parroting this lie, and is now pushing even harder for ACTA to be implemented:


The European Union appears to be preparing for adoption of the “gold standard” of enforcement, the Anti-Counterfeiting Trade Agreement (ACTA), as intellectual property law expert Annette Kur from the Max Planck Institute of Intellectual Property, Competition and Tax Law said it is now called.

At a conference of the Swedish EU Presidency on “Enforcement of Intellectual Property with a Special Focus on Trademarks and Patents” on 15-16 December in Stockholm, representatives from EU bodies, member states and industry supported a quick enforcement of ACTA, according to participants. A representative of the Justice, Freedom and Security Directorate General of the European Commission, presented a plan for a quick restart of a legislative process in the EU to harmonise criminal law sanctions in the Community.

Worryingly:

Only two members of Parliament attended the conference in Stockholm, which despite its high-level panels was not much publicised by the Swedish presidency. Not even an agenda had been published beforehand

That is, the inner circle of the EU, represented by the EU Presidency, was clearly trying to minimise scrutiny by the European Parliament, which has historically taken a more balanced view of intellectual monopolies and their enforcement. That matters, because:

Under the Lisbon Treaty, the European Parliament would be kept informed of the negotiation process in a manner similar to the Council, a Commission expert said. Furthermore, the ACTA text would be approved both by the Parliament and the Council.

In other words, the European Parliament now has powers that allow it to block things like ACTA, should it so desire. That's obviously a problem for those European politicians used to getting their way without such tiresome democratic obstacles.

Despite this shameful attempt to keep everything behind closed doors, the presentations show that even among those with access to the inner circle there are doubts about ACTA's "gold standard". Here's what the academic Annette Kur said in her presentation [.pdf]:

Using the public concern about serious crimes like fabrication of fake and noxious medicaments as an argument pushing for stronger legislation on IP infringement in general is inappropriate and dangerous

It is dangerous because it obscures the fact that to combat risks for public health is not primarily an IP issue

It is inappropriate because it will typically tend to encourage imbalanced legislation

Similarly Kostas Rossoglou from BEUC, the European Consumers’ Organisation, was deeply worried by the following aspects [.pdf]:

Counterfeiting used as a general term to describe all IPR Infringements and beyond!!!

Broad scope of IPRED Directive – all IPR infringements are presumed to be equally serious!!!

No distinction between commercial piracy and unauthorised use of copyright-protected content by individuals

No clear definition of the notion of “commercial scale”

Things are moving fast on the ACTA front in Europe, with a clear attempt to steamroller this through without scrutiny. This makes it even more vital that we call out those European politicians who try to justify their actions by equating counterfeiting and copyright infringement, and that we continue to demand a more reasoned and balanced approach that takes into account end-users as well as the holders of intellectual monopolies.

Follow me @glynmoody on Twitter or identi.ca.

23 December 2009

Coming up with a Copyright Assignment Strategy

One of the deep ironies of the free software world, which is predicated on freedom, is that it effectively requires people to become experts in copyright, an intellectual monopoly that is concerned with restricting freedom. That's because the GNU GPL, and licences that have followed its lead, all use copyright to achieve their aims. At times, though, that clever legal hack can come back to bite you, and nowhere more painfully than in the field of copyright assignment.

On Open Enterprise blog.

Google Opens up – about Google's Opennness

Google could not exist without open source software: licensing costs would be prohibitive if it had based its business on proprietary applications. Moreover, free software gives it the possibility to customise and optimise its code – crucially important in terms of becoming and staying top dog in the highly-competitive search market.

On Open Enterprise blog.

All Hail the Mighty Algorithm

As long-suffering readers of this blog will know, one of the reasons I regard software patents as dangerous is because software consists of algorithms, and algorithms are simply maths. So allowing software patents is essentially allowing patents on pure knowledge.

Against that background, this looks pretty significant:

Industries, particularly high tech, may be waiting for the U.S. Supreme Court decision, expected this coming spring, in the Bilski case to decide some fundamental questions of when you can patent business methods. But in the meantime, there’s a newly published decision from the Board of Patent Appeals and Interferences that establishes a new test to determine whether a machine or manufactured article that depends on a mathematical algorithm is patentable. The ruling is a big deal because it’s one of the few precedential decisions that the BPAI issues in a given year, and it will have a direct impact on patents involving computers and software.

For a claimed machine (or article of manufacture) involving a mathematical algorithm,

1. Is the claim limited to a tangible practical application, in which the mathematical algorithm is applied, that results in a real-world use (e.g., “not a mere field-of-use label having no significance”)?
2. Is the claim limited so as to not encompass substantially all practical applications of the mathematical algorithm either “in all fields” of use of the algorithm or even in “only one field?”

If the machine (or article of manufacture) claim fails either prong of the two-part inquiry, then the claim is not directed to patent eligible subject matter.

Now, the devil is in the details, and what impact this has will depend upon its interpretation. But what I find significant is that algorithms are foregrounded: the more people concentrate on this aspect, the harder it will be to justify software patents.

Follow me @glynmoody on Twitter or identi.ca.

16 December 2009

Hypocrisy, Thy Name is MPAA

I do love it when copyright maximalist organisations like the MPAA put out statements, because they invariably put their foot in it too. This "Testimony of Dan Glickman Chairman and CEO Motion Picture Association of America" is no exception. Here's a plum [.pdf]:

While not a Free Trade Agreement, the US motion picture industry – producers, studios and guilds -- has a keen interest in the Anti-Counterfeiting Trade Agreement (ACTA), in particular the provisions to address Internet piracy. We firmly believe that for the ACTA to address the enforcement challenges our industry confronts today, it MUST include robust protections for intellectual property online. Practical secondary liability regimes for online infringement are essential to motivate stakeholders to cooperate in implementing the reasonable practices that promote legitimate consumer options and make the online marketplace less hospitable for infringers. ACTA parties should refine their secondary liability regimes to reflect current realities and adopt modern, flexible systems where they do not exist.

What the MPAA wants is for ISPs, for example, to change their businesses "to reflect current realities and adopt modern, flexible systems where they do not exist": how strange, then, that the MPAA is not prepared to do the same by working according to the new digital rules instead of clinging to the old analogue ones...

Follow me @glynmoody on Twitter or identi.ca.

EC Says OK to MS IE Deal: How Much of a Win?

Neelie Kroes, European Commissioner for Competition Policy, had some news this morning:

Today is an important day for internet users in Europe. Today, the Commission has resolved a serious competition concern in a key market for the development of the internet, namely the market for web browsers. Now - for the first time in over a decade - Internet users in Europe will have an effective and unbiased choice between Microsoft’s Internet Explorer and competing web browsers, such as Mozilla Firefox, Google Chrome, Apple Safari and Opera....

On Open Enterprise blog.

15 December 2009

SFLC Gets Busy Around BusyBox

Contrary to some public perceptions, the Free Software Foundation is not keen on litigating against those who fail to respect the terms of the GNU GPL. Here's what Eben Moglen, very much the legal brains behind the organisation, told me a decade ago....

On Open Enterprise blog.

Australia Edges Us Towards the Digital Dark Ages

Last week, on my opendotdotdot blog, I was praising the Australian government for its moves to open up its data. I was rapidly – and rightly – taken to task in the comments for failing to mention that government's efforts to impose direct, low-level censorship on the country's Internet feed.

Although I was aware of these moves, I wasn't quite up to date on their progress. It seems that things have moved far and fast...

On Open Enterprise blog.

14 December 2009

Canadians *Do* Have a Sense of Humour

Want a good laugh?


One hour ago, a spoof press release targeted Canada in order to generate hurtful rumors and mislead the Conference of Parties on Canada's positions on climate change, and to damage Canada's standing with the international business community.

The release, from "press@enviro-canada.ca," alleges Canada's acceptance of unrealistic emissions-reduction targets, as well as a so-called "Climate Debt Mechanism," a bilateral agreement between Canada and Africa to furnish that continent with enormous sums in "reparation" for climate damage and to "offset" adaptation.

Of course, everyone should have known that Canada wouldn't do anything like accept massive emission reduction targets, or agree to reparations. No, this is what it *really* has in mind:

Today as always, Canada's binding responsibility is to supply the world - including its burgeoning developing portion - with those means of transport, health, and sustenance that prosperous markets require. Stopping short of these dictates would violate the very principles upon which our nations were founded, and endanger our very development.

As you will note, there's nothing here about that tiresome need to minimise climate change, it's all about "prosperous markets", yeah. Indeed:

Canada's current energy policy represents an elegant synthesis of the most advanced science, while remaining faithful to Canada's tradition of political pragmatism. Experts note, for example, that the much-decried oil sands of Alberta, contrary to environmentalists' dire assertions, are enabling Canada to meet ambitious emissions goals by providing her, as well as her neighbors, with the energy resources needed to transition to a cleaner energy future.

Cunning, no? Canada notes how using energy from one of the dirtiest sources, the "much-decried oil sands of Alberta", is in fact absolutely fine because it will allow a transition to a "cleaner energy future". Which means that we can justify *any* kind of energy source, no matter how dirty, provided it makes things better at some ill-specified time in the future.

If we have one, of course. (Via Tristan Nitot.)

Follow me @glynmoody on Twitter or identi.ca.

Monsoft or Microsanto?

I and others (notably Roy Schestowitz) have noted the interesting similarities between Microsoft and Monsanto at various levels; but a major new story from the Associated Press makes the parallels even more evident.

For example:

One contract gave an independent seed company deep discounts if the company ensured that Monsanto's products would make up 70 percent of its total corn seed inventory. In its 2004 lawsuit, Syngenta called the discounts part of Monsanto's "scorched earth campaign" to keep Syngenta's new traits out of the market.

This is identical to the approach adopted by Microsoft in offering discounts to PC manufacturers that only offered its products.

Monsanto has followed Microsoft in placing increasing emphasis on patents:

Monsanto was only a niche player in the seed business just 12 years ago. It rose to the top thanks to innovation by its scientists and aggressive use of patent law by its attorneys.

First came the science, when Monsanto in 1996 introduced the world's first commercial strain of genetically engineered soybeans. The Roundup Ready plants were resistant to the herbicide, allowing farmers to spray Roundup whenever they wanted rather than wait until the soybeans had grown enough to withstand the chemical.

The company soon released other genetically altered crops, such as corn plants that produced a natural pesticide to ward off bugs. While Monsanto had blockbuster products, it didn't yet have a big foothold in a seed industry made up of hundreds of companies that supplied farmers.

That's where the legal innovations came in, as Monsanto became among the first to widely patent its genes and gain the right to strictly control how they were used. That control let it spread its technology through licensing agreements, while shaping the marketplace around them.

Monsanto also blocks the use of "open source" genetically-modified organisms:

Back in the 1970s, public universities developed new traits for corn and soybean seeds that made them grow hardy and resist pests. Small seed companies got the traits cheaply and could blend them to breed superior crops without restriction. But the agreements give Monsanto control over mixing multiple biotech traits into crops.

The restrictions even apply to taxpayer-funded researchers.

Roger Boerma, a research professor at the University of Georgia, is developing specialized strains of soybeans that grow well in southeastern states, but his current research is tangled up in such restrictions from Monsanto and its competitors.

"It's made one level of our life incredibly challenging and difficult," Boerma said.

The rules also can restrict research. Boerma halted research on a line of new soybean plants that contain a trait from a Monsanto competitor when he learned that the trait was ineffective unless it could be mixed with Monsanto's Roundup Ready gene.

The result is yet another monoculture:

"We now believe that Monsanto has control over as much as 90 percent of (seed genetics). This level of control is almost unbelievable," said Neil Harl, agricultural economist at Iowa State University who has studied the seed industry for decades.

The key difference here, of course, is that this is no metaphor, but a *real* monoculture, with all the dangers that this implies.

Fortunately, things seem to be evolving for Monsanto just as they did for Microsoft, with a major anti-trust investigation in the offing:

Monsanto's business strategies and licensing agreements are being investigated by the U.S. Department of Justice and at least two state attorneys general, who are trying to determine if the practices violate U.S. antitrust laws.

Amazingly, David Boies, the lawyer that led the attack on Microsoft during that investigation, is also invovled: he is representing Du Pont, one of Monsanto's rivals concerned about the latter's monopoly power.

Let's just hope that Monsanto becomes the subject of a full anti-trust action, and that the result is more effective than that applied to Microsoft. After all, we're not talking about software here, but the world's food supply, and monopolies - both intellectual and otherwise - are simply morally indefensible when billions of lives are stake.

Follow me @glynmoody on Twitter or identi.ca.

13 December 2009

Of Access to Copyright Materials and Blindness

In a way, I suppose we should be grateful that the content industries have decided to dig their heels in over the question of providing more access to copyright materials for the visually impaired. For it leads to revelatory posts like this, which offer an extraordinary glimpse into the twisted, crimped souls of those fighting tooth and nail against the needs of the blind and visually impaired:

the treaty now being proposed would not be compatible with US copyright laws and norms, and would undermine the goal of expanded access that we all share. This overreaching treaty would also harm the rights of authors and other artists, and the incentives necessary for them to create and commercialize their works. We strongly believe improving access for one community should not mean that another loses its rights in the process.

Let's just look at that.

First, in what sense is providing more access to the visually impaired not compatible with US copyright laws? The proponents of this change have gone out of their way to make sure that the access given is within current copyright regimes, which are not serving this huge, disadvantaged constituency properly. And how would it undermine expanded access? It would, manifestly, provide access that is not available now; the publishers have proposed nothing that would address the problem other than saying the system's fine, we don't want to change it.

But the most telling - and frankly, sickening - aspect of this post is the way its author sets up the rights of authors against the rights of those with visual disabilities, as if the latter are little better than those scurvy "pirates" that "steal" copyright material from those poor authors.

In fact, *nothing* is being taken, it's simply that these people wish to enjoy their rights to read as others do - something that has been denied to them by an industry indifferent to their plight. And which author would not be happy to extend the pleasure of reading their works to those cut off from it by virtue of physical disabilities?

If Mark Esper thinks that is an unreasonable, outrageous goal for the visually impaired, and that maximalist copyright trumps all other humanitarian considerations, he is a truly sad human being, and I pity him. He should try looking in the mirror sometime - and be glad that he can, unlike those whose rights he so despises. (Via Jamie Love.)

Follow me @glynmoody on Twitter or identi.ca.

11 December 2009

The Future Impact of Openness

The European Commission has released a report [.pdf] with the rather unpromising title "Trends in connectivity technologies and their socio-economic impacts". Despite this, and a rather stodgy academic style, there are a number of interesting points made.

One of the best chapters is called "Projecting the future: Scenarios for tech trend development and impact assessment", which describe three possible future worlds: Borderless, Connecting and Scattered. What's interesting is that Connecting essentially describes a world where openness of all kinds is a major feature. The implications of these kinds of worlds are then examined in detail.

I wouldn't describe it as a gripping read, but there's a huge amount of detail that may be of interest to those pondering on what may be, especially the role of openness there.

Uncommon Meditations on the Commons

It's significant that books about the commons are starting to appear more frequently now. Here's one that came out six months ago:


Who Owns the World? The Rediscovery of the Commons, has now been published by oekom Verlag in Berlin. (The German title is Wem gehört die Welt – Zur Wiederentdeckung der Gemeingüter.) The book is an anthology of essays by a wide range of international authors, including Elinor Ostrom, Richard Stallman, Sunita Narain, Ulrich Steinvorth, Peter Barnes, Oliver Moldenhauer, Pat Mooney and David Bollier.

Unfortunately, its text no longer seems available in English (please correct me if I'm wrong), although there is a version in Spanish [.pdf]. For those of you a little rusty in that tongue, there's a handy review and summary of the book that actually turns into a meditation on some unusual aspects of the commons in its own right. The original, in French, is also available.

Here's the conclusion:

Those who love the commons and reciprocity rightly highlight the risks entailed by their necessary relationships with politics and the State, with money and the market. This caution should not lead them to isolate the commons from the rest of the world, however, or from the reign of the State and market. State and market are not cadavers which can be nailed into a coffin and thrown into the sea. For a very, very long time, they will continue to contaminate or threaten the reciprocal relationships that lie at the heart of the commons, with their cold logic. We can only try to reduce their importance. We must hope that reciprocal relationships will grow in importance with respect to relationships of exchange and of authority.

Worth reading.

Follow me @glynmoody on Twitter or identi.ca.

Preserving Patents Before the Planet

I don't think this needs much comment:

The Chamber's Global Intellectual Property Center (GIPC) has been front and center in this debate, and our position is clear: if governments are serious about addressing climate change, and all agree that new technologies are a vital part of the answer, then IP laws and rights need to be protected in any Copenhagen agreement. Indeed, in our view, a Copenhagen Summit with NO mention of IP at all is a successful conclusion. Current international laws and norms are working, and need to be preserved.

Got that? Stuff the environment, we've got to protect the *important* things in life, like intellectual monopolies...

Mandelson's Power to Censor the Net

I and many others have already noted that the proposed Digital Economy Bill gives far too many sweeping powers to the government. According to this detailed analysis, looks like there's one more clause to worry about:

What is the problem with clause 11 that I am getting so alarmed about it? It amends the Communications Act 2003 to insert a new section 124H which would, if passed, give sweeping powers to the Secretary of State. It begins:

(1) The Secretary of State may at any time by order impose a technical obligation on internet service providers if the Secretary of State considers it appropriate in view of—

Pausing there. Note that this says nothing at all about copyright infringement. For example the power could be used to:

* order ISP's to block any web page found on the Internet Watch Foundation's list
* block specific undesireable sites (such as wikileaks)
* block specific kinds of traffic or protocols, such as any form of peer-to-peer
* throttle the bandwidth for particular kinds of serivce or to or from particular websites.

In short, pretty much anything.

And how might that be used?

The definition of a "technical obligation" and "technical measure" are inserted by clause 10:

A "technical obligation", in relation to an internet service provider, is an obligation for the provider to take a technical measure against particular subscribers to its service.

A "technical measure" is a measure that— (a) limits the speed or other capacity of the service provided to a subscriber; (b) prevents a subscriber from using the service to gain access to particular material, or limits such use; (c) suspends the service provided to a subscriber; or (d) limits the service provided to a subscriber in another way.

As you can see blocking wikileaks is simply a matter of applying a technical measure against all subscribers of any ISP.

Hidden away inside the Bill, there's unlimited - and arbitrary - censorship of any site the Secretary of State takes against:

Surely something must limit this power you ask? It seems not. The Secretary of State may make an order if "he considers it appropriate" in view of:

(a) an assessment carried out or steps taken by OFCOM under section 124G; or (b) any other consideration.

Where "any other consideration" could be anything. To their credit the Tories do seem to have realised that this particular alternative is overly permissive. Lord Howard of Rising and Lord de Mauley have proposed (in the first tranche of amendments proposed that the "or" be replaced by an "and".

What astonishes me is that there is no obligation for the Secretary of STate to even publish such an order, let alone subject it to the scrutiny of Parliament, yet he could fundamentally change the way the internet operates using it. Other orders made under other parts of the Bill will have to be made by statutory instrument and most will require Parliamentary approval. Not this one.

If this goes through, we are in deep trouble, people....

Follow me @glynmoody on Twitter or identi.ca.

Visualising Open Data

One of the heartening trends in openness recently has been the increasing, if belated, release of non-personal government data around the world. Even the UK is waking up to the fact that transparency is not just good democracy, but is good economics too, since it can stimulate all kinds of innovation based on mashups of the underlying data.

That's the good news. The bad news is that the more such data we have, the harder it is to understand what it means. Fortunately, there is a well-developed branch of computing that tries to deal with this problem: visualisation. That is, turning the reams of ungraspable numbers into striking images that can be taken in at a glance.

Of course, the problem here is that someone has to spend time and effort taking the numbers and turning them into useful visualisations. Enter the Open Knowledge Foundation, which today launches the self-explanatory site “Where Does My Money Go? - analysing and Visualising UK Public Spending” (disclaimer: I have recently joined the OKFN's Advisory Board, but had nothing to do with this latest project.)

Here's what the press release has to say about the new site:

Now more than ever, UK taxpayers will be wondering where public funds are being spent - not least because of the long shadow cast by the financial crisis and last week’s announcements of an estimated £850 billion price tag for bailing out UK banks. Yesterday’s pre-budget report also raises questions about spending cutbacks and how public money is being allocated across different key areas.

However, closing the loop between ordinary citizens and the paper-trail of government receipts is no mean feat. Relevant documents and datasets are scattered around numerous government websites - and, once located, spending figures often require background knowledge to interpret and can be hard put into context. In the UK there is no equivalent to the US Federal Funding Accountability and Transparency Act, which requires official bodies to publish figures on spending in a single place. There were proposals for similar legislation in 2007, but these were never approved.

On Friday 11th December the Open Knowledge Foundation will launch a free interactive online tool for showing where UK public spending goes. The Where Does My Money Go? project allows the public to explore data on UK public spending over the past 6 years in an intuitive way using an array of maps, timelines and graphs. By means of the tool, anyone can make sense of information on public spending in ways which were not previously possible.

There's currently a prototype, and a list of the datasets currently analysed available as a Google Docs spreadsheet. There are some really cool interactive visualisations, but I can't point you to any of them because they are hidden within a Flash-based black box – one of the big problems with this benighted technology. Once HTML5 is finalised it will presumably be possible to move everything to this open format, which would be rather more appropriate for a site dedicated to open data.

That notwithstanding, it's great to see the flood of information being tamed in this way; I hope it's the forerunner of many more like it (other than its dependence on Flash, of course) as governments around the world continue to release more of their data hoards. Meanwhile, do take it for a spin and pass on any suggestions you have that might improve it.

Follow me @glynmoody on Twitter or identi.ca.

10 December 2009

Why Does Amazon Want to Be Evil?

I like Amazon's services. Indeed, judging by the amount I spend with the company, I'm probably a suitable case for treatment for Amazon addiction (whatever you do, don't sign up for Amazon Prime, which makes getting stuff *far* too easy).

And yet despite the fact that it offers an incredible service, Amazon seems hell-bent on proving that it is not a cuddly new-style company, but just as rapacious and obsessed with "owning" commonplace ideas as all the bad old ones.

Specifically, it is *still* trying to get a European patent on things that are both obvious and manifestly just business methods, neither of which can be patented in Europe:

The Board of Appeal of the European Patent Office (EPO) has recently heard an appeal against revocation of one of Amazon's "one-click" patents following opposition proceedings. The Board of Appeal found that the decision to revoke the patent should be set aside and that the patent should be returned to the opposition division for further consideration of an alternative set of claims.

Here's that brilliant "invention" that Amazon is so keen to claim as its very own:

The particular patent in issue is concerned with allowing a first individual to send a gift to a second individual when the first individual knows only the second individual's email address but not their postal address.

Wow, you can tell that Jeff Bezos and his crew are geniuses of Newtonian proportions from the fact that they were able to conceive such a stunningly original idea as that.

Undettered by its rejection, Amazon is now trying an even more pathetic track:

The Appeal Board decided that revocation of the patent as granted was correct, but that more limited claims relating to details of technical implementation of the invention should be considered further.

That is, having failed to patent the idea itself, it is now trying to claim that a "computer implementation" of the idea is patentable - as if implementing an obvious, trivial idea in a computer stops it from being obvious and trivial.

*Shame* on you, Amazon.

Follow me @glynmoody on Twitter or identi.ca.

UK Data Retention Double Standards

As we know, the UK government intends to force UK ISPs to store vast amounts of data about our online activities. The idea that this might be an undue burden is dismissed out of hand. But what do we now read about using intercept evidence in court?

Lord Carlile said the long-term aim was to introduce intercept evidence, but the circumstances were not yet right.

"Before intercept evidence can be useful in court it has to be able to satisfy two broad tests," he said. "It has to be legally viable and it has to be practically viable.

"I suspect that [the government] may well say that neither of those broad tests have been met."

He said that under European human rights law all material intercepted during the course of an inquiry would have to be available at trial, possibly several years later.

The practical means to electronically store that much data did not currently exist, he said.

Obviously the government's intercept data is special heavy *pixie* data that can't be stored on ordinary technology in the same way that the terabytes of *ordinary* ISP data can...

Follow me @glynmoody on Twitter or identi.ca.