12 January 2010

Stop the "Stop and Search" Shame

How can a government minister be so shameless as this?


Policing and Security Minister David Hanson MP said: ”Stop and search under section 44 of the Terrorism Act 2000 is an important tool in a package of measures in the ongoing fight against terrorism."

How has it done one single thing to "fight against terrorism"? How many "terrorists" have they caught as a result of using "stop and search"? Zero, I'll be bound. All it seems to be used for is to oppress opponents of the government. Words fail me.

11 January 2010

What "Nothing to Hide" is Hiding

As governments around the world - but particular in the UK - increase the surveillance of their hapless citizens, one argument above all is made in favour of doing so: "if you have nothing to hide, you have nothing to fear."

Of course, the rebuttal is that, indeed, we have nothing to hide, but we do value our privacy, and we should not be asked to sacrifice that for dubious government convenience. But as this excellent paper entitled "I've got nothing to hide, and other misunderstanding of privacy" points out, there is a particularly dangerous "strong" form of this argument that is harder to brush off so easily:

Grappling with the nothing to hide argument is important, because the argument reflects the sentiments of a wide percentage of the population. In popular discourse, the nothing to hide argument’s superficial incantations can readily be refuted. But when the argument is made in its strongest form, it is far more formidable.

...


The NSA surveillance, data mining, or other government information- gathering programs will result in the disclosure of particular pieces of information to a few government officials, or perhaps only to government computers. This very limited disclosure of the particular information involved is not likely to be threatening to the privacy of law-abiding citizens. Only those who are engaged in illegal activities have a reason to hide this information. Although there may be some cases in which the information might be sensitive or embarrassing to law-abiding citizens, the limited disclosure lessens the threat to privacy. Moreover, the security interest in detecting, investigating, and preventing terrorist attacks is very high and outweighs whatever minimal or moderate privacy interests law-abiding citizens may have in these particular pieces of information. Cast in this manner, the nothing to hide argument is a formidable one. It balances the degree to which an individual’s privacy is compromised by the limited disclosure of certain information against potent national security interests. Under such a balancing scheme, it is quite difficult for privacy to prevail.


One of the key arguments of the paper revolves around data aggregation (not surprisingly):

Aggregation...means that by combining pieces of information we might not care to conceal, the government can glean information about us that we might really want to conceal. Part of the allure of data mining for the government is its ability to reveal a lot about our personalities and activities by sophisticated means of analyzing data. Therefore, without greater transparency in data mining, it is hard to claim that programs like the NSA data mining program will not reveal information people might want to hide, as we do not know precisely what is revealed. Moreover, data mining aims to be predictive of behavior, striving to prognosticate about our future actions. People who match certain profiles are deemed likely to engage in a similar pattern of behavior. It is quite difficult to refute actions that one has not yet done. Having nothing to hide will not always dispel predictions of future activity.


Moreover:

Another problem in the taxonomy, which is implicated by the NSA program, is the problem I refer to as exclusion.85 Exclusion is the problem caused when people are prevented from having knowledge about how their information is being used, as well as barred from being able to access and correct errors in that data. The NSA program involves a massive database of information that individuals cannot access. Indeed, the very existence of the program was kept secret for years.86 This kind of information processing, which forbids people’s knowledge or involvement, resembles in some ways a kind of due process problem. It is a structural problem involving the way people are treated by government institutions. Moreover, it creates a power imbalance between individuals and the government. To what extent should the Executive Branch and an agency such as the NSA, which is relatively insulated from the political process and public accountability, have a significant power over citizens? This issue is not about whether the information gathered is something people want to hide, but rather about the power and the structure of government.


Finally:

A related problem involves “secondary use.” Secondary use is the use of data obtained for one purpose for a different unrelated purpose without the person’s consent. The Administration has said little about how long the data will be stored, how it will be used, and what it could be used for in the future. The potential future uses of any piece of personal information are vast, and without limits or accountability on how that information is used, it is hard for people to assess the dangers of the data being in the government’s control.

None of these will come as any surprise to people thinking about privacy and computers, but it's interesting to read a lawyer's more rigorous take on the same ideas.

Follow me @glynmoody on Twitter or identi.ca.

Is Richard Stallman Mellowing?

Richard Stallman is sometimes presented as a kind of Old Testament prophet, hurling anathemas hither and thither (indeed, I've been guilty of this characterisation myself - well, he does *look* like one.) But just recently we've had a fascinating document that suggests that this is wrong – or that RMS is mellowing....

On Open Enterprise blog.

Mozilla Starts to Follow a New Drumbeat

I've written often enough about Firefox and its continuing steady gains of browser market share. Here's another nice stat:

Roughly keeping pace with previous years, Firefox grew 40% worldwide. Two regions in particular continued adopting Firefox at a breakneck pace — South America (64%) and Asia (73%).

Most of the 40% growth occurred recently. In the 4 months leading up to the holiday season, Firefox added 22.8 million active daily users. During that same period last year, Firefox added 16.4 million users.

That's all well and good, but it raises the question: what should Mozilla be doing *after* it conquers the browser world – that is, once it has 50% market share?

On Open Enterprise blog.

In Praise of TinyOgg

The Ogg sound and video formats finally seem to be gaining some traction; the only problem is that few of the major sites employ it - Flash is pretty much ubiquitous these days. That's where TinyOgg comes in:

This service allows you to watch and listen to Flash-based videos without the need to Flash technology. This gives you speed, safety, control, freedom and openness.

And if you want more detail on the latter advantages:

Choosing Ogg over Flash is both an ethical and technical question. We all want our computers to do better, so here is how our computers are better with Free Formats:

1. You will be able to enjoy the media "natively"; there is no need to install any plug-in or add-on to your standard-friendly browser.

2. You will not need to load heavy and possibly unsafe scripts, which helps with speed and stability.

3. You will support a free, open Web where no corporation monopolizes an important technology, such as video.

4. You will enjoy more control over your digital life with Free Formats and Open Standards because no corporation decides what you can and cannot do with the files you have.

If you're wondering how they can afford to host all the stuff, the answer is, they don't:

All videos on TinyOgg are hosted for 48 hours. We believe that most users are going to watch the video only once, so there is no need to the "for-ever hosting" feature, which is going to be costly.

However, you can download the converted files and view them at your leisure. Recommended.

Follow me @glynmoody on Twitter or identi.ca.

10 January 2010

Personal Luggage *Will* be Subject to ACTA

One of the fairy tales being told about the oppressive ACTA is that it's only going to apply to large-scale criminal offenders, and that the Little People like you and me don't need to worry our pretty heads. But that's a lie, as this fascinating blog post has discovered:

It was very interesting to talk to Mr. Velasco. He said the negotiations could be understood, in a very very simplified way, as you basically could get cheap cars in exchange for IPR enforecement laws.

Interestingly enough, his materials published on the interenet also provided some kind of explanation to why people are afraid of having their iPods searched. Under "What is new" in a presentation about Enforcement of IPR Mr. Velasco says:

[it] "No longer excludes from the scope of the regulation counterfeit or pirated goods in a traveler's personal baggage where such goods are suspected to be part of a larger-scale traffic."

But don't put any great hopes in that fig-leaf "where such goods are suspected to be part of a larger-scale traffic": you can bet that once customs officials have the power to search through your laptop or MP3 player, they damn well will.

After all, potentially a *single* unauthorised copy can be used to spawn thousands of copies that would certainly constitute "larger-scale traffic"; so surely that means that all it takes is for a sufficiently suspicious customs official to "suspect" that single copies on an MP3 player might be part of larger-scale traffic - and then Robert is your father's Brother.

Make no mistake: if ACTA is agreed in its current form, it will impact every one of us directly - and direly.

Follow me @glynmoody on Twitter or identi.ca.

08 January 2010

Help Stop EU Software Patents – Again

A few years back, there was a fierce battle between those wishing to lock down software with patents, and those who wanted to keep copyright as the main protection for computer code. Thankfully, the latter won. Here's what the BBC wrote at the time....

On Open Enterprise blog.

04 January 2010

E-book Industry Gets E-diotic

Learning nothing from the decade-long series of missteps by the music industry, publishers want to repeat that history in all its stupidity:


Digital piracy, long confined to music and movies, is spreading to books. And as electronic reading devices such as Amazon's Kindle, the Sony Reader, Barnes & Noble's Nook, smartphones and Apple's much-anticipated "tablet" boost demand for e-books, experts say the problem may only get worse.

Gosh, the sky is falling.

"Textbooks are frequently pirated, but so are many other categories," said Ed McCoyd, director of digital policy at AAP. "We see piracy of professional content, such as medical books and technical guides; we see a lot of general fiction and non-fiction. So it really runs the gamut."

Er, you don't think that might be because the students are being price-gouged by academic publishers that know they have a captive audience?

And how's this for a solution?

Some publishers may try to minimize theft by delaying releases of e-books for several weeks after physical copies go on sale. Simon & Schuster recently did just that with Stephen King's novel, "Under the Dome," although the publisher says the decision was made to prevent cheaper e-versions from cannibalizing hardcover sales.

In other words, they are *forcing* people who might pay for a digital edition to turn to unauthorised copies: smart move.

And it seems old JK doesn't get it either:

Some authors have even gone as far as to shrug off e-book technology altogether. J.K Rowling has thus far refused to make any of her Harry Potter books available digitally because of piracy fears and a desire to see readers experience her books in print.

Well, I'm a big fan of analogue books too - indeed, I firmly believe it is how publishers will survive. But I wonder if JK has ever considered the point that reading digital versions is rather less pleasant than snuggling down with a physical book, and so once you've got people interested in the content - through digital versions - they might then go out and buy a dead tree version?

But no, instead we are going to get all the inane reasoning that we heard from the music publishers, all the stupid attempts to "lock down" texts, and the same flourishing of publishers despite all that.

Follow me @glynmoody on Twitter or identi.ca.

03 January 2010

Why Extending the DNA Database is Dangerous

Part of the problem with extending the DNA database is that doing so increases the likelihood of this happening:

After a seven-day trial, Jama had been convicted of raping a 40-year-old woman in the toilets at a suburban nightclub.

The only evidence linking him to the crime was a DNA sample taken from the woman's rape kit.

...

Jama had steadfastly denied the charge of rape and said he had never been to that nightclub, not on that cold Melbourne night, not ever. He repeatedly stated he was with his critically ill father on the other side of Melbourne, reading him passages from the Koran.

But the judge and the jury did not buy his alibi, despite supporting evidence from his father, brother and friend. Instead, they believed the forensic scientist who testified there was a one in 800 billion chance that the DNA belonged to someone other than the accused man.

This week Jama gave the lie to that absurdly remote statistic. After prosecutors admitted human error in the DNA testing on which the case against Jama was built, his conviction was overturned.

Prosecutors said they could not rule out contamination of the DNA sample after it emerged the same forensic medical officer who used the rape kit had taken an earlier sample from Jama in an unrelated matter. They admitted a "serious miscarriage of justice".

DNA is an important forensic tool - when used properly. But it is not foolproof, not least because contamination can lead to false positives.

The more DNA profiles that are stored on a database, the more likely there will be a match found due to such false positives. And such is the belief in the infallibility of DNA testing - thanks to the impressive-sound "one in 800 billion chance that the DNA belonged to someone other than the accused man" - that it is likely to lead to more *innocent* people being convicted. The best solution is to keep the DNA database small, tight and useful.

Follow me @glynmoody on Twitter or identi.ca.

02 January 2010

This Reminds Me of Something...

Interesting piece about the problems of remembering as we grow older:

if you are primed with sounds that are close to those you’re trying to remember — say someone talks about cherry pits as you try to recall Brad Pitt’s name — suddenly the lost name will pop into mind. The similarity in sounds can jump-start a limp brain connection. (It also sometimes works to silently run through the alphabet until landing on the first letter of the wayward word.)

This is exactly the method that I have developed in my old age: when I can't remember a name or word, I start saying apparently random sounds to myself, gradually focussing on those that *feel* close to the one I'm looking for. It something takes a while, but I can generally find the word, and it usually has some connection with the ones that I pronounce in my journey towards it.

I also found that this resonated with my experience too:

continued brain development and a richer form of learning may require that you “bump up against people and ideas” that are different. In a history class, that might mean reading multiple viewpoints, and then prying open brain networks by reflecting on how what was learned has changed your view of the world.

I find working in the field of computing useful here, since there are always new things to try. As the article says, it seems particularly helpful to try out things you are *not* particularly sympathetic to. It's the reason that I started twittering on 1 January last year: to force myself to do something new and something challenging. Well, that seemed to work out. Question is, what should I be doing this year?

Follow me @glynmoody on Twitter or identi.ca.

31 December 2009

What Lies at the Heart of "Avatar"?

If nothing else, "Avatar" is a computational tour-de-force. Here are some details of the kit they used:

It takes a lot of data center horsepower to create the stunning visual effects behind blockbuster movies such as King Kong, X-Men, the Lord of the Rings trilogy and most recently, James Cameron’s $230 million Avatar. Tucked away in Wellington, New Zealand are the facilities where visual effects company Weta Digital renders the imaginary landscapes of Middle Earth and Pandora at a campus of studios, production facilities, soundstages and a purpose-built data center.

...

The Weta data center got a major hardware refresh and redesign in 2008 and now uses more than 4,000 HP BL2×220c blades (new BL2×220c G6 blades announced last month), 10 Gigabit Ethernet networking gear from Foundry and storage from BluArc and NetApp. The system now occupies spot 193 through 197 in the Top 500 list of the most powerful supercomputers.

Here's info about Weta from the Top500 site:

Site WETA Digital
System Family HP Cluster Platform 3000BL
System Model Cluster Platform 3000 BL 2x220
Computer Cluster Platform 3000 BL2x220, L54xx 2.5 Ghz, GigE
Vendor Hewlett-Packard
Application area Media
Installation Year 2009

Operating System Linux

Oh, look: Linux. Why am I not surprised...?

Follow me @glynmoody on Twitter or identi.ca.

30 December 2009

The Wisdom of the Conservatives

I don't have much time for either of the main UK political parties (or many of the others, come to that), but I must give some kudos to the Tories for latching onto an ironic weakness of Labour: its authoritarian hatred of openness. And here the former are at it again, showing the UK government how it should be done:


The Conservatives are today announcing a competition, with a £1million prize, for the best new technology platform that helps people come together to solve the problems that matter to them – whether that’s tackling government waste, designing a local planning strategy, finding the best school or avoiding roadworks.

This online platform will then be used by a future Conservative government to throw open the policy making process to the public, and harness the wisdom of the crowd so that the public can collaborate to improve government policy. For example, a Conservative government would publish all government Green Papers on this platform, so that everyone can have their say on government policies, and feed in their ideas to make them better.

This is in addition to our existing radical commitment to introduce a Public Reading Stage for legislation so that the public can comment on draft bills, and highlight drafting errors or potential improvements.

That said, the following is a bit cheeky:

Harnessing the wisdom of the crowd in this way is a fundamentally Conservative approach, based on the insight that using dispersed information, such as that contained within a market, often leads to better outcomes than centralised and closed systems.

Tories as bastions of the bottom-up approach? Stalin would have been proud of that bit of historical revisionism.

The only remaining question (other than whether the Conservatives will win the forthcoming UK General Election) is whether the software thus produced will be released under an open source licence. I presume so, since this would also be "a fundamentally Conservative approach"....

Follow me @glynmoody on Twitter or identi.ca.

What Took Wired So Loongson?

I've been writing about the Loongson chip for three years now. As I've noted several times, this chip is important because (a) it's a home-grown Chinese chip (albeit based on one from MIPS) and (b) Windows doesn't run on it, but GNU/Linux does.

It looks like Wired magazine has finally woken up to the story (better late than never):


Because the Loongson eschews the standard x86 chip architecture, it can’t run the full version of Microsoft Windows without software emulation. To encourage adoption of the processor, the Institute of Computing Technology is adapting everything from Java to OpenOffice for the Loongson chip and releasing it all under a free software license. Lemote positions its netbook as the only computer in the world with nothing but free software, right down to the BIOS burned into the motherboard chip that tells it how to boot up. It’s for this last reason that Richard “GNU/Linux” Stallman, granddaddy of the free software movement, uses a laptop with a Loongson chip.

Because GNU/Linux distros have already been ported to the Loongson chip, neither Java nor OpenOffice.org needs "adapting" so much as recompiling - hardly a challenging task. As for "releasing it all under a free software license", they had no choice.

But at least Wired got it right about the potential impact of the chip:

Loongson could also reshape the global PC business. “Compared to Intel and IBM, we are still in the cradle,” concedes Weiwu Hu, chief architect of the Loongson. But he also notes that China’s enormous domestic demand isn’t the only potential market for his CPU. “I think many other poor countries, such as those in Africa, need low-cost solutions,” he says. Cheap Chinese processors could corner emerging markets in the developing world (and be a perk for the nation’s allies and trade partners).

And that’s just the beginning. “These chips have implications for space exploration, intelligence gathering, industrialization, encryption, and international commerce,” says Tom Halfhill, a senior analyst for Microprocessor Report.

Yup.

Follow me @glynmoody on Twitter or identi.ca.

29 December 2009

The Lost Decades of the UK Web

This is a national disgrace:

New legal powers to allow the British Library to archive millions of websites are to be fast-tracked by ministers after the Guardian exposed long delays in introducing the measures.

The culture minister, Margaret Hodge, is pressing for the faster introduction of powers to allow six major libraries to copy every free website based in the UK as part of their efforts to record Britain's cultural, scientific and political history.

The Guardian reported in October that senior executives at the British Library and National Library of Scotland (NLS) were dismayed at the government's failure to implement the powers in the six years since they were established by an act of parliament in 2003.

The libraries warned that they had now lost millions of pages recording events such as the MPs' expenses scandal, the release of the Lockerbie bomber and the Iraq war, and would lose millions more, because they were not legally empowered to "harvest" these sites.

So, 20 years after Sir Tim Berners-Lee invented the technology, and well over a decade after the Web became a mass medium, and the British Library *still* isn't archiving every Web site?

History - assuming we have one - will judge us harshly for this extraordinary UK failure to preserve the key decades of the quintessential technology of our age. It's like burning down a local digital version of the Library of Alexandria, all over again.

Follow me @glynmoody on Twitter or identi.ca.

Copyright Infringement: A Modest Proposal

The UK government's Canute-like efforts to stem the tide of online copyright infringement have plumbed new depths, it seems:


Proposals to suspend the internet connections of those who repeatedly share music and films online will leave consumers with a bill for £500 million, ministers have admitted.

The Digital Economy Bill would force internet service providers (ISPs) to send warning letters to anyone caught swapping copyright material illegally, and to suspend or slow the connections of those who refused to stop. ISPs say that such interference with their customers’ connections would add £25 a year to a broadband subscription.

As Mike Masnick points out:

Note, of course, that the music industry itself claims that £200 million worth of music is downloaded in the UK per year (and, of course, that's only "losses" if you use the ridiculous and obviously incorrect calculation that each download is a "lost sale").

So this absurd approach will actually cost far more than it will save, even accepting the grossly-inflated and self-serving figures from the music industry.

Against that background, I have a suggestion.

Given that the UK government seems happy for huge sums of money to be spent on this fool's errand, why not spend it more effectively, in a way that sustains businesses, rather than penalising them, and which actually encourages people not to download copyrighted material from unauthorised sources?

This can be done quite simply: by giving everyone who wants it a free Spotify Premium subscription. These normally cost £120 per year, but buying a national licence for the 10 million families or so who are online would presumably garner a generous discount - say, of 50% - bringing the total price of the scheme to around £600 million, pretty much the expected cost of the current plans.

As I can attest, once you get the Spotify Premium habit, you really don't want to bother with downloading files and managing them: having everything there, in the cloud, nicely organised, is just *so* convenient (well, provided you don't lose your connection). I'm sure that my scheme would lead to falls in the levels of file sharing that the government is looking for; and anyway, it could hardly be worse than the proposals in the Digital Economy bill.

Update: On Twitter, Barbara Cookson suggested a clever tweak to this idea: "absolution for ISPs who include #spotify as part of package". Nice.

Follow me @glynmoody on Twitter or identi.ca.

28 December 2009

Making Money by Giving Stuff Away

Open source software is obviously extremely interesting to companies from a utilitarian viewpoint: it means they can reduce costs and – more significantly – decrease their dependence on single suppliers. But there's another reason why businesses should be following the evolution of this field: it offers important lessons about how the economics of a certain class of products is changing.

On Open Enterprise blog.

24 December 2009

ACTA as the (Fool's) "Gold Standard"

I've noted before that at the heart of the ACTA negotiations there is a con-trick being played upon the world: insofar as the mighty ones deign to pass down any crumbs of information to us little people, it is framed in terms of the dangers of counterfeit medicines and the like, and how we are being "protected". But, then, strangely, those counterfeit medicines morph into digital copies of songs - where there is obviously no danger whatsoever - but the same extreme measures are called for.

Unfortunately, the European Union has now joined in the parroting this lie, and is now pushing even harder for ACTA to be implemented:


The European Union appears to be preparing for adoption of the “gold standard” of enforcement, the Anti-Counterfeiting Trade Agreement (ACTA), as intellectual property law expert Annette Kur from the Max Planck Institute of Intellectual Property, Competition and Tax Law said it is now called.

At a conference of the Swedish EU Presidency on “Enforcement of Intellectual Property with a Special Focus on Trademarks and Patents” on 15-16 December in Stockholm, representatives from EU bodies, member states and industry supported a quick enforcement of ACTA, according to participants. A representative of the Justice, Freedom and Security Directorate General of the European Commission, presented a plan for a quick restart of a legislative process in the EU to harmonise criminal law sanctions in the Community.

Worryingly:

Only two members of Parliament attended the conference in Stockholm, which despite its high-level panels was not much publicised by the Swedish presidency. Not even an agenda had been published beforehand

That is, the inner circle of the EU, represented by the EU Presidency, was clearly trying to minimise scrutiny by the European Parliament, which has historically taken a more balanced view of intellectual monopolies and their enforcement. That matters, because:

Under the Lisbon Treaty, the European Parliament would be kept informed of the negotiation process in a manner similar to the Council, a Commission expert said. Furthermore, the ACTA text would be approved both by the Parliament and the Council.

In other words, the European Parliament now has powers that allow it to block things like ACTA, should it so desire. That's obviously a problem for those European politicians used to getting their way without such tiresome democratic obstacles.

Despite this shameful attempt to keep everything behind closed doors, the presentations show that even among those with access to the inner circle there are doubts about ACTA's "gold standard". Here's what the academic Annette Kur said in her presentation [.pdf]:

Using the public concern about serious crimes like fabrication of fake and noxious medicaments as an argument pushing for stronger legislation on IP infringement in general is inappropriate and dangerous

It is dangerous because it obscures the fact that to combat risks for public health is not primarily an IP issue

It is inappropriate because it will typically tend to encourage imbalanced legislation

Similarly Kostas Rossoglou from BEUC, the European Consumers’ Organisation, was deeply worried by the following aspects [.pdf]:

Counterfeiting used as a general term to describe all IPR Infringements and beyond!!!

Broad scope of IPRED Directive – all IPR infringements are presumed to be equally serious!!!

No distinction between commercial piracy and unauthorised use of copyright-protected content by individuals

No clear definition of the notion of “commercial scale”

Things are moving fast on the ACTA front in Europe, with a clear attempt to steamroller this through without scrutiny. This makes it even more vital that we call out those European politicians who try to justify their actions by equating counterfeiting and copyright infringement, and that we continue to demand a more reasoned and balanced approach that takes into account end-users as well as the holders of intellectual monopolies.

Follow me @glynmoody on Twitter or identi.ca.

23 December 2009

Coming up with a Copyright Assignment Strategy

One of the deep ironies of the free software world, which is predicated on freedom, is that it effectively requires people to become experts in copyright, an intellectual monopoly that is concerned with restricting freedom. That's because the GNU GPL, and licences that have followed its lead, all use copyright to achieve their aims. At times, though, that clever legal hack can come back to bite you, and nowhere more painfully than in the field of copyright assignment.

On Open Enterprise blog.

Google Opens up – about Google's Opennness

Google could not exist without open source software: licensing costs would be prohibitive if it had based its business on proprietary applications. Moreover, free software gives it the possibility to customise and optimise its code – crucially important in terms of becoming and staying top dog in the highly-competitive search market.

On Open Enterprise blog.

All Hail the Mighty Algorithm

As long-suffering readers of this blog will know, one of the reasons I regard software patents as dangerous is because software consists of algorithms, and algorithms are simply maths. So allowing software patents is essentially allowing patents on pure knowledge.

Against that background, this looks pretty significant:

Industries, particularly high tech, may be waiting for the U.S. Supreme Court decision, expected this coming spring, in the Bilski case to decide some fundamental questions of when you can patent business methods. But in the meantime, there’s a newly published decision from the Board of Patent Appeals and Interferences that establishes a new test to determine whether a machine or manufactured article that depends on a mathematical algorithm is patentable. The ruling is a big deal because it’s one of the few precedential decisions that the BPAI issues in a given year, and it will have a direct impact on patents involving computers and software.

For a claimed machine (or article of manufacture) involving a mathematical algorithm,

1. Is the claim limited to a tangible practical application, in which the mathematical algorithm is applied, that results in a real-world use (e.g., “not a mere field-of-use label having no significance”)?
2. Is the claim limited so as to not encompass substantially all practical applications of the mathematical algorithm either “in all fields” of use of the algorithm or even in “only one field?”

If the machine (or article of manufacture) claim fails either prong of the two-part inquiry, then the claim is not directed to patent eligible subject matter.

Now, the devil is in the details, and what impact this has will depend upon its interpretation. But what I find significant is that algorithms are foregrounded: the more people concentrate on this aspect, the harder it will be to justify software patents.

Follow me @glynmoody on Twitter or identi.ca.

16 December 2009

Hypocrisy, Thy Name is MPAA

I do love it when copyright maximalist organisations like the MPAA put out statements, because they invariably put their foot in it too. This "Testimony of Dan Glickman Chairman and CEO Motion Picture Association of America" is no exception. Here's a plum [.pdf]:

While not a Free Trade Agreement, the US motion picture industry – producers, studios and guilds -- has a keen interest in the Anti-Counterfeiting Trade Agreement (ACTA), in particular the provisions to address Internet piracy. We firmly believe that for the ACTA to address the enforcement challenges our industry confronts today, it MUST include robust protections for intellectual property online. Practical secondary liability regimes for online infringement are essential to motivate stakeholders to cooperate in implementing the reasonable practices that promote legitimate consumer options and make the online marketplace less hospitable for infringers. ACTA parties should refine their secondary liability regimes to reflect current realities and adopt modern, flexible systems where they do not exist.

What the MPAA wants is for ISPs, for example, to change their businesses "to reflect current realities and adopt modern, flexible systems where they do not exist": how strange, then, that the MPAA is not prepared to do the same by working according to the new digital rules instead of clinging to the old analogue ones...

Follow me @glynmoody on Twitter or identi.ca.

EC Says OK to MS IE Deal: How Much of a Win?

Neelie Kroes, European Commissioner for Competition Policy, had some news this morning:

Today is an important day for internet users in Europe. Today, the Commission has resolved a serious competition concern in a key market for the development of the internet, namely the market for web browsers. Now - for the first time in over a decade - Internet users in Europe will have an effective and unbiased choice between Microsoft’s Internet Explorer and competing web browsers, such as Mozilla Firefox, Google Chrome, Apple Safari and Opera....

On Open Enterprise blog.

15 December 2009

SFLC Gets Busy Around BusyBox

Contrary to some public perceptions, the Free Software Foundation is not keen on litigating against those who fail to respect the terms of the GNU GPL. Here's what Eben Moglen, very much the legal brains behind the organisation, told me a decade ago....

On Open Enterprise blog.

Australia Edges Us Towards the Digital Dark Ages

Last week, on my opendotdotdot blog, I was praising the Australian government for its moves to open up its data. I was rapidly – and rightly – taken to task in the comments for failing to mention that government's efforts to impose direct, low-level censorship on the country's Internet feed.

Although I was aware of these moves, I wasn't quite up to date on their progress. It seems that things have moved far and fast...

On Open Enterprise blog.

14 December 2009

Canadians *Do* Have a Sense of Humour

Want a good laugh?


One hour ago, a spoof press release targeted Canada in order to generate hurtful rumors and mislead the Conference of Parties on Canada's positions on climate change, and to damage Canada's standing with the international business community.

The release, from "press@enviro-canada.ca," alleges Canada's acceptance of unrealistic emissions-reduction targets, as well as a so-called "Climate Debt Mechanism," a bilateral agreement between Canada and Africa to furnish that continent with enormous sums in "reparation" for climate damage and to "offset" adaptation.

Of course, everyone should have known that Canada wouldn't do anything like accept massive emission reduction targets, or agree to reparations. No, this is what it *really* has in mind:

Today as always, Canada's binding responsibility is to supply the world - including its burgeoning developing portion - with those means of transport, health, and sustenance that prosperous markets require. Stopping short of these dictates would violate the very principles upon which our nations were founded, and endanger our very development.

As you will note, there's nothing here about that tiresome need to minimise climate change, it's all about "prosperous markets", yeah. Indeed:

Canada's current energy policy represents an elegant synthesis of the most advanced science, while remaining faithful to Canada's tradition of political pragmatism. Experts note, for example, that the much-decried oil sands of Alberta, contrary to environmentalists' dire assertions, are enabling Canada to meet ambitious emissions goals by providing her, as well as her neighbors, with the energy resources needed to transition to a cleaner energy future.

Cunning, no? Canada notes how using energy from one of the dirtiest sources, the "much-decried oil sands of Alberta", is in fact absolutely fine because it will allow a transition to a "cleaner energy future". Which means that we can justify *any* kind of energy source, no matter how dirty, provided it makes things better at some ill-specified time in the future.

If we have one, of course. (Via Tristan Nitot.)

Follow me @glynmoody on Twitter or identi.ca.