15 January 2010

Declaring War on European Computer Users

The eagle-eyed Monica Horten has spotted something:

Following a question on counterfeiting and piracy from Italian MEP Salvini, Michel Barnier admitted that he and the new Trade Commissioner, Karel de Grucht, will be negotiating with the US on the ACTA (Anti-counterfeiting Trade Agreement). This is, I think, the first public admission by the EU that it is negotiating on ACTA. Following a further question asking what action he would take to support creators, he cited the ACTA negotiations as an important step. He said it would put Europe on the same level as al regions of the world.

He said " of course there is freedom of information, but there is also freedom of creation... it is necessary to balance that freedom of information with the right of artists to earn money". He said he would work with Commissioner Reding (Justice and Fundamental Rights). He said it was important to inform the public, but also to change the legislative framework. I think I understood him correctly - and if I did, this is a significant statement, because until now, my understanding is that the official line on ACTA is it not about changing the law.

That's certainly a crucial point, but I think that the post contains something even more important - and frankly terrifying:

In his opening statement to the European Parliament, he outlined his priorities. The second priority was to promote creation and innovation, and the protection of the rights of creators. In his view, it is necessary to adapt the rules for the electronic world. He said that 'la creation' is being weakened by counterfeiting and piracy, and he wants to eradicate it and will support the Observatory on Counterfeiting and Piracy which was set up under his predecessor.

"Eradicate piracy"? This man either knows nothing about the technology or nothing about people. Seeking to eradicate "piracy" is about as sensible as seeking to eradicate "drugs" or "terrorism": it shows yet another politician happy to mouth platitudes without any thought for their real consequences, which would be nothing less than declaring war on hundreds of millions of European computer users. The next few years are beginning to look grim.

Follow me @glynmoody on Twitter or identi.ca.

SABIP Finally Enters 21st Century

It looks like at least one government department, the Strategic Advisory Board for Intellectual Property Policy (SABIP), is starting to get a clue about the digital economy, and the fact that constantly harping on about *online* file-sharing misses the bigger picture:


Today sees the publication of the first comprehensive review of currently available national and international research into consumers’ attitudes and behaviours to obtaining and sharing digital content offline. Much of this activity infringes current copyright law in the UK.

Because what do they find?

# Estimates indicate that between 7-16% of the UK population buy discs (DVDs, CDs, & video games) which infringe copyright. Very little is known about other forms of physical peer-to-peer file sharing (e.g. hard drive swapping) and the few estimates that exist vary greatly.

# Demographics for consumers who acquire offline/hard copies which infringe copyright appear to be different from those that engage in online copyright infringement: they are often older, with dependent(s), and are more likely to belong to lower socio-economic groups - ie. they are more ‘ordinary’ than the predominantly younger, well educated, technologically-savvy group who infringe copyright online.

# The evidence is mixed as to whether consuming content through infringement substitutes or complements legal consumption. For example, while the music industry points to falling sales, some evidence suggests that consuming music illegally does not substitute legal consumption but that both types of consumption may sit alongside each other.

# Initial evidence indicates that online downloading and file sharing is substituting offline counterfeit sales. Anecdotally some suppliers suggest that the market for counterfeit content is declining - this is corroborated by falling seizures of counterfeit discs.

They suggest:

* The sharing of digital content offline needs to be looked at through a new lens. It has been predominantly studied using criminology or social psychology. But these perspectives tend to carry value judgements about what is considered right or wrong which implicitly shape the research. This means that other factors, eg, economic criteria, have rarely been considered. Industry and government surveys suggest that these additional factors are very important to any consideration of copyright infringement.

* There is little research that looks at the effect of ignorance of IP law. Copyright law is complex, and difficult for the average consumer to fully understand (where consumers are aware it exists at all). The default position in previous criminology-based research is that people know that they are breaking the law and make a choice to do so, but this is not empirically proven.

Who would have thought that economic criteria might have played a role in people's decisions to share copyright materials offline? Similarly, who would have thought that the fact that in recent decades copyright has been framed solely for the benefit of content owners, and not content users, means that it is user-hostile to the point of opacity, and that "ordinary" people make no attempt to navigate its thickets?

Let's hope this report is the first in many that shows some realism on the part of not just SABIP, but the UK government.

Follow me @glynmoody on Twitter or identi.ca.

13 January 2010

How Solid is the Great Firewall of China?

Here's the best description I've read of the situation:

Anybody inside China who really wants to get to Google.com -- or BBC or whatever site may be blocked for the moment -- can still do so easily, by using a proxy server or buying (for under $1 per week) a VPN service. Details here. For the vast majority of Chinese users, it's not worth going to that cost or bother, since so much material is still available in Chinese from authorized sites. That has been the genius, so far, of the Chinese "Great Firewall" censorship system: it allows easy loopholes for anyone who might get really upset, but it effectively keeps most Chinese Internet users away from unauthorized material.

That seems an incredibly important fact to keep in mind over the Google/China business. Talking of which, the above comes from a typically sane post from James Fallows on the subject: well worth reading.

Follow me @glynmoody on Twitter or identi.ca.

Going Googly in China

The news that Google was no longer prepared to censor its search results in China has a number of interesting and important aspects.

First, it's worth noting that this extremely big news was not announced at a press conference, or even with a press release, but on a blog. That's not so surprising given that Google prides itself on being geeky, but it has one huge implication: any journalist that is not following blogs (and I'd also add Twitter) is no longer adequately equipped to stay on top of the news. The press release is officially dead (and good riddance.)

The other thing that is striking is what is not said. Google speaks of

a highly sophisticated and targeted attack on our corporate infrastructure originating from China

It then announces that as a result of this attack:

We have decided we are no longer willing to continue censoring our results on Google.cn

But hang on a minute: how on earth will removing censorship diminish the likelihood of further attacks? Censorship, it would seem, has nothing to do with those attacks - unless Google is suggesting that the Chinese government was behind those attacks in some way, with this latest move from the company as a direct retaliation and warning.

And that, of course, is precisely what is going on, but there is no mention of any of this explicitly in the blog post. Nonetheless, it is extraordinary that a company should thus publicly, if only implicitly, accuse a government of being involved in attacks on its system.

As to the consequences of Google's statement, I imagine that the Chinese government will just block Google completely - China's recent actions show that it is no mood for compromise. Indeed, it is actually clamping down *harder* on the Internet, so anything like uncensored search results will be totally unacceptable.

Unfortunately, I don't think this will have much effect within China. The leading search engine, Baidu, will just pick up the slack. Many people on the Internet in China regard the West's attitude to censorship and freedom of speech as hypocritical, and unwarranted meddling. It will be easy for the Chinese government to sell this as arrogant Westerners lecturing to the Chinese (again) as a way of covering up their failure to topple Baidu.

More interesting is the effect this has on the West. The latter has been increasingly subservient to China recently, with its pathetic acceptance of the crackdown in Tibet and Xinjiang, and its acquiescence in the propaganda of the Olympic games. This was largely based on the fact that China's economy was perceived as so important that other issues fell by the wayside (although I was under the impression that there was a name for selling oneself in this way for a bit of dosh.)

Google's action, if picked up by others, might lead to the West taking a more principled approach, whereby it is not prepared to sell its soul for a few Renminbi. Sadly, it's probably too much to expect the West to drop its hypocritical actions (censorship, surveillance) as well....

12 January 2010

Stop the "Stop and Search" Shame

How can a government minister be so shameless as this?


Policing and Security Minister David Hanson MP said: ”Stop and search under section 44 of the Terrorism Act 2000 is an important tool in a package of measures in the ongoing fight against terrorism."

How has it done one single thing to "fight against terrorism"? How many "terrorists" have they caught as a result of using "stop and search"? Zero, I'll be bound. All it seems to be used for is to oppress opponents of the government. Words fail me.

11 January 2010

What "Nothing to Hide" is Hiding

As governments around the world - but particular in the UK - increase the surveillance of their hapless citizens, one argument above all is made in favour of doing so: "if you have nothing to hide, you have nothing to fear."

Of course, the rebuttal is that, indeed, we have nothing to hide, but we do value our privacy, and we should not be asked to sacrifice that for dubious government convenience. But as this excellent paper entitled "I've got nothing to hide, and other misunderstanding of privacy" points out, there is a particularly dangerous "strong" form of this argument that is harder to brush off so easily:

Grappling with the nothing to hide argument is important, because the argument reflects the sentiments of a wide percentage of the population. In popular discourse, the nothing to hide argument’s superficial incantations can readily be refuted. But when the argument is made in its strongest form, it is far more formidable.

...


The NSA surveillance, data mining, or other government information- gathering programs will result in the disclosure of particular pieces of information to a few government officials, or perhaps only to government computers. This very limited disclosure of the particular information involved is not likely to be threatening to the privacy of law-abiding citizens. Only those who are engaged in illegal activities have a reason to hide this information. Although there may be some cases in which the information might be sensitive or embarrassing to law-abiding citizens, the limited disclosure lessens the threat to privacy. Moreover, the security interest in detecting, investigating, and preventing terrorist attacks is very high and outweighs whatever minimal or moderate privacy interests law-abiding citizens may have in these particular pieces of information. Cast in this manner, the nothing to hide argument is a formidable one. It balances the degree to which an individual’s privacy is compromised by the limited disclosure of certain information against potent national security interests. Under such a balancing scheme, it is quite difficult for privacy to prevail.


One of the key arguments of the paper revolves around data aggregation (not surprisingly):

Aggregation...means that by combining pieces of information we might not care to conceal, the government can glean information about us that we might really want to conceal. Part of the allure of data mining for the government is its ability to reveal a lot about our personalities and activities by sophisticated means of analyzing data. Therefore, without greater transparency in data mining, it is hard to claim that programs like the NSA data mining program will not reveal information people might want to hide, as we do not know precisely what is revealed. Moreover, data mining aims to be predictive of behavior, striving to prognosticate about our future actions. People who match certain profiles are deemed likely to engage in a similar pattern of behavior. It is quite difficult to refute actions that one has not yet done. Having nothing to hide will not always dispel predictions of future activity.


Moreover:

Another problem in the taxonomy, which is implicated by the NSA program, is the problem I refer to as exclusion.85 Exclusion is the problem caused when people are prevented from having knowledge about how their information is being used, as well as barred from being able to access and correct errors in that data. The NSA program involves a massive database of information that individuals cannot access. Indeed, the very existence of the program was kept secret for years.86 This kind of information processing, which forbids people’s knowledge or involvement, resembles in some ways a kind of due process problem. It is a structural problem involving the way people are treated by government institutions. Moreover, it creates a power imbalance between individuals and the government. To what extent should the Executive Branch and an agency such as the NSA, which is relatively insulated from the political process and public accountability, have a significant power over citizens? This issue is not about whether the information gathered is something people want to hide, but rather about the power and the structure of government.


Finally:

A related problem involves “secondary use.” Secondary use is the use of data obtained for one purpose for a different unrelated purpose without the person’s consent. The Administration has said little about how long the data will be stored, how it will be used, and what it could be used for in the future. The potential future uses of any piece of personal information are vast, and without limits or accountability on how that information is used, it is hard for people to assess the dangers of the data being in the government’s control.

None of these will come as any surprise to people thinking about privacy and computers, but it's interesting to read a lawyer's more rigorous take on the same ideas.

Follow me @glynmoody on Twitter or identi.ca.

Is Richard Stallman Mellowing?

Richard Stallman is sometimes presented as a kind of Old Testament prophet, hurling anathemas hither and thither (indeed, I've been guilty of this characterisation myself - well, he does *look* like one.) But just recently we've had a fascinating document that suggests that this is wrong – or that RMS is mellowing....

On Open Enterprise blog.

Mozilla Starts to Follow a New Drumbeat

I've written often enough about Firefox and its continuing steady gains of browser market share. Here's another nice stat:

Roughly keeping pace with previous years, Firefox grew 40% worldwide. Two regions in particular continued adopting Firefox at a breakneck pace — South America (64%) and Asia (73%).

Most of the 40% growth occurred recently. In the 4 months leading up to the holiday season, Firefox added 22.8 million active daily users. During that same period last year, Firefox added 16.4 million users.

That's all well and good, but it raises the question: what should Mozilla be doing *after* it conquers the browser world – that is, once it has 50% market share?

On Open Enterprise blog.

In Praise of TinyOgg

The Ogg sound and video formats finally seem to be gaining some traction; the only problem is that few of the major sites employ it - Flash is pretty much ubiquitous these days. That's where TinyOgg comes in:

This service allows you to watch and listen to Flash-based videos without the need to Flash technology. This gives you speed, safety, control, freedom and openness.

And if you want more detail on the latter advantages:

Choosing Ogg over Flash is both an ethical and technical question. We all want our computers to do better, so here is how our computers are better with Free Formats:

1. You will be able to enjoy the media "natively"; there is no need to install any plug-in or add-on to your standard-friendly browser.

2. You will not need to load heavy and possibly unsafe scripts, which helps with speed and stability.

3. You will support a free, open Web where no corporation monopolizes an important technology, such as video.

4. You will enjoy more control over your digital life with Free Formats and Open Standards because no corporation decides what you can and cannot do with the files you have.

If you're wondering how they can afford to host all the stuff, the answer is, they don't:

All videos on TinyOgg are hosted for 48 hours. We believe that most users are going to watch the video only once, so there is no need to the "for-ever hosting" feature, which is going to be costly.

However, you can download the converted files and view them at your leisure. Recommended.

Follow me @glynmoody on Twitter or identi.ca.

10 January 2010

Personal Luggage *Will* be Subject to ACTA

One of the fairy tales being told about the oppressive ACTA is that it's only going to apply to large-scale criminal offenders, and that the Little People like you and me don't need to worry our pretty heads. But that's a lie, as this fascinating blog post has discovered:

It was very interesting to talk to Mr. Velasco. He said the negotiations could be understood, in a very very simplified way, as you basically could get cheap cars in exchange for IPR enforecement laws.

Interestingly enough, his materials published on the interenet also provided some kind of explanation to why people are afraid of having their iPods searched. Under "What is new" in a presentation about Enforcement of IPR Mr. Velasco says:

[it] "No longer excludes from the scope of the regulation counterfeit or pirated goods in a traveler's personal baggage where such goods are suspected to be part of a larger-scale traffic."

But don't put any great hopes in that fig-leaf "where such goods are suspected to be part of a larger-scale traffic": you can bet that once customs officials have the power to search through your laptop or MP3 player, they damn well will.

After all, potentially a *single* unauthorised copy can be used to spawn thousands of copies that would certainly constitute "larger-scale traffic"; so surely that means that all it takes is for a sufficiently suspicious customs official to "suspect" that single copies on an MP3 player might be part of larger-scale traffic - and then Robert is your father's Brother.

Make no mistake: if ACTA is agreed in its current form, it will impact every one of us directly - and direly.

Follow me @glynmoody on Twitter or identi.ca.

08 January 2010

Help Stop EU Software Patents – Again

A few years back, there was a fierce battle between those wishing to lock down software with patents, and those who wanted to keep copyright as the main protection for computer code. Thankfully, the latter won. Here's what the BBC wrote at the time....

On Open Enterprise blog.

04 January 2010

E-book Industry Gets E-diotic

Learning nothing from the decade-long series of missteps by the music industry, publishers want to repeat that history in all its stupidity:


Digital piracy, long confined to music and movies, is spreading to books. And as electronic reading devices such as Amazon's Kindle, the Sony Reader, Barnes & Noble's Nook, smartphones and Apple's much-anticipated "tablet" boost demand for e-books, experts say the problem may only get worse.

Gosh, the sky is falling.

"Textbooks are frequently pirated, but so are many other categories," said Ed McCoyd, director of digital policy at AAP. "We see piracy of professional content, such as medical books and technical guides; we see a lot of general fiction and non-fiction. So it really runs the gamut."

Er, you don't think that might be because the students are being price-gouged by academic publishers that know they have a captive audience?

And how's this for a solution?

Some publishers may try to minimize theft by delaying releases of e-books for several weeks after physical copies go on sale. Simon & Schuster recently did just that with Stephen King's novel, "Under the Dome," although the publisher says the decision was made to prevent cheaper e-versions from cannibalizing hardcover sales.

In other words, they are *forcing* people who might pay for a digital edition to turn to unauthorised copies: smart move.

And it seems old JK doesn't get it either:

Some authors have even gone as far as to shrug off e-book technology altogether. J.K Rowling has thus far refused to make any of her Harry Potter books available digitally because of piracy fears and a desire to see readers experience her books in print.

Well, I'm a big fan of analogue books too - indeed, I firmly believe it is how publishers will survive. But I wonder if JK has ever considered the point that reading digital versions is rather less pleasant than snuggling down with a physical book, and so once you've got people interested in the content - through digital versions - they might then go out and buy a dead tree version?

But no, instead we are going to get all the inane reasoning that we heard from the music publishers, all the stupid attempts to "lock down" texts, and the same flourishing of publishers despite all that.

Follow me @glynmoody on Twitter or identi.ca.

03 January 2010

Why Extending the DNA Database is Dangerous

Part of the problem with extending the DNA database is that doing so increases the likelihood of this happening:

After a seven-day trial, Jama had been convicted of raping a 40-year-old woman in the toilets at a suburban nightclub.

The only evidence linking him to the crime was a DNA sample taken from the woman's rape kit.

...

Jama had steadfastly denied the charge of rape and said he had never been to that nightclub, not on that cold Melbourne night, not ever. He repeatedly stated he was with his critically ill father on the other side of Melbourne, reading him passages from the Koran.

But the judge and the jury did not buy his alibi, despite supporting evidence from his father, brother and friend. Instead, they believed the forensic scientist who testified there was a one in 800 billion chance that the DNA belonged to someone other than the accused man.

This week Jama gave the lie to that absurdly remote statistic. After prosecutors admitted human error in the DNA testing on which the case against Jama was built, his conviction was overturned.

Prosecutors said they could not rule out contamination of the DNA sample after it emerged the same forensic medical officer who used the rape kit had taken an earlier sample from Jama in an unrelated matter. They admitted a "serious miscarriage of justice".

DNA is an important forensic tool - when used properly. But it is not foolproof, not least because contamination can lead to false positives.

The more DNA profiles that are stored on a database, the more likely there will be a match found due to such false positives. And such is the belief in the infallibility of DNA testing - thanks to the impressive-sound "one in 800 billion chance that the DNA belonged to someone other than the accused man" - that it is likely to lead to more *innocent* people being convicted. The best solution is to keep the DNA database small, tight and useful.

Follow me @glynmoody on Twitter or identi.ca.

02 January 2010

This Reminds Me of Something...

Interesting piece about the problems of remembering as we grow older:

if you are primed with sounds that are close to those you’re trying to remember — say someone talks about cherry pits as you try to recall Brad Pitt’s name — suddenly the lost name will pop into mind. The similarity in sounds can jump-start a limp brain connection. (It also sometimes works to silently run through the alphabet until landing on the first letter of the wayward word.)

This is exactly the method that I have developed in my old age: when I can't remember a name or word, I start saying apparently random sounds to myself, gradually focussing on those that *feel* close to the one I'm looking for. It something takes a while, but I can generally find the word, and it usually has some connection with the ones that I pronounce in my journey towards it.

I also found that this resonated with my experience too:

continued brain development and a richer form of learning may require that you “bump up against people and ideas” that are different. In a history class, that might mean reading multiple viewpoints, and then prying open brain networks by reflecting on how what was learned has changed your view of the world.

I find working in the field of computing useful here, since there are always new things to try. As the article says, it seems particularly helpful to try out things you are *not* particularly sympathetic to. It's the reason that I started twittering on 1 January last year: to force myself to do something new and something challenging. Well, that seemed to work out. Question is, what should I be doing this year?

Follow me @glynmoody on Twitter or identi.ca.

31 December 2009

What Lies at the Heart of "Avatar"?

If nothing else, "Avatar" is a computational tour-de-force. Here are some details of the kit they used:

It takes a lot of data center horsepower to create the stunning visual effects behind blockbuster movies such as King Kong, X-Men, the Lord of the Rings trilogy and most recently, James Cameron’s $230 million Avatar. Tucked away in Wellington, New Zealand are the facilities where visual effects company Weta Digital renders the imaginary landscapes of Middle Earth and Pandora at a campus of studios, production facilities, soundstages and a purpose-built data center.

...

The Weta data center got a major hardware refresh and redesign in 2008 and now uses more than 4,000 HP BL2×220c blades (new BL2×220c G6 blades announced last month), 10 Gigabit Ethernet networking gear from Foundry and storage from BluArc and NetApp. The system now occupies spot 193 through 197 in the Top 500 list of the most powerful supercomputers.

Here's info about Weta from the Top500 site:

Site WETA Digital
System Family HP Cluster Platform 3000BL
System Model Cluster Platform 3000 BL 2x220
Computer Cluster Platform 3000 BL2x220, L54xx 2.5 Ghz, GigE
Vendor Hewlett-Packard
Application area Media
Installation Year 2009

Operating System Linux

Oh, look: Linux. Why am I not surprised...?

Follow me @glynmoody on Twitter or identi.ca.

30 December 2009

The Wisdom of the Conservatives

I don't have much time for either of the main UK political parties (or many of the others, come to that), but I must give some kudos to the Tories for latching onto an ironic weakness of Labour: its authoritarian hatred of openness. And here the former are at it again, showing the UK government how it should be done:


The Conservatives are today announcing a competition, with a £1million prize, for the best new technology platform that helps people come together to solve the problems that matter to them – whether that’s tackling government waste, designing a local planning strategy, finding the best school or avoiding roadworks.

This online platform will then be used by a future Conservative government to throw open the policy making process to the public, and harness the wisdom of the crowd so that the public can collaborate to improve government policy. For example, a Conservative government would publish all government Green Papers on this platform, so that everyone can have their say on government policies, and feed in their ideas to make them better.

This is in addition to our existing radical commitment to introduce a Public Reading Stage for legislation so that the public can comment on draft bills, and highlight drafting errors or potential improvements.

That said, the following is a bit cheeky:

Harnessing the wisdom of the crowd in this way is a fundamentally Conservative approach, based on the insight that using dispersed information, such as that contained within a market, often leads to better outcomes than centralised and closed systems.

Tories as bastions of the bottom-up approach? Stalin would have been proud of that bit of historical revisionism.

The only remaining question (other than whether the Conservatives will win the forthcoming UK General Election) is whether the software thus produced will be released under an open source licence. I presume so, since this would also be "a fundamentally Conservative approach"....

Follow me @glynmoody on Twitter or identi.ca.

What Took Wired So Loongson?

I've been writing about the Loongson chip for three years now. As I've noted several times, this chip is important because (a) it's a home-grown Chinese chip (albeit based on one from MIPS) and (b) Windows doesn't run on it, but GNU/Linux does.

It looks like Wired magazine has finally woken up to the story (better late than never):


Because the Loongson eschews the standard x86 chip architecture, it can’t run the full version of Microsoft Windows without software emulation. To encourage adoption of the processor, the Institute of Computing Technology is adapting everything from Java to OpenOffice for the Loongson chip and releasing it all under a free software license. Lemote positions its netbook as the only computer in the world with nothing but free software, right down to the BIOS burned into the motherboard chip that tells it how to boot up. It’s for this last reason that Richard “GNU/Linux” Stallman, granddaddy of the free software movement, uses a laptop with a Loongson chip.

Because GNU/Linux distros have already been ported to the Loongson chip, neither Java nor OpenOffice.org needs "adapting" so much as recompiling - hardly a challenging task. As for "releasing it all under a free software license", they had no choice.

But at least Wired got it right about the potential impact of the chip:

Loongson could also reshape the global PC business. “Compared to Intel and IBM, we are still in the cradle,” concedes Weiwu Hu, chief architect of the Loongson. But he also notes that China’s enormous domestic demand isn’t the only potential market for his CPU. “I think many other poor countries, such as those in Africa, need low-cost solutions,” he says. Cheap Chinese processors could corner emerging markets in the developing world (and be a perk for the nation’s allies and trade partners).

And that’s just the beginning. “These chips have implications for space exploration, intelligence gathering, industrialization, encryption, and international commerce,” says Tom Halfhill, a senior analyst for Microprocessor Report.

Yup.

Follow me @glynmoody on Twitter or identi.ca.

29 December 2009

The Lost Decades of the UK Web

This is a national disgrace:

New legal powers to allow the British Library to archive millions of websites are to be fast-tracked by ministers after the Guardian exposed long delays in introducing the measures.

The culture minister, Margaret Hodge, is pressing for the faster introduction of powers to allow six major libraries to copy every free website based in the UK as part of their efforts to record Britain's cultural, scientific and political history.

The Guardian reported in October that senior executives at the British Library and National Library of Scotland (NLS) were dismayed at the government's failure to implement the powers in the six years since they were established by an act of parliament in 2003.

The libraries warned that they had now lost millions of pages recording events such as the MPs' expenses scandal, the release of the Lockerbie bomber and the Iraq war, and would lose millions more, because they were not legally empowered to "harvest" these sites.

So, 20 years after Sir Tim Berners-Lee invented the technology, and well over a decade after the Web became a mass medium, and the British Library *still* isn't archiving every Web site?

History - assuming we have one - will judge us harshly for this extraordinary UK failure to preserve the key decades of the quintessential technology of our age. It's like burning down a local digital version of the Library of Alexandria, all over again.

Follow me @glynmoody on Twitter or identi.ca.

Copyright Infringement: A Modest Proposal

The UK government's Canute-like efforts to stem the tide of online copyright infringement have plumbed new depths, it seems:


Proposals to suspend the internet connections of those who repeatedly share music and films online will leave consumers with a bill for £500 million, ministers have admitted.

The Digital Economy Bill would force internet service providers (ISPs) to send warning letters to anyone caught swapping copyright material illegally, and to suspend or slow the connections of those who refused to stop. ISPs say that such interference with their customers’ connections would add £25 a year to a broadband subscription.

As Mike Masnick points out:

Note, of course, that the music industry itself claims that £200 million worth of music is downloaded in the UK per year (and, of course, that's only "losses" if you use the ridiculous and obviously incorrect calculation that each download is a "lost sale").

So this absurd approach will actually cost far more than it will save, even accepting the grossly-inflated and self-serving figures from the music industry.

Against that background, I have a suggestion.

Given that the UK government seems happy for huge sums of money to be spent on this fool's errand, why not spend it more effectively, in a way that sustains businesses, rather than penalising them, and which actually encourages people not to download copyrighted material from unauthorised sources?

This can be done quite simply: by giving everyone who wants it a free Spotify Premium subscription. These normally cost £120 per year, but buying a national licence for the 10 million families or so who are online would presumably garner a generous discount - say, of 50% - bringing the total price of the scheme to around £600 million, pretty much the expected cost of the current plans.

As I can attest, once you get the Spotify Premium habit, you really don't want to bother with downloading files and managing them: having everything there, in the cloud, nicely organised, is just *so* convenient (well, provided you don't lose your connection). I'm sure that my scheme would lead to falls in the levels of file sharing that the government is looking for; and anyway, it could hardly be worse than the proposals in the Digital Economy bill.

Update: On Twitter, Barbara Cookson suggested a clever tweak to this idea: "absolution for ISPs who include #spotify as part of package". Nice.

Follow me @glynmoody on Twitter or identi.ca.

28 December 2009

Making Money by Giving Stuff Away

Open source software is obviously extremely interesting to companies from a utilitarian viewpoint: it means they can reduce costs and – more significantly – decrease their dependence on single suppliers. But there's another reason why businesses should be following the evolution of this field: it offers important lessons about how the economics of a certain class of products is changing.

On Open Enterprise blog.

24 December 2009

ACTA as the (Fool's) "Gold Standard"

I've noted before that at the heart of the ACTA negotiations there is a con-trick being played upon the world: insofar as the mighty ones deign to pass down any crumbs of information to us little people, it is framed in terms of the dangers of counterfeit medicines and the like, and how we are being "protected". But, then, strangely, those counterfeit medicines morph into digital copies of songs - where there is obviously no danger whatsoever - but the same extreme measures are called for.

Unfortunately, the European Union has now joined in the parroting this lie, and is now pushing even harder for ACTA to be implemented:


The European Union appears to be preparing for adoption of the “gold standard” of enforcement, the Anti-Counterfeiting Trade Agreement (ACTA), as intellectual property law expert Annette Kur from the Max Planck Institute of Intellectual Property, Competition and Tax Law said it is now called.

At a conference of the Swedish EU Presidency on “Enforcement of Intellectual Property with a Special Focus on Trademarks and Patents” on 15-16 December in Stockholm, representatives from EU bodies, member states and industry supported a quick enforcement of ACTA, according to participants. A representative of the Justice, Freedom and Security Directorate General of the European Commission, presented a plan for a quick restart of a legislative process in the EU to harmonise criminal law sanctions in the Community.

Worryingly:

Only two members of Parliament attended the conference in Stockholm, which despite its high-level panels was not much publicised by the Swedish presidency. Not even an agenda had been published beforehand

That is, the inner circle of the EU, represented by the EU Presidency, was clearly trying to minimise scrutiny by the European Parliament, which has historically taken a more balanced view of intellectual monopolies and their enforcement. That matters, because:

Under the Lisbon Treaty, the European Parliament would be kept informed of the negotiation process in a manner similar to the Council, a Commission expert said. Furthermore, the ACTA text would be approved both by the Parliament and the Council.

In other words, the European Parliament now has powers that allow it to block things like ACTA, should it so desire. That's obviously a problem for those European politicians used to getting their way without such tiresome democratic obstacles.

Despite this shameful attempt to keep everything behind closed doors, the presentations show that even among those with access to the inner circle there are doubts about ACTA's "gold standard". Here's what the academic Annette Kur said in her presentation [.pdf]:

Using the public concern about serious crimes like fabrication of fake and noxious medicaments as an argument pushing for stronger legislation on IP infringement in general is inappropriate and dangerous

It is dangerous because it obscures the fact that to combat risks for public health is not primarily an IP issue

It is inappropriate because it will typically tend to encourage imbalanced legislation

Similarly Kostas Rossoglou from BEUC, the European Consumers’ Organisation, was deeply worried by the following aspects [.pdf]:

Counterfeiting used as a general term to describe all IPR Infringements and beyond!!!

Broad scope of IPRED Directive – all IPR infringements are presumed to be equally serious!!!

No distinction between commercial piracy and unauthorised use of copyright-protected content by individuals

No clear definition of the notion of “commercial scale”

Things are moving fast on the ACTA front in Europe, with a clear attempt to steamroller this through without scrutiny. This makes it even more vital that we call out those European politicians who try to justify their actions by equating counterfeiting and copyright infringement, and that we continue to demand a more reasoned and balanced approach that takes into account end-users as well as the holders of intellectual monopolies.

Follow me @glynmoody on Twitter or identi.ca.

23 December 2009

Coming up with a Copyright Assignment Strategy

One of the deep ironies of the free software world, which is predicated on freedom, is that it effectively requires people to become experts in copyright, an intellectual monopoly that is concerned with restricting freedom. That's because the GNU GPL, and licences that have followed its lead, all use copyright to achieve their aims. At times, though, that clever legal hack can come back to bite you, and nowhere more painfully than in the field of copyright assignment.

On Open Enterprise blog.

Google Opens up – about Google's Opennness

Google could not exist without open source software: licensing costs would be prohibitive if it had based its business on proprietary applications. Moreover, free software gives it the possibility to customise and optimise its code – crucially important in terms of becoming and staying top dog in the highly-competitive search market.

On Open Enterprise blog.

All Hail the Mighty Algorithm

As long-suffering readers of this blog will know, one of the reasons I regard software patents as dangerous is because software consists of algorithms, and algorithms are simply maths. So allowing software patents is essentially allowing patents on pure knowledge.

Against that background, this looks pretty significant:

Industries, particularly high tech, may be waiting for the U.S. Supreme Court decision, expected this coming spring, in the Bilski case to decide some fundamental questions of when you can patent business methods. But in the meantime, there’s a newly published decision from the Board of Patent Appeals and Interferences that establishes a new test to determine whether a machine or manufactured article that depends on a mathematical algorithm is patentable. The ruling is a big deal because it’s one of the few precedential decisions that the BPAI issues in a given year, and it will have a direct impact on patents involving computers and software.

For a claimed machine (or article of manufacture) involving a mathematical algorithm,

1. Is the claim limited to a tangible practical application, in which the mathematical algorithm is applied, that results in a real-world use (e.g., “not a mere field-of-use label having no significance”)?
2. Is the claim limited so as to not encompass substantially all practical applications of the mathematical algorithm either “in all fields” of use of the algorithm or even in “only one field?”

If the machine (or article of manufacture) claim fails either prong of the two-part inquiry, then the claim is not directed to patent eligible subject matter.

Now, the devil is in the details, and what impact this has will depend upon its interpretation. But what I find significant is that algorithms are foregrounded: the more people concentrate on this aspect, the harder it will be to justify software patents.

Follow me @glynmoody on Twitter or identi.ca.

16 December 2009

Hypocrisy, Thy Name is MPAA

I do love it when copyright maximalist organisations like the MPAA put out statements, because they invariably put their foot in it too. This "Testimony of Dan Glickman Chairman and CEO Motion Picture Association of America" is no exception. Here's a plum [.pdf]:

While not a Free Trade Agreement, the US motion picture industry – producers, studios and guilds -- has a keen interest in the Anti-Counterfeiting Trade Agreement (ACTA), in particular the provisions to address Internet piracy. We firmly believe that for the ACTA to address the enforcement challenges our industry confronts today, it MUST include robust protections for intellectual property online. Practical secondary liability regimes for online infringement are essential to motivate stakeholders to cooperate in implementing the reasonable practices that promote legitimate consumer options and make the online marketplace less hospitable for infringers. ACTA parties should refine their secondary liability regimes to reflect current realities and adopt modern, flexible systems where they do not exist.

What the MPAA wants is for ISPs, for example, to change their businesses "to reflect current realities and adopt modern, flexible systems where they do not exist": how strange, then, that the MPAA is not prepared to do the same by working according to the new digital rules instead of clinging to the old analogue ones...

Follow me @glynmoody on Twitter or identi.ca.