11 January 2010

In Praise of TinyOgg

The Ogg sound and video formats finally seem to be gaining some traction; the only problem is that few of the major sites employ it - Flash is pretty much ubiquitous these days. That's where TinyOgg comes in:

This service allows you to watch and listen to Flash-based videos without the need to Flash technology. This gives you speed, safety, control, freedom and openness.

And if you want more detail on the latter advantages:

Choosing Ogg over Flash is both an ethical and technical question. We all want our computers to do better, so here is how our computers are better with Free Formats:

1. You will be able to enjoy the media "natively"; there is no need to install any plug-in or add-on to your standard-friendly browser.

2. You will not need to load heavy and possibly unsafe scripts, which helps with speed and stability.

3. You will support a free, open Web where no corporation monopolizes an important technology, such as video.

4. You will enjoy more control over your digital life with Free Formats and Open Standards because no corporation decides what you can and cannot do with the files you have.

If you're wondering how they can afford to host all the stuff, the answer is, they don't:

All videos on TinyOgg are hosted for 48 hours. We believe that most users are going to watch the video only once, so there is no need to the "for-ever hosting" feature, which is going to be costly.

However, you can download the converted files and view them at your leisure. Recommended.

Follow me @glynmoody on Twitter or identi.ca.

10 January 2010

Personal Luggage *Will* be Subject to ACTA

One of the fairy tales being told about the oppressive ACTA is that it's only going to apply to large-scale criminal offenders, and that the Little People like you and me don't need to worry our pretty heads. But that's a lie, as this fascinating blog post has discovered:

It was very interesting to talk to Mr. Velasco. He said the negotiations could be understood, in a very very simplified way, as you basically could get cheap cars in exchange for IPR enforecement laws.

Interestingly enough, his materials published on the interenet also provided some kind of explanation to why people are afraid of having their iPods searched. Under "What is new" in a presentation about Enforcement of IPR Mr. Velasco says:

[it] "No longer excludes from the scope of the regulation counterfeit or pirated goods in a traveler's personal baggage where such goods are suspected to be part of a larger-scale traffic."

But don't put any great hopes in that fig-leaf "where such goods are suspected to be part of a larger-scale traffic": you can bet that once customs officials have the power to search through your laptop or MP3 player, they damn well will.

After all, potentially a *single* unauthorised copy can be used to spawn thousands of copies that would certainly constitute "larger-scale traffic"; so surely that means that all it takes is for a sufficiently suspicious customs official to "suspect" that single copies on an MP3 player might be part of larger-scale traffic - and then Robert is your father's Brother.

Make no mistake: if ACTA is agreed in its current form, it will impact every one of us directly - and direly.

Follow me @glynmoody on Twitter or identi.ca.

08 January 2010

Help Stop EU Software Patents – Again

A few years back, there was a fierce battle between those wishing to lock down software with patents, and those who wanted to keep copyright as the main protection for computer code. Thankfully, the latter won. Here's what the BBC wrote at the time....

On Open Enterprise blog.

04 January 2010

E-book Industry Gets E-diotic

Learning nothing from the decade-long series of missteps by the music industry, publishers want to repeat that history in all its stupidity:


Digital piracy, long confined to music and movies, is spreading to books. And as electronic reading devices such as Amazon's Kindle, the Sony Reader, Barnes & Noble's Nook, smartphones and Apple's much-anticipated "tablet" boost demand for e-books, experts say the problem may only get worse.

Gosh, the sky is falling.

"Textbooks are frequently pirated, but so are many other categories," said Ed McCoyd, director of digital policy at AAP. "We see piracy of professional content, such as medical books and technical guides; we see a lot of general fiction and non-fiction. So it really runs the gamut."

Er, you don't think that might be because the students are being price-gouged by academic publishers that know they have a captive audience?

And how's this for a solution?

Some publishers may try to minimize theft by delaying releases of e-books for several weeks after physical copies go on sale. Simon & Schuster recently did just that with Stephen King's novel, "Under the Dome," although the publisher says the decision was made to prevent cheaper e-versions from cannibalizing hardcover sales.

In other words, they are *forcing* people who might pay for a digital edition to turn to unauthorised copies: smart move.

And it seems old JK doesn't get it either:

Some authors have even gone as far as to shrug off e-book technology altogether. J.K Rowling has thus far refused to make any of her Harry Potter books available digitally because of piracy fears and a desire to see readers experience her books in print.

Well, I'm a big fan of analogue books too - indeed, I firmly believe it is how publishers will survive. But I wonder if JK has ever considered the point that reading digital versions is rather less pleasant than snuggling down with a physical book, and so once you've got people interested in the content - through digital versions - they might then go out and buy a dead tree version?

But no, instead we are going to get all the inane reasoning that we heard from the music publishers, all the stupid attempts to "lock down" texts, and the same flourishing of publishers despite all that.

Follow me @glynmoody on Twitter or identi.ca.

03 January 2010

Why Extending the DNA Database is Dangerous

Part of the problem with extending the DNA database is that doing so increases the likelihood of this happening:

After a seven-day trial, Jama had been convicted of raping a 40-year-old woman in the toilets at a suburban nightclub.

The only evidence linking him to the crime was a DNA sample taken from the woman's rape kit.

...

Jama had steadfastly denied the charge of rape and said he had never been to that nightclub, not on that cold Melbourne night, not ever. He repeatedly stated he was with his critically ill father on the other side of Melbourne, reading him passages from the Koran.

But the judge and the jury did not buy his alibi, despite supporting evidence from his father, brother and friend. Instead, they believed the forensic scientist who testified there was a one in 800 billion chance that the DNA belonged to someone other than the accused man.

This week Jama gave the lie to that absurdly remote statistic. After prosecutors admitted human error in the DNA testing on which the case against Jama was built, his conviction was overturned.

Prosecutors said they could not rule out contamination of the DNA sample after it emerged the same forensic medical officer who used the rape kit had taken an earlier sample from Jama in an unrelated matter. They admitted a "serious miscarriage of justice".

DNA is an important forensic tool - when used properly. But it is not foolproof, not least because contamination can lead to false positives.

The more DNA profiles that are stored on a database, the more likely there will be a match found due to such false positives. And such is the belief in the infallibility of DNA testing - thanks to the impressive-sound "one in 800 billion chance that the DNA belonged to someone other than the accused man" - that it is likely to lead to more *innocent* people being convicted. The best solution is to keep the DNA database small, tight and useful.

Follow me @glynmoody on Twitter or identi.ca.

02 January 2010

This Reminds Me of Something...

Interesting piece about the problems of remembering as we grow older:

if you are primed with sounds that are close to those you’re trying to remember — say someone talks about cherry pits as you try to recall Brad Pitt’s name — suddenly the lost name will pop into mind. The similarity in sounds can jump-start a limp brain connection. (It also sometimes works to silently run through the alphabet until landing on the first letter of the wayward word.)

This is exactly the method that I have developed in my old age: when I can't remember a name or word, I start saying apparently random sounds to myself, gradually focussing on those that *feel* close to the one I'm looking for. It something takes a while, but I can generally find the word, and it usually has some connection with the ones that I pronounce in my journey towards it.

I also found that this resonated with my experience too:

continued brain development and a richer form of learning may require that you “bump up against people and ideas” that are different. In a history class, that might mean reading multiple viewpoints, and then prying open brain networks by reflecting on how what was learned has changed your view of the world.

I find working in the field of computing useful here, since there are always new things to try. As the article says, it seems particularly helpful to try out things you are *not* particularly sympathetic to. It's the reason that I started twittering on 1 January last year: to force myself to do something new and something challenging. Well, that seemed to work out. Question is, what should I be doing this year?

Follow me @glynmoody on Twitter or identi.ca.

31 December 2009

What Lies at the Heart of "Avatar"?

If nothing else, "Avatar" is a computational tour-de-force. Here are some details of the kit they used:

It takes a lot of data center horsepower to create the stunning visual effects behind blockbuster movies such as King Kong, X-Men, the Lord of the Rings trilogy and most recently, James Cameron’s $230 million Avatar. Tucked away in Wellington, New Zealand are the facilities where visual effects company Weta Digital renders the imaginary landscapes of Middle Earth and Pandora at a campus of studios, production facilities, soundstages and a purpose-built data center.

...

The Weta data center got a major hardware refresh and redesign in 2008 and now uses more than 4,000 HP BL2×220c blades (new BL2×220c G6 blades announced last month), 10 Gigabit Ethernet networking gear from Foundry and storage from BluArc and NetApp. The system now occupies spot 193 through 197 in the Top 500 list of the most powerful supercomputers.

Here's info about Weta from the Top500 site:

Site WETA Digital
System Family HP Cluster Platform 3000BL
System Model Cluster Platform 3000 BL 2x220
Computer Cluster Platform 3000 BL2x220, L54xx 2.5 Ghz, GigE
Vendor Hewlett-Packard
Application area Media
Installation Year 2009

Operating System Linux

Oh, look: Linux. Why am I not surprised...?

Follow me @glynmoody on Twitter or identi.ca.

30 December 2009

The Wisdom of the Conservatives

I don't have much time for either of the main UK political parties (or many of the others, come to that), but I must give some kudos to the Tories for latching onto an ironic weakness of Labour: its authoritarian hatred of openness. And here the former are at it again, showing the UK government how it should be done:


The Conservatives are today announcing a competition, with a £1million prize, for the best new technology platform that helps people come together to solve the problems that matter to them – whether that’s tackling government waste, designing a local planning strategy, finding the best school or avoiding roadworks.

This online platform will then be used by a future Conservative government to throw open the policy making process to the public, and harness the wisdom of the crowd so that the public can collaborate to improve government policy. For example, a Conservative government would publish all government Green Papers on this platform, so that everyone can have their say on government policies, and feed in their ideas to make them better.

This is in addition to our existing radical commitment to introduce a Public Reading Stage for legislation so that the public can comment on draft bills, and highlight drafting errors or potential improvements.

That said, the following is a bit cheeky:

Harnessing the wisdom of the crowd in this way is a fundamentally Conservative approach, based on the insight that using dispersed information, such as that contained within a market, often leads to better outcomes than centralised and closed systems.

Tories as bastions of the bottom-up approach? Stalin would have been proud of that bit of historical revisionism.

The only remaining question (other than whether the Conservatives will win the forthcoming UK General Election) is whether the software thus produced will be released under an open source licence. I presume so, since this would also be "a fundamentally Conservative approach"....

Follow me @glynmoody on Twitter or identi.ca.

What Took Wired So Loongson?

I've been writing about the Loongson chip for three years now. As I've noted several times, this chip is important because (a) it's a home-grown Chinese chip (albeit based on one from MIPS) and (b) Windows doesn't run on it, but GNU/Linux does.

It looks like Wired magazine has finally woken up to the story (better late than never):


Because the Loongson eschews the standard x86 chip architecture, it can’t run the full version of Microsoft Windows without software emulation. To encourage adoption of the processor, the Institute of Computing Technology is adapting everything from Java to OpenOffice for the Loongson chip and releasing it all under a free software license. Lemote positions its netbook as the only computer in the world with nothing but free software, right down to the BIOS burned into the motherboard chip that tells it how to boot up. It’s for this last reason that Richard “GNU/Linux” Stallman, granddaddy of the free software movement, uses a laptop with a Loongson chip.

Because GNU/Linux distros have already been ported to the Loongson chip, neither Java nor OpenOffice.org needs "adapting" so much as recompiling - hardly a challenging task. As for "releasing it all under a free software license", they had no choice.

But at least Wired got it right about the potential impact of the chip:

Loongson could also reshape the global PC business. “Compared to Intel and IBM, we are still in the cradle,” concedes Weiwu Hu, chief architect of the Loongson. But he also notes that China’s enormous domestic demand isn’t the only potential market for his CPU. “I think many other poor countries, such as those in Africa, need low-cost solutions,” he says. Cheap Chinese processors could corner emerging markets in the developing world (and be a perk for the nation’s allies and trade partners).

And that’s just the beginning. “These chips have implications for space exploration, intelligence gathering, industrialization, encryption, and international commerce,” says Tom Halfhill, a senior analyst for Microprocessor Report.

Yup.

Follow me @glynmoody on Twitter or identi.ca.

29 December 2009

The Lost Decades of the UK Web

This is a national disgrace:

New legal powers to allow the British Library to archive millions of websites are to be fast-tracked by ministers after the Guardian exposed long delays in introducing the measures.

The culture minister, Margaret Hodge, is pressing for the faster introduction of powers to allow six major libraries to copy every free website based in the UK as part of their efforts to record Britain's cultural, scientific and political history.

The Guardian reported in October that senior executives at the British Library and National Library of Scotland (NLS) were dismayed at the government's failure to implement the powers in the six years since they were established by an act of parliament in 2003.

The libraries warned that they had now lost millions of pages recording events such as the MPs' expenses scandal, the release of the Lockerbie bomber and the Iraq war, and would lose millions more, because they were not legally empowered to "harvest" these sites.

So, 20 years after Sir Tim Berners-Lee invented the technology, and well over a decade after the Web became a mass medium, and the British Library *still* isn't archiving every Web site?

History - assuming we have one - will judge us harshly for this extraordinary UK failure to preserve the key decades of the quintessential technology of our age. It's like burning down a local digital version of the Library of Alexandria, all over again.

Follow me @glynmoody on Twitter or identi.ca.

Copyright Infringement: A Modest Proposal

The UK government's Canute-like efforts to stem the tide of online copyright infringement have plumbed new depths, it seems:


Proposals to suspend the internet connections of those who repeatedly share music and films online will leave consumers with a bill for £500 million, ministers have admitted.

The Digital Economy Bill would force internet service providers (ISPs) to send warning letters to anyone caught swapping copyright material illegally, and to suspend or slow the connections of those who refused to stop. ISPs say that such interference with their customers’ connections would add £25 a year to a broadband subscription.

As Mike Masnick points out:

Note, of course, that the music industry itself claims that £200 million worth of music is downloaded in the UK per year (and, of course, that's only "losses" if you use the ridiculous and obviously incorrect calculation that each download is a "lost sale").

So this absurd approach will actually cost far more than it will save, even accepting the grossly-inflated and self-serving figures from the music industry.

Against that background, I have a suggestion.

Given that the UK government seems happy for huge sums of money to be spent on this fool's errand, why not spend it more effectively, in a way that sustains businesses, rather than penalising them, and which actually encourages people not to download copyrighted material from unauthorised sources?

This can be done quite simply: by giving everyone who wants it a free Spotify Premium subscription. These normally cost £120 per year, but buying a national licence for the 10 million families or so who are online would presumably garner a generous discount - say, of 50% - bringing the total price of the scheme to around £600 million, pretty much the expected cost of the current plans.

As I can attest, once you get the Spotify Premium habit, you really don't want to bother with downloading files and managing them: having everything there, in the cloud, nicely organised, is just *so* convenient (well, provided you don't lose your connection). I'm sure that my scheme would lead to falls in the levels of file sharing that the government is looking for; and anyway, it could hardly be worse than the proposals in the Digital Economy bill.

Update: On Twitter, Barbara Cookson suggested a clever tweak to this idea: "absolution for ISPs who include #spotify as part of package". Nice.

Follow me @glynmoody on Twitter or identi.ca.

28 December 2009

Making Money by Giving Stuff Away

Open source software is obviously extremely interesting to companies from a utilitarian viewpoint: it means they can reduce costs and – more significantly – decrease their dependence on single suppliers. But there's another reason why businesses should be following the evolution of this field: it offers important lessons about how the economics of a certain class of products is changing.

On Open Enterprise blog.

24 December 2009

ACTA as the (Fool's) "Gold Standard"

I've noted before that at the heart of the ACTA negotiations there is a con-trick being played upon the world: insofar as the mighty ones deign to pass down any crumbs of information to us little people, it is framed in terms of the dangers of counterfeit medicines and the like, and how we are being "protected". But, then, strangely, those counterfeit medicines morph into digital copies of songs - where there is obviously no danger whatsoever - but the same extreme measures are called for.

Unfortunately, the European Union has now joined in the parroting this lie, and is now pushing even harder for ACTA to be implemented:


The European Union appears to be preparing for adoption of the “gold standard” of enforcement, the Anti-Counterfeiting Trade Agreement (ACTA), as intellectual property law expert Annette Kur from the Max Planck Institute of Intellectual Property, Competition and Tax Law said it is now called.

At a conference of the Swedish EU Presidency on “Enforcement of Intellectual Property with a Special Focus on Trademarks and Patents” on 15-16 December in Stockholm, representatives from EU bodies, member states and industry supported a quick enforcement of ACTA, according to participants. A representative of the Justice, Freedom and Security Directorate General of the European Commission, presented a plan for a quick restart of a legislative process in the EU to harmonise criminal law sanctions in the Community.

Worryingly:

Only two members of Parliament attended the conference in Stockholm, which despite its high-level panels was not much publicised by the Swedish presidency. Not even an agenda had been published beforehand

That is, the inner circle of the EU, represented by the EU Presidency, was clearly trying to minimise scrutiny by the European Parliament, which has historically taken a more balanced view of intellectual monopolies and their enforcement. That matters, because:

Under the Lisbon Treaty, the European Parliament would be kept informed of the negotiation process in a manner similar to the Council, a Commission expert said. Furthermore, the ACTA text would be approved both by the Parliament and the Council.

In other words, the European Parliament now has powers that allow it to block things like ACTA, should it so desire. That's obviously a problem for those European politicians used to getting their way without such tiresome democratic obstacles.

Despite this shameful attempt to keep everything behind closed doors, the presentations show that even among those with access to the inner circle there are doubts about ACTA's "gold standard". Here's what the academic Annette Kur said in her presentation [.pdf]:

Using the public concern about serious crimes like fabrication of fake and noxious medicaments as an argument pushing for stronger legislation on IP infringement in general is inappropriate and dangerous

It is dangerous because it obscures the fact that to combat risks for public health is not primarily an IP issue

It is inappropriate because it will typically tend to encourage imbalanced legislation

Similarly Kostas Rossoglou from BEUC, the European Consumers’ Organisation, was deeply worried by the following aspects [.pdf]:

Counterfeiting used as a general term to describe all IPR Infringements and beyond!!!

Broad scope of IPRED Directive – all IPR infringements are presumed to be equally serious!!!

No distinction between commercial piracy and unauthorised use of copyright-protected content by individuals

No clear definition of the notion of “commercial scale”

Things are moving fast on the ACTA front in Europe, with a clear attempt to steamroller this through without scrutiny. This makes it even more vital that we call out those European politicians who try to justify their actions by equating counterfeiting and copyright infringement, and that we continue to demand a more reasoned and balanced approach that takes into account end-users as well as the holders of intellectual monopolies.

Follow me @glynmoody on Twitter or identi.ca.

23 December 2009

Coming up with a Copyright Assignment Strategy

One of the deep ironies of the free software world, which is predicated on freedom, is that it effectively requires people to become experts in copyright, an intellectual monopoly that is concerned with restricting freedom. That's because the GNU GPL, and licences that have followed its lead, all use copyright to achieve their aims. At times, though, that clever legal hack can come back to bite you, and nowhere more painfully than in the field of copyright assignment.

On Open Enterprise blog.

Google Opens up – about Google's Opennness

Google could not exist without open source software: licensing costs would be prohibitive if it had based its business on proprietary applications. Moreover, free software gives it the possibility to customise and optimise its code – crucially important in terms of becoming and staying top dog in the highly-competitive search market.

On Open Enterprise blog.

All Hail the Mighty Algorithm

As long-suffering readers of this blog will know, one of the reasons I regard software patents as dangerous is because software consists of algorithms, and algorithms are simply maths. So allowing software patents is essentially allowing patents on pure knowledge.

Against that background, this looks pretty significant:

Industries, particularly high tech, may be waiting for the U.S. Supreme Court decision, expected this coming spring, in the Bilski case to decide some fundamental questions of when you can patent business methods. But in the meantime, there’s a newly published decision from the Board of Patent Appeals and Interferences that establishes a new test to determine whether a machine or manufactured article that depends on a mathematical algorithm is patentable. The ruling is a big deal because it’s one of the few precedential decisions that the BPAI issues in a given year, and it will have a direct impact on patents involving computers and software.

For a claimed machine (or article of manufacture) involving a mathematical algorithm,

1. Is the claim limited to a tangible practical application, in which the mathematical algorithm is applied, that results in a real-world use (e.g., “not a mere field-of-use label having no significance”)?
2. Is the claim limited so as to not encompass substantially all practical applications of the mathematical algorithm either “in all fields” of use of the algorithm or even in “only one field?”

If the machine (or article of manufacture) claim fails either prong of the two-part inquiry, then the claim is not directed to patent eligible subject matter.

Now, the devil is in the details, and what impact this has will depend upon its interpretation. But what I find significant is that algorithms are foregrounded: the more people concentrate on this aspect, the harder it will be to justify software patents.

Follow me @glynmoody on Twitter or identi.ca.

16 December 2009

Hypocrisy, Thy Name is MPAA

I do love it when copyright maximalist organisations like the MPAA put out statements, because they invariably put their foot in it too. This "Testimony of Dan Glickman Chairman and CEO Motion Picture Association of America" is no exception. Here's a plum [.pdf]:

While not a Free Trade Agreement, the US motion picture industry – producers, studios and guilds -- has a keen interest in the Anti-Counterfeiting Trade Agreement (ACTA), in particular the provisions to address Internet piracy. We firmly believe that for the ACTA to address the enforcement challenges our industry confronts today, it MUST include robust protections for intellectual property online. Practical secondary liability regimes for online infringement are essential to motivate stakeholders to cooperate in implementing the reasonable practices that promote legitimate consumer options and make the online marketplace less hospitable for infringers. ACTA parties should refine their secondary liability regimes to reflect current realities and adopt modern, flexible systems where they do not exist.

What the MPAA wants is for ISPs, for example, to change their businesses "to reflect current realities and adopt modern, flexible systems where they do not exist": how strange, then, that the MPAA is not prepared to do the same by working according to the new digital rules instead of clinging to the old analogue ones...

Follow me @glynmoody on Twitter or identi.ca.

EC Says OK to MS IE Deal: How Much of a Win?

Neelie Kroes, European Commissioner for Competition Policy, had some news this morning:

Today is an important day for internet users in Europe. Today, the Commission has resolved a serious competition concern in a key market for the development of the internet, namely the market for web browsers. Now - for the first time in over a decade - Internet users in Europe will have an effective and unbiased choice between Microsoft’s Internet Explorer and competing web browsers, such as Mozilla Firefox, Google Chrome, Apple Safari and Opera....

On Open Enterprise blog.

15 December 2009

SFLC Gets Busy Around BusyBox

Contrary to some public perceptions, the Free Software Foundation is not keen on litigating against those who fail to respect the terms of the GNU GPL. Here's what Eben Moglen, very much the legal brains behind the organisation, told me a decade ago....

On Open Enterprise blog.

Australia Edges Us Towards the Digital Dark Ages

Last week, on my opendotdotdot blog, I was praising the Australian government for its moves to open up its data. I was rapidly – and rightly – taken to task in the comments for failing to mention that government's efforts to impose direct, low-level censorship on the country's Internet feed.

Although I was aware of these moves, I wasn't quite up to date on their progress. It seems that things have moved far and fast...

On Open Enterprise blog.

14 December 2009

Canadians *Do* Have a Sense of Humour

Want a good laugh?


One hour ago, a spoof press release targeted Canada in order to generate hurtful rumors and mislead the Conference of Parties on Canada's positions on climate change, and to damage Canada's standing with the international business community.

The release, from "press@enviro-canada.ca," alleges Canada's acceptance of unrealistic emissions-reduction targets, as well as a so-called "Climate Debt Mechanism," a bilateral agreement between Canada and Africa to furnish that continent with enormous sums in "reparation" for climate damage and to "offset" adaptation.

Of course, everyone should have known that Canada wouldn't do anything like accept massive emission reduction targets, or agree to reparations. No, this is what it *really* has in mind:

Today as always, Canada's binding responsibility is to supply the world - including its burgeoning developing portion - with those means of transport, health, and sustenance that prosperous markets require. Stopping short of these dictates would violate the very principles upon which our nations were founded, and endanger our very development.

As you will note, there's nothing here about that tiresome need to minimise climate change, it's all about "prosperous markets", yeah. Indeed:

Canada's current energy policy represents an elegant synthesis of the most advanced science, while remaining faithful to Canada's tradition of political pragmatism. Experts note, for example, that the much-decried oil sands of Alberta, contrary to environmentalists' dire assertions, are enabling Canada to meet ambitious emissions goals by providing her, as well as her neighbors, with the energy resources needed to transition to a cleaner energy future.

Cunning, no? Canada notes how using energy from one of the dirtiest sources, the "much-decried oil sands of Alberta", is in fact absolutely fine because it will allow a transition to a "cleaner energy future". Which means that we can justify *any* kind of energy source, no matter how dirty, provided it makes things better at some ill-specified time in the future.

If we have one, of course. (Via Tristan Nitot.)

Follow me @glynmoody on Twitter or identi.ca.

Monsoft or Microsanto?

I and others (notably Roy Schestowitz) have noted the interesting similarities between Microsoft and Monsanto at various levels; but a major new story from the Associated Press makes the parallels even more evident.

For example:

One contract gave an independent seed company deep discounts if the company ensured that Monsanto's products would make up 70 percent of its total corn seed inventory. In its 2004 lawsuit, Syngenta called the discounts part of Monsanto's "scorched earth campaign" to keep Syngenta's new traits out of the market.

This is identical to the approach adopted by Microsoft in offering discounts to PC manufacturers that only offered its products.

Monsanto has followed Microsoft in placing increasing emphasis on patents:

Monsanto was only a niche player in the seed business just 12 years ago. It rose to the top thanks to innovation by its scientists and aggressive use of patent law by its attorneys.

First came the science, when Monsanto in 1996 introduced the world's first commercial strain of genetically engineered soybeans. The Roundup Ready plants were resistant to the herbicide, allowing farmers to spray Roundup whenever they wanted rather than wait until the soybeans had grown enough to withstand the chemical.

The company soon released other genetically altered crops, such as corn plants that produced a natural pesticide to ward off bugs. While Monsanto had blockbuster products, it didn't yet have a big foothold in a seed industry made up of hundreds of companies that supplied farmers.

That's where the legal innovations came in, as Monsanto became among the first to widely patent its genes and gain the right to strictly control how they were used. That control let it spread its technology through licensing agreements, while shaping the marketplace around them.

Monsanto also blocks the use of "open source" genetically-modified organisms:

Back in the 1970s, public universities developed new traits for corn and soybean seeds that made them grow hardy and resist pests. Small seed companies got the traits cheaply and could blend them to breed superior crops without restriction. But the agreements give Monsanto control over mixing multiple biotech traits into crops.

The restrictions even apply to taxpayer-funded researchers.

Roger Boerma, a research professor at the University of Georgia, is developing specialized strains of soybeans that grow well in southeastern states, but his current research is tangled up in such restrictions from Monsanto and its competitors.

"It's made one level of our life incredibly challenging and difficult," Boerma said.

The rules also can restrict research. Boerma halted research on a line of new soybean plants that contain a trait from a Monsanto competitor when he learned that the trait was ineffective unless it could be mixed with Monsanto's Roundup Ready gene.

The result is yet another monoculture:

"We now believe that Monsanto has control over as much as 90 percent of (seed genetics). This level of control is almost unbelievable," said Neil Harl, agricultural economist at Iowa State University who has studied the seed industry for decades.

The key difference here, of course, is that this is no metaphor, but a *real* monoculture, with all the dangers that this implies.

Fortunately, things seem to be evolving for Monsanto just as they did for Microsoft, with a major anti-trust investigation in the offing:

Monsanto's business strategies and licensing agreements are being investigated by the U.S. Department of Justice and at least two state attorneys general, who are trying to determine if the practices violate U.S. antitrust laws.

Amazingly, David Boies, the lawyer that led the attack on Microsoft during that investigation, is also invovled: he is representing Du Pont, one of Monsanto's rivals concerned about the latter's monopoly power.

Let's just hope that Monsanto becomes the subject of a full anti-trust action, and that the result is more effective than that applied to Microsoft. After all, we're not talking about software here, but the world's food supply, and monopolies - both intellectual and otherwise - are simply morally indefensible when billions of lives are stake.

Follow me @glynmoody on Twitter or identi.ca.

13 December 2009

Of Access to Copyright Materials and Blindness

In a way, I suppose we should be grateful that the content industries have decided to dig their heels in over the question of providing more access to copyright materials for the visually impaired. For it leads to revelatory posts like this, which offer an extraordinary glimpse into the twisted, crimped souls of those fighting tooth and nail against the needs of the blind and visually impaired:

the treaty now being proposed would not be compatible with US copyright laws and norms, and would undermine the goal of expanded access that we all share. This overreaching treaty would also harm the rights of authors and other artists, and the incentives necessary for them to create and commercialize their works. We strongly believe improving access for one community should not mean that another loses its rights in the process.

Let's just look at that.

First, in what sense is providing more access to the visually impaired not compatible with US copyright laws? The proponents of this change have gone out of their way to make sure that the access given is within current copyright regimes, which are not serving this huge, disadvantaged constituency properly. And how would it undermine expanded access? It would, manifestly, provide access that is not available now; the publishers have proposed nothing that would address the problem other than saying the system's fine, we don't want to change it.

But the most telling - and frankly, sickening - aspect of this post is the way its author sets up the rights of authors against the rights of those with visual disabilities, as if the latter are little better than those scurvy "pirates" that "steal" copyright material from those poor authors.

In fact, *nothing* is being taken, it's simply that these people wish to enjoy their rights to read as others do - something that has been denied to them by an industry indifferent to their plight. And which author would not be happy to extend the pleasure of reading their works to those cut off from it by virtue of physical disabilities?

If Mark Esper thinks that is an unreasonable, outrageous goal for the visually impaired, and that maximalist copyright trumps all other humanitarian considerations, he is a truly sad human being, and I pity him. He should try looking in the mirror sometime - and be glad that he can, unlike those whose rights he so despises. (Via Jamie Love.)

Follow me @glynmoody on Twitter or identi.ca.

11 December 2009

The Future Impact of Openness

The European Commission has released a report [.pdf] with the rather unpromising title "Trends in connectivity technologies and their socio-economic impacts". Despite this, and a rather stodgy academic style, there are a number of interesting points made.

One of the best chapters is called "Projecting the future: Scenarios for tech trend development and impact assessment", which describe three possible future worlds: Borderless, Connecting and Scattered. What's interesting is that Connecting essentially describes a world where openness of all kinds is a major feature. The implications of these kinds of worlds are then examined in detail.

I wouldn't describe it as a gripping read, but there's a huge amount of detail that may be of interest to those pondering on what may be, especially the role of openness there.

Uncommon Meditations on the Commons

It's significant that books about the commons are starting to appear more frequently now. Here's one that came out six months ago:


Who Owns the World? The Rediscovery of the Commons, has now been published by oekom Verlag in Berlin. (The German title is Wem gehört die Welt – Zur Wiederentdeckung der Gemeingüter.) The book is an anthology of essays by a wide range of international authors, including Elinor Ostrom, Richard Stallman, Sunita Narain, Ulrich Steinvorth, Peter Barnes, Oliver Moldenhauer, Pat Mooney and David Bollier.

Unfortunately, its text no longer seems available in English (please correct me if I'm wrong), although there is a version in Spanish [.pdf]. For those of you a little rusty in that tongue, there's a handy review and summary of the book that actually turns into a meditation on some unusual aspects of the commons in its own right. The original, in French, is also available.

Here's the conclusion:

Those who love the commons and reciprocity rightly highlight the risks entailed by their necessary relationships with politics and the State, with money and the market. This caution should not lead them to isolate the commons from the rest of the world, however, or from the reign of the State and market. State and market are not cadavers which can be nailed into a coffin and thrown into the sea. For a very, very long time, they will continue to contaminate or threaten the reciprocal relationships that lie at the heart of the commons, with their cold logic. We can only try to reduce their importance. We must hope that reciprocal relationships will grow in importance with respect to relationships of exchange and of authority.

Worth reading.

Follow me @glynmoody on Twitter or identi.ca.