31 July 2009

Hell Goes Sub-Zero, Sony Does Open Source

For me, Sony has always been the antithesis of open source. So this comes as something of a shock:

Sony Pictures Imageworks, the award-winning visual effects and digital character animation unit of Sony Pictures Digital Productions, is launching an open source development program, it was announced today by Imageworks' chief technology officer, Rob Bredow. Five technologies will be released initially:

* OSL, a programmable shading language for rendering
* Field3d, a voxel data storage library
* Maya Reticule, a Maya Plug-in for camera masking
* Scala Migration, a database migration tool
* Pystring, python-like string handling in C++

Imageworks' production environment, which is known for its photo-real visual effects, digital character performances, and innovative technologies to facilitate their creation, has incorporated open source solutions, most notably the Linux operating system, for many years. Now the company is contributing back to the open source community by making these technologies available. The software can be used freely around the world by both large and small studios. Each project has a team of passionate individuals supporting it who are interested in seeing the code widely used. The intention of the open source release is to build larger communities to adopt and further refine the code.

OK, so it's only a tiny, non-mainstream bit of Sony, but it's surely a sign of the times....

Follow me @glynmoody on Twitter @glynmoody and identi.ca.

Why Single Sign On Systems Are Bad

Wow, here's a really great article about identity management from, um, er, Microsoft. Actually, it's a rather remarkable Microsoft article, since it contains the following sentences:

On February 14, 2006, Microsoft Chairman Bill Gates declared that passwords would be gone where the dinosaurs rest in three to four years.

But as I write this in March 2009, it is pretty clear that Bill was wrong.

But it's not for that frisson that you should read it; it's for the following insight, which really needs hammering home:

The big challenge with respect to identity is not in designing an identity system that can provide SSO [Single Sign On], even though that is where most of the technical effort is going. It's not even in making the solution smoothly functioning and usable, where, unfortunately, less effort is going. The challenge is that users today have many identities. As I mentioned above, I have well over 100. On a daily basis, I use at least 20 or 25 of those. Perhaps users have too many identities, but I would not consider that a foregone conclusion.

The purist would now say that "SSO can fix that problem." However, I don't think it is a problem. At least it is not the big problem. I like having many identities. Having many identities means I can rest assured that the various services I use cannot correlate my information. I do not have to give my e-mail provider my stock broker identity, nor do I have to give my credit card company the identity I use at my favorite online shopping site. And only I know the identity I use for the photo sharing site. Having multiple identities allows me to keep my life, and my privacy, compartmentalized.

Yes yes yes yes yes. *This* is what the UK government simply does not want to accept: creating a single, all-powerful "proof" of identity is actually exactly the wrong thing to do. Once compromised, it is hugely dangerous. Moreover, it gives too much power to the provider of that infrastructure - which is precisely why the government *loves* it. (Via Ideal Government.)

Follow me @glynmoody on Twitter @glynmoody and identi.ca.

Open Source Cognitive Science

A new site with the self-explanatory name of "Open source cognitive science" has an interesting opening post about Tools for Psychology and Neuroscience, pointing out that:

Open source tools make new options available for designing experiments, doing analysis, and writing papers. Already, we can see hardware becoming available for low-cost experimentation. There is an OpenEEG project. There are open source eye tracking tools for webcams. Stimulus packages like VisionEgg can be used to collect reaction times or to send precise timing signals to fMRI scanners. Neurolens is a free functional neuroimage analysis tool.

It also has this information about the increasingly fashionable open source statistics package R that was news to me, and may be of interest to others:

R code can be embedded directly into a LaTeX or OpenOffice document using a utility called Sweave. Sweave can be used with LaTeX to automatically format documents in APA style (Zahn, 2008). With Sweave, when you see a graph or table in a paper, it’s always up to date, generated on the fly from the original R code when the PDF is generated. Including the LaTeX along with the PDF becomes a form of reproducible research, rooted in Donald Knuth’s idea of literate programming. When you want to know in detail how the analysis was done, you need look no further than the source text of the paper itself.

Follow me @glynmoody on Twitter @glynmoody and identi.ca.

30 July 2009

Transparency Saves Lives

Here's a wonderful demonstration that the simple fact of transparency can dramatically alter outcomes - and, in this case, save lives:

Outcomes for adult cardiac patients in the UK have improved significantly since publication of information on death rates, research suggests.

The study also found more elderly and high-risk patients were now being treated, despite fears surgeons would not want to take them on.

It is based on analysis of more than 400,000 operations by the Society for Cardiothoracic Surgery.

Fortunately, people are drawing the right conclusions:

Experts said all surgical specialties should now publish data on death rates.

Follow me @glynmoody on Twitter @glynmoody and identi.ca.

Profits Without Intellectual Monopolies

Great interview with Mr Open Innovation, Eric von Hippel, who has these wise words of advice:

It is true that the most rapidly developing designs are those where many can participate and where the intellectual property is open. Think about open source software as an example of this. What firms have to remember is that they have many ways to profit from good new products, independent of IP. They’ve got brands; they’ve got distribution; they’ve got lead time in the market. They have a lot of valuable proprietary assets that are not dependent on IP.

If you’re going to give out your design capability to others, users specifically, then what you have to do is build your business model on the non-design components of your mix of competitive advantages. For instance, recall the case of custom semiconductor firms I mentioned earlier. Those companies gave away their job of designing the circuit to the user, but they still had the job of manufacturing those user-designed semiconductors, they still had the brand, they still had the distribution. And that’s how they make their money.

Follow me @glynmoody on Twitter @glynmoody and identi.ca.

29 July 2009

RIAA's War on Sharing Begins

Words matter, which is why the RIAA has always framed copyright infringement in terms of "piracy". But it has a big problem: most people call it "sharing"; and as everyone was told by their mother, it's good to share. So the RIAA needs to redefine things, and it seems that it's started doing just that in the Joel Tanenbaum trial:

"We are here to ask you to hold the defendant responsible for his actions," said Reynolds, a partner in the Boulder, Colorado office of Holme, Robert & Owen. "Filesharing isn't like sharing that we teach our children. This isn't sharing with your friends."

Got that? P2P sharing isn't *real* sharing, because it's not sharing with your friends; this is *evil* sharing because it's bad to share with stranger. Apparently.

Watch out for more of this meme in the future.

Follow me @glynmoody on Twitter @glynmoody and identi.ca.

It's Not Open Science if it's Not Open Source

Great to see a scientist come out with this in an interesting post entitled "What, exactly, is Open Science?":

granting access to source code is really equivalent to publishing your methodology when the kind of science you do involves numerical experiments. I’m an extremist on this point, because without access to the source for the programs we use, we rely on faith in the coding abilities of other people to carry out our numerical experiments. In some extreme cases (i.e. when simulation codes or parameter files are proprietary or are hidden by their owners), numerical experimentation isn’t even science. A “secret” experimental design doesn’t give skeptics the ability to repeat (and hopefully verify) your experiment, and the same is true with numerical experiments. Science has to be “verifiable in practice” as well as “verifiable in principle”.

The rest is well worth reading too.

(Via @phylogenomics.)

Follow me @glynmoody on Twitter @glynmoody and identi.ca.

28 July 2009

Why Hackers Will Save the World

For anyone that might be interested, my keynote from the recent Gran Canaria Desktop Summit is now online as an Ogg video.

24 July 2009

Bill Gates Shows His True Identity

And so it starts to come out:

Microsoft is angling to work on India’s national identity card project, Mr. Gates said, and he will be meeting with Nandan Nilekani, the minister in charge. Like Mr. Gates, Mr. Nilekani stopped running the technology company he helped to start, Infosys, after growing it into one of the biggest players in the business. He is now tasked with providing identity cards for India’s 1.2 billion citizens starting in 2011. Right now in India, many records like births, deaths, immunizations and driving violations are kept on paper in local offices.

Mr. Gates was also critical of the United States government’s unwillingness to adopt a national identity card, or allow some businesses, like health care, to centralize data keeping on individuals.

Remind me again why we bother listening to this man...

Follow me @glynmoody on Twitter or identi.ca.

Why the GNU GPL v3 Matters Even More

A little while back, I wrote a post called "Why the GNU GPL Still Matters". I was talking in general terms, and didn't really distinguish between the historical GNU GPL version 2 and the new version 3. That's in part because I didn't really have any figures on how the latter was doing. Now I do, because Matt Asay has just published some plausible estimates:

In July 2007, version 3 of the GNU General Public License barely accounted for 164 projects. A year later, the number had climbed past 2,000 total projects. Today, as announced by Google open-source programs office manager Chris DiBona, the number of open-source projects licensed under GPLv3 is at least 56,000.

And that's just counting the projects hosted at Google Code.

In a hallway conversation with DiBona at OSCON, he told me roughly half of all projects on Google Code use the GPL and, of those, roughly half have moved to GPLv3, or 25 percent of all Google Code projects.

With more than 225,000 projects currently hosted at Google Code, that's a lot of GPLv3.

If we make the reasonable assumption that other open-source project repositories Sourceforge.net and Codehaus have similar GPLv3 adoption rates, the numbers of GPLv3 projects get very big, very fast.

This is important not just because it shows that there's considerable vigour in the GNU GPL licence yet, but because version 3 addresses a particularly hot area at the moment: software patents. The increasing use of GPL v3, with its stronger, more developed response to that threat, is therefore very good news indeed.

Follow me @glynmoody on Twitter or identi.ca.

22 July 2009

Pat "Nutter" Brown Strikes Again

To change the world, it is not enough to have revolutionary ideas: you also have the inner force to be able to realise them in the face of near-universal opposition/indifference/derision. Great examples of this include Richard Stallman, who ploughed his lonely GNU furrow for years before anyone took much notice, and Michael Hart, who did the same for Project Gutenberg.

Another of these rare beings with both vision and tenacity is Pat Brown, a personal hero of mine. Not content with inventing one of the most important experimental tools in genomics - DNA microarrays - Brown decided he wanted to do something ambitious: open access publishing. This urge turned into the Public Library of Science (PLoS) - and even that is just the start:

PLoS is just part of a longer range plan. The idea is to completely change the way the whole system works for scientific communication.

At the start, I knew nothing about the scientific publishing business. I just decided this would be a fun and important thing to do. Mike Eisen, who was a post-doc in my lab, and I have been brain-storming a strategic plan, and PLoS was a large part of it. When I started working on this, almost everyone said, “You are completely out of your mind. You are obviously a complete idiot about how publishing works, and besides, this is a dilettante thing that you're doing.” Which I didn't feel at all.

I know I'm serious about it and I know it's doable and I know it's going to be easy. I could see the thermodynamics were in my favor, because the system is not in its lowest energy state. It's going to be much more economically efficient and serve the customers a lot better being open access. You just need a catalyst to GET it there. And part of the strategy to get it over the energy barrier is to apply heat—literally, I piss people off all the time.

In case you hadn't noticed, that little plan "to completely change the way the whole system works for scientific communication" is coming along quite nicely. So, perhaps buoyed up by this, Brown has decided to try something even more challenging:

Brown: ... I'm going to do my sabbatical on this: I am going to devote myself, for a year, to trying to the maximum extent possible to eliminate animal farming on the planet Earth.

Gitschier: [Pause. Sensation of jaw dropping.]

Brown: And you are thinking I'm out of my mind.

Gitschier: [Continued silence.]

Brown: I feel like I can go a long way toward doing it, and I love the project because it is purely strategy. And it involves learning about economics, agriculture, world trade, behavioral psychology, and even an interesting component of it is creative food science.

Animal farming is by far the most environmentally destructive identified practice on the planet. Do you believe that? More greenhouse production than all transportation combined. It is also the major single source of water pollution on the planet. It is incredibly destructive. The major reason reefs are dying off and dead zones exist in the ocean—from nutrient run-off. Overwhelmingly it is the largest driving force of deforestation. And the leading cause of biodiversity loss.

And if you think I'm bullshitting, the Food and Agricultural Organization of the UN, whose job is to promote agricultural development, published a study, not knowing what they were getting into, looking at the environmental impact of animal farming, and it is a beautiful study! And the bottom line is that it is the most destructive and fastest growing environmental problem.

Gitschier: So what is your plan?

Brown: The gist of my strategy is to rigorously calculate the costs of repairing and mitigating all the environmental damage and make the case that if we don't pay as we go for this, we are just dumping this huge burden on our children. Paying these costs will drive up the price of a Big Mac and consumption will go down a lot. The other thing is to come up with yummy, nutritious, affordable mass-marketable alternatives, so that people who are totally addicted to animal foods will find alternatives that are inherently attractive to eat, so much so that McDonald's will market them, too. I want to recruit the world's most creative chefs—here's a REAL creative challenge!

I've talked with a lot of smart people who are very keen on it actually. They say, “You have no chance of success, but I really hope you're successful.” That's just the kind of project I love.

Pat, the world desperately needs nutters like you. Let's just hope that the thermodynamics are in your favour once more.

Follow me @glynmoody on Twitter or identi.ca.

No Patents for Circuits? Since You Insist...

I love this argument:

Arguments against software patents have a fundamental flaw. As any electrical engineer knows, solutions to problems implemented in software can also be realized in hardware, i.e., electronic circuits. The main reason for choosing a software solution is the ease in implementing changes, the main reason for choosing a hardware solution is speed of processing. Therefore, a time critical solution is more likely to be implemented in hardware. While a solution that requires the ability to add features easily will be implemented in software. As a result, to be intellectually consistent those people against software patents also have to be against patents for electronic circuits.

People seem to think that this is an invincible argument *for* software patents; what the poor darlings fail to notice is that it's actually an invincible argument *against* patents for circuits.

Since software is just algorithms, which is just maths, which cannot be patented, and this clever chap points out that circuits are just software made out of hardware, it follows that we shouldn't allow patents for circuits (but they can still be protected by copyright, just as software can.)

So, thanks for the help in rolling back what is patentable...

21 July 2009

Has Google Forgotten Celera?

One of the reasons I wrote my book Digital Code of Life was that the battle between the public Human Genome Project and the privately-funded Celera mirrored so closely the battle between free software and Microsoft - with the difference that it was our genome that was at stake, not just a bunch of bits. The fact that Celera ultimately failed in its attempt to sequence and patent vast chunks of our DNA was the happiest of endings.

It seems someone else knows the story:

Celera was the company founded by Craig Venter, and funded by Perkin Elmer, which played a large part in sequencing the human genome and was hoping to make a massively profitable business out of selling subscriptions to genome databases. The business plan unravelled within a year or two of the publication of the first human genome. With hindsight, the opponents of Celera were right. Science is making and will make much greater progress with open data sets.

Here are some rea[s]ons for thinking that Google will be making the same sort of mistake as Celera if it pursues the business model outlined in its pending settlement with the AAP and the Author's Guild....

Thought provoking stuff, well worth a read.

Follow me @glynmoody on Twitter or identi.ca.

Building on Open Data

One of the great things about openness is that it lets people do incredible things by adding to it in a multiplicity of ways. The beatuy is that those releasing material don't need to try to anticipate future uses: it's enough that they make it as open as possible Indeed, the more open they make it, the more exciting the re-uses will be.

Here's an unusual example from the field of open data, specifically, the US government data held on Data.gov:

The purpose of Data.gov is to increase public access to high value, machine readable datasets generated by the Executive Branch of the Federal Government. Although the initial launch of Data.gov provides a limited portion of the rich variety of Federal datasets presently available, we invite you to actively participate in shaping the future of Data.gov by suggesting additional datasets and site enhancements to provide seamless access and use of your Federal data. Visit today with us, but come back often. With your help, Data.gov will continue to grow and change in the weeks, months, and years ahead.

Here's how someone intends to go even further:

Today I’m happy to announce Sunlight Labs is stealing an idea from our government. Data.gov is an incredible concept, and the implementation of it has been remarkable. We’re going to steal that idea and make it better. Because of politics and scale there’s only so much the government is going to be able to do. There are legal hurdles and boundaries the government can’t cross that we can. For instance: there’s no legislative or judicial branch data inside Data.gov and while Data.gov links off to state data catalogs, entries aren’t in the same place or format as the rest of the catalog. Community documentation and collaboration are virtual impossibilities because of the regulations that impact the way Government interacts with people on the web.

We think we can add value on top of things like Data.gov and the municipal data catalogs by autonomously bringing them into one system, manually curating and adding other data sources and providing features that, well, Government just can’t do. There’ll be community participation so that people can submit their own data sources, and we’ll also catalog non-commercial data that is derivative of government data like OpenSecrets. We’ll make it so that people can create their own documentation for much of the undocumented data that government puts out and link to external projects that work with the data being provided.

This the future.

Follow me @glynmoody on Twitter or identi.ca.

20 July 2009

British Library Turns Traitor

I knew the British Library was losing its way, but this is ridiculous:

The British Library Business & IP Centre at St Pancras, London can help you start, run and grow your business.

And how might it do that?

Intellectual property can help you protect your ideas and make money from them.

Our resources and workshops will guide you through the four types of intellectual property: patents, trade marks, registered designs and copyright.

This once-great institution used to be about opening up the world's knowledge for the benefit and enjoyment of all: today, it's about closing it down so that only those who can afford to pay get to see it.

What an utter disgrace.

Follow me @glynmoody on Twitter or identi.ca.

Patents *Are* Monopolies: It's Official

As long-suffering readers of this blog will know, I refer to patents and copyrights as intellectual monopolies because, well, that's what they are. But there are some who refuse to accept this, citing all kinds of specious reasons why it's not correct.

Well, here's someone else who agrees with me:

A further and more significant change may come from the President's nomination of David Kappos of IBM to be the next Director of the Patent Office. While in the past, IBM was a prolific filer of patent applications, many of them covering business methods and software, it has filed an amicus brief in Bilski opposing the patentability of business method patents. However, and perhaps not surprisingly, IBM defends approval of software patents.

Mr. Kappos announced his opposition to business method patents last year by stating that "[y]ou're creating a new 20-year monopoly for no good reason."

Yup: the next Director of the USPTO says patents are monopolies: it's official. (Via @schestowitz.)

Follow me @glynmoody on Twitter or identi.ca.

17 July 2009

Harvard University Press on Scribd

This sounds like a great move:

It’s a recession. Save the $200,000 you were going to spend on that Harvard education and check out some of the books Harvard University Press is selling on Scribd starting today.

It's so obviously right: saving trees, and making academic materials more readily available, thus boosting the recognition achieved by the authors - exactly what they want. Plus readers can buy the books much more cheaply, allowing many more people to access rigorous if rather specialist knowledge.

Or maybe not: a quick look through the titles shows prices ranging from mid-teens up to $45. Come on, people, these are *electrons*: they are cheap. The whole point is to use this fact to spread knowledge, reputation and joy more widely.

When will they (HUP) learn?

Follow me @glynmoody on Twitter or identi.ca.

Gadzooks - it's ZookZ from Antigua

I've been following the rather entertaining case of Antigua vs. US for a few years now. Basically, the US government has taken a "do as I say, not as I do" attitude to the WTO - refusing to follow the latter's rules while seeking to enforce them against others. The net result is that plucky little Antigua seems to have won some kind of permission to ignore US copyright - up to a certain point - although nobody really knows what this means in practice.

That's not stopping an equally cheeky Antigua-based company from trying to make money from this situation:

ZookZ provides a new way to get pure movie and music enjoyment. We deliver unlimited, high-quality movies and music through a safe, legal and secure platform for one low monthly subscription fee.

ZookZ makes it simple for any to enjoy digital entertainment. Our user-friendly interface provides access to all our digital assets. We offer unlimited downloads of all movies and music for one low monthly price. Files are delivered in MP3 and MP4 formats that are compatible with most mobile devices and players so you can enjoy your entertainment when, where and how you want. ZookZ is changing the way people use and enjoy digital entertainment. Unlike other companies, once you download the file, you can view or listen to it on any medium of your choice­ –without restrictions.

ZookZ is not a peer-to-peer file sharing system and prohibits that use of its product. Customers directly download safe content from our secure database, not from an unknown third party. ZookZ guarantees that all our digital media is free from viruses, adware and spyware. We are dedicated to providing high-quality, safe and secure digital files.

ZookZ operates under the parameters of the 2007 WTO ruling between Antigua and the United States, and is the only website that can legally offer members unlimited digital entertainment.

The FAQ has more details.

I doubt whether the US media industries will sit back and let ZookZ try to implement its plan, and I suspect that this could get rather interesting to watch.

16 July 2009

Why Most Newspapers are Dying

This is something that's struck me too:

as is oh-so-typical in these situations, Osnos does nothing at all to engage or respond to the comments that call out his mistakes. You want to know why newspapers are failing? It's not because of Google, it's because of this viewpoint that some journalists still hold that they're the masters of the truth, handing it out from on high, wanting nothing at all to do with the riff raff in the comments.

This is perhaps the biggest single clue that newspaper do not understand how the Internet has changed relationships between writers and readers. Indeed, one of my disappointments with the Guardian's Comment is Free site is that practically *never* do the writers deign to interact with their readers. Given that the Guardian is probably the most Web-savvy of the major newspapers, this does not augur well...

(Open) Learning from Open Source

As regular readers of this blog will know, I am intrigued by the way that ideas from free software are moving across to different disciplines. Of course, applying them is no simple matter: there may not be an obvious one-to-one mapping of the act of coding to activities in the new domain, or there may be significant cultural differences that place obstacles in the way of sharing.

Here's a fascinating post that explores some of the issues around the application of open source ideas to open educational resources (OER):

For all my fascination with all things open-source, I'm finding that the notion of open source software (OSS) is one that's used far too broadly, to cover more categories than it can rightfully manage. Specifically, the use of this term to describe collaborative open education resource (OER) projects seems problematic. The notion of OSS points to a series of characteristics and truths that do not apply, for better or worse, to the features of collaborative learning environments developed for opening up education.

While in general, open educational resources are developed to adhere to the letter of the OSS movement, what they miss is what we might call the spirit of OSS, which for my money encompasses the following:

* A reliance on people's willingness to donate labor--for love, and not for money.

* An embrace of the "failure for free" model identified by Clay Shirky in Here Comes Everybody.

* A loose collaboration across fields, disciplines, and interest levels.

Open educational resources are not, in general, developed by volunteers; they are more often the product of extensive funding mechanisms that include paying participants for their labor.

Unusually, the post does not simply lament this problem, but goes on to explore a possible solution:

an alternate term for OERs designed in keeping with the open source ideals: community source software (CSS)

Worth reading in its entirety, particularly for the light it sheds on things we take for granted in open source.

Follow me @glynmoody on Twitter or identi.ca.

Now You Too Can Contribute to Firefox...

...with your money:

This pilot allows developers to request an optional dollar amount for their Firefox Add-on. Along with requesting this amount, we’re helping developers tell their stories with our new “About the Developer” pages, which explain to prospective contributors the motivations for creating an add-on and its future road map. Since contributions are completely optional, users will have ample time to evaluate an add-on to determine whether or not they want to help a developer.

Some details:

How will payments work?

We are working with PayPal on this pilot to provide a secure and international solution for facilitating payments. Developers can optionally create a PayPal ID for each of their Firefox Add-ons. Users will be presented with a “Contribute” button that gives them the option of paying the suggested amount or a different amount.

This is a nice touch, too:

Why did you call this “Contributions” and not “Donations”?

At Mozilla, we use the word “Contributor” for community members who contribute time and energy to our mission of promoting choice and innovation on the Internet. Our goal is that users who contribute money to developers are supporting the future of a particular add-on, as opposed to donating for something already received.

Quite: this isn't just about getting some well-deserved dosh to the coders, but also about giving users a way to feel more engaged. Great move.

15 July 2009

Bill Gates Gets Sharing...Almost

Yesterday I wrote about Microsoft's attempt to persuade scientists to adopt its unloved Windows HPC platform by throwing in a few free (as in beer) programs. Here's another poisoned chalice that's being offered:

In between trying to eradicate polio, tame malaria, and fix the broken U.S. education system, Gates has managed to fulfill a dream of taking some classic physics lectures and making them available free over the Web. The lectures, done in 1964 by noted scientist (and Manhattan Project collaborator) Richard Feynman, take notions such as gravity and explains how they work and the broad implications they have in understanding the ways of the universe.

Gates first saw the series of lectures 20 years ago on vacation and dreamed of being able to make them broadly available. After spending years tracking down the rights--and spending some of his personal fortune--Gates has done just that. Tapping his colleagues in Redmond to create interactive software to accompany the videos, Gates is making the collection available free from the Microsoft Research Web site.

What a kind bloke - spending his *own* personal fortune of uncountable billions, just to make this stuff freely available.

But wait: what do we find when go to that "free" site:

Clicking will install Microsoft Silverlight.

So it seems that this particular free has its own non-free (as in freedom) payload: what a surprise.

That's a disappointment - but hardly unexpected; Microsoft's mantra is that you don't get something for nothing. But elsewhere in the interview with Gates, there's some rather interesting stuff:

Education, particularly if you've got motivated students, the idea of specializing in the brilliant lecture and text being done in a very high-quality way, and shared by everyone, and then the sort of lab and discussion piece that's a different thing that you pick people who are very good at that.

Technology brings more to the lecture availability, in terms of sharing best practices and letting somebody have more resources to do amazing lectures. So, you'd hope that some schools would be open minded to this fitting in, and making them more effective.

What's interesting is that his new-found work in the field of education is bringing him constantly face-to-face with the fact that sharing is actually rather a good thing, and that the more the sharing of knowledge can be facilitated, the more good results.

Of course, he's still trapped by the old Microsoft mindset, and constantly thinking how he can exploit that sharing, in this case by freighting it with all kinds of Microsoft gunk. But at least he's started on the journey, albeit unknowingly.

Follow me @glynmoody on Twitter or identi.ca.

14 July 2009

Hamburg Declaration = Humbug Declaration

You may have noticed that in the 10 years since Napster, the music industry has succeeded in almost completely ruining its biggest opportunity to make huge quantities of money, alienating just about anyone under 30 along the way (and a fair number of us old fogies, too).

Alas, it seems that some parts of the newspaper industry have been doing their job of reporting so badly that they missed that particular news item. For what does it want to do? Follow the music industry's lemming-like plunge off the cliff of "new intellectual property rights protection":

On the day that Commissioner Viviane Reding unveils her strategy for a Digital Europe during the Lisbon Council, and as the European Commission's consultation on the Content Online Report draws to a close this week, senior members of the publishing world are presenting to Information Society Commissioner Viviane Reding and Internal Market Commissioner Charlie McCreevy, a landmark declaration adopted on intellectual property rights in the digital world in a bid to ensure that opportunities for a diverse, free press and quality journalism thrive online into the future.

This is the first press communiqué on a significant meeting convened on 26th June in Berlin by news group Chief Executives from both the EPC and the World Association of Newspapers where the 'Hamburg Declaration' was signed, calling for online copyright to be respected, to allow innovation to thrive and consumers to be better served.

This comes from an extraordinary press release, combining arrogant self-satisfaction with total ignorance about how the Internet works:

A fundamental safeguard of democratic society is a free, diverse and independent press. Without control over our intellectual property rights, the future of quality journalism is at stake and with it our ability to provide our consumers with quality and varied information, education and entertainment on the many platforms they enjoy.

What a load of codswallop. What makes them think they are the sole guardians of that "free, diverse and independent press"? In case they hadn't noticed, the Internet is rather full of "quality and varied information, education and entertainment on the many platforms", most of it quite independent of anything so dull as a newspaper. As many others have pointed out, quality journalism is quite separate from old-style press empires, even if the latter have managed to produce the former from time to time.

Then there's this:

We continue to attract ever greater audiences for our content but, unlike in the print or TV business models, we are not the ones making the money out of our content. This is unsustainable.

Well, at least they got the last bit. But if they are attracting "ever greater audiences" for their content, but are not making money, does this not suggest that they are doing something fundamentally wrong? In a former incarnation, I too was a publisher. When things went badly, I did not immediately call for new laws: I tried again with something different. How about if newspaper publishers did the same?

This kind of self-pitying bleating would be extraordinary enough were it coming out of a vacuum; but given the decade of exemplary failure by the music industry taking *exactly* the same approach, it suggests a wilful refusal to look reality in the face that is quite extraordinary.

Speaking personally, the sooner all supporters of the Humbug Declaration are simply omitted from every search engine on Earth, the better: I'm sure we won't miss them, but they sure will miss the Internet...

Follow me @glynmoody on Twitter or identi.ca.

I Fear Microsoft Geeks Bearing Gifts...

Look, those nice people at Microsoft Research are saving science from its data deluge:

Addressing an audience of prominent academic researchers today at the 10th annual Microsoft Research Faculty Summit, Microsoft External Research Corporate Vice President Tony Hey announced that Microsoft Corp. has developed new software tools with the potential to transform the way much scientific research is done. Project Trident: A Scientific Workflow Workbench allows scientists to easily work with large volumes of data, and the specialized new programs Dryad and DryadLINQ facilitate the use of high-performance computing.

Created as part of the company’s ongoing efforts to advance the state of the art in science and help address world-scale challenges, the new tools are designed to make it easier for scientists to ingest and make sense of data, get answers to questions at a rate not previously possible, and ultimately accelerate the pace of achieving critical breakthrough discoveries. Scientists in data-intensive fields such as oceanography, astronomy, environmental science and medical research can now use these tools to manage, integrate and visualize volumes of information. The tools are available as no-cost downloads to academic researchers and scientists at http://research.microsoft.com/en-us/collaboration/tools.

Aw, shucks, isn't that just *so* kind? Doing all this out of the goodness of their hearts? Or maybe not:

Project Trident was developed by Microsoft Research’s External Research Division specifically to support the scientific community. Project Trident is implemented on top of Microsoft’s Windows Workflow Foundation, using the existing functionality of a commercial workflow engine based on Microsoft SQL Server and Windows HPC Server cluster technologies. DryadLINQ is a combination of the Dryad infrastructure for running parallel systems, developed in the Microsoft Research Silicon Valley lab, and the Language-Integrated Query (LINQ) extensions to the C# programming language.

So basically Project Trident is more Project Trojan Horse - an attempt to get Microsoft HPC Server cluster technologies into the scientific community without anyone noticing. And why might Microsoft be so keen to do that? Maybe something to do with the fact that Windows currently runs just 1% of the top 500 supercomputing sites, while GNU/Linux has over 88% share.

Microsoft's approach here can be summed up as: accept our free dog biscuit, and be lumbered with a dog.

Follow me @glynmoody on Twitter or identi.ca.

Batik-Makers Say "Tidak" to Copyright

Yesterday I was talking about how patents are used to propagate Western ideas and power; here's a complementary story about local artists understanding that copyright just ain't right for them:

Joko, speaking at this year’s Solo Batik Fashion Festival over the weekend, said that the ancient royal city was one of the principal batik cities in Indonesia, with no fewer than 500 unique motifs created here that are not found in any other region. The inventory process, however, was hampered by the reluctance of the batik makers to claim ownership over pieces.

The head of the Solo trade and industry office, Joko Pangarso, said copyright registration work had begun last year, but was constantly held up when it was found a particular batik only had a motif name because the creator declined to attach their own.

“So far only 10 motifs have been successfully included in the list,” he said. “The creators acknowledged their creations but asked for minimal exposure.

Interestingly, this is very close to the situation for software. The batik motifs correspond to sub-routines: both are part of the commons that everyone draws upon; copyrighting those patterns is as counter-productive as patenting subroutines, since it makes further creation almost impossible without "infringement". This reduces the overall creativity - precisely the opposite effect that intellectual monopolists claim. (Via Boing Boing.)

Follow me @glynmoody on Twitter or identi.ca.

13 July 2009

National Portrait Gallery: Nuts

This is so wrong:

Below is a letter I received from legal representatives of the National Portrait Gallery, London, on Friday, July 10, regarding images of public domain paintings in Category:National Portrait Gallery, London and threatening direct legal action under UK law. The letter is reproduced here to enable public discourse on the issue. For a list of sites discussing this event see User:Dcoetzee/NPG legal threat/Coverage. I am consulting legal representation and have not yet taken action.

Look, NPG, your job is to get people to look at your pix. Here's some news: unless they're in London, they can't do that. Put those pix online, and (a) that get to see the pix and (b) when they're in London, they're more likely to come and visit, no?

So you should be *encouraging* people to upload your pix to places like Wikipedia; you should be thanking them. The fact that you are threatening them with legal action shows that you don't have even an inkling of what you are employed to do.

Remind me not to pay the part of my UK taxes that goes towards your salary....

Are Patents Intellectual Monopolies? You Decide

Talking of intellectual monopolies, you may wonder why I use this term (well, if you've been reading this blog for long, you probably don't.) But in any case, here's an excellent exposition as to why, yes, patents are indeed monopolies:

On occasion you get some defender of patents who is upset when we use the m-word to describe these artificial state-granted monopoly rights. For example here one Dale Halling, a patent attorney (surprise!) posts about "The Myth that Patents are a Monopoly" and writes, " People who suggest a patent is a monopoly are not being intellectually honest and perpetuating a myth to advance a political agenda."

Well, let's see.

Indeed, do read the rest of yet another great post from the Against Monopoly site.

Follow me @glynmoody on Twitter or identi.ca.

What Are Intellectual Monopolies For?

If you still doubted that intellectual monopolies are in part a neo-colonialist plot to ensure the continuing dominance of Western nations, you could read this utterly extraordinary post, which begins:

The fourteenth session of the WIPO Intergovernmental Committee on Genetic Resources, Traditional Knowledge and Folklore (IGC), convened in Geneva from June 29, 2009 to July 3, 2009, collapsed at the 11th hour on Friday evening as the culmination of nine years of work over fourteen sessions resulted in the following language; “[t]he Committee did not reach a decision on this agenda item” on future work. The WIPO General Assembly (September 2009) will have to untangle the intractable Gordian knot regarding the future direction of the Committee.

At the heart of the discussion lay a proposal by the African Group which called for the IGC to submit a text to the 2011 General Assembly containing “a/(n) international legally binding instrument/instruments” to protect traditional cultural expressions (folklore), traditional knowledge and genetic resources. Inextricably linked to the legally binding instruments were the African Group’s demands for “text-based negotiations” with clear “timeframes” for the proposed program of work. This proposal garnered broad support among a group of developing countries including Malaysia, Thailand, Fiji, Bolivia, Brazil, Ecuador, Philippines, Sri Lanka, Cuba, Yemen India, Peru, Guatemala, China, Nepal and Azerbaijan. Indonesia, Iran and Pakistan co-sponsored the African Group proposal.

The European Union, South Korea and the United States could not accept the two principles of “text-based negotiations” and “internationally legally binding instruments”.

Australia, Canada and New Zealand accepted the idea of “text-based negotiations” but had reservations about “legally binding instruments” granting sui generis protection for genetic resources, traditional knowledge and folklore.

We can't possibly have dveloping countries protecting their traditional medicine and national lore - "genetic resources, traditional knowledge and folklore" - from being taken and patented by the Western world. After all, companies in the latter have an inalienable right to turn a profit by licensing that same traditional knowledge it back to the countries it was stolen from (this has already happened). That's what intellectual monopolies are for.

Follow me @glynmoody on Twitter or identi.ca.

10 July 2009

This Could Save Many Lives: Let's Patent It

Bill Gates is amazing; just look at this brilliant idea he's come up with:

using large fleets of vessels to suppress hurricanes through various methods of mixing warm water from the surface of the ocean with colder water at greater depths. The idea is to decrease the surface temperature, reducing or eliminating the heat-driven condensation that fuels the giant storms.

Against the background of climate change and increased heating of the ocean's surface in areas where hurricanes emerge, just imagine how many lives this could save - a real boon for mankind. Fantastic.

Just one problemette: he's decided to patent the idea, along with his clever old chum Nathan Myhrvold.

The filings were made by Searete LLC, an entity tied to Intellectual Ventures, the Bellevue-based patent and invention house run by Nathan Myhrvold, the former Microsoft chief technology officer. Myhrvold and several others are listed along with Gates as inventors.

After all, can't have people just going out there and saving thousands of lives without paying for the privilege, can we?

Follow me @glynmoody on Twitter or identi.ca.

Do We Need Open Access Journals?

One of the key forerunners of the open access idea was arxiv.org, set up by Paul Ginsparg. Here's what I wrote a few years back about that event:

At the beginning of the 1990s, Ginsparg wanted a quick and dirty solution to the problem of putting high-energy physics preprints (early versions of papers) online. As it turns out, he set up what became the arXiv.org preprint repository on 16 August, 1991 – nine days before Linus made his fateful “I'm doing a (free) operating system (just a hobby, won't be big and professional like gnu) for 386(486) AT clones” posting. But Ginsparg's links with the free software world go back much further.

Ginsparg was already familiar with the GNU manifesto in 1985, and, through his brother, an MIT undergraduate, even knew of Stallman in the 1970s. Although arXiv.org only switched to GNU/Linux in 1997, it has been using Perl since 1994, and Apache since it came into existence. One of Apache's founders, Rob Hartill, worked for Ginsparg at the Los Alamos National Laboratory, where arXiv.org was first set up (as an FTP/email server at xxx.lanl.org). Other open source programs crucial to arXiv.org include TeX, GhostScript and MySQL.

arxiv.org was and is a huge success, and that paved the way for what became the open access movement. But here's an interesting paper - hosted on arxiv.org:

Contemporary scholarly discourse follows many alternative routes in addition to the three-century old tradition of publication in peer-reviewed journals. The field of High- Energy Physics (HEP) has explored alternative communication strategies for decades, initially via the mass mailing of paper copies of preliminary manuscripts, then via the inception of the first online repositories and digital libraries.

This field is uniquely placed to answer recurrent questions raised by the current trends in scholarly communication: is there an advantage for scientists to make their work available through repositories, often in preliminary form? Is there an advantage to publishing in Open Access journals? Do scientists still read journals or do they use digital repositories?

The analysis of citation data demonstrates that free and immediate online dissemination of preprints creates an immense citation advantage in HEP, whereas publication in Open Access journals presents no discernible advantage. In addition, the analysis of clickstreams in the leading digital library of the field shows that HEP scientists seldom read journals, preferring preprints instead.

Here are the article's conclusions:

Scholarly communication is at a cross road of new technologies and publishing models. The analysis of almost two decades of use of preprints and repositories in the HEP community provides unique evidence to inform the Open Access debate, through four main findings:

1. Submission of articles to an Open Access subject repository, arXiv, yields a citation advantage of a factor five.

2. The citation advantage of articles appearing in a repository is connected to their dissemination prior to publication, 20% of citations of HEP articles over a two-year period occur before publication.

3. There is no discernable citation advantage added by publishing articles in “gold” Open Access journals.

4. HEP scientists are between four and eight times more likely to download an article in its preprint form from arXiv rather than its final published version on a journal web site.

On the one hand, it would be ironic if the very field that acted as a midwife to open access journals should also be the one that begins to undermine it through a move to repository-based open publishing of preprints. On the other, it doesn't really matter; what's important is open access to the papers. If these are in preprint form, or appear as fully-fledged articles in peer-reviewed open access journals is a detail, for the users at least; it's more of a challenge for publishers, of course... (Via @JuliuzBeezer.)

Follow me @glynmoody on Twitter or identi.ca.

08 July 2009

Not Kissing the Rod, Oh My Word, No

Becta today [6 July 2009] welcomes Microsoft's launch of the new Subscription Enrolment Schools Pilot (SESP) for UK schools, which provides greater flexibility and choice for schools who wish to use a Microsoft subscription agreement.

Great, and what might that mean, exactly?

The new licensing scheme removes the requirement that schools using subscription agreements pay Microsoft to licence systems that are using their competitor's technologies. So for the first time schools using Microsoft's subscription licensing agreements can decide for themselves how much of their ICT estate to licence.

So BECTA is celebrating that fact that schools - that is, we taxpayers - *no longer* have to "pay Microsoft to licence systems that are using their competitor's technologies"? They can now use GNU/Linux, for example, *without* having to pay Microsoft for the privilege?

O frabjous day! Callooh! Callay!

Follow me @glynmoody on Twitter or identi.ca.

Policing the Function Creep...

Remember how the poor darlings in the UK government absolutely *had to* allow interception of all our online activities so that those plucky PC Plods could maintain their current stunning success rate in their Whirr on Terruh and stuff like that? Well, it seems that things have changed somewhat:

Detectives will be required to consider accessing telephone and internet records during every investigation under new plans to increase police use of communications data.

The policy is likely to significantly increase the number of requests for data received by ISPs and telephone operators.

Just as every investigation currently has to include a strategy to make use of its subjects' financial records, soon CID officers will be trained to always draw up a plan to probe their communications.

The plans have been developed by senior officers in anticipation of the implementation of the Interception Modernisation Programme (IMP), the government's multibillion pound scheme to massively increase surveillance of the internet by storing details of who contacts whom online.

Er, come again? "CID officers will be trained to always draw up a plan to probe their communications"? How does that square with this being a special tool for those exceptional cases when those scary terrorists and real hard naughty criminals are using tricky high-tech stuff like email? Doesn't it imply that we are all terrorist suspects and hard 'uns now?

Police moves to prepare for the glut of newly accessible data were revealed today by Deputy Assistant Commissioner Janet Williams. She predicted always considering communications data will lead to a 20 per cent increase in the productivity of CID teams.

She told The Register IMP had "informed thinking" about use of communications data, but denied the plans gave the lie to the government line that massively increased data retention will "maintain capability" of law enforcement to investigate crime.

Well, Mandy Rice-Davies applies, m'lud...

Follow me @glynmoody on Twitter or identi.ca.

07 July 2009

Are Microsoft's Promises For Ever?

This sounds good:

I have some good news to announce: Microsoft will be applying the Community Promise to the ECMA 334 and ECMA 335 specs.

ECMA 334 specifies the form and establishes the interpretation of programs written in the C# programming language, while the ECMA 335 standard defines the Common Language Infrastructure (CLI) in which applications written in multiple high-level languages can be executed in different system environments without the need to rewrite those applications to take into consideration the unique characteristics of those environments.

"The Community Promise is an excellent vehicle and, in this situation, ensures the best balance of interoperability and flexibility for developers," Scott Guthrie, the Corporate Vice President for the .Net Developer Platform, told me July 6.

It is important to note that, under the Community Promise, anyone can freely implement these specifications with their technology, code, and solutions.

You do not need to sign a license agreement, or otherwise communicate to Microsoft how you will implement the specifications.

The Promise applies to developers, distributors, and users of Covered Implementations without regard to the development model that created the implementations, the type of copyright licenses under which it is distributed, or the associated business model.

Under the Community Promise, Microsoft provides assurance that it will not assert its Necessary Claims against anyone who makes, uses, sells, offers for sale, imports, or distributes any Covered Implementation under any type of development or distribution model, including open-source licensing models such as the LGPL or GPL.

But boring old sceptic that I am, I have memories of this:

The Software Freedom Law Center (SFLC), provider of pro-bono legal services to protect and advance free and open source software, today published a paper that considers the legal implications of Microsoft's Open Specification Promise (OSP) and explains why it should not be relied upon by developers concerned about patent risk.

SFLC published the paper in response to questions from its clients and the community about the OSP and its compatibility with the GNU General Public License (GPL). The paper says that the promise should not be relied upon because of Microsoft's ability to revoke the promise for future versions of specifications, the promise's limited scope, and its incompatibility with free software licenses, including the GPL.

That was then, of course, what about now? Well, here's what the FAQ says on the subject:

Q: Does this CP apply to all versions of the specification, including future revisions?

A: The Community Promise applies to all existing versions of the specifications designated on the public list posted at /interop/cp/, unless otherwise noted with respect to a particular specification.

Now, is it just me, or does Microsoft conspicuously fail to answer its own question? The question was: does it apply to all versions *including* future revision? And Microsoft's answer is about *existing* versions: so doesn't that mean it could simply not apply the promise to a future version? Isn't this the same problem as with the Open Specification Promise? Just asking.

03 July 2009

The Engine of Scientific Progress: Sharing

Here's a post saying pretty much what I've been saying, but in a rather different way:

Here we present a simple model of one of the most basic uses of results, namely as the engine of scientific progress. Research results are more than just accumulated knowledge. Research results make possible new questions, which in turn lead to even more knowledge. The resulting pattern of exponential growth in knowledge is called an issue tree. It shows how individual results can have a value far beyond themselves, because they are shared and lead to research by others.

Follow me @glynmoody on Twitter or identi.ca.

02 July 2009

Patents Don't Promote Innovation: Study

It's extraordinary how the myth that patents somehow promote innovation is still propagated and widely accepted; and yet there is practically *no* empirical evidence that it's true. All the studies that have looked at this area rigorously come to quite a different conclusion. Here's yet another nail in that coffin, using a very novel approach:

Patent systems are often justified by an assumption that innovation will be spurred by the prospect of patent protection, leading to the accrual of greater societal benefits than would be possible under non-patent systems. However, little empirical evidence exists to support this assumption. One way to test the hypothesis that a patent system promotes innovation is to simulate the behavior of inventors and competitors experimentally under conditions approximating patent and non-patent systems.

Employing a multi-user interactive simulation of patent and non-patent (commons and open source) systems ("PatentSim"), this study compares rates of innovation, productivity, and societal utility. PatentSim uses an abstracted and cumulative model of the invention process, a database of potential innovations, an interactive interface that allows users to invent, patent, or open source these innovations, and a network over which users may interact with one another to license, assign, buy, infringe, and enforce patents.

Data generated thus far using PatentSim suggest that a system combining patent and open source protection for inventions (that is, similar to modern patent systems) generates significantly lower rates of innovation (p<0.05), productivity (p<0.001), and societal utility (p<0.002) than does a commons system. These data also indicate that there is no statistical difference in innovation, productivity, or societal utility between a pure patent system and a system combining patent and open source protection.

The results of this study are inconsistent with the orthodox justification for patent systems. However, they do accord well with evidence from the increasingly important field of user and open innovation. Simulation games of the patent system could even provide a more effective means of fulfilling the Constitutional mandate ― "to promote the Progress of . . . useful Art" than does the orthodox assumption that technological innovation can be encouraged through the prospect of patent protection.

When will people get the message and start sharing for mutual benefit?

Follow me @glynmoody on Twitter or identi.ca.

01 July 2009

Help Me Go Mano a Mano with Microsoft

Next week, I'm taking part in a debate with a Microsoft representative about the passage of the OOXML file format through the ISO process last year. Since said Microsoftie can draw on the not inconsiderable resources of his organisation to provide him with a little back-up, I thought I'd try to even the odds by putting out a call for help to the unmatched resource that is the Linux Journal community. Here's the background to the meeting, and the kind of info I hope people might be able to provide....

On Linux Journal.