23 February 2009

Dell *Does* Deliver (with Netbooks)

There's been a lot of sound and fury flying around about the split between GNU/Linux and Windows XP sales on netbooks, and what that means for the larger desktop sector. Some have used low figures for the former to suggest that GNU/Linux *still* stands no chance with the general public. But maybe what we need are more datapoints - ones like this, perhaps:

While MSI told us a few months back that Wind netbooks running SuSE Linux saw 4x higher return rates than that of XP machines, Dell has had quite the opposite experience with its Inspiron Mini 9 offering with Ubuntu. “A third of our Mini 9 mix is Linux, which is well above the standard attach rate for other systems that offer Linux. We have done a very good job explaining to folks what Linux is,” says Dell’s Jay Pinkert.

Dell attributes part of the Linux growth to competitive pricing on the Ubuntu SKUs. “When you look at the sweet spot for this category it is price sensitivity, and Linux enabled us to offer a lower price entry point,” added Dell senior product manager John New.

The key point here is that the manufacturer must make it clear what the customer is getting for the super-low price. Kudos to Dell that they seem to have managed that.

Oh, and could we please have less whining by other netbook manufacturers about their GNU/Linux sales, since it might well be your *own* fault, not that of free software...

Ubuntu is So Last Year: Here's Kongoni

Well, a groovy African name worked for Ubuntu, so maybe it will for Kongoni:

Named after the Shona word for the GNU, Kongoni has a strong BSD-Unix influence and includes a ports-like package management system. The underlying code is, however, based on Slackware and the makers are promising to keep the distribution free of proprietary software.

Interestingly:

Technically, says Venter, Kongoni adopts a BSD ports-like approach to package management. “Ports represent a powerful way to distribute software as a set of tools that automatically fetch the sources of the program and then compile it locally,” he says. “This is more bandwidth friendly for users as source code is usually smaller than prebuilt packages. This benefit is particularly useful in Africa where bandwidth is expensive, and since Kongoni came from Africa this was a major concern.”

...

The core system includes a KDE 4.2 desktop as the default desktop manager but the system intended to be easy to remaster, says Venter. Users can easily build and replicate the system with their own preferred setups and desktops.

EU's Free Software Education Programme

Excellent news out of Europe:


A Consortium formed by three universities and led by the Free Knowledge Institute (FKI) has received the support from the EC's Lifelong Learning Programme to offer an international educational programme on Free Software. Following the Open Educational Resources movement, all learning materials will be freely available through the Internet. The use of Free Software (also referred to as Open Source software or Libre Software) is expanding rapidly in governmental and private organisations. However, still only a limited number of IT professionals, teachers and decision makers have sufficient knowledge and expertise in these new fields. In order to cover this gap, the Free Knowledge Institute and three European universities have founded the Free Technology Academy. The first course materials will be available after this summer.

I'd rather forgotten about the Free knowledge Institute. It's a spin-off of the Internet Society Netherlands, and apparently :

a non-profit organisation that fosters the free exchange of knowledge in all areas of society. Inspired by the Free Software movement, the FKI promotes freedom of use, modification, copying and distribution of knowledge in four different but highly related fields: education, technology, culture and science.

(Via Heise.)

Microsoft's "Enervate America" Programme

You got to hand it to Microsoft, they certainly know how to scavenge off dead and dying bodies:

Microsoft Corp. today announced a new initiative, Elevate America, which will provide up to 2 million people over the next three years with the technology training needed to succeed in the 21st-century economy.

...

“Millions of Americans don’t have the technology skills needed in today’s economy. Through Elevate America, we want to help workers get the skills they need to succeed,” Passman said. “We are also providing a full range of work force development resources for state and local governments so they can offer specialized training for their workers.”

And if you were wondering what "specialized training" meant:

A new online resource, located at http://www.microsoft.com/ElevateAmerica, is available today. This new Web site helps individuals understand what types of technical skills they need for the jobs and entrepreneurial opportunities of today and tomorrow, and resources to help acquire these skills. The Web site provides access to several Microsoft online training programs, including how to use the Internet, send e-mail and create a résumé, as well as more advanced programs on using specific Microsoft applications.

Bet there's not much Firefox, Thunderbird of OpenOffice.org in *there*.

It's a clever ploy, because it means that the state and local governments in the US don't have to pay to re-train workers made redundant. For Microsoft, of course, it's a brilliant way to turn people desperate to improve their financial situation into vectors of its threatened techno-orthdoxy.

Sadly, the net result is that people are being trained how to use 20th century technology for that 21st-century economy - more "Enervate America" than "Elevate America". For a telling contrast, just think of all those millions of young Brazilians who are growing up with a real understanding of what computers are about and for....

Nudging the NUJ Towards Bethlehem 2.0

Non-journos probably should avert their glances, but there is a cracking set of comments on this wonderful story from Adam Tinworth, blogger-in-chief at my old employer RBI, which concludes:


Nice to know that my union people associated with my union (self correcting in the interests of fairness), which I have been a member of for the last 15 years think that the journalistic field in which I work - blogging - is "effing blogs".

Like many of the people commenting, I too was once a member of the NUJ; I'm rather glad that I never went back judging by the extraordinary responses from a representative of that organisation to the original post (do read the whole thread if you can, it's highly entertaining.)

It contrasts nicely with the deft way that Anthony Gold responded to my critical article over on Open Enterprise, where he immediately offered to talk with me about the issues I raised (which turned into this interview). I emerged with enhanced respect for the man and the organisation he heads, since he did exactly what the NUJ representative did not: he addressed the issues I raised in a non-confrontational way.

Patently Not the Case

Professionals who work in the field of intellectual monopolies have a problem. Most of them are quite able to see there are serious problems with the system, but since their entire career has been built on it, they can hardly trash the whole thing. Instead, they not unreasonably try to come up with a "reasonable" compromise. Here's a good example:

how about a world in which there are fewer patents, but better ones — in the sense that they are more carefully examined, forced to comply more strictly both with legal criteria and market reality? That stricter compliance would make them more likely to be valid, in a world in which software inventions are under constant threat of being deemed retrospectively obvious, and useful to read.

But it's based on a false premise - that we actually *need* patents for business reasons:

Others praise patents, even for software applications. The patent protects investment that would not be directed at software development if that protection did not exist. The patent specification opens up inventions for everyone to read, thus enriching the state of the art and saving developers the need to reinvent the programmer's equivalent of the wheel.

This is patently untrue for the software world, which happily invested in software development for decades before software patent madness took over. Microsoft is actually a good example of a company that succeeded without feeling the need to try to patent its software, in the early days at least. And Bill Gates famously noted that the success of his company would actually have been impossible had software patents existed and been held by rival companies.

The second point is also manifestly untrue. Software patents are almost always totally generic - they do not "open up inventions for everyone to read": instead, they are written as opaquely as possible in the hope that they can be bent in court to apply to the most extreme situations. They do not "enrich the state of the art", because the whole purpose of gaining patents is to stop anyone using them for 20 years, by which time, technology has moved on so far that any content they originally had has been superseded.

There are simply *no* good reasons for software patents, and hence no justification for halfway houses, however reasonably framed, and however intelligent and reasonable the framer.

London Open Source Writers Meetup

As an experiment, @codepope and I have arranged to meet up as a possible kernel of open-sourcey writerish activity in London tonight. Anyone who writes - as a journalist, blogger, tweeter - and can make it is welcome, although I'm afraid PR on its own doesn't count (fine, if you do both).

We're aiming to meet for 6pm in one of the cafes at the Royal Festival Hall (allegedly furnished with Wifi), but first we'll hang around the entrance facing the Thames, on the main level. Any questions, contact me (glyn.moody@gmail or @glynmoody, or @codepope) before then.

See you there. Maybe.

Medvedev Confirms Free Software Support

Here's confirmation from the top that Russia is pushing ahead with its plans for introducing free software not just into its schools, but the entire domestic market:

Президент РФ так обозначил свою позицию по свободному ПО: «Ещё одна тема — это информационные технологии в социальной сфере. Сейчас нужно начинать массовое обучение школьных учителей новым технологиям. Мы, собственно, пытались это делать в рамках национального проекта. Наверное, кое-что удалось, но пока это только самое начало. Надо подумать и о том, чтобы двинуться дальше — к использованию отечественного свободного программного обеспечения. Я этой темой занимался, результаты у нас есть, мы подготовили уже свои программы, которые позволяют создать, по сути, продукт абсолютно качественный, на основе свободного программного обеспечения, но привязанный уже к нашим реалиям».


[via Google Translate: President of the Russian Federation as outlined its position on free software: «Another issue - this is information technology in the social sphere. We actually tried to do so as part of a national project. Probably something that succeeded, but for now this is just the beginning. We must consider that the next move - to the domestic free software. I dealt with this topic, the results we have, we have already prepared their programs, which allow to create, in essence, a product is qualitative, based on free software, but is already tied to our realities ».]

This is also worth noting:

Стоит также напомнить: недавно появлялось сообщение о том, что бюджет, выделенный в 2009 году для оснащения российских школ свободным ПО, оказался примерно втрое меньше ожидаемого (180-250 млн рублей против предполагаемых 650 млн).

[It is also worth recalling: appeared recently reported that the budget allocated in 2009 to equip Russian schools free software, was approximately three times less than expected (180-250 million rubles against the anticipated 650 million).]

What that means in practice is that there is less money, and so more incentive to use free software. But the bigger news is that Medvedev has confirmed the wider roll-out to the general domestic Russian market.

22 February 2009

Variations on an Open Source Theme

One of the most extraordinary - and under-recognised - developments in free software is the blossoming of specialist software applications.

Once, a common jibe was that the only programs available were for hackers. This made the appearance of the GIMP, an ambitious image manipulation program aiming to rival Photoshop, such an important milestone.

Since, then, of course, more and more programs have appeared for the most amazingly specialist areas. Here's another one:


Indiana University today announces the release of open source software to create a digital music library system. The software, called Variations, provides online access to streaming audio and scanned score images in support of teaching, learning, and research.

Variations enables institutions such as college and university libraries and music schools to digitize audio and score materials from their own collections, provide those materials to their students and faculty in an interactive online environment, and respect intellectual property rights.

...

This open source release of Variations complements IU’s earlier release of the open source Variations Audio Timeliner, which lets users identify relationships in passages of music, annotate their findings, and play back the results with simple point-and-click navigation. This tool is also included as a feature of the complete Variations system.

(Via DigitalKoans.)

Crowdsourcing an Astronomy Commons

This is a fab use of pooled images combined with automation:

Flickr hosts a wide range of beautiful images, but a new project built on top of Flickr's API only focuses on photos of the night sky from amateur astronomers. The Astrometry.net project constantly scans the Astrometry Flickr group for new images to catalog and to add to its open-source sky survey. At the same time, this project also provides a more direct service to the amateur astronomers, as it also analyzes each image and returns a high-quality description of the photo's contents.

The Astrometry group currently has over 400 members, and as Christoper Stumm, a member of the Astrometry.net team, told the Flickr Code blog, the back-end software uses geometric hashing to exactly pinpoint and describe the objects in the images. When you submit an image to the Flickr pool, the robot will not just respond with a comment that contains an exact description of what you see in the image, but it will also annotate the image automatically.

What I'd like to see is something similar for terrestrial images, to build up a huge mosaic of everything, everywhere.

18 February 2009

Radio Opendotdotdot...

...is off the airwaves for a couple of days. Back soon.

17 February 2009

Crowdsourcing the Heavens

Sounds sensible:

The Galaxy Zoo files contain almost a quarter of a million galaxies which have been imaged with a camera attached to a robotic telescope (the Sloan Digital Sky Survey, no less). In order to understand how these galaxies — and our own — formed, we need your help to classify them according to their shapes — a task at which your brain is better than even the fastest computer.

More than 150,000 people have taken part in Galaxy Zoo so far, producing a wealth of valuable data and sending telescopes on Earth and in space chasing after their discoveries. Zoo 2 focuses on the nearest, brightest and most beautiful galaxies, so to begin exploring the Universe, click the ‘How To Take Part’ link above, or read ‘The Story So Far’ to find out what Galaxy Zoo has achieved to date.

It's ironic that the more data we produce, the more we need people to process it. And long may that be so.

The Kids Are Spot-on

Interesting figures from new research:

Marrakesh Records and Human Capital surveyed 1,000 15 to 24-year-olds highlighting not just how important music is to young people, but their changing attitudes to paying for content. 70 percent said they don't feel guilty for illegally downloading music from the internet. 61 percent feel they shouldn't have to pay for music. And around 43 percent of the music owned by this age group has not been paid for, increasing to 49 percent for the younger half of the group.

But the battle to get them to pay for music has not been lost entirely:

This age group felt £6.58 is a fair price for CD album, but that a downloaded album should be just £3.91 and a single 39p - almost half the price charged by Apple's iTunes Store.

Clearly, if the music industry wants to stand any chance of retaining people's willingness to pay for content, it had better move its prices down to this level pretty sharply. If they don't, it's not hard to predict what will happen the next time they carry out this research.

Adobe and Nokia Fund Open Screen Project

The Open Screen Project was set up in May 2008:

Partners in the Open Screen Project are working together to provide a consistent runtime environment for open web browsing and standalone applications — taking advantage of Adobe Flash Player and, in the future, Adobe AIR. This consistent runtime environment will remove barriers to publishing content and applications across desktops, mobile phones, televisions, and other consumer electronics.

Now, Adobe's AIR ain't open source, so I'm a bit sceptical of the "open" bit in the name of Open Screen Project, but AIR does, at least, run on GNU/Linux. I've been using the AIR-based TweetDeck on Ubuntu, and memory leaks aside, it just works.

The Open Screen Project has received a wad of dosh:

At the GSMA Mobile World Congress, Adobe Systems Incorporated (Nasdaq:ADBE) and Nokia Corporation (NYSE: NOK) today announced a $10 million Open Screen Project fund designed to help developers create applications and services for mobile, desktop and consumer electronics devices using the Adobe Flash® Platform. The new fund is a result of the Open Screen Project, an industry-wide initiative of more than 20 industry leaders set to enable a consistent experience for web browsing and standalone applications. Additional Open Screen Project partners are expected to join the fund in the future.

Apparently, AIR projects are also eligible, which is something.

Now, if they could just open source AIR, as they will probably have to if they want to see off the threat from Microsoft's Silverlight...

Lack of Open Access to Geodata Costing Lives?

Here's another great example of why we need open access to public data:

The refusal of the government in Victoria, Australia, to provide data for Google's bushfire map mashup limited its scope and highlighted glaring problems with Crown copyright provisions, the search giant's top Australian engineer said yesterday.

...

The search giant's search for data to plot fires on public lands--which are managed by the Victorian Department of Sustainability and Environment--produced an entirely different result. With no public feed of the fires' location and an explicit denial of permission to access its own internal data, the engineers were ultimately unable to plot that data on the map as well.

It's not hard to imagine that such mashups, provided in a timely fashion, could have saved lives, either directly or indirectly. It's a perfect examaple of why governments have a duty to share such basic data as widely as possible.

Ubuntu Edges Further into the Data Centre

Everybody knows that Ubuntu is the most popular GNU/Linux distro for the desktop. Everybody knows that it has achieved that distinction be concentrating on that sector, unlike Red Hat, say, which is aiming at the corporate market. Everybody knows these things, and everybody is wrong. Because, very cunningly, Ubuntu is trying a tricky strategy: to insinuate itself into the highly-profitable corporate sector without losing its cachet as the user-friendly distro for newbies....

On Open Enterprise blog.

16 February 2009

BBC and Microsoft: Joined at the Hip?

Not another one?

Microsoft's UK online services group GM Sharon Baylay is becoming the BBC's director of marketing, comms and audiences, succeeding Tim Davie, who became audio and music director last year.

Why doesn't Microsoft just take over the BBC and be done with it?

Sketchory: Sharing CC Drawings

It's hard enough working out what collaboration might mean with words, but even it's even worse with images. This probably explains why there aren't that many sites out there exploring the idea. Happily, here's one that's just opened its virtual doors, and it looks promising:

Drawings at Sketchory.com can be freely shared by keeping to this Creative Commons license (which includes commercial use but requires attribution, among other things) with the additional prerequisite that you don't share over 1000 sketches.

Below every sketch, you'll also find an embed code you can use. Please note we cannot promise to keep pics up forever, and may also remove certain images sometimes, or change images or image content (like the watermark).

What's really remarkable is the scale: there are currently *250,000* drawings on Sketchory. (Via Google Blogoscoped.)

Open Enterprise Interview: Brian Reale, Colosa CEO

Bolivia is not a county you might associate with free software, but one of the advantages of open source is that it can be created anywhere, drawing on the support of users around the world. Aside from Linus, one person who has proved that to be the case is Brian Reale. He's the founder of Colosa, an open source company based in Bolivia's capital, La Paz.

On Open Enterprise blog.

EU Puts "Three Strikes" on Ice

Here's a turn-up for the books:

The European Commission is set to put proposals to tackle online piracy on ice until the end of its current mandate, following heavy pressure from telecoms companies and consumer organisations alike, EurActiv has learned.

The EU executive had been expected to bring forward two initiatives in the first half of 2009, both of which could have forced a more restrictive EU-wide approach to free and illegal downloading.

The most ancipated measure was a follow-up to a Communicationexternal on online content, presented at the beginning of 2008, which hinted at restrictive measures to curb online piracy. Proposals included a mandate for Internet service providers (ISPs) "to suspend or cut access to the web for those who illegally file-share," the so-called three-step model proposed by France (EurActiv 10/12/07).

That's surprising, but what's really striking is the reason for this pause:

Brussels had planned to present actual proposals in the form of a recommendation in April. But now the plan has been frozen "after a radicalisation of the debate which has left no space for manoeuvre," a Commission official told EurActiv, referring to strong lobbying by the content industry (in particular music), supported mainly by France, in negotiations over the telecoms package.

"There will be no recommendation. The Commission will only later present issue papers," which may be used by the next Commission after it is sworn in at the end of 2009 or in 2010, explained Martin Selmayr, spokesman for Viviane Reding, the EU's information society commissioner.
This suggests the increasingly outrageous demans from the content industries have been their own undoing. Perhaps the era in which lobbyists can dictate legislation at will is finally coming to a close.

But we're not in the clear yet:

Consumers can rejoice too, although restrictive measures at national level are planned in many EU countries. Meanwhile, a new EU-wide attempt to regulate may be made during the current negotiations over the telecoms package, where the Council and the Parliament have the final say.

The fight goes on.

13 February 2009

UK Sticks with 70 Years for Music Copyright

Cold comfort, but the UK government is being more sensible than most others on the sound copyright extension:

David Lammy, the U.K. minister of state for intellectual property, has reaffirmed the British government's position on term extension by refusing to accept the European Parliament's legal affairs committee ruling on a 95-year copyright term for music recordings.

...


In a statement, Lammy effectively reiterated that support for a 70-year term for music recordings. The European ruling will ultimately be voted on by the Council of Ministers, in which Germany and France are supporters of the 95-year term.

So a certain amount of kudos is due. But not much.

Leak of Classified ACTA Dox Reveals Dissent

There's a battle going on for the soul of ACTA, and Knowledge Ecology International has a leaked document that spells it out:

Classified negotiating proposals for the Anti-Counterfeiting Trade Agreement (ACTA) obtained by Knowledge Ecology International and examined by Inside U.S. Trade reveal wrangling between Japan, the United States, European Union, Australia and Canada over issues of civil and criminal enforcement and how to apply border measures against infringing products.

The post contains the full details of what is known, but the following sections are of particular interest for EU citizens:

The section on empowering authorities to order infringers to provide information on other persons involved in their activities also appears in the Korea FTA and ACTA draft. In the document, the EU seeks to add language that would limit this provision so that it conforms with national laws such as those on personal data privacy.

...

In this section, the EU has sought a provision specifically designed to exclude non-commercial items in personal baggage, from the scope of the ACTA border measures. U.S. officials have said that the agreement would not lead to wholesale raids on laptops and iPods at airports, but the EU appears to be trying to make sure this is the case in this section.

If true, these are to the credit of the EU delegation, which is clearly trying to limit at least some of the most damaging aspects of ACTA. But other areas remain a concern:

The documents do not detail the subsection on Internet measures and these are known to be among the most controversial provisions.

Moreover:

Criminal trafficking in labels is defined as occurring even in the absence of willful piracy.

Which would seem to capture P2P sharing.

Although much remains shrouded in secrecy, it's good news that at least a little light is being shed on what is clearly a hugely important treaty. The fact that participants are still trying to negotiate it in secrecy so as to present a fait accompli is nothing short of scandalous.

How Openness Can Regulate the Real World

Yes, even the really messy bits:

Participatory regulation is arguably the best way to surface and defeat corruption in government and industry. I’ve highlighted a range of impressive efforts below. They range from Transparency International’s more top-down survey and index approach to the bottom-up Wikileaks site where anybody can post documents that uncover instances of corruption.

The post explores several examples: Transparency International’s Corruption Perceptions Index; The Kimberley Process (KP) - a joint government-industry-civil society initiative to stem the flow of conflict diamonds; and the Extractive Industries Transparency Initiative (EITI), which is "similar in intent to TI’s bribe payer’s index — it also aims to strengthen governance by improving transparency and accountability in the extractives sector" (apparently the "extractive industries" refer to mining, oil, gas and similar companies).

What's really noteworthy here is that openness is being used to make a difference not in airy-fairy realms of genteel, abstract concerns, but in some of the most brutal, real-world contexts imaginable. Who knows, it might even work for something as corrupt as the British political system.

Update: Simon Phipps has pointed out the new Stimulus Watch, which works on similar principles.

Open Proofs

The problem:

The world depends on having secure, accurate, and reliable software - but most software isn't. In some circumstances we need "high confidence" (aka "high assurance") software built on top of verified software. Verified software, in this context, is software that has been proved to have or not have some property using formal methods (formal methods apply mathematical techniques to prove properties of software). Yet developing verified software is currently very difficult to do, or improve on, because there are few fully-public examples of verified software. Verified software is often highly classified, sensitive, and/or proprietary. This lack of detailed examples impedes progress by software developers, tool developers, users, teachers, and even current practitioners.

Unlike a mathematical proof, software normally undergoes change due to changing conditions and needs. So just publishing unchangeable software, with an unchangeable proof, isn't enough. Instead, we need a number of "open proofs".

The solution:

"Open proofs" solve the problem by releasing implementation, proofs, and tools as FLOSS. With such rights, developers can build on the examples to build larger works, teachers and students can use the examples for learning and research, users can verify that the proof is valid, and tool suppliers can use real examples to improve tools. Both realistic examples (for building on and tool development) and small examples (for teaching) are needed.

Not all systems need to be revealed to the public, but we need public examples as "seed corn" to develop more verified software. To be high assurance, such software would need to come with some automated test suite, but that isn't a strict requirement to be an open proof.

Open proofs do not solve every possible problem, of course. For example: (1) the formal specification might be wrong or incomplete for its purpose; (2) the tools might be incorrect; (3) one or more assumptions might be wrong. But they would still be a big improvement from where we are today. Many formal method approaches have historically not scaled up to larger programs, but open proofs may help counter that by enabling tool developers to work with others.

Firefox (In)Security Update Dynamics Exposed

One of the great things about Firefox is its automatic update scheme. Here's some interesting research on the subject:

Although there is an increasing trend for attacks against popular Web browsers, only little is known about the actual patch level of daily used Web browsers on a global scale. We conjecture that users in large part do not actually patch their Web browsers based on recommendations, perceived threats, or any security warnings. Based on HTTP useragent header information stored in anonymized logs from Google's web servers, we measured the patch dynamics of about 75% of the world's Internet users for over a year. Our focus was on the Web browsers Firefox and Opera. We found that the patch level achieved is mainly determined by the ergonomics and default settings of built-in auto-update mechanisms. Firefox' auto-update is very effective: most users installed a new version within three days. However, the maximum share of the latest, most secure version never exceeded 80% for Firefox users and 46% for Opera users at any day in 2007. This makes about 50 million Firefox users with outdated browsers an easy target for attacks. Our study is the result of the first global scale measurement of the patch dynamics of a popular browser.

What's interesting, too, is that this was research done using data drawn from Google: there must be a lot of really useful info there to be mined - suitably anonymised, of course. (Via Bruce Schneier.)