06 January 2009

The (Intellectual Monopoly) Biter Bit

The author of a proposed Chilean law to fight copyright infringement was greeted with the warning message "This copy of Microsoft Office is not genuine" when he was making a presentation about it.

Whoops! [Google Translation.]

The Once and Future Economy

Great post by Tim O'Reilly about how we need to junk the idea that the economy can expand indefinitely, and move to a different system - one prefigured in the current sharing of code and content:


The consumption of electronic media perhaps gives a foretaste of an economy in which qualitative complexity might replace quantitative addition as the raw material of exchange. Obviously, we're not there yet, as we're still consuming lots of resources to build the substrate for our increasingly intellectual economy, but I love that he's broken the naive assumption that if we don't have growth, the only alternative is stasis.

This is yet another reason why the lock down of knowledge by intellectual monopolies is simply unacceptable in a world that will be predicated on sharing digital stuff, just as we used to share the physical stuff that Nature gave us a few hundred thousand years ago.

Brainstorming with GNOME's Stormy Peters

As I wrote last week, foundations are playing an increasingly important role in the development of free software. I cited Mozilla Foundation and GNOME Foundation - although Matthew Aslett rightly pointed out that Eclipse is a leader, too - but in one respect Mozilla and GNOME are somewhat different. We hear a lot about Mozilla's plans, articulated by Mitchell Baker, now ably abetted by Mark Surman, but GNOME is rather less high profile. The same goes for the head of the GNOME Foundation, Stormy Peters, so I was delighted to come across this very full interview with her....

(On Open Enterprise blog.)

05 January 2009

Computational Journalism

I like the sound of this:


the digital revolution that has been undermining in-depth reportage may be ready to give something back, through a new academic and professional discipline known in some quarters as "computational journalism." James Hamilton is director of the DeWitt Wallace Center for Media and Democracy at Duke University and one of the leaders in the emergent field; just now, he's in the process of filling an endowed chair with a professor who will develop sophisticated computing tools that enhance the capabilities — and, perhaps more important in this economic climate, the efficiency — of journalists and other citizens who are trying to hold public officials and institutions accountable.

Sounds like bringing in openness to government willy-nilly....
(Via @timoreilly.)

On Becoming a Twit

In the last three years, I've written just under 4000 blog posts. You might think that is more than enough, but for some time I have been conscious that I don't always blog everything I could or even want to. Often, I've multiple Firefox tabs sitting there holding juicy items that I think deserve passing on; and yet I never get around to writing about them. I've been pondering why that is, and what I can do about it.

I think it comes down to two things. First, it takes a certain minimum amount of time to craft even the simplest blog posting: sometimes I just don't have the spare minutes/spare brain cycles to do that. Often, though, there is very little to say about the item in question - no profound comment is required beyond "take butcher's at this". What I really need, I realised, is a lightweight way of passing on such stories quickly.

Enter Twitter.

One of the interesting trends over the last year has been the steady rise of Twitter. Increasingly, I am finding bloggers that I read referring to stuff they find via Twitter, or to conversations conducted there. Clearly this can be a very powerful medium, if used in the right way. I've always been sceptical about the idea of twittering about every mundane detail of your life, but using it as a kind of micro-blogging tool is an attractive solution to the problems I've been experiencing.

As a result, I've started using Twitter at twitter.com/glynmoody; updates aren't protected, so anyone can follow. Note that I won't generally be posting links to blog posts there, unless there's a particular reason for doing so. In part, that's because the info is meant to be complementary. But it's also because some kind soul (whose name escapes me, to my shame - please get in touch if you want your name up in lights - now revealed to be one Jonny Dover, to whom many thanks) has set up a separate Twitter feed for opendotdotdot (which also includes pointers to my other posts on Open Enterprise and Linux Journal) at twitter.com/opendotdotdot. This means that you can choose whether to follow just the longer-form stuff, or the new, reduced-fat posts, or - for masochists only - both.

A few early observations on the medium.

First, one of the reasons I have held off from Twitter is that its parsimonious format forces you to use a URL shortening service, the best known of which is TinyURL.com. I have inveighed against these several times, largely because of the fact that they obscure the inherently linky nature of the Web. Fortunately, things have moved on somewhat: you can now provide users of the shortened URL with a preview. This means that (a) they can see that structure and (b) they can be slightly more sure you are not dumping them on some manifestly infected site.

Although TinyURL offers this service, I've plumped instead for is.gd, partly because it uses considerably fewer characters than TinyURL.com, partly because it has a shorter preview feature (you just add a hyphen to the end of shortened URL), and partly because it uses buckets of open source:

is.gd runs on the CentOS operating system. The most major pieces of software used are Lighttpd (web server), PHP (scripting) and MySQL (database).

CentOS:

is an Enterprise-class Linux Distribution derived from sources freely provided to the public by a prominent North American Enterprise Linux vendor. CentOS conforms fully with the upstream vendors redistribution policy and aims to be 100% binary compatible. (CentOS mainly changes packages to remove upstream vendor branding and artwork.) CentOS is free.

The coy "prominent North American Enterprise Linux vendor" is Red Hat, in case you were wondering.

The other aspect that has already struck me, after just a few days of using Twitter, is how you find people to follow. For me, at least, it's very similar to how I find blogs: I come across links to new ones in the blogs that I currently read. Similarly, I've found that a good way to find people who may be of interest is to look at whom the people I am following are following themselves. This leads to pools of people who tend to be reading and responding to each other - a micro-community at best, another echo chamber at worst.

I've also made up a few rough and ready rules: no news feeds (I want real people, their opinions and their daily lives - isn't that partly the point of Twitter?) and nobody who can't be bothered posting on a fairly regular basis. I've also avoided most of the Twitter super-stars (you know who you are) as a matter of principle: I don't really want to follow people who are almost totally famous for being famous on Twitter, for the same reason that I read relatively few of the A-list blogs.

Blogging has evolved considerably over the last few years, and I expect both it and Twitter to continue to do so - for example, in terms of them working together, fulfilling different functions (along with email, which completes the trinity of one-to-one, one-to-many and many-to-many interactions online). I've already found that I enjoy blogging more: I no longer feel obliged to blog about everything of interest, since I can push some stuff straight out on Twitter.

Part of the fun of blogging and twittering comes from participating in this huge, collaborative experiment in open writing and open thinking; this means that your comments/tweets on any of the above are even more welcome than usual.

04 January 2009

Another Reason to Run GNU/Linux...

And a pretty important one:


The Home Office has quietly adopted a new plan to allow police across Britain routinely to hack into people’s personal computers without a warrant.

So why might GNU/Linux help? Well:

He said the authorities could break into a suspect’s home or office and insert a “key-logging” device into an individual’s computer. This would collect and, if necessary, transmit details of all the suspect’s keystrokes. “It’s just like putting a secret camera in someone’s living room,” he said.

Police might also send an e-mail to a suspect’s computer. The message would include an attachment that contained a virus or “malware”. If the attachment was opened, the remote search facility would be covertly activated. Alternatively, police could park outside a suspect’s home and hack into his or her hard drive using the wireless network.

Er, and how are they going to break into my system to install the keylogger if they don't know the password? Attachments won't work: I'm generally clever enough *not* to open them, and even if I did, they wouldn't do much on a GNU/Linux box. And hacking my hard disc through the wireless network? I don't think so.

Looks like free software is becoming even more about freedom....

Not That Microsoft is Desperate, or Anything...

From the You Can't Even Give It Away department: the Ultimate List of Free Windows Software from Microsoft - 150 items. (Via @Jack Schofield.)

Major Win for ODF in Brazil

Great news for ODF in Brazil: it's becoming the official format for storing government agency dox:

Já no passado mês de Abril de 2008, o ODF (Open Document Format) tinha sido adoptado como Norma Nacional no Brasil, mas agora sabemos por um comunicado da SERPRO que foi publicada a versão 4.0 dos Padrões de Interoperabilidade de Governo Electrónico (e-PING) que torna obrigatória a utilização do ODF na administração pública federal.

A nova versão publicada pela Secretaria de Logística e Tecnologia da Informação (SLTI) do Ministério do Planejamento adota o Open Document Format (ODF), como formato padrão para guarda e troca de documentos eletrônicos no governo federal.

...

Até a última versão da e-Ping o formato ODF constava com o status de recomendado pelo documento, sendo facultativo aos órgãos o uso, na versão 4.0 o ODF assume característica de adotado, dessa forma, torna-se obrigatório para todos os órgãos da administração direta, autarquias e fundações.


[Via Google Translate: Already in April 2008, the ODF (Open Document Format) had been adopted as national standard in Brazil, but now we know for a release of SERPRO which was published version 4.0 of the Standards for Interoperability of Electronic Government (E-PING ) That mandate the use of ODF in the public service federation.

The new version published by the Department of Logistics and Information Technology (SLTI) of the Ministry of Planning adopts the Open Document Format (odf), as a standard for safekeeping and exchange of electronic documents in the federal government.

...

Until the latest version of the e-Ping the format ODF was recommended to the status of the document, and voluntary bodies to use, version 4.0 in the ODF takes characteristic of adopted thus becomes mandatory for all government agencies direct, municipalities and foundations.]

As ever, Brazil's decision is doubly significant: important in itself, given the size of the country, and important as an example to others.

Project Gutenberg Made Easy

In my view, Project Gutenberg doesn't get the respect it deserves. After all, this effort to make the world's literature freely available in a digital form pre-dates free software by a decade. Partly, I suspect, this is because people don't know much about the process. Here's a great hands-on intro:

Contributing my time, energy, and two books to PG was not my first excursion in UGC, but it is the first time I have allied myself with a high-profile international project. Adding content to PG requires patience, good social skills (for interacting with your proofreader), and the ability to intuit what needs to be done to get your contribution online. Here’s a journal of my recent experience. (See the sidebar Project Gutenberg’s Verions of the Steps on the right for the concise step-by-step directions for getting material into Project Gutenberg.)

DRM as Freedom-Eating Infection

I've often written about DRM, and how it is antithetical to free software. But here's an interview with Amazon's CTO, which provides disturbing evidence that it actively *reduces* the amount of free software in use:

InformationWeek: Amazon is known as an open source shop. Is that still true?

Vogels: Where in the past we could say this was a pure Linux shop, now in terms of the large pieces of the e-commerce platform, we're a pure Amazon EC2 shop. There's an easier choice of different operating systems. Linux is still very popular, but, for example, Windows Server is often a requirement, especially if you need to transcode video and things that have to be delivered through Windows DRM [digital rights management], so there is a variety of operating systems available for internal developers.

Another reason to fight the spread of DRM. (Via @storming.)

03 January 2009

Why IPv4 Addresses Are Like Oil

IPv4 addresses are an increasingly rare resource. But I'd not spotted the parallel with oil until this:

the US was still the largest user of new IPv4 addresses in 2008 with 50.08 million addresses used. China was a close second with 46.5 million new addresses last year, an increase of 34 percent.

Although China and Brazil saw huge increases in their address use, suggesting that the developing world is demanding a bigger part of the pie while IPv4 addresses last, what's really going on is more complex. India is still stuck in 18th place between the Netherlands and Sweden at 18.06 million addresses—only a tenth of what China has. And Canada, the UK, and France saw little or no increase in their numbers of addresses, while similar countries like Germany, Korea, and Italy saw double-digit percentage increases.

A possible explanation could be that the big player(s) in some countries are executing a "run on the bank" and trying to get IPv4 addresses while the getting is good, while those in other countries are working on more NAT (Network Address Translation) and other address conservation techniques in anticipation of the depletion of the IPv4 address reserves a few years from now.

In other words, the greediest countries - the US and China - are rushing to burn up all the oil while there's some left, and to hell with what happens afterwards....

02 January 2009

Happy Public Domain Day...

...er, yesterday:

It is January 1st, which means that this morning at midnight a batch more “life-plus” copyrights expired in those countries — most of them — where copyright expires at the end of the Nth year following the death of the author.

Yes, folks, it’s Public Domain Day! And it’s international! There are little Public Domain Day virtual commemorations going on in places like Poland and Switzerland. Spread the word!

In the life+50 universe, which constitute the largest cohort of countries, including Canada, which collectively have the majority of the world’s population, life-plus copyrights expired at midnight for those authors, or last-surviving of multiple authors, who died in 1958.

(Via Michael Hart.)

Will OpenOffice.org Go to the Ball this Year?

I remain perplexed by the state of OpenOffice.org. After years of using Word 2 (yes, you read that correctly - by far the best version Microsoft ever produced), I jumped straight to OpenOffice.org as my main office software. Version 1.0 was, it is a true, a little on the, er, rough side, but since 2.0, I've had practically no problems - no crashes at all that I can remember. It's reasonably fast, not a huge memory hog (certainly nothing compared to the old versions of Firefox, or even Firefox 3.0, which still regularly eats several hundred Meg of my RAM for breakfast) and does practically everything most people who aren't Excel macro junkies could possibly want: what's not to like?

On Open Enterprise blog.

Dear Mr Burnham....

Tom Watson is that rare thing: a tech-savvy MP. And since he has taken the trouble of asking what people think about Andy Burnham's proposals to adopt cinema-style ratings for the Internet, I think it would be churlish not to respond. Not least because this kind of thing should be the norm, not the exception, and needs to be nurtured.

Here's what I've posted on the site:

As someone who has been writing about the Internet for fifteen years now, I obviously agree with the majority sentiments expressed above: the idea simply won't work at multiple levels. If attempted, it will be costly, and cause great collateral damage in terms of maligning perfectly harmless sites.

But carping is easy: the real issue is what should be done instead.

I think the key to solving not just this problem, but myriad other technology-related issues, is to tap the huge reservoir of expertise that exists both in the UK and elsewhere. It is simply folly to attempt to come up with solutions to complex problems ex nihilo; instead, we need to build on what people already know, and what they've already tried. This means getting people involved, at all levels.

This would help not only in the current case, but generally when the UK government is grappling with the intersection of policy with technology. Sadly, previous decisions involving computers, the Internet and related areas have frequently ignored salient facts that have subsequently vitiated the proposed schemes.

In summary, please don't even think about implementing clumsy classification schemes until more general structures are in place to help arrive, collaboratively, at ones that will work better.

You may want to add your twopence.

01 January 2009

Laying Down the Law

Ever since RMS drew up the GNU GPL, code and law have been inextricably linked. Mark Radcliffe provides a good summary of the last year from a legal viewpoint:

Last year was the one of the most active years for legal developments in the history of free and open source (“FOSS”). http://lawandlifesiliconvalley.com/blog/?p=27 This year, 2008, has seen a continuation of important legal developments for FOSS. My list of the top ten FOSS legal developments in 2008 follows...

31 December 2008

A Good Foundation for 2009

If I had to pinpoint major open source trends in 2008, one of them would be the rise in the foundation as a major force in free software. The best-known examples of these are probably the Mozilla Foundation and GNOME Foundation, both of which have expanded their ambitions recently. Here's what each has to say about its aims...

On Open Enterprise blog.

Proud to be Lesser

Matt has some thoughts on blogs - including this one:

my primary interest is in digging up what's not already "popular." Unfortunately, I'm as guilty as anyone of recycling "news," but real traffic comes from breaking new ground, and I find that by scouring Digg and much lesser-known blogs.

...

no Drudge Report for me. Instead I'll be reading OpenDotDotDot and other "lesser" blogs. Hopefully this will keep translating into rising Open Road readership in 2009. Maybe we'll break the top-5,000,000 by 2012. One can dream....

Thanks, Matt...I think.

Actually, I feel exactly the same way: I'd much rather read Matt's informed writing on The Open Road - born of real analytical intelligence *and* hands-on experience - than the frothy nonsense served up by "leading" blogs.

The latter are most interested in traffic and in maintaining their position as blogosphere personalities: famous for being famous. They rarely contribute a deeper understanding of the world they write about.

That's what we "lesser" blogs are for.

The Super-Stupid Super-Snooping Database Idea

This is just a jokette, right?

The private sector will be asked to manage and run a communications database that will keep track of everyone's calls, emails, texts and internet use under a key option contained in a consultation paper to be published next month by Jacqui Smith, the home secretary.

I mean, not content with attempting to put into place a total surveillance system, old Jacqui now seriously wants to out-source it? Which will effectively means that it can be owned by anyone - including a foreign entity - that buys the company with the contract.

I can see the political advantages of doing so - "oh no, *we* didn't lose all your intimate data, blame the company" - but this is stupidity squared.

Linus Plays Prince of Persia - Again

Most people in the free software world know that before he wrote Linux, Linus was using the Minix operating system. To run it, he had to acquire his first "proper" PC - his main machine until then was the Sinclair QL (remember that?). As he told me a few years ago, the PC arrived early in 1991....

On Open Enterprise blog.

The Commons of Darkness

Those of us who are city-dwellers rarely see much in the sky at night; we have lost the commons of darkness. As a result, to view the terrifying multitude of stars out in countries with little street lighting is an almost mystical experience.

Against that, er, background, here's an interesting idea:


2009 has been designated by the United Nations as the International Year of Astronomy (IYA), marking the 400th anniversary of Galileo’s telescope. The excitement is starting early, with Galloway Forest Park in Scotland announcing its plans to become Europe’s first “dark sky park.”

The forest, which covers 300 square miles and includes the foothills of the Awful Hand Range, rates as a 3 on the Bortle scale. The scale, created by John Bortle in 2001, measures night sky darkness based on the observability of astronomical objects. It ranges from Class 9 – Inner City Sky – where "the only celestial objects that really provide pleasing telescopic views are the Moon, the planets, and a few of the brightest star clusters (if you can find them)," to Class 1 – Excellent Dark-Sky Site – where "the galaxy M33 is an obvious naked-eye object" and "airglow… is readily apparent." Class 3 is merely "Rural Sky," meaning that while "the Milky Way still appears complex... M33 is only visible with averted vision."

(Via A Blog Around the Clock.)

30 December 2008

Extreme Openness: the Rise of Wikileaks

There is a long journalistic tradition of looking back at the end of the year over the major events of the preceding 12 months - one that I have no intention of following. But I would like to point out an important development in the world of openness that has occurred over that time-span: the rise and rise of Wikileaks....

On Open Enterprise blog.

Collaboration Markets and Open Source

Here's a detailed and important piece that looks at the economics of scientific collaboration. One concept that may be of particular interest to readers of this blog is that of collaboration markets:

There are good reasons it’s difficult to set up efficient collaboration markets in expert attention. Creative problems are often highly specialized one-off problems, quite unlike the commodites traded in most markets. Until very recently, markets in such specialized goods were relatively uncommon and rather limited even in the realm of physical goods. This has recently changed, with online markets such as eBay showing that it is possible to set up markets which are highly specialized, provided suitable search and reputational tools are in place.

To the extent such collaboration markets do exist in science, they still operate very inefficiently compared with markets for trade in goods. There are considerable trust barriers that inhibit trading relationship being set up. There is no medium of exchange (c.f. the posts by Shirley Wu and Cameron Neylon’s on this topic). The end result is that mechanisms for identifying and aggregating comparative advantage are downright primitive compared with markets for physical goods.

Perhaps the best existing examples of collaboration markets occur in the open source programming community. No single model is used throughout that community, but for many open source projects the basic model is to set up one or more online fora (email discussion lists, wikis, bug-tracking software, etcetera) which is used to co-ordinate activity. The fora are used to advertise problems people are having, such as bugs they’d like fixed, or features they’d like added. People then volunteer to solve those problems, with self-selection ensuring that work is most often done by people with a considerable comparative advantage. The forum thus acts as a simple mechanism for aggregating information about comparative advantage. While this mechanism is primitive compared with modern markets, the success of open source is impressive, and the mechanisms for aggregating information about comparative advantage in expert attention will no doubt improve.

Haque on Hacking Economics

And yes, it's all about openness, collaboration and respect:

companies who can build authentic, honest, open, collaborative relationships with consumers are significantly more profitable (and sustainably profitable) than companies who treat consumers deceptively, antagonistically, and manipulatively.

Timeo Danaos....

Perhaps the most neglected pioneer in computing is Ted Nelson, who came up with most of the ideas of hypertext and linking, but got sidetracked for most of his life with the ill-fated Project Xanadu. One of my favourite computing puns is "I fear the geeks bearing gifts". So putting them together is an irresistible combination:

Whether you love the computer world the way it is, or consider it a nightmare honkytonk prison, you'll giggle and rage at Ted Nelson's telling of computer history, its personalities and infights.

Computer movies, music, 3D; the eternal fight between Jobs and Gates; the tangled stories of the Internet and the World Wide Web; all these and more are punchily told in brief chapters on many topics such as The Web Browser Salad, Voting Machines, Google, Web 2.0 and much more. These short stories make great reading – it's a book to dip in and out of.

I have to say that's not exactly the book I would have expected Nelson to write, but then he's full of surprises.... (Via Iterating Towards Openness.)

29 December 2008

Business versus Business

It's pretty obvious why companies in sectors like oil production should be denying so vehemently that their products are major contributors to climate change. It's also pretty clear why many other industries would prefer not to think about the externalities of their business models, and how much they take without replacing from the environmental commons. But there are a few non-green businesses that not only believe climate change and environmental degradation is happening, but that it is large scale - and already hugely expensive:

The past year has been one of the most devastating ever in terms of natural disasters, one of the world's biggest re-insurance companies has said.

Munich Re said the impact of the disasters was greater than in 2007 in both human and economic terms.

The company suggested climate change was boosting the destructive power of disasters like hurricanes and flooding.

...

"It is now very probable that the progressive warming of the atmosphere is due to the greenhouse gases emitted by human activity," said Prof Peter Hoppe, head of Munich Re's Geo Risks Research.

"The logic is clear: when temperatures increase there is more evaporation and the atmosphere has a greater capacity to absorb water vapour, with the result that its energy content is higher.

"The weather machine runs into top gear, bringing more intense severe weather events with corresponding effects in terms of losses."

The company said world leaders must put in place "effective and binding rules on CO2 emissions" to curb climate change and ensure that "future generations do not have to live with weather scenarios that are difficult to control".

"If we delay too long, it will be very costly for future generations," said Mr Jeworrek.

Not rabid greenies talking, but hard-headed representatives of a big business sector...