Economistical with the Truth
The Economist is a strange beast. It has a unique writing style, born of the motto "simplify, then exaggerate"; and it has an unusual editorial structure, whereby senior editors read every word written by those reporting to them - which means the editor reads every word in the magazine (at least, that's the way it used to work). Partly for this reason, nearly all the articles are anonymous: the idea is that they are in some sense a group effort.
One consequence of this anonymity is that I can't actually prove I've written for title (which I have, although it was a long time ago). But on the basis of a recent showing, I don't think I want to write for it anymore.
The article in question, which is entitled "Open, but not as usual", is about open source, and about some of the other "opens" that are radiating out from it. Superficially, it is well written - as a feature that has had multiple layers of editing should be. But on closer examination, it is full of rather tired criticisms of the open world.
One of these in particular gets my goat:...open source might already have reached a self-limiting state, says Steven Weber, a political scientist at the University of California at Berkeley, and author of “The Success of Open Source” (Harvard University Press, 2004). “Linux is good at doing what other things already have done, but more cheaply—but can it do anything new? Wikipedia is an assembly of already-known knowledge,” he says.
Well, hardly. After all, the same GNU/Linux can run globe-spanning grids and supercomputers; it can power back office servers (a market where it bids fair to overtake Microsoft soon); it can run on desktops without a single file being installed on your system; and it is increasingly appearing in embedded devices - mp3 players, mobile phones etc. No other operating system has ever achieved this portability or scalability. And then there's the more technical aspects: GNU/Linux is simply the most stable, most versatile and most powerful operating system out there. If that isn't innovative, I don't know what is.
But let's leave GNU/Linux aside, and consider what open source has achieved elsewhere. Well, how about the Web for a start, whose protocols and underlying software have been developed in a classic open source fashion? Or what about programs like BIND (which runs the Internet's name system), or Sendmail, the most popular email server software, or maybe Apache, which is used by two-thirds of the Internet's public Web sites?
And then there's Wikimedia, which powers Wikipedia (and a few other wikis): even if Wikipedia were merely "an assembly of already-known knowledge", Wikimedia (based on the open source applications PHP and MySQL) is an unprecedentedly large assembly, unmatched by any proprietary system. Enough innovation for you, Mr Weber?
But the saddest thing about this article is not so much these manifest inaccuracies as the reason why they are there. Groklaw's Pamela Jones (PJ) has a typically thorough commentary on the Economist piece. From corresponding with its author, she says "I noticed that he was laboring under some wrong ideas, and looking at the finished article, I notice that he never wavered from his theory, so I don't know why I even bothered to do the interview." In other words, the feature is not just wrong, but wilfully wrong, since others, like PJ, had carefully pointed out the truth. (There's an old saying among journalists that you should never let the facts get in the way of a good story, and it seems that The Economist has decided to adopt this as its latest motto.)
But there is a deeper irony in this sad tale, one carefully picked out by PJ:
There is a shocking lack of accuracy in the media. I'm not at all kidding. Wikipedia has its issues too, I've no doubt. But that is the point. It has no greater issues than mainstream articles, in my experience. And you don't have to write articles like this one either, to try to straighten out the facts. Just go to Wikipedia and input accurate information, with proof of its accuracy.
If you would like to learn about Open Source, here's Wikipedia's article. Read it and then compare it to the Economist article. I think then you'll have to agree that Wikipedia's is far more accurate. And it isn't pushing someone's quirky point of view, held despite overwhelming evidence to the contrary.
Wikipedia gets something wrong, you can correct it by pointing to the facts; The Economist gets it wrong - as in the piece under discussion - and you are stuck with an article that is, at best, Economistical with the truth.
6 comments:
All this, of course, conveniently ignores some important facts about the Economist's article itself. Here's just one sentence from the opening paragraph, for example:
"More than two-thirds of websites are hosted using Apache, an open-source product that trounces commercial rivals."
See that word 'trounces'? The author clearly both acknowledges and understands the importance of Apache. Here's another short example of the article's objectivity, which illustrates the same understanding:
"The “open-source” process of creating things is quickly becoming a threat — and an opportunity — to businesses of all kinds."
These examples both occur in the first 150 words or so of the article. The remaining copy is strewn with equally objective observations, as well as a largely positive conclusion and plenty of quotes and examples that do more than counterbalance Steven Weber's opinions. Any sane and impartial reader would conclude that it was a fair and well-balanced article which did a pretty good job of explaining some fairly complicated ideas to a broad business audience whose primary expertise is usually not technology.
But of course, it gets a predictable reaction from open source zealots because it contains - gasp! - both sides of the story, rather than the fawning sycophancy that characterises so much of the open source movement. I agree absolutely with Pamela Jones' observation that mistakes are inevitable in any human endeavour (not least because I have made plenty of my own, both in print and in life); but can there be a bigger mistake than refusing to accept objective criticism and scepticism as important parts of the peer review process?
S.
(A word on my own objectivity here. I am the author of several books written under the Economist brand, but I have never been an employee of the magazine or its publisher and have no axe to grind one way or the other. For those who think this might still compromise my position, I was also the editor who commissioned Glyn's original Linux article for Wired magazine in 1996, on the grounds that it was a fascinating and important story with unprecedented repercussions. My views on that have not changed).
“The author clearly both acknowledges and understands the importance of Apache.” Well, I should jolly well hope so: it's been quoted ad nauseam (not least by me) all over the place. I don't think the author can claim many brownie points for chiming in.
"The “open-source” process of creating things is quickly becoming a threat — and an opportunity — to businesses of all kinds." Becoming? Becoming?? Where has this man been? It's been a threat since 1997 – I seem to recall a certain magazine writing that GNU/Linux was “perhaps the only alternative to Windows NT” (and they weren't even my words). Open access has been a threat to traditional STM publishing since the launch of Public Library of Science in 2001.
“It gets a predictable reaction from open source zealots because it contains - gasp! - both sides of the story, rather than the fawning sycophancy that characterises so much of the open source movement.” Just because he presents opposing views does not mean that both are correct representations of the facts. In a “balanced” article I can praise someone and then say “on the other hand” and then reel off a series of lies: superficially balanced, but in reality, less so. So the issue then is what he actually saying, not the fact that he is saying it.
As I said in my rant (it's a blog, remember....), one of the things that got me was the old chestnut that open source can't “remain innovative” in the long-run. In the long-run? The GNU project started in 1984, Linux in 1991: these are seriously long in the tooth. And yet, as I detailed, GNU/Linux is innovative in several dimensions, despite its cyber-greybeard status (in fact I missed once: apparently it's moving into the gaming world, too).
The other of the “two doubts about its staying power” - whether “the motivation of contributors can be sustained” - is equally specious. It seems plausible, but the facts so far don't bear it out: all of the major open source projects have been going for years, and they just keep getting stronger. There is some waxing and waning (Mozilla/Firefox being a good example), but if there is a real need for an application, there are always people willing to work on it.
Moreover, what the article fails to point out is that there has actually been a huge infusion of people willing “to rise at dawn for a day's dreary labour” because more and more companies are paying them to work on open source (think of IBM's huge investment). This isn't instead of volunteers, it's as well as. And this is only in the Western world: it's hard to work out just how many Chinese free software hackers there are, but I'd guess it's a non-trivial number. As the deployment of computers in so-called third-world countries increases, so the pool of potential hackers grows too – not least because there are many projects to use GNU/Linux systems from the start.
Alongside those “two doubts” there seem to be a some others - about the lack of control, the inability to “ensure quality”, etc. These straw men are demolished at length by Groklaw, which explains how all the things the Economist is calling for in its benevolent wisdom are already in place for most of the major projects, but once again the author failed to notice.
Bringing in the GPL 3 is a complete red herring: it's addressing the fact that the world has changed mightily in the last 12 years – notably through the appearance of DRM, Trusted Computing and embedded systems (patents are only addressed for clarity's sake). Although it does represent something of a constitution for the free software world, GPL 3 is a tweaking rather than a radical overhaul.
Objective criticism and scepticism are, as you rightly say, crucial elements of the peer review process: you might like to read some of Linus' choice comments on what he sees as stupid ideas to see how much he values both. But the Economist feature is simply wrong in too many places to be accorded that role – particularly since it has always set itself up as striving for and attaining a higher standard of rigour than other titles (that's why Bill Gates reads it, no?).
Moreover, it wields too much power among the movers and shakers to be allowed to get away with this kind of sloppy reporting, which inevitably succours efforts by You Know Who at sowing FUD. As PJ makes plain in her commentary, the author clearly picked and chose among his correspondents those that backed up a preformed, distorted view. I mean, witnesses for the prosecution are Steven Weber and Christian Ahlert? Hello? - it's not that we're struggling, or anything....
There, I think that's half an hour.
Is it just me, or is this the dumbest criticism (and possibly the dumbest sentence) I've read from a major, respected publication in quite some time:
"Wikipedia is an assembly of already-known knowledge..."
Hunh? "Already-known knowledge?" As opposed to... "not-known knowledge?" Which would be, what?... prediction? fiction? prevarication? psychic echoes from the distant future?
Spank me if I'm wrong, but isn't an encyclopedia *supposed to be* an assembly of already-known knowledge? What's Britannica? What's Comptons? What's the OED? What's any reference work?
I'm so confused.... I'd check a source of "already-known knowledge" to see what's wrong with me, but, according to the Economist, I guess that's not good enough anymore.
Thanks for that - you saved me having to add another comment-rant on this subject to go alongside the other comment-rant on the post-rant.
I'm not sure I have any issues with the even-handed nature of the analysis. What bugged me about this article (see my rant) was that the Economist just got so much plain wrong about Open Source and its history.
Just like Glyn, I choked on the quote by Steven Weber about innovation and open source.
Having been around during the early days of the Internet, I remember compiling lots and lots of software from source. There were precious few commercial options and that didn't change when the web came along. Apache itself is the direct decendant of the NCSA and CERN servers.
It's a bit misleading picking out those very projects that were founded to replace established alternatives, such as MySQL (Oracle, Sybase, etc.), Wikipedia (paper encyclopedias) and Linux (various flavors of Unix -- but not Windows at the time) to make the point that Open Source is only good at replacing existing products.
For anyone who develops software for a living, it is clear that all the best and most innovative products come from Open Source. Hibernate, for one, has pretty much driven all the ORM (object to relational database mapping) tool vendors to the brink of extinction. Now these various Java products are making their way over to the .NET platform (just prepend an "N" to any Java project and you are likely to see a .NET version under that name).
Their discussion of volunteer incentives and business models was a bit shallow. I've seen better, for one from Steven Weber from when he was at Harvard. That's one reason I was so suprised to see that quote from him.
Thanks for the comment.
Good to see someone whose Internet credentials go back even further than mine making the same points and then some.
Post a Comment