22 December 2006

XXX for XML on its Xth Birthday

Back in the good old Web 1.0 days, XML was really hot. Here's a useful reminder that (a) XML is 10 years old (gosh, doesn't time fly when you're having fun?) and (b) it's still hot.

Last month marked ten years since the World Wide Web Consortium (W3C) Standard Generalized Markup Language (SGML) on the Web Editorial Review Board publicly unveiled the first draft of Extensible Markup Language (XML) 1.0 at the SGML 96 conference. In November 1996, in the same hotel, Tim Bray threw the printed 27-page XML spec into the audience from the stage, from whence it fluttered lightly down; then, he said, "If that had been the SGML spec, it would have taken out the first three rows." The point was made. Although SGML remains in production to this day, as a couple of sessions reminded attendees, the markup community rapidly moved on to XML and never looked back.

Two areas stand out in this report on the conference: XQuery and Darwin Information Typing Architecture (DITA). Here's to the next X.

Red Letter Day for Red Hat

Time to throw those hats in the air, methinks:

Red Hat, Inc. the world's leading provider of open source solutions, today announced financial results for its fiscal year 2007 third quarter.

Total revenue for the quarter was $105.8 million, an increase of 45% from the year-ago quarter and 6% from the prior quarter. Subscription revenue was $88.9 million, up 48% year-over-year and 5% sequentially.

Net income for the quarter was $14.6 million or $0.07 per diluted share compared with $11.0 million or $0.05 per diluted share for the prior quarter. Non-GAAP adjusted net income for the quarter was $29.6 million, or $0.14 per diluted share, after adjusting for stock compensation and tax expense as detailed in the tables below. This compares to non-GAAP adjusted net income of $22.7 million, or $0.11 per diluted share in the third quarter of last fiscal year.

These figures are important for a number of reasons (and no, I don't own any shares - never have, never will.) It shows that Red Hat has been unaffected by all of Larry's Machiavellian machinations; it also indicates the rude health of open source's bellwether. That's good not just for Red Hat, but for the whole free software ecosystem too.

Open Source: Just the Ticket for Librarians

Here's a well-written story about how librarians have undertaken a major open source project with great success:

The system, Evergreen, whose 1.0 release came in November, is an Integrated Library System (ILS): the software that manages, catalogs, and tracks the circulation of library holdings. It's written in C, JavaScript and Perl, is GPLed, runs on Linux with Apache, uses a PostgreSQL database, Jabber for messaging and XUL as client-side software. The system allows easy clustering and is based entirely on open protocols.

21 December 2006

Wengo's Wideo Widget

Wengo, the people behind OpenWengo, an open source VOIP project, are offering a free video widget (to the first 10,000 applicants, at least) that consists of just a few lines of HTML code (but uses Flash). (Via Quoi9.)

Allison Does the Noble Samba

Top Samba man Jeremy Allison, whom I had the pleasure of interviewing many moons ago, has done the decent thing, and cut his ties with (ex-)employer Novell:

I have decided to leave Novell.

This has been a very difficult decision, but one I feel I have no choice but to make.

As many of you will guess, this is due to the Microsoft/Novell patent agreement, which I believe is a mistake and will be damaging to Novell's success in the future. But my main issue with this deal is I believe that even if it does not violate the letter of the licence it violates the intent of the GPL licence the Samba code is released under, which is to treat all recipients of the code equally.

Sad day for Novell. Luckily, Jeremy will soon be snapped up elsewhere. Bravo for taking a stand.

Update: And the lucky winner is...Google - again.

Oh! to Be in Turkmenistan...

...now that Turkmenbashi's not there.

This is an area of the world that has always fascinated me; in the wake of Niyazov's unexpected death, it's about to get even more interesting....

Heading Towards 3D

Once this kind of thing becomes commonplace, there's no stopping the 3D wave. (Via TechCrunch.)

On the Statute Book

Great that we've finally been granted free beer access to our laws; pity that it's not free as in freedom. And, of course, positively treasonable, that we don't have access to the original Anglo-Norman texts. (Via Open Knowledge Foundation.)

The Sergeant's (Digital) Song

Well, here's a right rollicum-rorum:

Yesterday, UK telecom regulator Ofcom issued a Consultation paper on future uses of the "Digital Dividend" - the frequencies to be released when TV broadcasters migrate from analog to digital transmission.

At the same time, they released a related set of "preparatory reports" by several teams of consultants.

There is a significant difference of opinion between Ofcom and the consultants on the question of whether to reserve "Digital Dividend" frequencies for license exempt applications.

This difference leads Ofcom to encourage the public to use the just-launched consultation to provide better arguments and new proposals for worthwhile license exempt applications in the UHF band.

Ignoring highly-paid consultants? Whatever next:?

Then Little Boney he’ll pounce down,
And march his men on London town!

(Via openspectrum.info.)

Scanning the Big Delta

"Delta Scan" sounds like one of those appalling airport potboilers involving mad scientists, terrorists and implausibly durable secret agents, but it's actually something much more exciting: an attempt to peek into the future of our science and technology. A hopeless task, clearly, but worth attempting if only as a five-neuron exercise.

The results are remarkably rich; considerable credit must go to the UK's Office of Science and Innovation for commissioning the report and - particularly - making it freely available. I was glad to see that there are plenty of links in the documents, which are short and to the point. Great for, er, scanning.

Open Peer Review: Not in Their Nature

One door opens, another door closes: Nature has decided to bin its open peer review experiment:

Despite the significant interest in the trial, only a small proportion of authors opted to participate. There was a significant level of expressed interest in open peer review among those authors who opted to post their manuscripts openly and who responded after the event, in contrast to the views of the editors. A small majority of those authors who did participate received comments, but typically very few, despite significant web traffic. Most comments were not technically substantive. Feedback suggests that there is a marked reluctance among researchers to offer open comments.

Nature and its publishers will continue to explore participative uses of the web. But for now at least, we will not implement open peer review.

I suspect that Nature was probably the worst possible place to try this experiment. Nature is simply the top spot for scientific publishing: getting a paper published there can make somebody's career. So the last thing most people want is anything that might increase the risk of rejection. Public discussion of submitted papers certainly falls into that category, both for the commenter and commented (think scientific mafia).

In a way, this is what makes PLoS ONE so important: it's a tabula rasa for this kind of thing, and can redefine what scientific publishing is about. Nature and its contributors are hardly likely to want to do the same. Kudos to the title for trying, but I bet they're relieved it flopped. (Via Techdirt.)

Open Sourcing Second Life

Here's a subject close to my heart: opening up Second Life. And this is what the alpha geek behind it, Cory Ondrejka, had to say on the subject yesterday:


As we’ve talked about, the long term goals for Second Life are to make it a more open platform. Part of that process is learning how projects like libSL can be beneficial to all of Second Life. We should be thrilled that we’ve built an interesting enough set of technologies and communities that people want to tinker and explore. In the long run, this is why we’ve talked about wanting to be able to Open Source eventually. My hope is that in 2007 we’ll be able to get there.

Also of note:

HTML and Firefox . . . ah my two favorite topics of all time. We have an external contractor who has tons of experience working on it right now. Basically we’ve been trying to make sure that we can get Flash working correctly because so many of the interesting parts of the Web are moving to Flash-based players/plugins/etc. Getting the control inputs and updates to work correctly is a bear but they do seem to be making progress, which is very exciting. The order of operations will be to roll a full internal browser first, then supplement the parcel media types with URLs, and then move to full HTML on a prim. Note that HTML on a prim has several pieces, from being able to interpret straight HTML in order to build text, do layout, etc, all the way to having a face of a prim point at a web page. In terms of timeline, the next major Firefox roll out will be in Q1 – ie, more functionality in the existing pages that use it plus a floater that is a browser – followed by the parcel URL in Q2. HTML on a prim will be part of a larger rearchitecture of textures – we need to go to materials per face rather than texture per face – which several of the devs are itching to work on, but will realistically not start until Q2.

Firefox in Second Life: perfect.

PLoS ONE: Plus One for Science

PLoS ONE, the new way of publishing scientific papers, has gone live. As well as fascinating papers on the Syntax and Meaning of Wild Gibbon Songs, to say nothing of populist numbers like Regulated Polyploidy in Halophilic Archaea, you can also find a sandbox for playing around with the new features of this site. It's obviously premature to say whether this experiment in Web 2.0 science publishing will work, but it certainly deserves to.

20 December 2006

The Real Story Behind Red (and White) Hat's Name

Computerworld has a short piece that has some background on Red Hat's unusual name. Bob Young is quoted as saying:


"When [Red Hat co-founder] Marc [Ewing] was at university he used to name his software projects red hat -- red hat one, red hat two -- just to differentiate them from his friends. So, of course, when he started his Linux project he just named it Red Hat Linux for lack of a better term," said Young, who left Red Hat in 2005 to focus his energies on another company he founded, online independent publishing marketplace Lulu.com.

Well, nearly right Bob. This is the real story, as told to me by Marc Ewing himself some years back:

In college I used to wear my grandfather's lacrosse hat, which was red and white striped. It was my favorite hat, and I lost it somewhere in Philadelphia in my last year. I named the company to memorialize the hat. Of course, Red and White Hat Software wasn't very catchy, so I took a little liberty.

So there you have it: the company should really be called Red and White Hat.

The New Richard Stallman

It's hard to tell whether it's because RMS has changed, or whether the world has changed, but something is new here: RMS is starting to engage with politicians at the highest levels.

For example, a few months back I wrote about his meeting with Ségolène Royal - who, as I and many others predicted, is now officially a candidate for the French Presidential elections.

Now here's RMS hobnobbing with Rafael Correa, the new president of Ecuador. I had to laugh at this description:


Stallman spoke for almost 20 minutes, without Correa saying anything, just listening.

So some things remain the same.

In any case, the result seems good:

When Stallman finished speaking, there were a few questions, and a short conversations where the two of them were in agreement on everything, and Correa asked his advisers if Ecuador should migrate to free software. They said yes, and everyone, including Stallman, left the meeting with a broad smile on their faces.

(Via Linux and Open Source Blog.)

LWN's 2006 Linux and free software timeline

A lot has happened in the last year in the world of free software. That makes it hard (a) to remember who exactly did what and (b) to get the big picture. One invaluable tool for doing both is LWN's 2006 Linux and free software timeline, which offers all the main events with handy links to the original stories. They've also got other timelines going back to 1998, if you want to see an even bigger picture. Great stuff for a trip down free memory lane.

Update 1: And here's C|net's list of top stories in the same field.

Update 2: Meanwhile, here's Matthew Aslett's open source year in quotations.

19 December 2006

Behold: Ajax3D the Great

Something that seems to have everything going for it: Ajax3D. Yup: Ajax meets 3D - or X3D, to be more precise. Here's what a rather useful white paper on the subject by Tony Parisi, one of the pioneers of the by-now antediluvian VRML standard has to say:

Ajax3D combines the power of X3D, the standard for real-time 3D on the web, with the ease of use and ubiquity of Ajax. Ajax3D employs the X3D Scene Access Interface (SAI)—the X3D equivalent of the DOM— to control 3D worlds via Javascript. With the simple addition of an X3D plugin to today’s web browsers, we can bring the awesome power of video game technology to the everyday web experience.

The initial development has begun. Media Machines has created the first showcase applications and tutorials, and has launched a web site, www.ajax3d.org, as an open industry forum to explore technologies, techniques and best practices. This white paper describes the key technical concepts behind Ajax3D and, via examples, introduces the beginning of a formal programming framework for general use.

(Via Enterprise Open Source Magazine.)

ID'ing Reality

The truth begins to sink in:

The government has abandoned plans for a giant new computer system to run the national identity cards scheme.

Instead of a single multi-billion pound system, information will be held on three existing, separate databases.

Well, that's a start. Just as hopeful is the statement:

Home Secretary John Reid denied this was a "U-turn" saying it would save cash, boost efficiency and cut fraud.

So, presumably cancelling the whole thing would also not be a "U-turn", since it too "would save cash, boost efficiency and cut fraud"....

dmoz RIP?

DMOZ - now called the Open Directory Project - just doesn't have the respect it deserves. That's partly because it's had more names than even Firefox/Firebird/Phoenix.

It started out as GnuHoo, but RMS took exception to that, and it became NewHoo - which Yahoo promptly took exception to. It managed to avoid the horrible ZURL (shouldn't that be Zurg?), before metamorphosing into the Open Directory Project, also known as dmoz (from directory.mozilla.org) to its friends.

But it's real importance is not as an open Yahoo: it was the direct inspiration for Nupedia - NewHoo, Nupedia - geddit? - which in turn gave rise to the complementary Wikipedia: need I say more?

So it's sad to hear that dmoz is fizzling. It may not serve much purpose at present, but it's had a glorious past. (Via John Battelle's Searchblog.)

Google Gets Earthier

Google has acquired the mapping company Endoxon:

Endoxon is a developer of internet mapping solutions, mobile services, data processing, cartography, direct marketing and the Trinity software suite. Since 1988, Endoxon and its 75 employees have created ground-breaking solutions for a wide variety of geographic needs. Endoxon is a pioneer in AJAX mapping technologies. Endoxon technologies enable the integration and processing of geo-referenced data and high-resolution aerial and satellite images for dynamic internet and mobile services.

What's interesting about this is that it shows Google pushing forward in the field of mapping, cartography and 3D interfaces - and area that is emerging as increasingly important. (Via Ogle Earth.)

Digital Library of the Commons

OnTheCommons has an interesting post about a new book called Understanding Knowledge as a Commons. This sounds great - see Peter Suber's comment below for details on open access to its contents. but sadly - and ironically - seems not to be open access (though I'd bet that Peter Suber's contribution to the collection is doubtless available somewhere),. However, tThis article did mention something I'd not come across before: the Digital Libary of the Commons.

This turns out to be a wonderful resource:

a gateway to the international literature on the commons. This site contains an author-submission portal; an archive of full-text articles, papers, and dissertations; the Comprehensive Bibliography of the Commons; a Keyword Thesaurus, and links to relevant reference sources on the study of the commons.

Among the list of commons areas, there is Information and Knowledge Commons:

anticommons, copyright, indigenous, local, scientific knowledge issues, intellectual property rights, the Internet, libraries, patents, virtual commons, etc.

Strange that free software is not included. But good, nonetheless.

18 December 2006

British Judges - Gawd Bless 'Em

Here's a interesting little tale of two nations sharing a common tongue but divided by patent culture:

In the US the courts found that Smith had infringed the patents in its use of similar designing software and ordered that it remove certain functions from its software.

The English court took a rigorous approach to analysing the patents and found that it did not adequately describe the system it sought to patent. In order to be valid a patent must describe a process so completely that a person who knows that subject area must be able to replicate it using only the contents of the patent.

The High Court found that Halliburton's patent did not do that, and the Court of Appeal has now agreed. Justice Jacob ruled that the patent was missing vital details, contained wrong equations, demanded a higher level of expertise than allowed and that it relied on material external to the patent, and therefore was not a valid patent.

Shot, sir!

A pity they didn't just chuck it on the basis you can't patent software, but at least the bewigged gents "took a rigorous approach."

It's a Small, Small, Small, Small World

And if anyone's wondering why I keep posting stuff about Chinese currencies - virtual or real - try this for a little hint about the interconnectedness of things (which is what this blog is all about), and the deep nature of a commons:


At least one-third of California's fine particulate pollution -- known as aerosol -- has floated across from Asia, says Steve Cliff, an atmospheric scientist at the University of California at Davis. "In May this year, almost all the fine aerosol present at Lake Tahoe [300 km east of San Francisco] came from China," says Tom Cahill, a UC Davis emeritus professor of atmospheric sciences. "So the haze that you see in spring at Crater Lake [Oregon] or other remote areas is in fact Chinese in origin."

...

The irony of finding Chinese mercury in American rivers, of course, is that much of it was emitted to produce goods being consumed in the United States. There's been a growing awareness that importing commodities from the rest of the world displaces pollution from the U.S. onto other countries; this story brings it full circle and demonstrates yet again that in this fishbowl called Earth, pollution can't be displaced "elsewhere" for long.

The World's Economic Centre of Gravity...

...just started to move to the right:


The euro held steady after the Iranian government said it had ordered the central bank to transform the state's dollar-denominated assets held abroad into euros and use the European currency for foreign transactions.

But I've no illusions that this is anything but the start of a shift even further eastward...

Open Public Data: Halfway There

Well, now, here's some progress:

The OFT's market study into the commercial use of public information has found that more competition in public sector information could benefit the UK economy by around £1billion a year.

Download Commercial use of public information (pdf 707 kb).

Examples of public sector information include weather observations collected by the Met Office, records held by The National Archives used by the public to trace their family history, and mapping data collated by Ordnance Survey. The underlying raw information is vital for businesses wanting to make value-added products and services such as in-car satellite navigation systems.

Public sector information holders (PSIHs) are usually the only source for much of this raw data, and although some make this available to businesses for free, others charge. A number of PSIHs also compete with businesses in turning the raw information into value-added products and services. This means PSIHs may have reason to restrict access to information provided solely by themselves.

The study found that raw information is not as easily available as it should be, licensing arrangements are restrictive, prices are not always linked to costs and PSIHs may be charging higher prices to competing businesses and giving them less attractive terms than their own value-added operations.

It's good news that the Office of Fair Trading has grasped that the public sector trough-scoffers cost taxpayers serious money through their greed; however, realising that making the information freely available - not just for commercial use - would generate far more dosh still will take a while, I fear. (Via Open Knowledge Foundation.)