26 November 2007

Why Javascript, not Flash? - Ask Zoho

I've only just come across this, perhaps the best summary of why using Flash is the wrong way to create Web apps:


1. Native to the Web

A real web application should natively support web standards - HTML & CSS are pretty much synonymous with “web standards”. The biggest reason we started out with Javascript is that it is native to the web - in the sense its core object model for Javascript is the HTML/CSS Document Object Model. The DOM is a gift to web applications. Even with the annoying browser differences in DOM (which sophisticated libraries increasingly hide), it is still far better to have the DOM than not have it. Flash, for all its advantages, sits in a separate space from the browser. In that sense, Flash is not that different from Java-on-the-client. In fact, Flash is Java-on-the-client-done-right.

I am sure Flash will eventually find a way to natively integrate with the browser but it is not there yet.

2. Open Source Library Support

This is a big one. The depth and variety of libraries available in Javascript just keep getting better. It is mind boggling just how much open source development is going on in Javascript. Developers keep pushing the envelope. For one example, look at the jQuery solar system demo. It shocked me the first time I saw it. Pretty impressive that Javascript could do that, right? The capabilities of Javascript exceed the client requirements of office productivity applications today, and there are tons more innovations coming.

3. Vector Graphics in Browsers

This is another big one. Vector graphics formats like SVG (Firefox, Opera), VML (IE), and HTML Canvas (Firefox, Safari, Opera), are becoming ubiquitous in browsers. Yeah, it sucks that IE doesn’t support SVG, but that can be worked around. Even cooler is the fact that SVG & VML are XML and very Javascript friendly. You can do real magic.

Obviously, number 2 is the heart of the matter: Javascript is just going to keep getting better, faster, thanks to the open development process. With Flash, you're dependent on the skills of one company (now, where have I heard that before?)

Open Bookshelf: Real-Time and Embedded Linux

Real-time and embedded Linux is an iceberg: for all its low visibility, it's pretty big below the surface, and getting bigger. If you want to get to know this world better - and you know you do - here's a bumper crop of light reading for you:

LinuxDevices.com is pleased to publish an overview and papers from the Ninth Real-Time Linux Workshop held in Linz, Austria, Nov. 2-3, 2007. The papers, available for free download without registration, span a broad range of topics, ranging from fundamental real-time technologies to applications, hardware, and tools.

(Via Linux Today.)

Soaraway Open Source

Rupert Murdoch's tabloid Sun newspaper, better known for its fascination with chest-tops rather than laptops, is nonetheless starting to grok the Joy of Linux, thanks to the Asus EEE PC:

The crucial thing about the Eee is rather than running on Windows, it uses a Linux operating system. Now I'm a Microsoft man through and through, I've never been able to face switching from XP or Vista to the likes of OS X on an Apple. There's safety in what you know.

I'd certainly never consider running Linux on my home PC but by slimming down the software on this gadget, it allows it to have a much longer battery life - crucial for a product designed to be used on the move. It will also run faster and has instant on and off.

As I've said elsewhere, the Asus could really prove to be a breakthrough machine for GNU/Linux among general users. (Via FSDaily.)

Andy Updegrove on the War of the Words

The ODF/OOXML struggle has been one of the pivotal stories for the world of open source, open data and open standards. I've written about here and elsewhere many times. But the person best placed to analysis it fully from a standards viewpoint - which is what it is all about, at heart - is undoubtedly Andy Updegrove, who is one of those fine individuals obsessed with an area most people find slightly, er, soporific, and capable of making it thrilling stuff.

News that he's embarked on an e-book about this continuing saga is therefore extremely welcome: I can't imagine anyone doing a finer job. You can read the first instalment now, with the rest following in tantalising dribs and drabs, following highly-successful precedents set by Dickens and others. With the difference, of course, that this book - entitled ODF vs. OOXML: War of the Words - is about fact, not fiction, and that the events it describes have not even finished yet.

25 November 2007

Feel Free to Squeak

I don't know much about the open source programming language Squeak, but it does sound rather cool:

Squeak is a highly portable, open-source Smalltalk with powerful multimedia facilities. Squeak is the vehicle for a wide range of projects from educational platforms to commercial web application development.

...

Squeak stands alone as a practical environment in which a developer, researcher, professor, or motivated student can examine source code for every part of the system, including graphics primitives and the virtual machine itself. One can make changes immediately and without needing to see or deal with any language other than Smalltalk.

Our diverse and very active community includes teachers, students, business application developers, researchers, music performers, interactive media artists, web developers and many others. Those individuals use Squeak for a wide variety of computing tasks, ranging from child education to innovative research in computer science, or the creation of advanced dynamic web sites using the highly acclaimed continuation based Seaside framework.

Squeak runs bit-identical images across its entire portability base, greatly facilitating collaboration in diverse environments. Any image file will run on any interpreter even if it was saved on completely different hardware, with a completely different OS (or no OS at all!).

Now, though, it seems there is no excuse not to find out more:

To help more people get familiar with Squeak's very powerful programming environment, the new book Squeak by Example is now being made available under the Creative Commons Attribution-ShareAlike 3.0 license. It's intended for both students and developers and guides readers through the Squeak language and development environment by means of a series of examples and exercises. This is very useful to those who wish to become more familiar with the Croquet programming environment. You can either download the PDF for free, or you can buy a softcover copy from lulu.com.

What a classic combination: CC digital download, or an analogue version from Lulu.com.

Update 1: Alas, it seems you can't squeak freely - see comment below.

Update 2: Or maybe you can - see other comments below.

24 November 2007

(Copyright) Darkness Visible

The benighted policy of extending copyright terms again and again is made visible in a nice graphic accompanying this post:

The term of copyright has steadily expanded under U.S. law. The first federal copyright legislation, the 1790 Copyright Act, set the maximum term at fourteen years plus a renewal term (subject to certain conditions) of fourteen years. The 1831 Copyright Act doubled the initial term and retained the conditional renewal term, allowing a total of up to forty-two years of protection. Lawmakers doubled the renewal term in 1909, letting copyrights run for up to fifty-six years. The 1976 Copyright Act changed the measure of the default copyright term to life of the author plus fifty years. Recent amendments to the Copyright Act expanded the term yet again, letting it run for the life of the author plus seventy years.

What's wrong with this picture?

The Supreme Court has held that legislative trick constitutional, notwithstanding copyright’s policy implied aim of stimulating new authorship—not simply rewarding extant authors.

Open EMR

He beat me to it.

23 November 2007

Another Reason Why China is the Future...

...or rather *a* future:

Chinese trust the Internet over mainstream media and all sources of information according to a study done by Harris Interactive for Edelman.

(Via RConversation.)

Live Documents and Let Live Documents

It's not really clear whether we need yet another online office suite, but at least Live Documents seems to have understood the importance of freeing users from dependence on a certain offline one:


"From a technology and utility perspective, Live Documents offers two valuable improvements - firstly, it break's Microsoft's proprietary format lock-in and builds a bridge with other document standards such as Open Office and secondly, our solution matches features found only in the latest version of Office (Office 2007) such as macros, table styles and databar conditional formatting in Excel 2007 and live preview of changes in PowerPoint 2007. Thus, Live Documents lets consumers and businesses to derive the benefits of Office 2007 without having to upgrade," said Adarsh Kini, Chief Technology Officer, InstaColl.

KOffice Made Simpler

The high-profile nature of OpenOffice.org means that KOffice tends not to get the respect it deserves. Maybe the latest iteration will change that, because it offers an interesting addition:

Over two years ago, Inge Wallin proposed a simplified word processor to be used in school for kids. Thomas Zander, the KWord lead developer, made a proof of concept of this using the infrastructure of KOffice 2. This proved simpler than even Thomas would have believed, and KOffice 2.0 Alpha 5 now contains a first version of the KOffice for kids. Note that only the GUI is simplified, and that it still contains the full power of KOffice. This means that it can save and load the OpenDocment Format, which will make it easy to interact with other users of OpenOffice.org or the full KOffice suite.

These are precisely the kind of innovations that free software makes so easy: hacking together a quick prototype and then polishing it. Let's hope that other simplified versions follow, since an "Easy" Office would be useful far beyond its original target market, education.

It would also be a nice riposte to never-ending complexification of Microsoft's own products, which are forced to add more and more obscure features - whether or not users what them - in a desperate attempt to justify yet another paid-for upgrade. Free software is under no such pressure, and can therefore downgrade applications when that might appropriate, as here. Microsoft, by contrast, is trapped by its ratchet-based business model.

MS Explorer Is Sinking...

...no, really. Talk about symbolism.

We Demand Books on Demand

One of the interesting results of the move to digital texts is a growing realisation that analogue books still have a role to play. Similarly, it's clear that analogue books serve different functions, and that feeds into their particular physical form. So some books may be created as works of art, produced to the very highest physical standards, while others may simply be convenient analogue instantiations of digital text.

Public domain books are likely to fall into the latter class, which means that ideally there should be an easy way to turn such e-texts into physical copies. Here's one:

This is an experiment to see what the demand for reprints of public domain books would be. This free service can take any book from the Internet Archive (that is in public domain) and reprint it using Lulu.com. Prices of the books are rounded up from Lulu.com cost prices to the nearest $0.99 to cover the bandwidth and processing power that we rent from Amazon using their EC2 service. There is also a short post on my blog about it.

How Does It Work

Anyone with an email address can place a request on this page using an Internet Archive link or ID. Your request will be forwarded to our conversion server which will convert the appropriate book to printable form, and sends it off to Lulu.com. When the book has been uploaded, it will be made for immideate ordering and shipping, and you will receive a link to it via email. Currently, only soft cover books are supported in 6"x9", 6.625"x10.25" or 8"x11" trim sizes.

Interesting to see Lulu.com here, confirming its important place as a mediator between the digital and analogue worlds. (Via Open Access News.)

Openness: Purity of Essence

I wrote a piece for Linux Journal recently warning that Microsoft was beginning to hijack the meaning of the phrase "open source". But the problem is much bigger than this: the other opens face similar pressures, as Peter Murray-Rust notes.

In some ways it's even more serious for fledgling movements like open access and open data: there, the real meaning has barely been established, and so defending it is harder than for open source, which has had a well-defined definition for some time. Given the importance of labels, this is a matter that needs to be addressed with some urgency before "open access" and "open data" become little more than bland marketing terms.

Thank You, FOSS

Via GigaOM, I came across a link to this love-letter to Facebook:

Thinking about it, I've rarely used a service that has brought me so much emotional satisfaction...connecting with good friends is a feel-good thing and it is this emotional value that makes Facebook hard to beat in terms of the gratification other services can provide. So much so, here I am even writing a thank you note to the service (I can't remember doing that for any service...I've written about how "cool" stuff is, or how useful some service might be...but "thank you"? Never).

Although I think that Facebook is interesting - but not unproblematic, especially its recent moves - I'd never see it in this light. But it set me wondering whether there was anything comparable for me - a place of digital belonging of the kind offered by Facebook. And I realised there was, but not one that was crystallised in a single service. Rather, I feel this same sense of "connecting with good friends" with respect to the much larger, and more diffuse free software community.

This isn't a new thing. Back in the early years of this century, when I was writing Rebel Code, I was astonished at how helpful everyone was that I spoke to in that world. That stood in stark contrast to the traditional computing milieu, where many was full of their own (false) self-importance, and rather too fixated on making lots of money.

It seems I'm not alone in this sense of hacker camaraderie:

The key thing here is that in all the details, spats, debates, differences in direction and nitty-gritty, it is easy to forget that the core ingredients in this community are enthusiastic, smart, decent people who volunteer their time and energy to make Open Source happen. As Open Source continues to explode, and as we continue to see such huge growth and success as it spreads across the world and into different industries, we all need to remember that the raw ingredients that make this happen are enthusiastic, smart, decent people, and I for one feel privileged to spend every day with these people.

To paraphrase W.H.Auden:

Thank You, Thank You, Thank You, FOSS.

Public Domain Search

One of the big advantages of open content is that there are no problems with indexing it - unlike proprietary stuff, where owners can get unreasonably jumpy at the idea. Public domain materials are the ultimate in openness, and here's a basic search engine for some of them:

major public domain sites were chosen, the most important being the US federal government sites. government:

* .gutenberg.org
* .fed.us
* .gov
* .mil

But there are plenty of exclusions. Also, it's a pity this is only for the US: the public domain is somewhat bigger. (Via Open Access News.)

22 November 2007

Happy Birthday Internet

Watch out, there's a meme about:


The Internet is 30 today. Exactly 30 years ago today on November 22, 1977 the first three networks were connected to become the Internet.

(Via Simon Willison's Weblog.)

Realising Virtual Worlds Through Openness

I mentioned Tim Berners-Lee below as an iconic figure. Philip Rosedale may not quite be there yet, but he stands a good chance of attaining that status if his vision works out. He's put together a useful summary of how that vision grew, and, more importantly, what Linden Lab is going to do to realise it more fully. Nice to see that at the heart of the strategy lies openness:

we need to keep opening SL up, as we’ve started talking about lately. This means formats, source code, partners, and more. We are working on turning our clear vision on this into more detailed plans. Virtual worlds, in their broadest form, will be more pervasive that the web, and that means that their systems will need to be open: extended and operated by many people and companies, not just us.

That Umair Bloke on Blogonomics 2007

Glad it's not just me that feels this way.

Tim B-L: On Moving from the WWW to the GGG

Tim Berners-Lee is an inconic figure for a reason: he's actually rather sharp. This makes his rare blog posts important and interesting - none more so than his most recent one about the Giant Global Graph (GGG):

In the long term vision, thinking in terms of the graph rather than the web is critical to us making best use of the mobile web, the zoo of wildy differing devices which will give us access to the system. Then, when I book a flight it is the flight that interests me. Not the flight page on the travel site, or the flight page on the airline site, but the URI (issued by the airlines) of the flight itself. That's what I will bookmark. And whichever device I use to look up the bookmark, phone or office wall, it will access a situation-appropriate view of an integration of everything I know about that flight from different sources. The task of booking and taking the flight will involve many interactions. And all throughout them, that task and the flight will be primary things in my awareness, the websites involved will be secondary things, and the network and the devices tertiary.

This is probably the best thing I've read about social graphs, not least because it anchors a trendy idea in several pre-existing areas of serious Webby development. (Via Simon Willison's Weblog.)

21 November 2007

Interoperability: The New Battlefield

One word is starting to crop up again and again when it comes to Microsoft: interoperability - or rather the lack of it. It was all over the recent agreement with the EU, and it also lies at the heart of the OpenDocument Foundation's moves discussed below.

And now here we have some details of the next interoperability battles:

the EU Competition Commissioner’s office, with the first case decided by the EU Court of First Instance, now has started working intensively on the second case.

The new case involves three main aspects. First, Microsoft allegedly barred providers of other text document formats access to information that would them allow to make their products fully compatible with computers running on Microsoft’s operating systems. “You may have experienced that sometimes open office documents can be received by Microsoft users, sometimes not.”

Second, for email and collaboration software Microsoft also may have privileged their own products like Outlook with regard to interfacing with Microsoft’s Exchange servers. The third, and according to Vinje, most relevant to the Internet and work done at the IGF, was the problem of growing .NET-dependency for web applications. .NET is Microsoft’s platform for web applications software development. “It is a sort of an effort to ‘proprietise’ the Internet,” said Vinje.

That's a good summary of the problems, and suggests that the Commission is learning fast; let's hope that it doesn't get duped when it comes to remedies as it did the last time, apparently fooled by Microsoft's sleights of hand over patents and licences.

Decentralise Your Data - Or Lose It

Aside from the obvious one of not trusting the UK government with personal data, the other lesson to be learned from the catastrophic failure of "security" by the HMG is the obverse to one of free software's key strengths, decentralisation. When you do centralise, you make it easy for some twerp - or criminal - to download all your information onto a couple of discs and then lose them. A decentralised approach is not without its problems, but at least it puts a few barriers in the way of fools and knaves.

Hardware is Like Software? - Ban Hardware Patents

I won't bother demolishing this sad little piece on why software patents are so delicious and yummy, because Mike Masnick has already done that with his customary flair.

But I would like to pick on something purports to be an argument in the former:


One needs to understand that there is fundamentally no difference between software and hardware; each is frequently expressed in terms of the other, interchangeably describing the same thing. For example, many microprocessors are conceptualized as software through the use of hardware description languages (HDL) such as Bluespec System Verilog and VHDL. The resulting HDL software code is downloaded to special microprocessors known as FPGAs (field programmable gate arrays), which can mimic a prospective chip's design and functions for testing. Eventually, the HDL code may be physically etched into silicon. Voilà! The software becomes hardware.

Well, that's jolly interesting, isn't it? Because it means that such hardware is in fact simply an instantiation of algorithms - hard-wired, to be sure, but no different from chiselling those algorithms in granite, say. And as even the most hardened patent fan concedes, pure knowledge such as mathematics is not patentable.

So the logical conclusion of this is not that software is patentable, but that such hardware *shouldn't* be. I'd go further: I suspect that anything formed by instantiating digial information in an analogue form - but which is not essentially analogue - should not be patentable. The only things that might be patentable are purely analogue objects - what most people would recognise as patentable things.

There is an added benefit to taking this approach, since it is also solves all those conundrums about whether virtual objects - in Second Life, for example - should be patentable. Clearly, they should not, because they are simply representations of digital entities. But if you wanted to make an analogue version - and not just a hard-wiring - you could reasonable seek a patent if it fulfilled the usual conditions.

Oh, Tell Me the Truth About...the ODF Bust-Up

The recent decision by the OpenDocument Foundation to shift its energies away from ODF to CDF has naturally provoked a lot of rather exaggerated comment. I wrote a piece for LWN.net (now out from behind the paywall) exploring what exactly was going on, and found out that there are bigger issues than simply document interoperability at play.

It turns out to be all about Microsoft's Sharepoint - software that I am beginning to see as one of the most serious threats to open source today. Read it and be very afraid.

GNU PDF Project

Around ten years ago I fought a fierce battle to get people to use HTML instead of PDF files, which I saw as part of a move to close the Web by making it less transparent.

You may have noticed that I lost.

Now, even the GNU project is joining in:

The goal of the GNU PDF project is to develop and provide a free, high-quality and fully functional set of libraries and programs that implement the PDF file format, and associated technologies.

...

PDF has become the de-facto standard for documentation sharing in the industry.

Almost all enterprises uses PDF documents to communicate all kinds of information: manuals, design documents, presentations, etc, even if it is originally composed with OpenOffice, LaTeX or some other word processor.

Almost all enterprises use proprietary tools to compose, read and manipulate PDF files. Thus, the workers of these enterprises are forced to use proprietary programs.


I still think HTML, suitably developed, would be a better solution. (Via LXer.)

20 November 2007

Actuate's Actual Open Source Snapshot

One of the sure signs that open source is moving into the mainstream is the number of surveys about it that are being conducted. The great thing about these is that while individually they bolster the case for open source in different areas, collectively they are almost overwhelmingly compelling.

The latest such survey comes from Actuate. It's actually an update of an earlier, more circumscribed one, and it ranges far more widely:


Following research first conducted in November 2005, exclusively targeted at financial services companies in the UK and Europe, the 2007 Actuate Open Source Software Survey broadened its scope to include research attitudes to open source systems in both North America and Germany. The 2007 survey also extended beyond financial services to include public services, manufacturing and telecommunications (telco) in the new regions and now uniquely provides a detailed local insight as well as interesting regional comparisons across the geographies and the vertical sectors within them.

The top-line result?
Half the organizations surveyed stated that open source is either the preferred option or is explicitly considered in the software procurement process. One surprising note is that one-third of the organizations surveyed are now likely to consider open source business intelligence in their evaluations. This is a huge shift from just a few years ago.

The survey is available free of charge, but registration is required.