Showing posts with label modularity. Show all posts
Showing posts with label modularity. Show all posts

02 March 2011

Open Source by Any Other Name...

As I noted on Tuesday, the UK government has been pretty much a total disaster when it comes to using open source. Indeed, it has arguably been a total disaster when it comes to using computers of any kind, spending far more on this area than any comparable European government. Moreover, the stuff is almost always late, and rarely works properly.

On Open Enterprise blog.

21 March 2010

Open Source's (Not-so-)Secret Sauce: Modularity

Why does open source succeed? Apart, that is, from the fact that it is created by huge numbers of amazingly clever and generous people? Or, put another way, what is a key ingredient that must be present for the open source methodology to be applicable to other spheres?

Modularity.

If the stuff to hand isn't modular, you can't really share, because your stuff isn't compatible with other people's stuff. If it isn't modular, you can't share out tasks and scale. If you can't share out tasks, you can't have people working independently, at their own pace and in their own way, which means the project isn't really open. If it isn't modular, you can't swap in some new elements while leaving everything else untouched, which means no "release early, release often", no experimentation, no rapid evolution. Modularity is indispensable.

I think that's why open source hardware has singularly failed to take off. It's difficult to make bunches of atoms modular in the way that bunches of bits are (at least until we have general 3D printers, in which case we're done...)

But could there be a way of introducing that modularity at a higher level so as to enjoy the benefits outlined above? I do believe there is, and with hindsight, it was pretty obvious (er, so why didn't I think of it?). It's called OpenStructures:

The OS (OpenStructures) project explores the possibility of a modular construction model where everyone designs for everyone on the basis of one shared geometrical grid. It initiates a kind of collaborative Meccano to which everybody can contribute parts, components and structures.

As you can see, the clever people behind this project have the magic word "modular" in there. Specifically, they have devised a very simple grid system that ensures that things fit together, even when they're made by different people at different times and for different purposes. Significantly, the grid is based on binary multiples and subdivisions:

If you choose to apply the OS grid for the dimensions of a part, at least one of the measurements of this part (length, wideness and thickness or height) should correspond to either 0,125cm / 0,25cm / 0,5cm / 1cm / 2cm and multiples of 2cm in order to be compatible with other parts. (see part examples)


What's really impressive about this project is not just this insight into the modularity of elements, but the completeness of the vision that results. For example, there is an explicit hierarchy of elements, starting from OS Parts, which combine to form OS Components, from which are made OS Structures, and finally OS Superstructures.

It's an amazing vision, and I think it could have a major impact on the world of open source hardware, at least of this particular construction-set type. If you want to see some of the exciting objects that have already been created, don't miss the fab photos on the project's blog. (Via @opensourcerer.)

Follow me @glynmoody on Twitter or identi.ca.

18 September 2008

Enterprise Open Source 2.0

Yesterday I met up with Brian Gentile, the CEO of JasperSoft. He's relatively new to the job, although not new to the company, since he was already on its board for some time. It was striking that much of our conversation was about marketing and management, and that's probably a fair reflection of why Gentile's there: he's been brought in essentially to take that little old open source startup to the next level – and that means worrying about all that tiresome adult stuff like articulating corporate strategies, conversion rates, and generally getting a good operational handle on things....

On Open Enterprise blog.

03 April 2008

You Know Open Source Has Really Arrived...

...when the two main political parties in the UK are squabbling over who is truer to the open source spirit:

David Cameron embraced Linux, open source and bottoms-up decision-making today as he detailed his vision of a Tory innovation policy in a speech at the National Endowment for Science, Technology and the Arts.

Cameron pledged that a Tory government would set the UK’s data free – but not in a bad way, like HMRC. Rather, he said, he wanted to ensure people could access information which allowed them to create “innovative applications that serve the public benefit”. This “information liberation” meant ensuring spending data was transparent for example, and that people could easily compare crime figures.

At the same time, he said, “We also want to see how open source methods can help overcome the massive problems in government IT programs”. Cameron said the Tories would reject Labour’s addiction to the mainframe model. Instead, he claimed, a Conservative government would follow private sector best practice and introduce open standards, “that enables IT contracts to be split up into modular components”.

...


Cameron’s pledge to open source comes just days after the minister for transformational government, Tom Watson, claimed that Labour is the party that really, you know, gets open source.

In his speech on Monday announcing the government’s Power of Information taskforce, he referred to an earlier speech where “I talked about the three rules of open source - one, nobody owns it. Two, everybody uses it. And three, anyone can improve it." He then recounted how the Tories immediately sent out an email “laying claim that in fact they are the ‘owners’ of these new ideas. I was accused of plundering policies from the Conservatives.”

Fight, fight, fight.

31 March 2008

The Marvels of Modularity

One word that has cropped up time and again on this blog is "modularity". It's one of the prime characteristics of the open source way - and one of its greatest strengths. Now wonder, then, that Microsoft has finalled cottoned on - helped, no doubt, by the abject failure of its Vista monster:


When Windows 7 launches sometime after the start of 2010, the desktop OS will be Microsoft's most "modular" yet. Having never really been comfortable with the idea of a single, monolithic desktop OS offering, Microsoft has offered multiple desktop OSes in the marketplace ever since the days of Windows NT 3.1, with completely different code bases until they were unified in Windows 2000. Unification isn't necessarily a good thing, however; Windows Vista is a sprawling, complex OS.

A singular yet highly modular OS could give Microsoft the best of all possible worlds: OSes that can be highly customized for deployment but developed monolithically. One modular OS to rule them all, let's say.

Modularity has another huge benefit for Microsoft: it will allow it to address the nascent ultraportable market, something that it finds hard to do with its current operating systems.

Needless to say, though, even in making this sensible move, Microsoft manages to add a touch of absurdity:

Unsurprisingly, Microsoft already has a patent on a "modular operating system" concept.

A *patent* on modularity? Give me a break....

27 February 2008

What Windows Server 2008 Learned from OSS

Fascinating stuff from Microsoft's Sam Ramji:

When I think about what works really well in open source development and technology, the following things stand out:

* Modular architectures
You can find these wherever you see participation at scale – and often a rearchitecture to a more modular system precedes expanded participation. Great examples of this are Firefox, OpenOffice, and X11 – from both the historical rearchitecture and the increased participation that resulted. The Apache HTTP server and APR are good examples that have been modular for as long as I can recall.

* Programming language agnostic
A given project uses a consistent language, but there are no rules on what languages are in scope or out of scope. Being open to more languages means opportunity to attract more developers – the diversity of PHP/Perl/Python/Java has been a core driver in the success of a number of projects including Linux.

* Feedback-driven development
The “power user” as product manager is a powerful shift in how to build and tune software – and this class of users includes developers who are not committing code back, but instead submitting CRs and defects – resulting in a product that better fits its end users.

* Built-for-purpose systems
Most frequently seen in applications of Linux, the ability to build a system that has just what is needed to fulfill its role and nothing else (think of highly customizable distributions like Gentoo or BusyBox, as well as fully custom deployments).

* Sysadmins who write code
The ability of a skilled system administrator to write the “last mile” code means that they can make a technology work in their particular environment efficiently and often provide good feedback to developers. This is so fundamental to Unix and Linux environments that most sysadmins are competent programmers.

* Standards-based communication
Whether the standard is something from the IETF or W3C, or simply the implementation code itself, where these are used projects are more successful (think of Asterisk and IAX2) and attract a larger ecosystem of software around them.

What's interesting about this is not that it's astute analysis - which it is - but that Ramji doesn't mind making it public while admitting that Windows is learning from open source. Of course, it would be stupid not to, but it's nonetheless an important sign of how things are finally changing at Microsoft that it's prepared to trumpet the fact - and of the irreistible rise of the open source way.

19 November 2007

Modular Magazines

After modular books, now this:

Google may soon begin to offer users the ability to create customized, printed magazines from Internet content. And print ads included in the magazine would be customized, too.

The future is modular.

17 November 2007

Modular Books

Modularisation is one of the key elements of open processes: so why can't we have modular books? Well, we can, up to a point:


On Wednesday, the Arizona community college announced a partnership with Pearson Custom Publishing to allow Rio Salado professors to piece together single individualized textbooks from multiple sources. The result, in what could be the first institution-wide initiative of its kind, will be a savings to students of up to 50 percent, the college estimates, as well as a savings of time to faculty, who often find themselves revising course materials to keep pace with continuously updated editions.

However, this is only with texts from one source: imagine if you could do this with *any* text. (Via if:book.)

13 November 2007

Of Bazaars and Dangerous Co-location

I often bang on about modularity in this blog, and its critical importance to creating and running open projects. Here are some more thoughts on the subject, along with many interesting ruminations on creating a Raymondian bazaar, and the state of open source companies today. It concludes by answering a key question it posed itself:

Why do so many open-source projects not have the active community of external contributors they are hoping for? Because they have been largely developed by co-located teams of hired software engineers, 100% dedicated to the project, managed and organized like any traditional software development effort. This seems to be especially true for the new crop of ‘custom build’ open-source companies, which would like to take advantage of the open-source business model. They might hope to also enjoy the advantages of the open-source development model one day, but achieving that requires a conscious effort.

Good stuff.

19 May 2007

Microsoft Starts to Get the Modularity Bug

First, this incredible opening par:

Some of the changes in the upcoming release of Windows Server 2008 are a response to features and performance advantages that have made Linux an attractive option to Microsoft customers.

Er, say that again? Windows Server 2008 is explicitly responding to GNU/Linux?

Then, this little nugget:

"Having less surface area does reduce the servicing and the amount of code you have running and exposed, so we have done a lot of work in 2008 to make the system more modular. You have the server manager; every role is optional, and there are more than 30 components not installed by default, which is a huge change," Laing said.

Ah, yes, modularity....

18 May 2007

In Praise of Modularity (Again)

News that Firefox users tend to be more up-to-date with their security patches is interesting, especially for on account of the suggested explanation:

Much of this patching success has to be credited to Firefox's automatic update mechanism, which debuted in version 1.5 but was improved in version 2.0. The browser checks to see if a new version is available and notifies the user when it finds one. The security updates tend to be small (around 200KB to 700KB), which also makes the updating process less painful.

Internet Explorer, in contrast, is typically updated along with the rest of the system with Windows Update. Regular users of Windows Update automatically got upgraded from IE 6 to IE 7, so it is not surprising that people still stuck on IE 6 are not updating as much as IE 7. It's possible to assume that many of the people who aren't using Windows Update are avoiding it because the Windows Update web site checks (using WGA) to see if the user has a legitimate copy of the operating system, but as critical updates for IE 6 are still automatically downloaded by Windows even if WGA fails, it seems more likely that the numbers include legitimate users who have turned automatic updates off.

Once again, the virtues of modularity become clear - and turn out to have very clear real-world benefits too, in this case.

16 May 2007

Fighting Climate Change with Open Data

Here's an interesting idea on several levels:

the Zerofootprint platform, powered by Business Objects, provides urban dwellers the ability to view their “environmental footprint” – the effect their daily habits have on pollution levels and the strain they place on our natural resources.

Enter accessible data — such as miles driven each year, miles flown, kilowatt hours used, location of home and office — and you can easily calculate your effects on the earth. The calculator measures not only the amount of carbon dioxide emitted (the carbon footprint) but also the use of resources such as land, trees and water. Once an individual's impact has been calculated, the Zerofootprint tool provides information on how to reduce it, measuring the results.

I think this makes an important point: if you can't measure something - in this case environmental impact - then you can't manage it. Providing direct feedback to people on the consequences of their day-to-day choices seems a sensible way to engage them in fighting climate change and the destruction of the environmental commons.

Interestingly, there's another level:

Much of the data gathered will be stored on the Insight database — and then the real work begins.

The challenge, or challenges, will not stop with the creation of a database. As soon as a representative sample size is available, business analysts and number crunchers everywhere can roll up their sleeves to use the information in meaningful ways.

For instance, imagine a visualization comparing the carbon footprint per kilowatt hour of electricity used in Paris versus Shanghai.

“When we are able to analyze and visualize this data, that is bound to suggest a myriad of solutions,” says Ron Dembo, founder of Zerofootprint, whose mission is nothing less than to change the world by helping people reduce their environmental footprint. “The database created here will be the ‘creative commons’ for building models for many different opportunities.”

Again, this is hardly a novel insight, but it is an important idea. Aggregation of open data in this way provides a whole that is greater than the sum of the parts. What's striking is that both this and the idea of providing some kind of feedback lie at the heart of open source and related open endeavours. Modularisation means that people can work on small elements that together contribute to a larger whole; and the feedback they get for their efforts - typically peer esteem - is what keeps them going.
(Via Ars Technica.)

09 May 2007

A Theory of Modularity

I've mentioned a few times how important modularity is to the efficiency of openness. This seems pretty obvious, intuitively, but it's nice to know that some academics have produced a rather nice, rigorous demonstration of why this should be the case for software:

Important software modularity principles, such as the information hiding criterion, have remained informal. DSM modeling and Baldwin and Clark’s design rule theory have the potential to formally account for how design rules create options in the form of independent modules and enable independent substitution.

This paper evaluated the applicability of the model and theory to real-world large-scale software designs by studying the evolution of two complex software platforms through the lens of DSMs and design rule theory. The results showed that (1) DSM models can precisely capture key characteristics of software architecture by revealing independent modules, design rules, and the parts of a system that are not well modularized; (2) design rule theory can formally explain why some software systems are more adaptable, and how a modularization activity, such as refactoring, conveys strategic advantages to a company.

Er, quite. (Via Michael Tiemann.)

30 April 2007

Of Modules, Atoms and Packages

I commented before that I thought Rufus Pollock's use of the term "atomisation" in the context of open content didn't quite capture what he was after, so I was pleased to find that he's done some more work on the concept and come up with the following interesting refinements:

Atomization

Atomization denotes the breaking down of a resource such as a piece of software or collection of data into smaller parts (though the word atomic connotes irreducibility it is never clear what the exact irreducible, or optimal, size for a given part is). For example a given software application may be divided up into several components or libraries. Atomization can happen on many levels.

At a very low level when writing software we break thinks down into functions and classes, into different files (modules) and even group together different files. Similarly when creating a dataset in a database we divide things into columns, tables, and groups of inter-related tables.

But such divisions are only visible to the members of that specific project. Anyone else has to get the entire application or entire database to use one particular part of it. Furthermore anyone working on any given part of one of the application or database needs to be aware of, and interact with, anyone else working on it — decentralization is impossible or extremely limited.

Thus, atomization at such a low level is not what we are really concerned with, instead it is with atomization into Packages:

Packaging

By packaging we mean the process by which a resource is made reusable by the addition of an external interface. The package is therefore the logical unit of distribution and reuse and it is only with packaging that the full power of atomization’s “divide and conquer” comes into play — without it there is still tight coupling between different parts of a given set of resources.

Developing packages is a non-trivial exercise precisely because developing good stable interfaces (usually in the form of a code or knowledge API) is hard. One way to manage this need to provide stability but still remain flexible in terms of future development is to employ versioning. By versioning the package and providing ‘releases’ those who reuse the packaged resource can use a specific (and stable) release while development and changes are made in the ‘trunk’ and become available in later releases. This practice of versioning and releasing is already ubiquitous in software development — so ubiquitous it is practically taken for granted — but is almost unknown in the area of knowledge.

Tricky stuff, but I'm sure it will be worth the effort if the end-result is a practical system for modularisation, since this will allow open content to enjoy many of the evident advantages of open code.

19 March 2007

Open Knowledge, Open Greenery and Modularity

On Saturday I attended the Open Knowledge 1.0 meeting, which was highly enjoyable from many points of view. The location was atmospheric: next to Hawksmoor's amazing St Anne's church, which somehow manages the trick of looking bigger than its physical size, inside the old Limehouse Town Hall.

The latter had a wonderfully run-down, almost Dickensian feel to it; it seemed rather appropriate as a gathering place for a ragtag bunch of ne'er-do-wells: geeks, wonks, journos, activists and academics, all with dangerously powerful ideas on their minds, and all more dangerously powerful for coming together in this way.

The organiser, Rufus Pollock, rightly placed open source squarely at the heart of all this, and pretty much rehearsed all the standard stuff this blog has been wittering on about for ages: the importance of Darwinian processes acting on modular elements (although he called the latter atomisation, which seems less precise, since atoms, by definition, cannot be broken up, but modules can, and often need to be for the sake of increased efficiency.)

One of the highlights of the day for me was a talk by Tim Hubbard, leader of the Human Genome Analysis Group at the Sanger Institute. I'd read a lot of his papers when writing Digital Code of Life, and it was good to hear him run through pretty much the same parallels between open genomics and the other opens that I've made and make. But he added a nice twist towards the end of his presentation, where he suggested that things like the doomed NHS IT programme might be saved by the use of Darwinian competition between rival approaches, each created by local NHS groups.

The importance of the ability to plug into Darwinian dynamics also struck me when I read this piece by Jamais Cascio about carbon labelling:

In order for any carbon labeling endeavor to work -- in order for it to usefully make the invisible visible -- it needs to offer a way for people to understand the impact of their choices. This could be as simple as a "recommended daily allowance" of food-related carbon, a target amount that a good green consumer should try to treat as a ceiling. This daily allowance doesn't need to be a mandatory quota, just a point of comparison, making individual food choices more meaningful.

...

This is a pattern we're likely to see again and again as we move into the new world of carbon footprint awareness. We'll need to know the granular results of actions, in as immediate a form as possible, as well as our own broader, longer-term targets and averages.

Another way of putting this is that for these kind of ecological projects to work, there needs to be a feedback mechanism so that people can see the results of their actions, and then change their behaviour as a result. This is exactly like open source: the reason the open methodology works so well is that a Darwinian winnowing can be applied to select the best code/content/ideas/whatever. But that is only possible when there are appropriate metrics that allow you to judge which actions are better, a reference point of the kind Cascio is writing about.

By analogy, we might call this particular kind of environmental action open greenery. It's interesting to see that here, too, the basic requirement of modularity turns out to be crucially important. In this case, the modularity is at the level of the individual's actions. This means that we can learn from other people's individual success, and improve the overall efficacy of the actions we undertake.

Without that modularity - call its closed-source greenery - everything is imposed from above, without explanation or the possibility of local, personal, incremental improvement. That may have worked in the 20th century, but given the lessons we have learned from open source, it's clearly not the best way.

25 January 2007

Open Linux Router

When I wrote about the open source router Vyatta, I noted that it was slightly ironic that only now is free software addressing the area. So it's good to see another project, called simply the Open Linux Router doing the same:


The Open Linux Router will be a network appliance unlike any other. Its modular design will empower the user with the ability to pick and choose what features and/or services will and will not be included on the implementation. By scaling the features and services down, the Open Linux Router can easily be installed on a small, embedded device. Although, if the implementation demands functionality, it is just as easy to add the features, which provides the Open Linux Router with a wide and diverse demographic. Residential and small business implementations have a certain set of needs, while an enterprise implementation requires a more concentrated operation and thats what drives the modular approach to services and features. The learning curve is also greatly reduced through a consolidation of the nominal devices that your IT staff would currently have to master to rise to the same level of productivity. This project aims to encourage open source software for network systems and solutions.

(Via Linux and Open Source Blog.)

16 January 2007

We Are All Modular Now

One of the central theses of this blog is that for things like software, modularity produces more and better code, because it allows a kind of Darwinian selection to kick in on an atomistic basis.

But wait: isn't another of my theses that openness is appropriate across a whole range of activities - notably content production? And so...that would suggest that content should become more modular too, allowing a similar kind of winnowing process to take place.

Eek!

06 November 2006

Open Source as Archaeology

An interesting thought about the modular design of free software:

We have observed a number of projects where software development is driven by the identification, selection, and combination of working software systems. More often than not, these constituent parts are Open Source software systems and typically not designed to be used as components. These parts are then made to interoperate through wrappers and glue code. We think this trend is a harbinger of things to come.