Showing posts with label supercomputers. Show all posts
Showing posts with label supercomputers. Show all posts

10 February 2013

What's the next big platform for Linux?

Linux has a problem: it's running out of platforms to conquer. It's already the top operating system for smartphones and supercomputers, and is widely used in embedded and industrial systems. It's true the Year of the GNU/Linux desktop continues to be five years in the future, but the rise of tablets makes up for that in part. 

On The H Open.

15 November 2010

Microsoft: Super - But Not Quite Super Enough

Once upon a time, the Netcraft Web server market share was reported upon eagerly every month for the fact that it showed open source soundly trouncing its proprietary rivals. We don't hear much about that survey these days - not because things have changed, but for that very reason: it's now just become a boring fact of life that Apache has always been the top Web server, still is, and probably will be for the foreseeable future. I think we're fast approaching that situation with the top500 supercomputing table.

On Open Enterprise blog.

09 November 2010

Is it Time for Free Software to Move on?

A remarkable continuity underlies free software, going all the way back to Richard Stallman's first programs for his new GNU project. And yet within that continuity, there have been major shifts: are we due for another such leap?

On The H Open.

01 June 2010

GNU/Linux *Does* Scale – and How

As everyone knows, GNU/Linux grew up as a project to create a completely free alternative to Unix. Key parts were written by Richard Stallman while living the archetypal hacker's life at and around MIT, and by Linus Torvalds – in his bedroom. Against that background, it's no wonder that one of Microsoft's approaches to attacking GNU/Linux has been to dismiss it on technical grounds: after all, such a rag-bag of code written by long-haired hippies and near-teenagers could hardly be compared with the product of decades of serious, top-down planning by some of best coding professionals money can buy, could it?

On Open Enterprise blog.

27 January 2010

Enter the (Big) Dragon

As part of my continuing service to report on the fascinating developments in the Chinese chip sector, I pass on the following:

It's official: China's next supercomputer, the petascale Dawning 6000, will be constructed exclusively with home-grown microprocessors. Weiwu Hu, chief architect of the Loongson (also known as "Godson") family of CPUs at the Institute of Computing Technology (ICT), a division of the Chinese Academy of Sciences, also confirms that the supercomputer will run Linux. This is a sharp departure from China's last supercomputer, the Dawning 5000a, which debuted at number 11 on the list of the world's fastest supercomputers in 2008, and was built with AMD chips and ran Windows HPC Server.

It won't come as a surprise to readers of this blog that China's new supercomputer will be running Linux - over 80% of the world's big machines do. What's fascinating is that this is being built out of that home-grown Loongson chip - the one that Windows doesn't run on. As the same article explains:

The arrival of Dawning 6000 will be an important landmark for the Loongson processor family, which to date has been used only in inexpensive, low-power netbooks and nettop PCs. When the Dawning 5000a was initially announced, it too was meant to be built with Loongson processors, but the Dawning Information Industry Company, which built the computer, eventually went with AMD chips, citing a lack of support for Windows, and the ICT's failure to deliver a sufficiently powerful chip in time.

That means that as China builds more and more of these, and pushes the technology further and further, it will be Linux that benefits, not Windows, and Linux that spreads...

China + Loonson + Linux: this is one to watch...

Follow me @glynmoody on Twitter or identi.ca.

12 October 2009

Windows Does Not Scale

Who's afraid of the data deluge?


Researchers and workers in fields as diverse as bio-technology, astronomy and computer science will soon find themselves overwhelmed with information. Better telescopes and genome sequencers are as much to blame for this data glut as are faster computers and bigger hard drives.

While consumers are just starting to comprehend the idea of buying external hard drives for the home capable of storing a terabyte of data, computer scientists need to grapple with data sets thousands of times as large and growing ever larger. (A single terabyte equals 1,000 gigabytes and could store about 1,000 copies of the Encyclopedia Britannica.)

The next generation of computer scientists has to think in terms of what could be described as Internet scale. Facebook, for example, uses more than 1 petabyte of storage space to manage its users’ 40 billion photos. (A petabyte is about 1,000 times as large as a terabyte, and could store about 500 billion pages of text.)

Certainly not GNU/Linux: the latest Top500 supercomputer rankings show that the GNU/Linux family has 88.60%. Windows? Glad you asked: 1%.

So, forget about whether there will ever be a Year of the GNU/Linux Desktop: the future is about massive data-crunchers, and there GNU/Linux already reigns supreme, and has done for years. It's Windows that's got problems....

Follow me @glynmoody on Twitter or identi.ca.

07 October 2009

Meet Microsoft, the Delusional

This is hilarious:

Jean Philippe Courtois, president of Microsoft Europe, described the company as an underdog in Paris today.

He said Bing had between three and five percent market share in search and could only grow - although he admitted it could take a long time.

...

Despite Microsoft having to live with open source software for 10 years, it had retained its share in the market place, he said.

Er, what, like the browser sector, where Firefox now has nearly 24% market share worldwide, and Microsoft's share is decreasing? Or Apache's 54% in the Web server world, where Microsoft's share is decreasing? Or GNU/Linux's 88% market share of the top 500 supercomputers in the world, where Microsoft's share is static?

Microsoft the underdog? Or just a dog?

Follow me @glynmoody on Twitter or identi.ca.

23 June 2009

GNU/Linux Tops TOP500 Supercomputers Again

The fact that GNU/Linux totally dominates the top 500 supercomputing list is hardly news, but the fact that it has managed to *increase* its market share yet further is.

Here are the results for June 2009:


GNU/Linux 443 (88.6%)
Windows 5 (1.0%)
Unix 22 (4.4%)

and here are the figures for six months ago:


GNU/Linux 439 (87.8%)
Windows 5 (1.0%)
Unix 23 (4.6%)

Notice that plucky little Windows, from that small and hopelessly out-gunned company up in Seattle has bravely managed to increase its share by precisely 0%: an impressive result considering the millions of dollars it has spent trying to break into this market.

Snarky? Moi?

Update: More details about the top 20, and GNU/Linux's dominance here.

Follow me @glynmoody on Twitter or identi.ca.

17 November 2008

The Super Windows That...Couldn't

One of the more bizarre accusations flung by Microsoft at GNU/Linux over the years is that it doesn't scale. This is part of a larger campaign to portray it as a kind of “toy” operating system – fine for low-end stuff, but nothing you'd want to run your enterprise on....

On Open Enterprise blog.

06 February 2008

Running the Internet - All of It - on GNU/Linux

Everyone knows that Google uses hundreds of thousands of commodity PCs running GNU/Linux to power its services. Well, IBM wants to go one further: running everything - the entire Internet, for example - on an Blue Gene/P supercomputer running GNU/Linux:

In this paper we described the vision and exploration of Project Kittyhawk, an ongoing effort at IBM Research which explores the construction of a next-generation compute platform capable of simultaneously hosting many web-scale workloads. At scales of potentially millions of connected computers, efficient provisioning, powering, cooling, and management are paramount.

...

To test our hypothesis, we are prototyping a stack consisting of a network-enabled firmware layer to bootstrap nodes, the L4 hypervisor for partitioning and security enforcement, Linux as a standard operating system, and an efficient software pack-
aging and provisioning system. An important aspect is that while these building blocks allow us to run a large variety of standard workloads, none of these components are required and therefore can be replaced as necessary to accommodate many diverse workloads. This flexibility, efficiency, and unprecedented scale makes Blue Gene a powerhouse for running computation at Internet scale.

(Via The Reg.)

02 January 2008

Vista's Problem: Microsoft Does Not Scale

It is deeply ironic that once upon a time Linux - and Linus - was taxed with an inability to scale. Today, though, when Linux is running everything from most of the world's supercomputers to the new class of sub-laptops like the Asus EEE PC and increasing numbers of mobile phones, it is Microsoft that finds itself unable to scale its development methodology to handle this range. Indeed, it can't even produce a decent desktop system, as the whole Vista fiasco demonstrates.

But the issue of scaling goes much deeper, as this short but insightful post indicates:

The world has been scaling radically since the Web first came on the scene. But the success of large, open-ended collaborations -- a robust operating system, a comprehensive encyclopedia, some "crowd-sourced" investigative journalism projects -- now is not only undeniable, but is beginning to shape expectations. This year, managers are going to have to pay attention.

Moreover, it points out exactly why scaling is important - and it turns out to be precisely the same reason that open source works so well (surprise, surprise):

The scaling is due to the basic elements in the Web equation: Lots of people, bazillions of pieces of information, and gigabazillions of links among them all. As more of the market, more of the supply chain, and more of the employees spend more of their time online, the scaled world of the Web begins to set the agenda for the little ol' real world.

15 November 2007

Adding Some Lustre to Supercomputing

Everybody knows that GNU/Linux absolutely dominates the top 500 supercomputing listings: in the latest survey it notches up an 85% share (Windows manages 1.2%). Less well-known - to me, at least - is the fact that Lustre, an open source cluster file system, also does well:

Lustre highlights include:

The #1 fastest supercomputer in the world.

Lustre is being used on 7 out of the top 10 fastest supercomputers in the world.

Out of the top 30 fastest supercomputers in the world - Lustre can be found on 16 of them.

25 April 2007

Virtual Mouse Brain is Penguin-Powered

One of GNU/Linux's unique properties is its ability to run on dozens of platforms (whereas Windows runs on precisely one, that of Intel's processors). GNU/Linux can power anything from an embedded processor in a tiny industrial device, through mobile phones, PCs, minicomputers, mainframes right up to massively-parallel supercomputers.

One of these, IBM's Blue Gene/L, has recently been used to model part of a mouse brain in near-real-time. Which means that GNU/Linux has just added a platform, albeit as an emulation. (Via Jamais Cascio.)

14 November 2006

Top500 Supercomputers: Guess Who's Top?

The Top500 Supercomputer list is always fun, not least because it shows us where we will all be in a few years' time. There are all sorts of cuts of the main data, but the one you'll really be interested in is here; it shows that GNU/Linux ran a cool 75% of the Top500, and that a certain other operating system's share is so nugatory it's not even mentioned by name.

16 May 2006

Bird 'Flu vs. Open Source, Open Data

IBM pushes all the right buttons in this announcement of an open source, open data project to predict and help stem the spread of infectious diseases - like bird 'flu.

Central to the effort will be the use of advanced software technologies, elements of which IBM intends to contribute to the open-source community, that are designed to help share information on disease outbreaks electronically and use it to predict how diseases will spread.

And

Ultimately, those plans could include development and distribution of more effective and timely vaccines as IBM taps into knowledge gained through a planned collaborative initiative known as "Project Checkmate," in which IBM and The Scripps Research Institute propose to conduct advanced biological research on influenza viruses. The collaboration is designed to predict the way viruses will mutate over time using advanced predictive techniques running on high performance computing systems, such as IBM's BlueGene supercomputer, allowing effective vaccines to be developed by drug-makers, drawing on the immunology and chemistry expertise at Scripps.

Blue Gene runs GNU/Linux in part, so maybe open source will really save the world. (Via Boing Boing.)