Weird California Incident Last Year Points To The Real Threat To The Power Grid (Hint: It's Not Cyberattacks)
Via Bruce Schneier's blog, we learn of the following intriguing story published in Foreign Policy:
On Techdirt.
open source, open genomics, open creation
Via Bruce Schneier's blog, we learn of the following intriguing story published in Foreign Policy:
Posted by Glyn Moody at 1:57 pm 0 comments
Labels: bruce schneier, california, cyberattacks
A few months back, we wrote about the University of California's plan to lock up even more knowledge in the form of patents, in the hope that this would bring in lots of cash. But as Techdirt has reported time and again over the years, patenting research does not bring in more money to fund further research, in fact it probably doesn't bring in any money at all, once you allow for the costs of running tech transfer offices. Moreover, there's evidence that making the results of research freely available is much better for the wider economy than trying to turn them into intellectual monopolies.
Posted by Glyn Moody at 5:18 pm 0 comments
Labels: california, licensing, patents, techdirt, trolls
Techdirt has been monitoring for a while the inexorable rise of open access in the academic world. But even against a background of major wins, this latest news from the University of California (UC) is still big, not least because it seems to represent a major shift there:
Posted by Glyn Moody at 10:24 am 0 comments
Labels: california, open access, patents, sharing, techdirt
At the end of last year, we wrote about an extraordinary attempt by the University of California (UC) to resuscitate the infamous "Eolas" patents that were thrown out earlier by a jury in East Texas. Clearly, the University of California likes patents, and the way that they can be used to extract money from people with very little effort. In fact, it likes them so much it is trying to privatize research produced by taxpayer-funded laboratories so that even more patents can be taken out on the work, and even more money obtained through licensing them. The background to this new approach, implemented via a new entity provisionally entitled "Newco", is described in a fantastic feature by Darwin BondGraham that appears in East Bay Express:
Posted by Glyn Moody at 8:15 pm 0 comments
Labels: california, eolas, monopolies, patents, techdirt
Two years ago, Techdirt wrote about the major report "Media Piracy in Emerging Economies", which explored how media and software piracy in emerging countries is largely a question of economics: people and companies there simply cannot afford Western-style pricing, and resort to alternative sourcing. That hasn't stopped media and computer companies from demanding that governments around the world should inflict ever-more harsh punishments on their own people.
Posted by Glyn Moody at 3:11 pm 0 comments
Labels: california, piracy, software, techdirt
The acrimonious debate and serious lobbying that developed around California's Proposition 37, which would have required the labelling of genetically-modified ingredients in food products had it passed, is an indication that the subject inspires extreme views and involves big money. But an interesting post in Slate argues that GM labelling is really a minor issue compared to the main problem -- gene patents:
Posted by Glyn Moody at 9:56 pm 0 comments
Labels: california, gene patents, monsanto, techdirt
Elections seem like a no-brainer for openness: after all, fairness requires transparency, and you don't get more transparent than being fully open. And yet previous e-voting systems have proved notoriously fallible - not least because they weren't open. The Open Voting Consortium aims to do solve these problems:The Open Voting Consortium is a not-for-profit organization dedicated to the development, maintenance, and delivery of trustable and open voting systems for use in public elections. We are comprised of computer scientists, voting experts, and voting rights activists. We have a growing international membership base, but our organizing efforts are currently focused in California where we are actively engaged in legislation and implementing Open Voting as a model for the United States.
Needless to say, it's based on open source:We have developed (1) a prototype of open-source software for voting machines (2) an electronic voting machine that prints a paper ballot, (3) a ballot verification station that scans the paper ballot and lets a voter hear the selections, and (4) stations with functions to aid visually impaired people so they can vote without assistance. Open source means that anyone can see how the machines are programmed and how they work.
Posted by Glyn Moody at 7:24 am 0 comments
Labels: ballots, california, e-voting, open voting consortium
Here's some joined-up thinking: providing open access to key greenhouse figures:The Global Warming Solutions Act of 2006 (A.B. 32) requires CARB to adopt regulations creating a greenhouse gas registry by Jan. 1, 2008, putting in place what appears to be the country's most comprehensive and sophisticated greenhouse gas registry.
The proposed regulations were developed with input from public and private stakeholders, state agencies and the general public. Modeled after the California Climate Action Registry (CCAR), a voluntary greenhouse gas reporting program started in 2001, the regulations detail which industrial sectors will report, what the reporting and verification thresholds and requirements will be, and how calculations will be made. Approximately 800 facilities will be required to report greenhouse gas (GHG) emissions, which CARB estimates will represent 94 percent of California's total carbon dioxide production from stationary sources.
(Via Open Access News.)
Posted by Glyn Moody at 6:33 pm 0 comments
Labels: budapest open access initiative, california, greenhouse gases
Well, well:In what appears to be a surprise move, four state attorneys general who previously praised the effectiveness of Microsoft's antitrust settlement with the feds are now changing course.
In a nine-page court filing with U.S. District Judge Colleen Kollar-Kotelly on Thursday, officials in New York, Maryland, Louisiana and Florida said they were joining a group of six states, led by California, and the District of Columbia in calling for extending oversight on Redmond until 2012.
And listen to this:
The New York group's filing centers largely on what it calls the "indisputably resilient" monopoly that Microsoft holds in the operating system realm. The attorneys general said they were "mindful" that Windows' approximately 90 percent market share in client operating systems is not the only test for how successful the antitrust agreement has been. But they added, "the absence of meaningful erosion in Windows' market share is still problematic for the public interest."
What a fine phrase that is: "indisputably resilient". I think I could really get to like using that....
Posted by Glyn Moody at 4:49 pm 2 comments
Labels: california, florida, indisputably resilient, kollar-kotelly, louisiana, maryland, Microsoft, monopoly, New York, open source windows
If, like me, you somehow didn't make it to the Virtual Worlds 2007 conference, fear not: two reporters with, er, inimitable styles did attend, and have filed virtuoso reports on Philip Rosedale's speech. Read them both, and feel virtually there/hair.
Urizenus Sklar:He talks about how the Mandelbrot program on his computer blew his mind. He and a friend follow the replication of a starfish in a diagram as they zoom in on regions of it. Did I imagine this or did he say *chocolate* starfish. “The area of diagram was the same as the surface of the earth” – the earth tiled with chocolate starfish. Imagine.
Prokofy Neva, Kremlindenologist:So I walk into the 55th floor of the Millenium Hotel and I see it...The Hair. Our Hero's Hair is Holding Up. Relieved, I shake Philip Rosedale's hand and ask him how he's holding up, but the message has already been telegraphed to me: gelled, sturdy, stellar, architectural -- thank you very much. Philip's hair, if it could talk, would describe what it's like being the Cat in the Hat holding up all those sims, a rake, a plate, a cake...So...how many sims is it now? He gives me a figure..it's different than the figure Joe Miller gives later, you know, I don't think they really know, it's *almost organic* this stuff and out of control. 7800?
If you can imagine it's possible -- Philip's hair is *even more amazing* than it was at SOP II and SLCC I, which is when I first was exposed to the construction. People in New York don't do that kind of thing to their hair. I mean, you just never see it. Walk around, look. So this is So California. And...it's like...so cool and perfectly constructed, with just the right amount of mix of "bedhead" and "tousled bad boy" and "mad scientist". Gazing out over the sterilized wound of downtown, I couldn't help thinking of that time Nikola Tesla shorted out lower Manhattan with some experiment on Houston St...Philip looks more than ever like he stuck his hand in the socket and still finds it interesting...
Utterly brilliant.
Posted by Glyn Moody at 4:23 pm 2 comments
Labels: california, chocolate, hair, houston street, mandelbrot, New York, nikola tesla, philip rosedale, prokofy neva, starfish, urizenus sklar, virtual worlds 2007
One of the favourite games of scholars working on ancient texts that have come down to us from multiple sources is to create a family tree of manuscripts. The trick is to look for groups of textual divergences - a word added here, a mis-spelling there - to spot the gradual accretions, deletions and errors wrought by incompetent, distracted or bored copyists. Once the tree has been established, it is possible to guess what the original, founding text might have looked like.
You might think that this sort of thing is on the way out; on the contrary, though, it's an extremely important technique in bioinformatics - hardly a dusty old discipline. The idea is to treat genomes deriving from a common ancestor as a kind of manuscript, written using just the four letters - A, C, G and T - found in DNA.
Then, by comparing the commonalities and divergences, it is possible to work out which manuscripts/genomes came from a common intermediary, and hence to build a family tree. As with manuscripts, it is then possible to hazard a guess at what the original text - the ancestral genome - might have looked like.
That, broadly, is the idea behind some research that David Haussler at the University of California at Santa Cruz is undertaking, and which is reported on in this month's Wired magazine (freely available thanks to the magazine's enlightened approach to publishing).
As I described in Digital Code of Life, Haussler played an important role in the closing years of the Human Genome Project:Haussler set to work creating a program to sort through and assemble the 400,000 sequences grouped into 30,000 BACs [large-scale fragments of DNA] that had been produced by the laboratories of the Human Genome Project. But in May 2000, when one of his graduate students, Jim Kent, inquired how the programming was going, Haussler had to admit it was not going well. Kent had been a professional programmer before turning to research. His experience in writing code against deadlines, coupled with a strongly-held belief that the human genome should be freely available, led him to volunteer to create the assembly program in short order.
Kent later explained why he took on the task:There was not a heck of a lot that the Human Genome Project could say about the genome that was more informative than 'it's got a lot of As, Cs, Gs and Ts' without an assembly. We were afraid that if we couldn't say anything informative, and thereby demonstrate 'prior art', much of the human genome would end up tied up in patents.
Using 100 800 MHz Pentiums - powerful machines in the year 2000 - running GNU/Linux, Kent was able to lash up a program, assemble the fragments and save the human genome for mankind.
Haussler's current research depends not just on the availability of the human genome, but also on all the other genomes that have been sequenced - the different manuscripts written in DNA that have come down to us. Using bioinformatics and even more powerful hardware than that available to Kent back in 2000, it is possible to compare and contrast these genomes, looking for tell-tale signs of common ancestors.
But the result is no mere dry academic exercise: if things go well, the DNA text that will drop out at the end will be nothing less than the genome of one of our ancient forebears. Even if Wired's breathless speculations about recreating live animals from the sequence seem rather wide of the mark - imagine trying to run a computer program recreated in a similar way - the genome on its own will be treasure enough. Certainly not bad work for those scholars who "cough in ink" in the world of open genomics.
Posted by Glyn Moody at 2:58 pm 2 comments
Labels: bioinformatics, california, cough, david haussler, GNU/Linux, human genome project, ink, jim kent, pentiums, prior art, santa cruz, scholars, Wired
The Economist is a strange beast. It has a unique writing style, born of the motto "simplify, then exaggerate"; and it has an unusual editorial structure, whereby senior editors read every word written by those reporting to them - which means the editor reads every word in the magazine (at least, that's the way it used to work). Partly for this reason, nearly all the articles are anonymous: the idea is that they are in some sense a group effort.
One consequence of this anonymity is that I can't actually prove I've written for title (which I have, although it was a long time ago). But on the basis of a recent showing, I don't think I want to write for it anymore.
The article in question, which is entitled "Open, but not as usual", is about open source, and about some of the other "opens" that are radiating out from it. Superficially, it is well written - as a feature that has had multiple layers of editing should be. But on closer examination, it is full of rather tired criticisms of the open world.
One of these in particular gets my goat:...open source might already have reached a self-limiting state, says Steven Weber, a political scientist at the University of California at Berkeley, and author of “The Success of Open Source” (Harvard University Press, 2004). “Linux is good at doing what other things already have done, but more cheaply—but can it do anything new? Wikipedia is an assembly of already-known knowledge,” he says.
Well, hardly. After all, the same GNU/Linux can run globe-spanning grids and supercomputers; it can power back office servers (a market where it bids fair to overtake Microsoft soon); it can run on desktops without a single file being installed on your system; and it is increasingly appearing in embedded devices - mp3 players, mobile phones etc. No other operating system has ever achieved this portability or scalability. And then there's the more technical aspects: GNU/Linux is simply the most stable, most versatile and most powerful operating system out there. If that isn't innovative, I don't know what is.
But let's leave GNU/Linux aside, and consider what open source has achieved elsewhere. Well, how about the Web for a start, whose protocols and underlying software have been developed in a classic open source fashion? Or what about programs like BIND (which runs the Internet's name system), or Sendmail, the most popular email server software, or maybe Apache, which is used by two-thirds of the Internet's public Web sites?
And then there's Wikimedia, which powers Wikipedia (and a few other wikis): even if Wikipedia were merely "an assembly of already-known knowledge", Wikimedia (based on the open source applications PHP and MySQL) is an unprecedentedly large assembly, unmatched by any proprietary system. Enough innovation for you, Mr Weber?
But the saddest thing about this article is not so much these manifest inaccuracies as the reason why they are there. Groklaw's Pamela Jones (PJ) has a typically thorough commentary on the Economist piece. From corresponding with its author, she says "I noticed that he was laboring under some wrong ideas, and looking at the finished article, I notice that he never wavered from his theory, so I don't know why I even bothered to do the interview." In other words, the feature is not just wrong, but wilfully wrong, since others, like PJ, had carefully pointed out the truth. (There's an old saying among journalists that you should never let the facts get in the way of a good story, and it seems that The Economist has decided to adopt this as its latest motto.)
But there is a deeper irony in this sad tale, one carefully picked out by PJ:
There is a shocking lack of accuracy in the media. I'm not at all kidding. Wikipedia has its issues too, I've no doubt. But that is the point. It has no greater issues than mainstream articles, in my experience. And you don't have to write articles like this one either, to try to straighten out the facts. Just go to Wikipedia and input accurate information, with proof of its accuracy.
If you would like to learn about Open Source, here's Wikipedia's article. Read it and then compare it to the Economist article. I think then you'll have to agree that Wikipedia's is far more accurate. And it isn't pushing someone's quirky point of view, held despite overwhelming evidence to the contrary.
Wikipedia gets something wrong, you can correct it by pointing to the facts; The Economist gets it wrong - as in the piece under discussion - and you are stuck with an article that is, at best, Economistical with the truth.
Posted by Glyn Moody at 8:40 am 6 comments
Labels: bind, california, GNU/Linux, groklaw, mediawiki, pamela jones, sendmail, steven weber, the economist, wikimedia, wikipedia
To the extent possible under law,
glyn moody
has waived all copyright and related or neighbouring rights to
this work.
This work is published from:
United Kingdom.