Showing posts with label open access. Show all posts
Showing posts with label open access. Show all posts

07 September 2010

Open Access Meets Open Archaeology

How could I resist OA meets OA thanks to OA?

The OA Library has been developed by Oxford Archaeology in order to allow us to distribute grey literature client reports and other documents to wider audiences. Oxford Archaeology is committed to a policy of Open Access to archaeological data; this website allows us to disseminate material as widely as possible.

And all towards the greater good of open archaeology...

Follow me @glynmoody on Twitter or identi.ca.

22 July 2010

Openness: Just What the Doctoral Student Ordered

In 2007 the British Library (BL) and the JISC funded The Google Generation Information Behaviour of the Researcher of the Future research (CIBER, 2008), which focused on how researchers of the future, ‘digital natives’ born after 1993, are likely to access and interact with digital resources in five to ten years’ time. The research reported overall that the information literacy of young people has not improved with wider access to technology.

To complement the findings of the Google Generation research, the BL and the JISC commissioned this three‐year research study Researchers of Tomorrow focusing on the information‐seeking and research behaviour of doctoral students born between 1982 – 1994, dubbed ‘Generation Y’.

There's lots of interesting stuff in the first report, but what really caught my attention was the following:

The principles behind open access publishing and self‐archiving speak to the students’ desire for an all‐embracing, seamlessly accessible research information network in which restrictions on access do not constrain them. Similarly, many of the students favour open source technology applications (e.g. Linux, Mozilla) to support the way they want to work and organise their research, and are critical of the lack of technical support to open source applications in their own institutions.

However, as the report emphasises, students remain somewhat confused about what open access really is. This suggests fertile ground for a little more explanation by open access practitioners - the benefits of doing so could be considerable.

It's also rather ironic that one of those behind the report should be the British Library: as I've noted with sadness before, the BL is one of the leading opponents of openness in the academic world, choosing instead to push DRM and patented-encumbered Microsoft technologies for its holdings. It's probably too much to expect it to read the above sections and to understand that it is going in exactly the wrong direction as far as future researchers - its customers - are concerned...

Follow me @glynmoody on Twitter or identi.ca.

20 June 2010

Open Source Scientific Publishing

Since one of the key ideas behind this blog is to explore the application of the open source approach to other fields, I was naturally rather pleased to come across the following:

As a software engineer who works on open source scientific applications and frameworks, when I look at this, I scratch my head and wonder "why don't they just do the equivalent of a code review"? And that's really, where the germ of the idea behind this blog posting started. What if the scientific publishing process were more like an open source project? How would the need for peer-review be balanced with the need to publish? Who should bear the costs? Can a publishing model be created that minimizes bias and allows good ideas to emerge in the face of scientific groupthink?

It's a great question, and the post goes some way to sketching out how that might work in practice. It also dovetails nicely with my earlier post about whether we need traditional peer review anymore. Well worth reading.

Follow me @glynmoody on Twitter or identi.ca.

26 March 2010

The Battle for Scholarly Publishing's Soul

Before Peter Suber became Mr Open Access, he was a philosopher by trade. This is evident in the long, thoughtful essays he writes for the SPARC Open Access Newsletter, which help console us for his absence these days from the world of blogging.

Here's the latest of them, entitled "Open access, markets, and missions". It asks some deep questions about what kind of scholarly publishing we should strive for: market oriented or mission oriented? As he observes:

Profit maximizing limits access to knowledge, by limiting it to paying customers. If anyone thinks this is just a side-effect of today's market incentives, then we can put the situation differently: Profit maximizing doesn't always limit access to knowledge, but is always ready to do so if it pays better. This proposition has a darker corollary: Profit maximizing doesn't always favor untruth, but is always ready to do so if it would pay better. It's hard to find another explanation for the fake journals Elsevier made for Merck and the dishonest lobbying campaigns against OA policies. (Remember "Public access equals government censorship"? "If the other side is on the defensive, it doesn't matter if they can discredit your statements"?)

He concludes:

Instead of hypnotically granting the primacy of markets in all sectors, as if there were no exceptions, we should remember that many organizations compromise profits or relinquish revenues in order to foster their missions, and that we all benefit from their dedication. Which institutions and sectors ought to do so, and how should we protect and support them to pursue their missions? Instead of smothering these questions for offending the religion of markets, we should open them for wider discussion. Should scholarly publishing, with all of its mixed incentives and hard choices, migrate closer to market-oriented end of the spectrum or to the mission-oriented end of the spectrum? For me the answer depends on a prior question. Do we want scholarly publishing to serve a certain function in the community?

Follow me @glynmoody on Twitter or identi.ca.

08 February 2010

Beyond Open Access: Open Publishing

Another splendid piece from Cameron Neylon calling into question the value of traditional peer review:


Whatever value it might have we largely throw away. Few journals make referee’s reports available, virtually none track the changes made in response to referee’s comments enabling a reader to make their own judgement as to whether a paper was improved or made worse. Referees get no public credit for good work, and no public opprobrium for poor or even malicious work. And in most cases a paper rejected from one journal starts completely afresh when submitted to a new journal, the work of the previous referees simply thrown out of the window.

Much of the commentary around the open letter has suggested that the peer review process should be made public. But only for published papers. This goes nowhere near far enough. One of the key points where we lose value is in the transfer from one journal to another. The authors lose out because they’ve lost their priority date (in the worse case giving the malicious referees the chance to get their paper in first). The referees miss out because their work is rendered worthless. Even the journals are losing an opportunity to demonstrate the high standards they apply in terms of quality and rigor – and indeed the high expectations they have of their referees.

What Neylon has exposed here is that scientific publishing - even the kind that wears its open access badge with pride - simply isn't open in any deep way. We need to be able to see the whole process, for the reasons he mentions. Open access isn't enough, not even with open data: we need *open publishing*.

And yes, that's going to be a huge shift, and painful for many. But if that's the price of producing better scientific papers - and hence better science - surely it's a price worth paying. (Via Nat Torkington.)

Follow me @glynmoody on Twitter or identi.ca.

30 November 2009

Harnessing Openness in Higher Education

Surprisingly, perhaps, education was one of the late-comers to the openness party (couldn't be all those fiercely protective academic egos, could it?) Happily, ground is rapidly being made up in areas like open access, open courseware and open educational resources (OER), with a steady stream of important studies looking at how openness can be applied to make education better.

I've not come across the Committee for Economic Development before, but I like their thinking in this new report "Harnessing Openness to Improve Research, Teaching and Learning in Higher Education". Here's a sample from the summary:

We do not expect OER to simply replace more closed, proprietary educational materials which themselves are increasingly becoming digital. And there are many issues that must be addressed if OER is to live up to its potential. OER has been supply driven, with creators posting whatever interests them regardless of how or even whether it is used; to be successful OER must meet the needs of users. We need to know how OER is actually being used, how effective it is, particularly in comparison with existing materials, and what impact it has on learners. We need to rethink our copyright rules to allow increased non-commercial educational uses of copyrighted materials beyond the traditional classroom in order to facilitate the further development of OER. Just as new approaches to sustainability are being developed to support open-source software and open-access scientific journals, we will need to see if there are ways to sustain the development and distribution of free high-quality, academically rigorous, and pedagogically sound OER that take full advantage of its digital nature.

It also shows a good appreciation of one of the key obstacles to openness in education - and elsewhere:

The intellectual property arguments that have been invoked to oppose public-access mandates for government-funded research and the digitization and partial display of the world’s books suggest to us the need to recalibrate our intellectual property rules for the digital age. Intellectual property rules should serve not only those who first create a work (and subsequent rights holders) but should also recognize the needs of users who often are follow-on creators. When the application of existing intellectual property rules appear to regularly have perverse effects — electronic books having text-to-speech capabilities turned off to the detriment of the visually impaired, or university presses, created to increase the accessibility of scholarly materials, invoking copyright protections to have their material removed from the globally accessible Web — it is time to step back and revisit not only the specific applications of the rules but the rules themselves. Given the complexity of these issues, universities should be forceful proponents for greater openness in legislative debates about IP, and should be educating their faculties about their intellectual property rights.

That's truly remarkable given the background of the Committee for Economic Development that is behind the report:

CED is a Trustee-directed organization. CED's Trustees are chairmen, presidents, and senior executives of major American corporations and university presidents. Trustees alone set CED's research agenda, develop policy recommendations, and speak out for their adoption. Unique among U.S. business organizations, CED offers senior executives a nonpolitical forum for exploring critical long-term issues and making an impact on U.S. policy decisions.

CED is proud of its reputation as a group of business and education leaders committed to improving the growth and productivity of the U.S. economy, a freer global trading system, and greater opportunity for all Americans. CED's Trustees understand that business, government, and individuals are jointly responsible for our mutual security and prosperity.

These are clearly not a bunch of sandal-wearing hippies, but a bunch of hard-headed business people who can see the economic case for more openness in education.

The rest of report offers useful potted histories of openness in education, and even broadens out to include transparency - an interesting indication of this rising meme. Overall, well worth reading for those interested in this area.

Follow me @glynmoody on Twitter or identi.ca.

26 November 2009

Who Owns Science? The Manchester Manifesto

One of my heroes, Sir John Sulston, has a piece in the Guardian today with the intriguing headline "How science is shackled by intellectual property":

The myth is that IP rights are as important as our rights in castles, cars and corn oil. IP is supposedly intended to encourage inventors and the investment needed to bring their products to the clinic and marketplace. In reality, patents often suppress invention rather than promote it: drugs are "evergreened" when patents are on the verge of running out – companies buy up the patents of potential rivals in order to prevent them being turned into products. Moreover, the prices charged, especially for pharmaceuticals, are often grossly in excess of those required to cover costs and make reasonable profits.

IP rights are beginning to permeate every area of scientific endeavour. Even in universities, science and innovation, which have already been paid for out of the public purse, are privatised and resold to the public via patents acquired by commercial interests. The drive to commercialise science has overtaken not only applied research but also "blue-skies" research, such that even the pure quest for knowledge is subverted by the need for profit.

Great stuff, but this is actually just a teaser for the launch today of something called rather grandly "The Manchester Manifesto" [.pdf], which states the problem as follows:

It is clear that the dominant existing model of innovation, while serving some necessary purposes for the current operation of innovation, also impedes achievement of core scientific goals in a number of ways. In many cases it restricts access to scientific knowledge and products, thereby limiting the public benefits of science; it can restrict the flow of information, thereby inhibiting the progress of science; and it may hinder innovation through the costly and complicated nature of the system. Limited improvements may be achieved through modification of the current IP system, but consideration of alternative models isurgently required.

Unfortunately, after asking the right questions, the answer that the manifesto comes up with is pretty thin gruel:

We call for further research towards achieving more equitable innovation and enabling greater fulfilment of the goals of science as we see them.

Further research?

Modified and alternative models of innovation have the potential to address problems inherent in the current system. An investigation and evaluation of these models is required in order to determine whether they are likely to be more successful in facilitating the goals of science and innovation identified above, and if so how they may be deployed.

Hey, let's not get too radical, eh?

Follow me @glynmoody on Twitter or identi.ca.

24 November 2009

Promoting Open Source Science

Open source science certainly seems to be catching on lately: there have been as many articles on the subject in the last few months as in the prevous few years. Here's a good one, an interview with Walter Jessen. This is his definition of what open source science means:

Open Source Science is a collaborative and transparent approach to science. To me, it means four things:

1. Open Source: the use of open and freely accessible software tools for scientific research and collaboration.
2. Open Notebook: transparency in experimental design and data management.
3. Open Data: public accessibility of scientific data, which allows for distribution, reuse and derived works.
4. Open Access: public access to scholarly literature.

It's well-worth reading, not least for its useful links to related sites.

Follow me @glynmoody on Twitter or identi.ca.

21 October 2009

Why not Participatory Medicine?

If this participation thing is so great, why don't we apply it to something really important, like medicine? Why not, indeed?

Welcome to JoPM, a New Peer-Reviewed, Open Access Journal

Our mission is to transform the culture of medicine to be more participatory. This special introductory issue is a collection of essays that will serve as the 'launch pad' from which the journal will grow. We invite you to participate as we create a robust journal to empower and connect patients, caregivers, and health professionals.

More specifically:

Because the Journal of Participatory Medicine is a new journal publishing multidisciplinary articles on topics within a new, not yet defined field, we have established draft parameters that define the journal’s range of interest. We anticipate that these parameters will change somewhat as the field develops. In the meantime, the following characterize the field of participatory medicine.

I particularly liked the following section, with its emphasis on openness:

New Knowledge Creation stems from the collaboration of researchers and patients, as individuals and as groups.

1. Health professionals and patients sharing in the discussion of scientific methods, including open discussion about the level of evidence of the research

2. Open, transparent process that demonstrates collaboration and participation in research

3. Patients with significant interest in a topic joining together to create repositories for research, including (but not limited to) registries, tissue banks and genetic databases; demonstrating mutual respect for the contributions of the data owners and health research professionals with the tools to gain insight from those repositories. Interpretation of results and conclusions including involvement of all stakeholders.

Important stuff, worth a read.

Follow me @glynmoody on Twitter or identi.ca.

26 August 2009

Another Reason for Open Access

Yet again, Cameron Neylon is daring to ask the unasked questions that *should* be asked:

Many of us have one or two papers in journals that are essentially inaccessible, local society journals or just journals that were never online, and never widely enough distributed for anyone to find. I have a paper in Complex Systems (volume 17, issue 4 since you ask) that is not indexed in Pubmed, only available in a preprint archive and has effectively no citations. Probably because it isn’t in an index and no-one has ever found it. But it describes a nice piece of work that we went through hell to publish because we hoped someone might find it useful.

Now everyone agreed, and this is what the PLoS ONE submission policy says quite clearly, that such a paper cannot be submitted for publication. This is essentially a restatement of the Ingelfinger Rule. But being the contrary person I am I started wondering why. For a commercial publisher with a subscripton business model it is clear that you don’t want to take on content that you can’t exert a copyright over, but for a non-profit with a mission to bring science to wider audience does this really make sense? If the science is currently unaccessible and is of appropriate quality for a given journal and the authors are willing to pay the costs to bring it to a wider public, why is this not allowed?

Why not, indeed? For as Neylon points out:

If an author feels strongly enough that a paper will get to a wider audience in a new journal, if they feel strongly enough that it will benefit from that journal’s peer review process, and they are prepared to pay a fee for that publication, why should they be prevented from doing so? If that publication does bring that science to a wider audience, is not a public service publisher discharging their mission through that publication?

Which is only possible, of course, in open access journals adopting a funder pays approach, since traditional publishers need to be able to point to the uniqueness of their content if they are trying to sell it - after all, why would you want to buy it twice? Open access journals have no such imperative, since they are giving it away, so readers have no expectations that the stuff is unique and never seen before.

Follow me @glynmoody on Twitter or identi.ca.

21 August 2009

PLoS Reinvents Publishing and Saves the World

As someone who has been writing about open access for some years, I find myself returning again and again to the Public Library of Science. That's because, not content with pioneering open access, PLoS has time and again re-invented the broader world of scientific publishing. Now, it's done it again:

Today, after several months of work, I’m delighted to announce that PLoS is launching PLoS Currents (Beta) – a new and experimental website for the rapid communication of research results and ideas. In response to the recent worldwide H1N1 influenza outbreak, the first PLoS Currents research theme is influenza.

Note the emphasis on "rapid": this is absolutely crucial, as I've noted before. The current system of publishing papers is simply too slow to deal with pandemics, where speed is of the essence if we're to have a chance of nipping them in the bud. It's good to see PLoS stepping in to help address this major problem.

It's doing it in a very interesting way:

PLoS Currents: Influenza, which we are launching today, is built on three key components: a small expert research community that PLoS is working with to run the website; Google Knol with new features that allow content to be gathered together in collections after being vetted by expert moderators; and a new, independent database at the National Center for Biotechnology Information (NCBI) called Rapid Research Notes, where research targeted for rapid communication, such as the content in PLoS Currents: Influenza will be freely and permanently accessible. To ensure that researchers are properly credited for their work, PLoS Currents content will also be given a unique identifier by the NCBI so that it is citable.

...

The key goal of PLoS Currents is to accelerate scientific discovery by allowing researchers to share their latest findings and ideas immediately with the world’s scientific and medical communities. Google Knol’s features for community interaction, comment and discussion will enable commentary and conversations to develop around these findings. Given that the contributions to PLoS Currents are not peer-reviewed in detail, however, the results and conclusions must be regarded as preliminary. In time, it is therefore likely that PLoS Currents contributors will submit their work for publication in a formal journal, and the PLoS Journals will welcome these submissions.

PLoS Currents: Influenza is an experiment and a prototype for further PLoS Currents sites. It reflects our commitment to using online tools to the fullest extent possible for the open sharing of research results. As with any new project, we will be listening carefully to the reactions within and beyond the scientific and medical communities and welcoming suggestions for improvements.

This is really exciting from many viewpoints. It's pushing the ideas behind open access even further; it's reshaping publishing; and it may even save humanity. (Via James Boyle.)

Follow me @glynmoody on Twitter and identi.ca.

09 August 2009

Open Access Piles on the Pressure

Interesting:

it is not open access per se that is threatening Elsevier (High Energy Physics since long have had almost 100% open access uptake, so critical mass has long been reached in this special field, quite different from other physics areas), but that they are loosing the battle for authors, possibly due to their reluctance to support SCOAP3. As I wrote, they have lost between 30% to 50% in submissions from authors during the last 4 years for their HEP journals. With such a massive reduction in size, prices also had to come down. In the new open access scholarly publishing market, journals will compete for authors even more than now. SCOAP3 certainly raised the awareness for both the scientific community's expectation to fully convert these journals to OA and the unsustainable prices that had risen to absurd record prices. It is clear that subscriptions are now under even more pressure because of the global economic crisis that especially hit american libraries very hard.

High energy physics (my old discipline) is certainly in the vanguard, but is probably just the first of many to follow this path. Go, open access.

Follow me @glynmoody on Twitter and identi.ca.

04 August 2009

Level Playing-Fields and Open Access

Yesterday, I wrote elsewhere about open standards, and how they sought to produce a level playing field for all. Similar thoughts have occurred to Stuart Shieber in this post about open access:


In summary, publishers see an unlevel playing field in choosing between the business models for their journals exactly because authors see an unlevel playing field in choosing between journals using the business models.

He has an interesting solution:

To mitigate this problem—to place open-access processing-fee journals on a more equal competitive footing with subscription-fee journals—requires those underwriting the publisher's services for subscription-fee journals to commit to a simple “compact” guaranteeing their willingness to underwrite them for processing-fee journals as well.

He concludes:

If all schools and funders committed to the compact, a publisher could more safely move a journal to an open-access processing-fee business model without fear that authors would desert the journal for pecuniary reasons. Support for the compact would also send a signal to publishers and scholarly societies that research universities and funders appreciate and value their contributions and that universities and funders promoting self-archiving have every intention of continuing to fund publication, albeit within a different model. Publishers willing to take a risk will be met by universities and funding agencies willing to support their bold move.

The new US administration could implement such a system through simple FRPAA-like legislation requiring funding agencies to commit to this open-access compact in a cost-neutral manner. Perhaps reimbursement would be limited to authors at universities and research institutions that themselves commit to a similar compact. As funding agencies and universities take on this commitment, we might transition to an efficient, sustainable journal publishing system in which publishers choose freely among business models on an equal footing, to the benefit of all.

Follow me @glynmoody on Twitter and identi.ca.

29 July 2009

It's Not Open Science if it's Not Open Source

Great to see a scientist come out with this in an interesting post entitled "What, exactly, is Open Science?":


granting access to source code is really equivalent to publishing your methodology when the kind of science you do involves numerical experiments. I’m an extremist on this point, because without access to the source for the programs we use, we rely on faith in the coding abilities of other people to carry out our numerical experiments. In some extreme cases (i.e. when simulation codes or parameter files are proprietary or are hidden by their owners), numerical experimentation isn’t even science. A “secret” experimental design doesn’t give skeptics the ability to repeat (and hopefully verify) your experiment, and the same is true with numerical experiments. Science has to be “verifiable in practice” as well as “verifiable in principle”.

The rest is well worth reading too.

(Via @phylogenomics.)

Follow me @glynmoody on Twitter @glynmoody and identi.ca.

22 July 2009

Pat "Nutter" Brown Strikes Again

To change the world, it is not enough to have revolutionary ideas: you also have the inner force to be able to realise them in the face of near-universal opposition/indifference/derision. Great examples of this include Richard Stallman, who ploughed his lonely GNU furrow for years before anyone took much notice, and Michael Hart, who did the same for Project Gutenberg.

Another of these rare beings with both vision and tenacity is Pat Brown, a personal hero of mine. Not content with inventing one of the most important experimental tools in genomics - DNA microarrays - Brown decided he wanted to do something ambitious: open access publishing. This urge turned into the Public Library of Science (PLoS) - and even that is just the start:


PLoS is just part of a longer range plan. The idea is to completely change the way the whole system works for scientific communication.

At the start, I knew nothing about the scientific publishing business. I just decided this would be a fun and important thing to do. Mike Eisen, who was a post-doc in my lab, and I have been brain-storming a strategic plan, and PLoS was a large part of it. When I started working on this, almost everyone said, “You are completely out of your mind. You are obviously a complete idiot about how publishing works, and besides, this is a dilettante thing that you're doing.” Which I didn't feel at all.

I know I'm serious about it and I know it's doable and I know it's going to be easy. I could see the thermodynamics were in my favor, because the system is not in its lowest energy state. It's going to be much more economically efficient and serve the customers a lot better being open access. You just need a catalyst to GET it there. And part of the strategy to get it over the energy barrier is to apply heat—literally, I piss people off all the time.

In case you hadn't noticed, that little plan "to completely change the way the whole system works for scientific communication" is coming along quite nicely. So, perhaps buoyed up by this, Brown has decided to try something even more challenging:

Brown: ... I'm going to do my sabbatical on this: I am going to devote myself, for a year, to trying to the maximum extent possible to eliminate animal farming on the planet Earth.

Gitschier: [Pause. Sensation of jaw dropping.]

Brown: And you are thinking I'm out of my mind.

Gitschier: [Continued silence.]

Brown: I feel like I can go a long way toward doing it, and I love the project because it is purely strategy. And it involves learning about economics, agriculture, world trade, behavioral psychology, and even an interesting component of it is creative food science.

Animal farming is by far the most environmentally destructive identified practice on the planet. Do you believe that? More greenhouse production than all transportation combined. It is also the major single source of water pollution on the planet. It is incredibly destructive. The major reason reefs are dying off and dead zones exist in the ocean—from nutrient run-off. Overwhelmingly it is the largest driving force of deforestation. And the leading cause of biodiversity loss.

And if you think I'm bullshitting, the Food and Agricultural Organization of the UN, whose job is to promote agricultural development, published a study, not knowing what they were getting into, looking at the environmental impact of animal farming, and it is a beautiful study! And the bottom line is that it is the most destructive and fastest growing environmental problem.

Gitschier: So what is your plan?

Brown: The gist of my strategy is to rigorously calculate the costs of repairing and mitigating all the environmental damage and make the case that if we don't pay as we go for this, we are just dumping this huge burden on our children. Paying these costs will drive up the price of a Big Mac and consumption will go down a lot. The other thing is to come up with yummy, nutritious, affordable mass-marketable alternatives, so that people who are totally addicted to animal foods will find alternatives that are inherently attractive to eat, so much so that McDonald's will market them, too. I want to recruit the world's most creative chefs—here's a REAL creative challenge!

I've talked with a lot of smart people who are very keen on it actually. They say, “You have no chance of success, but I really hope you're successful.” That's just the kind of project I love.

Pat, the world desperately needs nutters like you. Let's just hope that the thermodynamics are in your favour once more.

Follow me @glynmoody on Twitter or identi.ca.

10 July 2009

Do We Need Open Access Journals?

One of the key forerunners of the open access idea was arxiv.org, set up by Paul Ginsparg. Here's what I wrote a few years back about that event:

At the beginning of the 1990s, Ginsparg wanted a quick and dirty solution to the problem of putting high-energy physics preprints (early versions of papers) online. As it turns out, he set up what became the arXiv.org preprint repository on 16 August, 1991 – nine days before Linus made his fateful “I'm doing a (free) operating system (just a hobby, won't be big and professional like gnu) for 386(486) AT clones” posting. But Ginsparg's links with the free software world go back much further.

Ginsparg was already familiar with the GNU manifesto in 1985, and, through his brother, an MIT undergraduate, even knew of Stallman in the 1970s. Although arXiv.org only switched to GNU/Linux in 1997, it has been using Perl since 1994, and Apache since it came into existence. One of Apache's founders, Rob Hartill, worked for Ginsparg at the Los Alamos National Laboratory, where arXiv.org was first set up (as an FTP/email server at xxx.lanl.org). Other open source programs crucial to arXiv.org include TeX, GhostScript and MySQL.

arxiv.org was and is a huge success, and that paved the way for what became the open access movement. But here's an interesting paper - hosted on arxiv.org:

Contemporary scholarly discourse follows many alternative routes in addition to the three-century old tradition of publication in peer-reviewed journals. The field of High- Energy Physics (HEP) has explored alternative communication strategies for decades, initially via the mass mailing of paper copies of preliminary manuscripts, then via the inception of the first online repositories and digital libraries.

This field is uniquely placed to answer recurrent questions raised by the current trends in scholarly communication: is there an advantage for scientists to make their work available through repositories, often in preliminary form? Is there an advantage to publishing in Open Access journals? Do scientists still read journals or do they use digital repositories?

The analysis of citation data demonstrates that free and immediate online dissemination of preprints creates an immense citation advantage in HEP, whereas publication in Open Access journals presents no discernible advantage. In addition, the analysis of clickstreams in the leading digital library of the field shows that HEP scientists seldom read journals, preferring preprints instead.

Here are the article's conclusions:

Scholarly communication is at a cross road of new technologies and publishing models. The analysis of almost two decades of use of preprints and repositories in the HEP community provides unique evidence to inform the Open Access debate, through four main findings:

1. Submission of articles to an Open Access subject repository, arXiv, yields a citation advantage of a factor five.

2. The citation advantage of articles appearing in a repository is connected to their dissemination prior to publication, 20% of citations of HEP articles over a two-year period occur before publication.

3. There is no discernable citation advantage added by publishing articles in “gold” Open Access journals.

4. HEP scientists are between four and eight times more likely to download an article in its preprint form from arXiv rather than its final published version on a journal web site.

On the one hand, it would be ironic if the very field that acted as a midwife to open access journals should also be the one that begins to undermine it through a move to repository-based open publishing of preprints. On the other, it doesn't really matter; what's important is open access to the papers. If these are in preprint form, or appear as fully-fledged articles in peer-reviewed open access journals is a detail, for the users at least; it's more of a challenge for publishers, of course... (Via @JuliuzBeezer.)

Follow me @glynmoody on Twitter or identi.ca.

30 June 2009

Why Scientific Publishing Will Never be the Same

For those of us tracking open access and its wider import, it's pretty clear that scientific publishing has changed for ever. But for some within the industry, there remains the desperate hope that all this new-fangled open, collaborative stuff will just blow over.

Forget about it: as this magisterial analysis shows, not only is there no way for traditional publishing to get from there to here - it's a terrifying leap across a minimum to the next maximum - but the most exciting stuff is *already* happening in other forms of publishing:

What’s new today is the flourishing of an ecosystem of startups that are experimenting with new ways of communicating research, some radically different to conventional journals. Consider Chemspider, the excellent online database of more than 20 million molecules, recently acquired by the Royal Society of Chemistry. Consider Mendeley, a platform for managing, filtering and searching scientific papers, with backing from some of the people involved in Last.fm and Skype. Or consider startups like SciVee (YouTube for scientists), the Public Library of Science, the Journal of Visualized Experiments, vibrant community sites like OpenWetWare and the Alzheimer Research Forum, and dozens more. And then there are companies like Wordpress, Friendfeed, and Wikimedia, that weren’t started with science in mind, but which are increasingly helping scientists communicate their research. This flourishing ecosystem is not too dissimilar from the sudden flourishing of online news services we saw over the period 2000 to 2005.

It’s easy to miss the impact of blogs on research, because most science blogs focus on outreach. But more and more blogs contain high quality research content. Look at Terry Tao’s wonderful series of posts explaining one of the biggest breakthroughs in recent mathematical history, the proof of the Poincare conjecture. Or Tim Gowers recent experiment in “massively collaborative mathematics”, using open source principles to successfully attack a significant mathematical problem. Or Richard Lipton’s excellent series of posts exploring his ideas for solving a major problem in computer science, namely, finding a fast algorithm for factoring large numbers. Scientific publishers should be terrified that some of the world’s best scientists, people at or near their research peak, people whose time is at a premium, are spending hundreds of hours each year creating original research content for their blogs, content that in many cases would be difficult or impossible to publish in a conventional journal. What we’re seeing here is a spectacular expansion in the range of the blog medium. By comparison, the journals are standing still.

What's even better about this piece is that it's not content to point out why traditional publishing has big problems: it also offers some practical suggestions of what people *should* be looking at:

These opportunities can still be grasped by scientific publishers who are willing to let go and become technology-driven, even when that threatens to extinguish their old way of doing things. And, as we’ve seen, these opportunites are and will be grasped by bold entrepreneurs. Here’s a list of services I expect to see developed over the next few years. A few of these ideas are already under development, mostly by startups, but have yet to reach the quality level needed to become ubiquitous. The list could easily be continued ad nauseum - these are just a few of the more obvious things to do.

Fantastic stuff - do read it all if you have time.

Follow me @glynmoody on Twitter or identi.ca.

19 June 2009

Elsevier Does a Microsoft with Open Access

Nice one, Elsevier:

A multinational journal giant is understood to be courting vice- chancellors in an effort to win their support for an alternative to open-access institutional research repositories.

Elsevier is thought to be mooting a new idea that could undermine universities' own open-access repositories. It would see Elsevier take over the job of archiving papers and making them available more widely as PDF files.

If successful, it would represent a new tactic by publishers in their battle to secure their future against the threat posed by the open-access publishing movement.

Most UK universities operate open-access repositories, where scholars can voluntarily deposit final drafts of their pay-to-access journal publications online. Small but growing numbers are also making such depositions mandatory.

I've seen these kind of stories so many times in the world of open source, with Microsoft as the main protagonist, that they warm the cockles of my heart when I see them popping up in other areas like open access. Why? Because if a multi-billion pound company like Elsevier is starting to stoop to this kind of tactic, it demonstrates just how profoundly worried it is - and how close open access is to widespread acceptance.

Follow me @glynmoody on Twitter or identi.ca.

03 June 2009

Big Open Access Win in UK

Great news:

University College London is set to become the first of the top tier of elite European universities to make all its research available for free at the click of a mouse, in a model it hopes will spread across the academic world.

UCL’s move to “open access” for all research, subject to copyright law, could boost the opportunities for rapid intellectual breakthroughs if taken up by other universities, thereby increasing economic growth.

Paul Ayris, head of the UCL library and an architect of the plan to put all its research on a freely accessible UCL website, said he had backed open access because the existing system of having to visit a library or pay a subscription fee to see research in journals erected “barriers” to the use of research. “This is not good for society if you’re looking for a cure for cancer,” he said.

What's pathetic is that some people are *still* spreading the FUD:

Martin Weale, director of the National Institute of Economic and Social Research, said: “If you read something in the American Economic Review, there’s a presumption that its quality has been examined with great care, and the article isn’t rubbish. But if you have open access, people who are looking for things ... will find it very difficult to sort out the wheat from the chaff.”

Hey, Martin, as you should know, open access and peer review are completely different things. The open access material at UCL can still be published in peer reviewed journals - including those that are also open access - in order "to sort the wheat from the chaff". The point is that *anyone* can access all the materials at any time - not just when publishers allow it upon payment of exorbitant fees.

Moreover, I seem to recall that there's this cute little company called Google that's pretty good at pointing people to content on the Web. And that's partly the point: once stuff is open access, all sorts of clever ways of finding it and using it are possible - and that's rarely true for traditional scientific publishing. (Via Mike Simons.)

Follow me @glynmoody on Twitter or identi.ca.

31 May 2009

Open Government: the Latest Member of the Open Family

One of the most exciting developments in the last few years has been the application of some of the core ideas of free software and open source to completely different domains. Examples include open content, open access, open data and open science. More recently, those principles are starting to appear in a rather surprising field: that of government, as various transparency initiatives around the world start to gain traction....

On Linux Journal.