Showing posts sorted by relevance for query open access. Sort by date Show all posts
Showing posts sorted by relevance for query open access. Sort by date Show all posts

11 March 2006

Open University Meets Open Courseware

Great news (via Open Access News and the Guardian): the Open University is turning a selection of its learning materials into open courseware. To appreciate the importance of this announcement, a little background may be in order.

As its fascinating history shows, the Open University was born out of Britain's optimistic "swinging London" culture of the late 1960s. The idea was to create a university open to all - one on a totally new scale of hundreds of thousands of students (currently there are 210,000 enrolled). It was evident quite early on that this meant using technology as much as possible (indeed, as the history explains, many of the ideas behind the Open University grew out of an earlier "University of the Air" idea, based around radio transmissions.)

One example of this is a close working relationship with the BBC, which broadcasts hundreds of Open University programmes each week. Naturally, these are open to all, and designed to be recorded for later use - an early kind of multimedia open access. The rise of the Web as a mass medium offered further opportunities to make materials available. By contrast, the holdings of the Open University Library require a username and password (although there are some useful resources available to all if you are prepared to dig around).

Against this background of a slight ambivalence to open access, the announcement that the Open University is embracing open content for at least some of its courseware is an extremely important move, especially in terms of setting a precedent within the UK.

In the US, there is already the trail-blazing MIT OpenCourseWare project. Currently, there are materials from around 1250 MIT courses, expected to rise to 1800 by 2007. Another well-known example of open courseware is the Connexions project, which has some 2900 modules. This was instituted by Rice University, but now seems to be spreading ever wider. In this it is helped by an extremely liberal Creative Commons licence, that allows anyone to use Connexions material to create new courseware. MIT uses a Creative Commons licence that is similar, except it forbids commercial use.

At the moment, there's not much to see at the Open University's Open Content Initiative site. There is an interesting link is to information from the project's main sponsor, the William and Flora Hewlett Foundation, about its pioneering support for open content. This has some useful links at the foot of the page to related projects and resources.

One thing the Open University announcement shows is that open courseware is starting to pick up steam - maybe a little behind the related area of open access, but coming through fast. As with all open endeavours, the more there are, the more evident the advantages of making materials freely available becomes, and the more others follow suit. This virtuous circle of openness begetting openness is perhaps one of the biggest advantages that it has over the closed, proprietary alternatives, which by their very nature take an adversarial rather than co-operative approach to those sharing their philosophy.

07 October 2008

"IBM" Buys "Red Hat", Sort Of....

Well, that gives an idea of the importance of this move for the world of open access:

Open access pioneer BioMed Central has been acquired by Springer, ScientificAmerican.com has learned.

....

Those in the open access movement had watched BioMed Central with keen interest. Founded in 2000, it was the first for-profit open access publisher and advocates feared that when the company was sold, its approach might change. But Cockerill assured editors that a BMC board of trustees "will continue to safeguard BioMed Central's open access policy in the future." Springer "has been notable...for its willingness to experiment with open access publishing," Cockerill said in a release circulated with the email to editors.

11 May 2006

The Digital Sum of Human Knowledge

Most of us think of open access as a great way of reading the latest research online, so there is an implicit assumption that open access is only about the cutting edge. This also flows from the fact that most open access journals are recent launches, and those that aren't usually only provide content for volumes released after a certain (recent) date, for practical reasons of digital file availability, if nothing else.

This makes the joint Wellcome Trust and National Libary of Medicine project to place 200 years of biomedical journals online by scanning them a major expansion not just to the open access programme, but to the whole concept of open access.

It also hints at what the end-goal of open access must be: the online availability of every journal, magazine, newspaper, pamphlet, book, manuscript, tablet, inscription, statue, seal and ostracon that has survived the ravages of history - the digital sum of all written human knowledge.

05 April 2009

Who Can Put the "Open" in Open Science?

One of the great pleasures of blogging is that your mediocre post tossed off in a couple of minutes can provoke a rather fine one that obviously took some time to craft. Here's a case in point.

The other day I wrote "Open Science Requires Open Source". This drew an interesting comment from Stevan Harnad, pretty much the Richard Stallman of open access, as well as some tweets from Cameron Neylon, one of the leading thinkers on and practitioners of open science. He also wrote a long and thoughtful reply to my post (including links to all our tweets, rigorous chap that he is). Most of it was devoted to pondering the extent to which scientists should be using open source:

It is easy to lose sight of the fact that for most researchers software is a means to an end. For the Open Researcher what is important is the ability to reproduce results, to criticize and to examine. Ideally this would include every step of the process, including the software. But for most issues you don’t need, or even want, to be replicating the work right down to the metal. You wouldn’t after all expect a researcher to be forced to run their software on an open source computer, with an open source chipset. You aren’t necessarily worried what operating system they are running. What you are worried about is whether it is possible read their data files and reproduce their analysis. If I take this just one step further, it doesn’t matter if the analysis is done in MatLab or Excel, as long as the files are readable in Open Office and the analysis is described in sufficient detail that it can be reproduced or re-implemented.

...

Open Data is crucial to Open Research. If we don’t have the data we have nothing to discuss. Open Process is crucial to Open Research. If we don’t understand how something has been produced, or we can’t reproduce it, then it is worthless. Open Source is not necessary, but, if it is done properly, it can come close to being sufficient to satisfy the other two requirements. However it can’t do that without Open Standards supporting it for documenting both file types and the software that uses them.

The point that came out of the conversation with Glyn Moody for me was that it may be more productive to focus on our ability to re-implement rather than to simply replicate. Re-implementability, while an awful word, is closer to what we mean by replication in the experimental world anyway. Open Source is probably the best way to do this in the long term, and in a perfect world the software and support would be there to make this possible, but until we get there, for many researchers, it is a better use of their time, and the taxpayer’s money that pays for that time, to do that line fitting in Excel. And the damage is minimal as long as source data and parameters for the fit are made public. If we push forward on all three fronts, Open Data, Open Process, and Open Source then I think we will get there eventually because it is a more effective way of doing research, but in the meantime, sometimes, in the bigger picture, I think a shortcut should be acceptable.

I think these are fair points. Science needs reproduceability in terms of the results, but that doesn't imply that the protocols must be copied exactly. As Neylon says, the key is "re-implementability" - the fact that you *can* reproduce the results with the given information. Using Excel instead of OpenOffice.org Calc is not a big problem provided enough details are provided.

However, it's easy to think of circumstances where *new* code is being written to run on proprietary engines where it is simply not possible to check the logic hidden in the black boxes. In these circumstances, it is critical that open source be used at all levels so that others can see what was done and how.

But another interesting point emerged from this anecdote from the same post:

Sometimes the problems are imposed from outside. I spent a good part of yesterday battling with an appalling, password protected, macroed-to-the-eyeballs Excel document that was the required format for me to fill in a form for an application. The file crashed Open Office and only barely functioned in Mac Excel at all. Yet it was required, in that format, before I could complete the application.

Now, this is a social issue: the fact that scientists are being forced by institutions to use proprietary software in order to apply for grants or whatever. Again, it might be unreasonable to expect young scientists to sacrifice their careers for the sake of principle (although Richard Stallman would disagree). But this is not a new situation. It's exactly the problem that open access faced in the early days, when scientists just starting out in their career were understandably reluctant to jeopardise it by publishing in new, untested journals with low impact factors.

The solution in that case was for established scientists to take the lead by moving their work across to open access journals, allowing the latter to gain in prestige until they reached the point where younger colleagues could take the plunge too.

So, I'd like to suggest something similar for the use of open source in science. When established scientists with some clout come across unreasonable requirements - like the need to use Excel - they should refuse. If enough of them put their foot down, the organisations that lazily adopt these practices will be forced to change. It might require a certain courage to begin with, but so did open access; and look where *that* is now...

Follow me on Twitter @glynmoody

04 December 2006

Open Science or Free Science?

The open science meme is rather in vogue at the moment. But Bill Hooker raises an interesting point (in a post that kindly links to a couple items on this blog):

should we be calling the campaign to free up scientific information (text, data and software) "Free Science", for the same reasons Stallman insists on "Free Software"?

Interestingly, there is another parallel here:

Just as free software gained the alternative name "open source" at the Freeware Summit in 1998, so free open scholarship (FOS), as it was called until then by the main newsletter that covered it - written by Peter Suber, professor of philosophy at Earlham College - was renamed "open access" as part of the Budapest Open Access Initiative in December 2001. Suber's newsletter turned into Open Access News and became one of the earliest blogs; it remains the definitive record of the open access movement, and Suber has become its semi-official chronicler (the Eric Raymond of open access - without the guns).

05 September 2007

Microsoft Loves Openness

Well, some openness:

BioMed Central, the world’s largest publisher of peer-reviewed, open access research journals, is pleased to announce that Microsoft Research has agreed to be the premium sponsor of the BioMed Central Research Awards for 2007. The BioMed Central Research Awards, which began accepting nominations in late July, recognize excellence in research that has been made universally accessible by open access publication in one of the publisher’s 180 journals.

"Microsoft’s External Research group is proud to be a sponsor of the BioMed Central Research Awards and feel it is important to recognize excellence in research," said Lee Dirks, director, scholarly communications, Microsoft Research. "We are very supportive of the open science movement and recognize that open access publication is an important component of overall scholarly communications."

It may only be promoting open science and open access at the moment, but I predict Microsoft will one day love open source just as much. (Via Open Access News.)

03 August 2007

Parallel Universes?

Now, where have I heard this before?

Free and open source software (FOSS) has roots in the ideals of academic freedom and the unimpeded exchange of information. In the last five years, the concepts have come full circle, with FOSS serving as a model for Open Access (OA), a movement within academia to promote unrestricted access to scholarly material for both researchers and the general public.

"The philosophy is so similar that when we saw the success that open source was having, it served as a guiding light to us," says Melissa Hagemann, program manager for Open Access initiatives at the Open Society Institute, a private foundation for promoting democratic and accessible reform at all levels of society. Not only the philosophy, but also the history, the need to generate new business models, the potential empowerment of users, the impact on developing nations, and resistance to the movement make OA a near twin of FOSS.

Oh, I remember:

The parallels between this movement - what has come to be known as “open access” – and open source are striking. For both, the ultimate wellspring is the Internet, and the new economics of sharing that it enabled. Just as the early code for the Internet was a kind of proto-open source, so the early documentation – the RFCs – offered an example of proto-open access. And for both their practitioners, it is recognition – not recompense – that drives them to participate.

Great minds obviously think alike - and Bruce does have some nice new quotations. Read both; contrast and compare.

18 November 2009

Free Culture Forum: Getting it Together

As regular readers will know, I write a lot about the related areas of openness, freedom, transparency and the commons, but it's rare to find them literally coming together like this, in the Free Culture Forum:

Across the planet, people are recognizing the need for an international space to build and coordinate a common agenda for issues surrounding free culture and access to knowledge. The Free Culture Forum of Barcelona created one such space.

Bringing together key organizations and active voices in the free culture and knowledge space under a single roof, the Forum was a meeting point to sit and find answers to the pressing questions behind the present paradigm shift.

The Forum was an open space for drawing up proposals to present the position of civil society on the privatization of culture and access to knowledge. Participants debated the role of government in access to knowledge, on the creation and distribution of art and culture, and other areas.

The list of participants is impressive, and includes well-known names like the EFF, the P2P Foundation, the Knoweldge Ecology International, La Quadrature du Net, and many others. Even better is the extremely thorough charter; here's it's opening section:

We are in the midst of a revolution in the way that knowledge and culture are created, accessed and transformed. Citizens, artists and consumers are no longer powerless and isolated in the face of the content-providing industries: now individuals across many different spheres collaborate, participate and decide. Digital technology has bridged the gap, allowing ideas and knowledge to flow. It has done away with many of the geographic and technological barriers to sharing. It has provided new educational tools and stimulated new possibilities for forms of social, economic and political organisation. This revolution is comparable to the far reaching changes brought about as a result of the printing press.

In spite of these transformations, the entertainment industry, most communications service providers governments and international bodies still base the sources of their advantages and profits on control of content and tools and on managing scarcity. This leads to restrictions on citizens’ rights to education, access to information, culture, science and technology; freedom of expression; inviolability of communications and privacy. They put the protection of private interests above the public interest, holding back the development of society in general.

Today’s institutions, industries, structures or conventions will not survive into the future unless they adapt to these changes. Some, however, will alter and refine their methods in response to the new realities. And we need to take account of this.

That will all be pretty familiar to readers of this blog. There then follow an amazingly complete list of Things That We Need - which will also ring a few bells. Here are the areas covered:

Reverse Three-Step Test
Knowledge Commons and Public Domain
Defending access to Technological Infrastructures and Net Neutrality
Rights in digital context
Stimulating Creativity and Innovation
Access to works for persons with reading disabilities
Transparency

There's also an important section headed "Guidelines for Education and Access to Knowledge", which naturally considers open educational resources, and has this to say on free software, open standards and open formats:

Free/libre and Open Source Software allows people to study and learn concepts instead of black boxes, enables transparency of information processing, assures competition and innovation, provides independence from corporate interests and increases the autonomy of citizens.

The use of open standards and open formats is essential to ensure technical interoperability, provide a level playing field for competing vendors, enable seamless access to digital information and the availability of knowledge and social memory now and in the future. Thus we assert that:

* Educational entities should use Free/libre and Open Source Software as a learning tool, as a subject in itself and as the base for their IT infrastructure.
* All software developed in an educational environment and publicly funded must be released under a free license.
* Promote the use of Free/libre and Open Source Software in textbooks as an alternative to proprietary software to perform learning-related tasks such as numerical calculus, image editing, document composition, etc. where applicable.
* Develop, provide and promote free editing tools to elaborate and improve didactic materials.
* Technologies like Digital Rights Management must be refused to assure the permanent access to educational resources and enable lifelong learning.


All-in-all, this is an extraordinary document with which I find myself in pretty much total agreement. It's an great achievement, and will be a real reference point for everyone working in the fields of digital freedom, openness and transparency for years to come.

Follow me @glynmoody on Twitter or identi.ca.

21 March 2007

Learning about Open Educational Resources

Major European studies on open source are two a penny these days (and that's good), but some of the other opens have yet to achieve this level of recognition. So the appearance of major EU report on Open Educational Resources from the Open e-Learning Content Observatory Services (OLCOS) project is particularly welcome.

At present a world-wide movement is developing which promotes unencumbered open access to digital resources such as content and software-based tools to be used as a means of promoting education and lifelong learning. This movement forms part of a broader wave of initiatives that actively promote the “Commons” such as natural resources, public spaces, cultural heritage and access to knowledge that are understood to be part of, and to be preserved for, the common good of society. (cf. Tomales Bay Institute, 2006)

With reference to the Open Educational Resources (OER) movement, the William and Flora Hewlett Foundation justifies their investment in OER as follows: “At the heart of the movement toward Open Educational Resources is the simple and powerful idea that the world’s knowledge is a public good and that technology in general and the Worldwide Web in particular provide an extraordinary opportunity for everyone to share, use, and re-use knowledge. OER are the parts of that knowledge that comprise the fundamental components of education – content and tools for teaching, learning and research.”

Since the beginning of 2006, the Open e-Learning Content Observatory Services (OLCOS) project has explored how Open Educational Resources (OER) can make a difference in teaching and learning. Our initial findings show that OER do play an important role in teaching and learning, but that it is crucial to also promote innovation and change in educational practices. The resources we are talking about are seen only as a means to an end, and are utilised to help people acquire the competences, knowledge and skills needed to participate successfully within the political, economic, social and cultural realms of society.

Despite its title, it covers a very wide area, including open courseware, open access and even open source. It's probably the best single introduction to open educational resources around today - and it's free, as it should be. (Via Open Access News.)

25 July 2014

Open Access: Looking Back, Looking Forwards

A couple of weeks ago, I spoke at a conference celebrating the tenth anniversary of the Berlin declaration on open access. More formally, the "Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities" is one of three seminal formulations of the open access idea: the other two are the Bethesda Statement (2003) and the original Budapest Open Access Initiative (2002).

On Open Enterprise blog.

25 January 2007

The Coming Victory of Open Access

In this blog, I've emphasised the parallels between open source and open access. We know that as Microsoft has become more and more threatened by the former, it has resorted to more and more desperate attempts to sow FUD. Now comes this tremendous story from Nature that the traditional scientific publishing houses are contemplating doing the same to attack open access:

Nature has learned, a group of big scientific publishers has hired the pit bull to take on the free-information movement, which campaigns for scientific results to be made freely available. Some traditional journals, which depend on subscription charges, say that open-access journals and public databases of scientific papers such as the National Institutes of Health's (NIH's) PubMed Central, threaten their livelihoods.

The "pit bull" is Eric Dezenhall:

his firm, Dezenhall Resources, was also reported by Business Week to have used money from oil giant ExxonMobil to criticize the environmental group Greenpeace.

These are some of the tactics being considered:

Dezenhall also recommended joining forces with groups that may be ideologically opposed to government-mandated projects such as PubMed Central, including organizations that have angered scientists. One suggestion was the Competitive Enterprise Institute, a conservative think-tank based in Washington DC, which has used oil-industry money to promote sceptical views on climate change. Dezenhall estimated his fee for the campaign at $300,000–500,000.

The Competitive Enterprise Institute, you may recall, are the people behind the risible "Carbon dioxide: they call it pollution, we call it life" campaign of misinformation about global warming.

This is a clear sign that we're in the end-game for open access's victory.

04 August 2009

Level Playing-Fields and Open Access

Yesterday, I wrote elsewhere about open standards, and how they sought to produce a level playing field for all. Similar thoughts have occurred to Stuart Shieber in this post about open access:


In summary, publishers see an unlevel playing field in choosing between the business models for their journals exactly because authors see an unlevel playing field in choosing between journals using the business models.

He has an interesting solution:

To mitigate this problem—to place open-access processing-fee journals on a more equal competitive footing with subscription-fee journals—requires those underwriting the publisher's services for subscription-fee journals to commit to a simple “compact” guaranteeing their willingness to underwrite them for processing-fee journals as well.

He concludes:

If all schools and funders committed to the compact, a publisher could more safely move a journal to an open-access processing-fee business model without fear that authors would desert the journal for pecuniary reasons. Support for the compact would also send a signal to publishers and scholarly societies that research universities and funders appreciate and value their contributions and that universities and funders promoting self-archiving have every intention of continuing to fund publication, albeit within a different model. Publishers willing to take a risk will be met by universities and funding agencies willing to support their bold move.

The new US administration could implement such a system through simple FRPAA-like legislation requiring funding agencies to commit to this open-access compact in a cost-neutral manner. Perhaps reimbursement would be limited to authors at universities and research institutions that themselves commit to a similar compact. As funding agencies and universities take on this commitment, we might transition to an efficient, sustainable journal publishing system in which publishers choose freely among business models on an equal footing, to the benefit of all.

Follow me @glynmoody on Twitter and identi.ca.

09 August 2009

Open Access Piles on the Pressure

Interesting:

it is not open access per se that is threatening Elsevier (High Energy Physics since long have had almost 100% open access uptake, so critical mass has long been reached in this special field, quite different from other physics areas), but that they are loosing the battle for authors, possibly due to their reluctance to support SCOAP3. As I wrote, they have lost between 30% to 50% in submissions from authors during the last 4 years for their HEP journals. With such a massive reduction in size, prices also had to come down. In the new open access scholarly publishing market, journals will compete for authors even more than now. SCOAP3 certainly raised the awareness for both the scientific community's expectation to fully convert these journals to OA and the unsustainable prices that had risen to absurd record prices. It is clear that subscriptions are now under even more pressure because of the global economic crisis that especially hit american libraries very hard.

High energy physics (my old discipline) is certainly in the vanguard, but is probably just the first of many to follow this path. Go, open access.

Follow me @glynmoody on Twitter and identi.ca.

18 June 2008

Open Access Increases Its Impact

Unless you're an academic, you probably don't care about "impact factors", but for the world of academic journals - and the people who publish there - it's a matter of life and death (sadly.) Think of them as a kind of Google PageRank for publishing.

Anyway, the news that the trail-blazing Public Libary of Science titles have increased their impact factors is important:

The latest impact factors (for 2007) have just been released from Thomson Reuters. They are as follows:
PLoS Biology - 13.5
PLoS Medicine - 12.6
PLoS Computational Biology - 6.2
PLoS Genetics - 8.7
PLoS Pathogens - 9.3

As we and others have frequently pointed out, impact factors should be interpreted with caution and only as one of a number of measures which provide insight into a journal’s, or rather its articles’, impact. Nevertheless, the 2007 figures for PLoS Biology and PLoS Medicine are consistent with the many other indicators (e.g. submission volume, web statistics, reader and community feedback) that these journals are firmly established as top-flight open-access general interest journals in the life and health sciences respectively.

The increases in the impact factors for the discipline-based, community-run PLoS journals also tally with indicators that these journals are going from strength to strength. For example, submissions to PLoS Computational Biology, PLoS Genetics and PLoS Pathogens have almost doubled over the past year - each journal now routinely receives 80-120 submissions per month of which around 20-25 are published. The hard work and commitment of the Editors-in-Chief and the Editorial Boards (here, here and here) are setting the highest possible standards for community-run open-access journals.

This matters because many sceptics of open access would love PLoS to fail - either financially, in terms of academic influence or, ideally, both - and its continuing ascendancy in terms of impact factors is essentially a validation of the whole open access idea. And that has to be good for everyone, whether they care about academic PageRanks or not.

29 January 2009

Open Access Astro-Observatory Runs GNU/Linux

Montegancedo Observatory is the first free open access astronomical observatory in the world. It is located in Building 6 of the School of Computing. The dome is equipped with a computer-automated, robotized 10” telescope, and several computers operating as a web applications server. The observatory also links and broadcasts images and videos captured by the webcams arranged around the dome... All servers run on GNU/Linux systems.

Providing open access to facilities, mediated by the Internet; providing (presumably) open access to the results in real-time; using free software to run the whole thing: this is the future of science. (Via Open Access News.)

31 January 2007

Conservation Commons

A little while back I was urging you to sign a petition calling for open access in the European Union (you did sign, didn't you?). Now here's another worthy cause, asking for open access to environmental information - the ultimate, double commons:

Principles of the Conservation Commons

Open Access: The Conservation Commons promotes free and open access to data, information and knowledge for all conservation purposes.

Mutual Benefit: The Conservation Commons welcomes and encourages participants to both use these resources and to contribute data, information and knowledge.

Rights and Responsibilities: Contributors to the Conservation Commons have full right to attribution for any uses of their data, information, or knowledge, and the right to ensure that the original integrity of their contribution to the Commons is preserved. Users of the Conservation Commons are expected to comply, in good faith, with terms of uses specified by contributors.

You can sign up online. See you there. (Via Open Access News.)

04 May 2009

Another Reason We Need Open Access

One of the more laughable reasons that traditional science publishers cite in their attempts to rubbish open access is that it's somehow not so rigorous as "their" kind of publishing. There's usually a hint that standards might be dropped, and that open access journals aren't, well, you know, quite proper.

And then this comes along:

The Scientist has reported that, yes, it's true, Merck cooked up a phony, but real sounding, peer reviewed journal and published favorably looking data for its products in them. Merck paid Elsevier to publish such a tome, which neither appears in MEDLINE or has a website, according to The Scientist.

Now, open access in itself isn't going to stop this kind of thing, but it seems highly unlikely that anyone would try it, given that the results would be freely available for any Thomas, Richard or Harold to peruse.

One reason why Elseview probably thought they could pull it off was that they knew few people would look at this stuff - which is why it's not in Medline, and why it doesn't have a website. Given enough eyes, all bugs are shallow and all that.

So, next time high-falutin' publishers look down on open access journals - especially if it's Elsevier - just remind them about the Australasian Journal of Bone and Joint Medicine episode....

19 July 2006

Open Access to Open Access, the Book

An important new collection of essays on open access has been published. It's called Open Access: Key Strategic, Technical and Economic Aspects. Hearteningly, most of the chapters have been self-archived by the authors: kudos to them for doing so, and to Chandos Publishing for being enlightened enough to allow it. (via Open Access News.)

08 August 2007

On the Necessity of Open Access and Open Data

One of the great things about open source is its transparency: you can't easily hide viruses or trojans, nor can you simply filch code from other people, as you can with closed source. Indeed, the accusations made from time to time that open source contains "stolen" code from other programs is deeply ironic, since it's almost certainly proprietary, closed software that has bits of thievery hidden deep within its digital bowels.

The same is true of open access and open data: when everything is out in the open, it is much easier to detect plagiarism or outright fraud. Equally, making it hard for people to access online, searchable text, or the underlying data by placing restrictions on its distribution reduces the number of people checking it and hence the likelihood that anyone will notice if something is amiss.

A nicely-researched piece on Ars Technica provides a clear demonstration of this:

Despite the danger represented by research fraud, instances of manufactured data and other unethical behavior have produced a steady stream of scandal and retractions within the scientific community. This point has been driven home by the recent retraction of a paper published in the journal Science and the recognition of a few individuals engaged in dozens of acts of plagiarism in physics journals.

By contrast, in the case of arXiv's preprint holdings, catching this stuff is relatively easy thanks to its open, online nature:

Computer algorithms to detect duplications of text have already proven successful at detecting plagiarism in papers in the physical sciences. The arXiv now uses similar software to scan all submissions for signs of plagiarized text. As this report was being prepared, the publishing service Crossref announced that it would begin a pilot program to index the contents of the journals produced by a number of academic publishers in order to expose them for the verification of originality. Thus, catching plagiarism early should be getting increasingly easy for the academic world.

Note, though, that open access allows *anyone* to check for plagiarism, not just the "authorised" keepers of the copyrighted academic flame.

Similarly, open data means anyone can take a peek, poke around and pick out problems:

How did Dr. Deb manage to create the impression that he had generated a solid data set? Roberts suggests that a number of factors were at play. Several aspects of the experiments allowed Deb to work largely alone. The mouse facility was in a separate building, and "catching a mouse embryo at the three-cell stage had him in from midnight until dawn," Dr. Roberts noted. Deb was also on his second post-doc position, a time where it was essential for him to develop the ability to work independently. The nature of the data itself lent it to manipulation. The raw data for these experiments consisted of a number of independent grayscale images that are normally assigned colors and merged (typically in Photoshop) prior to analysis.

Again, if the "raw data" were available to all, as good open notebook science dictates that they should be, any manipulation could be detected more readily.

Interestingly, this is not something that traditional "closed source" publishing can ever match using half-hearted fudges or temporary fixes, just as closed source programs can never match open ones for transparency. There is simply no substitute for openness.

22 July 2010

Openness: Just What the Doctoral Student Ordered

In 2007 the British Library (BL) and the JISC funded The Google Generation Information Behaviour of the Researcher of the Future research (CIBER, 2008), which focused on how researchers of the future, ‘digital natives’ born after 1993, are likely to access and interact with digital resources in five to ten years’ time. The research reported overall that the information literacy of young people has not improved with wider access to technology.

To complement the findings of the Google Generation research, the BL and the JISC commissioned this three‐year research study Researchers of Tomorrow focusing on the information‐seeking and research behaviour of doctoral students born between 1982 – 1994, dubbed ‘Generation Y’.

There's lots of interesting stuff in the first report, but what really caught my attention was the following:

The principles behind open access publishing and self‐archiving speak to the students’ desire for an all‐embracing, seamlessly accessible research information network in which restrictions on access do not constrain them. Similarly, many of the students favour open source technology applications (e.g. Linux, Mozilla) to support the way they want to work and organise their research, and are critical of the lack of technical support to open source applications in their own institutions.

However, as the report emphasises, students remain somewhat confused about what open access really is. This suggests fertile ground for a little more explanation by open access practitioners - the benefits of doing so could be considerable.

It's also rather ironic that one of those behind the report should be the British Library: as I've noted with sadness before, the BL is one of the leading opponents of openness in the academic world, choosing instead to push DRM and patented-encumbered Microsoft technologies for its holdings. It's probably too much to expect it to read the above sections and to understand that it is going in exactly the wrong direction as far as future researchers - its customers - are concerned...

Follow me @glynmoody on Twitter or identi.ca.