Showing posts with label ross anderson. Show all posts
Showing posts with label ross anderson. Show all posts

23 March 2009

The State of the Database State

A recurrent theme in these posts – and throughout Computerworld UK – has been the rise of vast, unnecessary and ultimately doomed databases in the UK.

But those stories have been largely sporadic and anecdotal; what has been lacking has been a consolidated, coherent and compelling analysis of what is going on in this area – what is wrong, and how we can fix it.

That analysis has just arrived in the form of the Database State report, commissioned by the Joseph Rowntree Foundation from the Foundation for Information Policy Research (FIPR).

On Open Enterprise blog.

04 February 2009

Light Blue Rebel Code

Cambridge University is celebrating its 800th anniversary in 2009. The official history tells the tale of the buildings; but what about the ideas?

Down through the years, Oxford has produced many powerful men and Cambridge many iconoclasts – scientists, philosophers and revolutionaries. The polarisation is by no means total: Oxford's alumni include the reformer John Wyclif and the father of economics Adam Smith, while ours include the Prime Minister Charles Grey, who abolished slavery and passed the Great Reform Bill. But we've long produced more of the rebels; way back in the Civil War, for example, we were parliamentarian while Oxford was royalist. Why should this be?

Read on for the rest of this splendidly iconoclastic history of Cambridge University by Ross Anderson, a man who managed a fair bit of iconoclasm in his undergraduate days, as I recall.... (Via John Naughton.)

03 September 2008

ContactPoint: What is it Good For?

Scrapping:

Anderson disagrees: "If you allow large numbers of people access to sensitive data it's never going to be secure. You can't protect it. ContactPoint should simply never have been built."

This is Prof Ross Anderson, and he knows whereof he speaketh.

23 July 2008

W(h)ither the UK Database Nation?

Interesting:

The court’s view was that health care staff who are not involved in the care of a patient must be unable to access that patient’s electronic medical record: “What is required in this connection is practical and effective protection to exclude any possibility of unauthorised access occurring in the first place.” (Press coverage here.)

A “practical and effective” protection test in European law will bind engineering, law and policy much more tightly together. And it will have wide consequences. Privacy compaigners, for example, can now argue strongly that the NHS Care Records service is illegal.

To say nothing of the central ID card database that permits all kinds of decentralised access....

17 June 2008

Insecurity is Bad for Your Health

Outrageous:


A shocking article appeared yesterday on the BMJ website. It recounts how auditors called 45 GP surgeries asking for personal information about 51 patients. In only one case were they asked to verify their identity; the attack succeeded against the other 50 patients.

31 August 2006

Security Engineering - the Book

I've mentioned Ross Anderson before in this blog, and my own failed attempt to interact with him. But I won't let a little thing like that get in the way of plugging his book Security Engineering - especially now that it can be freely downloaded. If you want to know why that's good news, try reading the intro to said tome, written by the other Mr Security, Bruce Schneier. (Via LWN.net.)

02 July 2006

The Economics of Security

In his lastest Wired column, Bruce S. is writing about a subject particularly dear to my heart: the economics of security. He was lucky enough to go up to the fifth Workshop on the Economics of Information Security at Cambridge: I had hoped to go, but a sudden influx of work prevented me.

My own interest in this area was sparked by a talk that Ross Anderson, now a professor at Cambridge, gave down in London. I vaguely knew Ross at university, when both of us had rather more hair than we do now. Since this was 30 years ago, it's not suprising that he didn't remember me when I introduced myself at the London talk, pointing out that the last time I had seen him was in Whewell's Court: he stared at me as if I was completely bonkers. Ah well.

Schneier gives a good summary of what this whole area is about, and why it is so important:

We generally think of computer security as a problem of technology, but often systems fail because of misplaced economic incentives: The people who could protect a system are not the ones who suffer the costs of failure.

When you start looking, economic considerations are everywhere in computer security. Hospitals' medical-records systems provide comprehensive billing-management features for the administrators who specify them, but are not so good at protecting patients' privacy. Automated teller machines suffered from fraud in countries like the United Kingdom and the Netherlands, where poor regulation left banks without sufficient incentive to secure their systems, and allowed them to pass the cost of fraud along to their customers. And one reason the internet is insecure is that liability for attacks is so diffuse.

Read the whole column, and then, if you are feeling strong, try Ross's seminal essay on the subject: "Why Information Security Is Hard -- An Economic Perspective".