Jan 312018
 
 January 31, 2018  Posted by  Breaches, Govt, Healthcare, Non-U.S.

This is the nightmare – that patients will stop sharing their serious and sensitive health or personal information with providers because the information is not being kept confidential. NHS should be focused on treating patients – not on being tin-star deputies for the Home Office on immigration issues.  

Alexander J. Martin reports:

The NHS has been told to “immediately stop sharing patient data with the Home Office” for the purpose of tracing immigration offenders.

Dr Sarah Wollaston MP, the chair of the House of Commons health select committee, has criticised a deal between NHS Digital and the Home Office that may be putting pregnant women and modern slavery victims at risk.

She said it was “deeply concerning” that NHS Digital “ever chose” to breach the patient confidentiality standards in order to assist immigration investigations.

Read more on Sky News.

Jan 302018
 
 January 30, 2018  Posted by  Breaches, Featured News, Online, Surveillance

Here’s the abstract of the article, which may scare you if you were counting on Bitcoin in conjunction with Tor to protect your privacy:

ABSTRACT

With the rapid increase of threats on the Internet, people are continuously seeking privacy and anonymity. Services such as Bitcoin and Tor were introduced to provide anonymity for online transactions and Web browsing. Due to its pseudonymity model, Bitcoin lacks retroactive operational security, which means historical pieces of information could be used to identify a certain user. We investigate the feasibility of deanonymizing users of Tor hidden services who rely on Bitcoin as a payment method by exploiting public information leaked from online social networks, the Blockchain, and onion websites. This, for example, allows an adversary to link a user with @alice Twitter address to a Tor hidden service with private.onion address by finding at least one past transaction in the Blockchain that involves their publicly declared Bitcoin addresses.

To demonstrate the feasibility of this deanonymization attack, we carried out a real-world experiment simulating a passive, limited adversary. We crawled 1.5K hidden services and collected 88 unique Bitcoin addresses. We then crawled 5B tweets and 1M BitcoinTalk forum pages and collected 4.2K and 41K unique Bitcoin addresses, respectively. Each user address was associated with an online identity along with its public profile information. By analyzing the transactions in the Blockchain, we were able to link 125 unique users to 20 Tor hidden services, including sensitive ones, such as The Pirate Bay and Silk Road. We also analyzed two case studies in detail to demonstrate the implications of the resulting information leakage on user anonymity. In particular, we confirm that Bitcoin addresses should always be considered exploitable, as they can be used to deanonymize users retroactively. This is especially important for Tor hidden service users who actively seek and expect privacy and anonymity.

Jan 302018
 
 January 30, 2018  Posted by  Court, Featured News, Non-U.S., Surveillance

Matt Burgess what is some Very Big News:

In December 2016, the EU’s highest court ruled that governments keeping emails and electronic communications on a “general an indiscriminate” basis was illegal. In doing so, it pushed the UK’s controversial surveillance laws back into the country’s highest court.

Now, in what will be seen as a fresh blow for the government’s so-called Snooper’s Charter, the UK’s Court of Appeal has ruled previous surveillance law was illegal. After years of legal wrangling the court said the Data Retention and Investigatory Powers Act 2014 (Dripa) didn’t put restrictions on the access to reams of data collected about people in the UK.

Read more on Wired UK.

Jan 292018
 
 January 29, 2018  Posted by  Business, Featured News, Healthcare

David Gershgorn reports:

Some of Google’s top AI researchers are trying to predict your medical outcome as soon as you’re admitted to the hospital.

A new research paper, published Jan. 24 with 34 co-authors and not peer-reviewed, claims better accuracy than existing software at predicting outcomes like whether a patient will die in the hospital, be discharged and readmitted, and their final diagnosis. To conduct the study, Google obtained de-identified data of 216,221 adults, with more than 46 billion data points between them. The data span 11 combined years at two hospitals, University of California San Francisco Medical Center (from 2012-2016) and University of Chicago Medicine (2009-2016).

Read more on Quartz.

OK, now if this is accurate, it sounds really promising, right? But I wondered how they got so much de-identified medical data on so many people. So I took a look at the paper’s methods section and here’s what is says:

We included EHR data from the University of California, San Francisco (UCSF) from 2012-2016, and the University of Chicago Medicine (UCM) from 2009-2016. We refer to each health system as Hospital A and Hospital B. All electronic health records were de-identified, except that dates of service were maintained in the UCM dataset. Both datasets contained patient demographics, provider orders, diagnoses, procedures, medications, laboratory values, vital signs, and flowsheet data, which represents all other structured data elements (e.g. nursing flowsheets), from all inpatient and outpatient encounters. The UCM dataset (but not UCSF) additionally contained de-identified, free-text medical notes. Each dataset was kept in an encrypted, access-controlled, and audited sandbox.

Ethics review and institutional review boards approved the study with waiver of informed consent or exemption at each institution.

So if you went to either of these hospitals, the hospital might have subsequently waived your informed consent and just turned over data on you that everyone believes is de-identified. Now it’s great that that it was kept encrypted, access-controlled, and in an audited sandbox, but here’s the thing:  are you okay with a hospital waiving your informed consent? How difficult might it be to re-identify the data?

I know a lot of people feel that it’s okay for entities to do this (waive consent) because it’s in the best interests of public health and progress, but of course, I focus on the individual’s rights. So think about it… is this okay and if it’s not, how does that affect your use of a particular hospital? Would you say or do anything different?