Any fair-minded assessment of the dangers of the deal between Britain's National Health Service (NHS)and Deep Mind must start by acknowledging that both sides mean well.Deep Mind is one of the leading artificial intelligence (AI)companies in the world.The potential of this work applied to health care is very great,but it could also lead to further concentration of power in the tech giants.It Is against that background that the information commissioner,Elizabeth Denham,has issued her damning verdict against the Royal Free hospital trust under the NHS,which handed over to Deep Mind the records of 1.6million patients In 2015on the basis of a vague agreement which took far too little account of the patients'rights and their expectations of privacy.Deep Mind has almost apologized.The NHStrust has mended its ways.Further arrangements-and there may be many-between the NHSand DeepMind will be carefully scrutinised to ensure that all necessary permissions have been asked of patients and all unnecessary data has been cleaned.There are lessons about informed patient consent to learn.But privacy is not the only angle in this case and not even the most important.Ms.Denham chose to concentrate the blame on the NHStrust,since under existing law it “controlled”the data and Deep Mind merely “processed"it.But this distinction misses the point that it is processing and aggregation,not the mere possession of bits,that gives the data value.The great question is who should benefit from the analysis of all the data that our lives now generate.Privacy law builds on the concept of damage to an individual from identifiable knowledge about them.That misses the way the surveillance economy works.The data of an individual there gains its value only when it is compared with the data of countless millions more.The use of privacy law to curb the tech giants in this instance feels slightly maladapted.This practice does not address the real worry.It is not enough to say that the algorithms DeepMind develops will benefit patients and save lives.What matters is that they will belong to a private monopoly which developed them using public resources.If software promises to save lives on the scale that dugs now can,big data may be expected to behave as a big pharm has done.We are still at the beginning of this revolution and small choices now may turn out to have gigantic consequences later.Along struggle will be needed to avoid a future of digital feudalism.Ms.Denham's report is a welcome start
Any fair-minded assessment of the dangers of the deal between Britain's National Health Service (NHS)and Deep Mind must start by acknowledging that both sides mean well.Deep Mind is one of the leading artificial intelligence (AI)companies in the world.The potential of this work applied to health care is very great,but it could also lead to further concentration of power in the tech giants.It Is against that background that the information commissioner,Elizabeth Denham,has issued her damning verdict against the Royal Free hospital trust under the NHS,which handed over to Deep Mind the records of 1.6million patients In 2015on the basis of a vague agreement which took far too little account of the patients'rights and their expectations of privacy.Deep Mind has almost apologized.The NHStrust has mended its ways.Further arrangements-and there may be many-between the NHSand DeepMind will be carefully scrutinised to ensure that all necessary permissions have been asked of patients and all unnecessary data has been cleaned.There are lessons about informed patient consent to learn.But privacy is not the only angle in this case and not even the most important.Ms.Denham chose to concentrate the blame on the NHStrust,since under existing law it “controlled”the data and Deep Mind merely “processed"it.But this distinction misses the point that it is processing and aggregation,not the mere possession of bits,that gives the data value.The great question is who should benefit from the analysis of all the data that our lives now generate.Privacy law builds on the concept of damage to an individual from identifiable knowledge about them.That misses the way the surveillance economy works.The data of an individual there gains its value only when it is compared with the data of countless millions more.The use of privacy law to curb the tech giants in this instance feels slightly maladapted.This practice does not address the real worry.It is not enough to say that the algorithms DeepMind develops will benefit patients and save lives.What matters is that they will belong to a private monopoly which developed them using public resources.If software promises to save lives on the scale that dugs now can,big data may be expected to behave as a big pharm has done.We are still at the beginning of this revolution and small choices now may turn out to have gigantic consequences later.Along struggle will be needed to avoid a future of digital feudalism.Ms.Denham's report is a welcome start