“I’ve just seen a face” *
The September edition of your PL&B UK Report is the first to feature on the cover page two stories which focus on tech issues. The first is on the use of Artificial Intelligence in the context of algorithmic bias and the second is about the privacy risks associated with smart home innovations.
There has been wide coverage recently on the use of biometric identification to verify identity in the context of the Kings Cross Estate scheme in London, now abandoned, and the High Court case involving police use of this technology in Cardiff, and the experiment at Gatwick Airport, now going ahead.
In the corporate context, some of the same and some different issues apply. Different issues include use of facial recognition in the private domain and consent in the workplace. However, the drive for accuracy, the risks of false positives and false negatives, and both over-reliance and under-reliance on biometric identification schemes are common concerns.
Biometric identification roundtable: To address these issues, Privacy Laws & Business has organised a roundtable on 29 January 2020 Balancing privacy with biometric techniques used in a commercial context, to be hosted by Macquarie Group in London. We will learn about the latest tech developments in this field and explore current practice to assess what is lawful, and examine fair collection and use of personal data in different company contexts.
Companies are increasingly using voice or facial recognition and other biometric techniques in a variety of commercial and employment related contexts such as attendance in the workplace and fraud prevention in financial services.
The objective of this Roundtable is for companies and law firms to exchange experience on how they are implementing and using biometric techniques in different scenarios and plan to use them in the future. While the technology is constantly developing, the question facing companies, as always, is just because a technique can be developed, should it be used? What are, and should be, the legal and ethical constraints on its use? Informing customers and staff of use of biometric techniques is necessary and has a preventative role. What would privacy regulators regard as achieving a fair balance between use of a technique to reduce fraud and enhance fairness to customers? How have different companies approached this subject?
If you would like to join this roundtable to share experience, please contact me with your ideas. Registration and further information coming shortly.
Finally, Laura Linkomies, Editor, and all of our editorial team are very gratified by the prompt and positive responses to my request a couple of days ago for comments on how you find PL&B UK Report useful. You can see several of these comments at the foot of the subscription page. Others are welcome!
Regards,
Stewart Dresner, Publisher
* The Beatles - “I've just seen a face, I can't forget the time or place”
UK Report 105
Lead stories:
AI-powered Onfido one of first selected for the ICO’s Sandbox
Onfido, an identity verification company, will research how to identify and mitigate algorithmic bias in machine learning models used for remote biometric identification. By Ali Vaziri of Lewis Silkin LLP.
Smart-home study weighs the privacy risks involved
Martin Kraemer and William Seymour at the University of Oxford report on an ICO-funded research project investigating how ‘smart’ doesn’t have to mean invasive.