“I’ve just seen a face” *
The September edition of your PL&B UK Report is the first to feature on the cover page two stories which focus on tech issues. The first is on the use of Artificial Intelligence in the context of algorithmic bias (p.1) and the second is about the privacy risks associated with smart home innovations (p.1).
There has been wide coverage recently on the use of biometric identification to verify identity in the context of the Kings Cross Estate scheme in London, now abandoned, and the High Court case involving police use of this technology in Cardiff (p.15), and the experiment at Gatwick Airport, now going ahead.
In the corporate context, some of the same and some different issues apply. Different issues include use of facial recognition in the private domain and consent in the workplace. However, the drive for accuracy, the risks of false positives and false negatives, and both over-reliance and under-reliance on biometric identification schemes are common concerns.
Biometric identification roundtable: To address these issues, Privacy Laws & Business has organised a roundtable on 29 January 2020 Balancing privacy with biometric techniques used in a commercial context, to be hosted by Macquarie Group in London. We will learn about the latest tech developments in this field and explore current practice to assess what is lawful, and examine fair collection and use of personal data in different company contexts.
Companies are increasingly using voice or facial recognition and other biometric techniques in a variety of commercial and employment related contexts such as attendance in the workplace and fraud prevention in financial services.
The objective of this Roundtable is for companies and law firms to exchange experience on how they are implementing and using biometric techniques in different scenarios and plan to use them in the future. While the technology is constantly developing, the question facing companies, as always, is just because a technique can be developed, should it be used? What are, and should be, the legal and ethical constraints on its use? Informing customers and staff of use of biometric techniques is necessary and has a preventative role. What would privacy regulators regard as achieving a fair balance between use of a technique to reduce fraud and enhance fairness to customers? How have different companies approached this subject?
If you would like to join this roundtable to share experience, please contact me with your ideas. Registration and further information coming shortly.
Finally, Laura Linkomies, Editor, and all of our editorial team are very gratified by the prompt and positive responses to my request a couple of days ago for comments on how you find PL&B UK Report useful. You can see several of these comments at the foot of the subscription page. Others are welcome!
Stewart Dresner, Publisher
UK Report 105
AI-powered Onfido one of first selected for the ICO’s Sandbox
Onfido, an identity verification company, will research how to identify and mitigate algorithmic bias in machine learning models used for remote biometric identification. By Ali Vaziri of Lewis Silkin LLP.
Contents also include:
- Comment: Brexit data protection issues
- Busy year for the ICO
- Smart-home privacy risks
- Finding a legal ground for AI
- Data protection risk management
- Subject Access Requests
- The ICO’s cookie guidance
- Book Review: Data Protection Strategy
- Council for Internet Safety
- DP Act immigration exemption in High Court
- DMA gathers views on GDPR and e-Privacy
- Facial recognition scrapped
- Bias in algorithmic decisions
- Responsible Marketing award
- Government smart data initiative
- Changes for privacy and DP claims
- No set age when children understand privacy
- ICO consults on data sharing code
- Draft Environment Bill would ‘restrict EIR rights’