Balancing privacy with biometric techniques in a commercial context
29 January 2020
Macquarie Group, London
Date: 29 January 2020
Registration - 09.00 - 09.30
Roundtable - 09.30 - 16.30
Location: Macquarie Group, London
Facial recognition is a common example of a biometric technique and attracts the attention of privacy regulators to its use in public areas, such as railway stations and to help verify identity. Meanwhile companies are increasingly using voice or facial recognition and other biometric techniques in a variety of commercial and employment related contexts such as attendance in the workplace and fraud prevention in financial services.
The objective of this Roundtable is for companies to exchange experience on how they are implementing and using biometric techniques in different scenarios and plan to use them in the future. While the technology is constantly developing, the question facing companies, as always, is just because a technique can be developed, what are, and should be, the legal and ethical constraints on its use? Informing customers and staff of use of biometric techniques is necessary and has a preventative role. What would privacy regulators regard as achieving a fair balance between use of a technique to reduce fraud and fairness to customers? How have different companies approached this subject?
Privacy Laws & Business, hosted by Macquarie Group, is facilitating a Roundtable immediately after Data Protection Day 2020, to learn about the latest tech developments in this field and explore current practice to assess what is not only lawful but also learn about fair collection and use of personal data in different company contexts.
- Which biometric techniques are most used to verify identity or other purposes matters related to:
- 1.1 the workplace?
- 2.2 customers?
- Which biometric techniques third party service providers are you planning on using in the next 3 years?
- How do you navigate through the various third-party related privacy issues?
- 2.1 Choice of provider
- 2.2 Compliant contracts
- 2.3 Data security
- 2.4 Data Protection by Default as a criterion for selection
- 2.5 Data Protection Impact Assessment as a criterion for selection
- 2.6 Use of data and purpose limitation
- 2.7 Period of retention of data
- 2.8 De-identification and destruction of data
- 2.9 Other
- How have you assessed the GDPR applying to biometric technologies?
- What, if any, specific legal issues are related to specific biometric technologies including, identification techniques?
- 5.1. Facial recognition
- 5.2. Voice recognition
- 5.3. Iris recognition
- 5.4. Gait
- 5.5. Keyboard and mouse patterns
- 5.6. Other
- To what extent does Artificial Intelligence help any of the processes, including the verification process?
- Have labour unions, works councils, and/or privacy advocates opposed the introduction of biometric technologies, including for identification purposes, in your organisation or wish to limit its use?
- 7.1. If so, on which grounds?
- 7.2. Have any harms been identified?
- 7.3. If so, which harms?
- How have you planned, or will you plan, to address these concerns?
- To what extent can consent be freely given and informed in this context?
- What other privacy legal issues arise? How have you dealt with them?
- A case study from the ICO’s sandbox programme with comments from both the company and the ICO.
- Identify legal issues, ethical or commercial/risk considerations in specific jurisdictions.
- Action points in general and in specific jurisdictions.
SRA hours available
Detailed programme coming soon
Early Bird Rate
|Single registration, price per person||£250 + VAT||£350 + VAT|
To register click BOOK NOW at the top of this page or email email@example.com.