The Information Commissioner’s Office (ICO) has asked companies to steer clear of ’emotional analysis’ technologies or face fines, due to the ‘pseudo-scientific’ nature of the field.
It’s the first time the regulator has issued a blanket warning about the ineffectiveness of new technology, said Stephen Bonner, deputy commissioner, but the warning is justified by the harm that could be caused if companies make major decisions on meaningless data.
The Information Commissioner’s Office (ICO) is the UK’s independent data protection and right to information regulator. He defends the rights to information in the public interest and the confidentiality of data for individuals.
Companies that do not act responsibly, pose risks to vulnerable people, or fail to meet ICO expectations will be investigated.
Emotional analysis technologies process data such as gaze tracking, sentiment analysis, facial movements, gait analysis, heartbeats, facial expressions, and skin moisture.
This includes, for example, monitoring the physical health of workers by offering portable (wearable) detection devices or using visual and behavioral methods, including body position, speech, eye and head movements. , to register students for exams.
Emotion analysis is based on the collection, storage and processing of a range of personal data, including subconscious behavioral or emotional responses and, in some cases, special category data. This type of data use is much more risky than traditional biometric technologies that are used to verify or identify a person.
The inability of algorithms, which are not sufficiently developed to detect emotional signals, means that there is a risk of systemic bias, inaccuracy and even discrimination.
“Developments in the market for biometrics and emotional AI are immature. They may not work immediately, if ever. Although opportunities are present, the risks are currently greater. At the ICO, we are concerned that incorrect analysis of data could lead to inaccurate assumptions and judgments about a person and lead to discrimination,” says Bonner.
“The only sustainable biometric deployments will be those that are fully functional, accountable and backed by science. As things stand, we have yet to see emotional AI technology developed in a way that satisfies data protection requirements, and we have broader questions about proportionality, fairness and transparency in this area.
“The ICO will continue to scan the market, identifying stakeholders looking to create or deploy these technologies, and explaining the importance of strengthening data privacy and compliance, while encouraging trust in the operation of these systems. »
Helping businesses and organizations develop biometric products and services takes a ‘privacy by design’ approach, reducing risk factors and ensuring organizations operate securely. security and legality.
Additional information is available in two new reports which were published this week to help companies navigate the use of emerging biometric technologies.
Emotional AI is one of four topics the ICO has identified in a study on the future of biometrics. It is accompanied by another study providing an introduction to the state of biometrics and regulation in the UK.
Examples of current use of biometric technologies:
Financial companies use facial recognition to verify people’s identities by comparing photo IDs and a selfie. The computer systems then check the likelihood that the documents are genuine and that the person in the different images is the same.
Airports are looking to simplify the passenger journey by using facial recognition at check-in, self-service bag drop, and gates.
Other companies use voice recognition to allow users to access secure platforms instead of using passwords.
Biometric technologies are also expected to have a major impact on the following sectors:
Finance and commerce sectors that are rapidly deploying behavioral biometrics and technologies such as voice, gait and vein shape for identification and security purposes.
The fitness and health sector, which is expanding the range of biometric data it collects, with consumer electronics being repurposed for health data.
The employment industry has begun to deploy biometrics for interview analysis and staff training.
Behavioral analysis in preschool education is becoming an important, albeit distant, concern.
Biometrics will also be integral to the success of immersive entertainment.