Mirror, mirror: Navigating privacy and AI compliance with digital clones


Contributors:
Frida Alim
Editor's note: The IAPP is policy neutral. We publish contributed opinion and analysis pieces to enable our members to hear a broad spectrum of views in our domains.
Digital clones — whether lifelike avatars, voice replicas or interactive chatbots — present exciting opportunities for creators to interact with fans, offer one-on-one coaching at scale and continue their legacies online. For companies, however, the generation and operation of these digital clones presents novel challenges under laws — both new and old — relating to privacy, rights of publicity and artificial intelligence transparency.
Companies that create digital clones face risks around identity verification, exploitation and misuse, and should be aware of key compliance areas and security considerations.
Biometric information collection
The creation of a digital clone or replica may involve analyzing photos, videos and/or audio recordings to generate a realistic, digital replica of an identifiable individual. This digital clone may mimic the individual's appearance, gesticulation and/or voice. Companies that create digital clones should evaluate whether the process involves the creation of biometric information based on photos, videos or audio.
Illinois' Biometric Information Privacy Act, for example, applies to biometric information, meaning certain biometric identifiers — including voiceprints or facial geometry — that can be used to uniquely identify an individual. Under the law, companies that collect biometric information have a number of obligations, including obtaining consent to collection and processing of the data, avoiding selling or otherwise profiting from it and adhering to specific retention periods.
Contributors:
Frida Alim