With the rise of avatars and digital twins, models, celebrities and talent need to protect their image. Companies like The Diigitals market virtual supermodels and, since 2023, brands like Mango and Levi’s have produced campaigns generated entirely by AI. H&M plans to create 30 digital clones of models and allow them to license their avatars to other brands. This approach, hailed for its ethical dimension, raises a number of questions: how should exploitation rights be managed? How should creation and distribution be remunerated? What uses should be prohibited or restricted?
Legal experts already have several references. In the United States, the Fashion Workers Act (FWA ), which came into force in New York in June 2025, requires that any agreement relating to a digital replica (Digital Twins) be distinct from the representation contract, specifying scope, purpose, remuneration and duration. The FWA prohibits agencies from requiring an exclusive commitment of more than three years, or from charging more than 20% commission. The New York law also requires the client to obtain clear written consent to create or use a digital replica, specifying the purpose, remuneration and duration. The OnLabor article stresses that this consent must be independent of the agency contract, and that clients could, on the other hand, exert pressure to work only with models who have agreed to the duplication.
In California, two laws (AB 1836 and AB 2602) that took effect on January 1ᵉʳ, 2025 strengthen protections. AB 1836 prohibits the use of digital replicas of deceased persons without the consent of their estates and extends the right to a remedy allowing minimum damages of $10,000. AB 2602 renders unenforceable contractual clauses authorizing the use of a digital replica without precise description or legal representation of the talent. These provisions are in addition to European regulations: the AI Act requires all AI-generated content to be clearly marked and training data to be transparent (source cited in English version not provided here), while the General Data Protection Regulation (GDPR) frames the use of biometric data.
To ensure the security of models and talents, contracts must include precise and respectful clauses:
Definition of digital replica: specify that it is a computer-generated representation capable of reproducing the voice, face or body of an individual. Exclude minor and standard retouching.
Purpose: detail the intended use (advertising campaign for fashion, cosmetics, perfume, watches or jewelry, e-commerce presentation, in-house training) and prohibit any unintended use.
Separate written consent: require the talent to sign a separate document specifying the creation and use of the replica, in compliance with the FWA.
Duration and scope of use: set the duration (e.g. maximum three years) and territories (worldwide or specific zones) of the clone’s use.
Staggered remuneration: set a fixed fee for the creation of the digital twin, then pay per use (per campaign, per video, per platform) to align interests.
Protection against re-training: prohibit the training of AI models on talent data without consent, along the lines of Californian laws.
Prohibition of sensitive uses: explicitly exclude uses that could damage reputation (non-consensual nudity, political use, misleading deepfakes). FWA already requires compliance with decency laws.
Revocation and right of withdrawal: enable talent to withdraw consent and demand the removal of generated content within a reasonable timeframe, with fair compensation.
Transparency and traceability: require the brand to document the use of the clone (dates, campaigns, platforms) and provide reports to the talent. California regulations require a specific description of usage.
Indemnity and insurance: include an indemnity clause to cover damage caused by unauthorized use, and require customers to take out liability insurance for the use of digital replicas.
These clauses should be included in deal memos and framework contracts. Fashion agencies can use model contracts and negotiation playbooks to guide their sales teams. The aim is to protect talent while providing fashion, cosmetics, perfume, watch and jewelry houses with a clear framework for exploiting AI responsibly. Ongoing dialogue between lawyers, bookers and customers is essential: a misunderstood clause can provoke disputes or compromise trust. By drawing on the expertise of firms like Palmer IA, Elite can act as a trusted third party and guarantee the harmonious adoption of digital avatars and clones.