Bryan Cranston and SAG-AFTRA say OpenAI is taking their deepfake concerns seriously

TribeNews
3 Min Read

Elissa Welle

is a NYC-based AI reporter and is currently supported by the Tarbell Center for AI Journalism. She covers AI companies, policies, and products.

- Advertisement -

Actors, studios, agents, and the actors union SAG-AFTRA have all expressed their concerns about appearing in Sora 2’s AI-generated videos ever since the deepfake machine was released last month. Now a joint statement from actor Bryan Cranston, OpenAI, the union, and others says that after videos of him appeared on Sora — one even showed him taking a selfie with Michael Jackson — the company has “strengthened guardrails” around its opt-in policy for likeness and voice.

The joint statement said that OpenAI “expressed regret for these unintentional generations.” It also carried cosigns from talent agencies United Talent Agency, the Association of Talent Agents, and the Creative Artists Agency, which had criticized the company’s lack of protections for artists in the past. OpenAI did not give specifics on how it would change the app, or reply to The Verge’s request for comment in time for publication.

- Advertisement -

OpenAI appeared to reaffirm its commitment to stronger protections for those who do not opt in: “All artists, performers, and individuals will have the right to determine how and whether they can be simulated.” It also said it would “expeditiously” review complaints about breaches of the policy.

Cranston said he is “grateful to OpenAI for its policy and for improving its guardrails.” While Cranston’s case came to a positive resolution, SAG-AFTRA president Sean Astin said in the joint statement that performers need a law to protect them from “massive misappropriation by replication technology,” and pointed to the proposed Nurture Originals, Foster Art, and Keep Entertainment Safe Act, or NO FAKES Act.

- Advertisement -

OpenAI launched Sora 2 with an opt-out policy for copyright holders, before reversing course following public outcry and videos of Nazi SpongeBob, promising to “give rightsholders more granular control over generation of characters, similar to the opt-in model for likeness but with additional controls.”

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.

Elissa Welle

- Advertisement -
Leave a Comment
Ads Blocker Image Powered by Code Help Pro

Ads Blocker Detected & This Is Prohibited!!!

We have detected that you are using extensions to block ads and you are also not using our official app. Your Account Have been Flagged and reported, pending de-activation & All your earning will be wiped out. Please turn off the software to continue

You cannot copy content of this app