Trevor Noah says AI-powered video generators like OpenAI’s Sora could be ‘disastrous’

 

Comedian Trevor Noah (left) interviews Code.org CEO Hadi Partovi at an event at the Microsoft campus in Redmond, Washington, on Thursday to launch the company’s new Microsoft Elevate Washington initiative for AI and education. (GeekWire Photo/Taylor Soper)

Trevor Noah is concerned about where things are going with controversial AI video generators.

The comedian and former Daily Program The host said AI video apps like OpenAI’s Sora could be “disastrous” if they continue to use images of people without permission.

“I have to find out what they’re doing and how they’re doing it,” he told GeekWire. “But I don’t think everything will end well when they’re not dealing with permits.”

We spoke to Noah — Microsoft’s “chief issues officer” — on Thursday, following his appearance at the company’s headquarters in Redmond, where he helped launch a new AI education initiative in Washington state.

OpenAI last week released Sora 2, a new version of its AI video generation system that creates hyper-realistic clips from text prompts or existing footage. The new version adds a “Cameo” feature that allows users to generate videos with human likenesses by uploading or referencing existing photos.

The update made the invite-only Sora one of the most viral consumer technology products of 2025 – it’s the top free app in Apple’s App Store.

It is also designed intense resistance from major Hollywood talent agencies who criticized the software for allowing the use of a person’s image or likeness without explicit consent or compensation.

Meanwhile, AI-generated videos depicting deceased celebrities such as Robin Williams and George Carlin have been sparked public outrage of their families.

Noah told GeekWire that “this could end up being the most disastrous thing for everyone involved.”

He referred to Denmark, which recently introduced legislation this would give individuals ownership of their digital image.

“I think the U.S. needs to catch up as quickly as possible,” Noah said.

Legal experts say the next wave of AI video tools – including those from Google and Meta – will test existing advertising and likeness laws. Kraig Baker, a Seattle-based media lawyer with Davis Wright Tremainesaid the problem likely won’t be deliberate misuse by advertisers, but rather the flood of casual or careless content that includes images of people, now enabled by AI.

He added that the issue can be especially thorny for deceased public figures whose estates no longer actively manage image rights.

There are broader potential impacts, as New York Times columnist Brian Chen says. observed: “Technology could spell the end of visual fact – the idea that video could serve as an objective record of reality – as we know it. Society as a whole will have to treat videos with as much skepticism as people already treat words.”

OpenAI published a Sora 2 security document outlining similarity based on consent. “Only you decide who can use your cameo and you can revoke access at any time,” says the company. “We have also taken steps to block depictions of public figures (except those using the cameo feature, of course).”

Sora initially released with an opt-out policy for copyrighted characters. But in a to updateOpenAI CEO Sam Altman said the company now plans to give “rights holders more granular control over character generation” and establish a revenue model for copyright holders.

The wave of attention to AI video generators is creating opportunities for startups like Lotia Seattle company that helps celebrities, politicians and other prominent individuals protect their digital image.

“Everyone is concerned about how AI will use their image and is looking for trusted tools and partners to help guide them,” said Loti CEO Luke Arrigoni.

He said Loti’s business is “booming now”, with about 30-fold growth in sign-ups month-on-month. The startup raised $16.2 million earlier this year.

avots

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *