December 19, 2025 at 12:55 pm

OpenAI’s Sora 2 AI Video Generation Tool Is More Realistic Than Ever, And That’s Causing Ethical And Legal Concerns

by Michael Levanduski

Deep Fake Technology

Shutterstock

Sora 2 is the latest version of the ‘text to video’ AI tool from OpenAI (The parent company of ChatGPT). The tool is really easy to use and can produce some very impressive videos. If you spend any amount of time on TikTok, Reels, YouTube Shorts, or any other similar platform, you have undoubtedly started to see videos generated with this tool, even if you don’t yet realize it.

On the surface, this is a fun tool that has a lot of potential, but there are some risks as well.

The video below is potentially an example of this. Gabriel Petersson created a video of OpenAI CEO Sam Altman stealing graphics cards at target. The video was made to appear like a CCTV feed that caught the CEO grabbing the cards in what was a very realistic way.

Of course, Altman didn’t actually steal these cards. He has no need to after securing a $100 billion deal with NVIDIA to provide these cards for their needs.

While the video itself is funny, the potential problems it highlights are not. Going forward, anytime the police enter CCTV video into evidence, the defense attorneys in the case will have a much easier time discrediting it.

CCTV is known for having lower resolution, which makes faking it even easier, and authenticating it more difficult. Video evidence may be all but useless in the courts very soon.

Another issue is with a feature that is available with the program called Cameos. This allows you to insert yourself (or anyone who gives you permission) into an artificially generated video. The company explains:

“With cameos, you are in control of your likeness end-to-end with Sora. Only you decide who can use your cameo, and you can revoke access or remove any video that includes it at any time.”

Deep Fake Technology

Shutterstock

While it sounds like they are taking privacy and safety seriously, the fact that Sora 2 can do this safely means that third party companies will be able to copy it without the restrictions before you know it. The widespread generation of ‘Deep Fakes’ using this technology is going to be almost impossible to escape from. It is easy to imagine an endless number of  inappropriate ways that this could be use for blackmail or just to embarrass other people, not to mention other scams.

The bottom line, while this technology is really cool, it is also really dangerous. Unfortunately, the cat seems to be out of the bag for this and there isn’t much that can be done to stop it.

If you enjoyed that story, check out what happened when