In the last few weeks, something really appropriate has been trending on the internet. People are using artificial intelligence to make deepfake videos, pornographic pictures of famous celebrities like Taylor Swift. This is making everyone worried about how we use artificial intelligence in our daily lives. Let’s take a closer look at what’s going on.

What is actually Deepfake?

Deepfake AI is a kind of artificial intelligence that crafts realistic fake images, audio, and videos. The term “deepfake” combines “deep learning” and “fake” to describe both the technology and the deceptive content it produces. These AI systems can either swap faces in existing content or generate entirely new content, making it seem like someone is doing or saying something they never actually did or said.

The Big Challenge: Tricking Smart Computers

What leading to this situation was a website called 4chan was having a challenge every day. They’re trying to find ways to trick artificial intelligence into making pictures they shouldn’t. These pictures are appropriate, and the target is mostly famous ladies. It’s like a challenge for them to see if they can get past the safety rules on sites like Microsoft Designer and OpenAI’s Dall-E-3.

Also: Best AI Image Creators for Your Imagination

Taylor Swift’s Problem: Just the Tip of the Iceberg

Even though we’re hearing a lot about Taylor Swift’s deepfakes, she’s not the main target on 4chan. This competition shows that anyone, famous or not, can become a victim of these mean computer tricks.

Also: AI Hidden Risks: The Dark Side of Technology

The Problem Spreads: Going from One Place to Another

At first, these fake pictures were only on 4chan, but they quickly went to other places like Telegram and X. Shockingly, one of Taylor Swift’s fake pictures was on X for a long time, and lots of people saw it before they did something about it.

Tricking Computers Keeps Going: Not Getting Better

Even though companies said no to certain words, people on 4chan keep sharing ideas to get around those rules. They treat making appropriate like game’s challenge, and when rules change, they just find new ways to break them.

What the Politicians Say: Taking It Seriously

As of these deepfakes pornography are spreading a lot, the people in charge are talking about it. The White House issued a statement condemning the explicit images, and Congressman Joe Morelle from New York hopes to draw attention to legislative efforts that would classify deepfake pornography as a federal crime, warranting fines and jail time.

More Than Just Taylor Swift: Many Famous People in Trouble

Taylor Swift is not the only one dealing with this problem. Other famous people like Billie Eilish, Ariana Grande, and Emma Watson are also getting fake pictures made of them. Even though it’s really serious, the people who work for these famous folks haven’t said anything yet.

A Big Problem for Everyone: Even Regular People

Even though we’ve seen deepfakes of famous people for a long time, now regular people are having this problem too. It shows that these artificial intelligence are getting better and easier to use on. Victims often find themselves with little recourse, struggling to have these explicit images taken down.

We Need to Do Something Now: Finding a Balance

As of January 30, people are still asking for more fake pictures of Taylor Swift, even really bad ones. We need to understand that this is like a challenge to some people, and we have to stop it. We need to figure out how to use smart computers without hurting people. Making strong rules and laws is really important right now. It’s a big problem that needs fixing fast.