The Rise AI-Generated Porn.
Imagine for a second, that you have lived a life free from any controversy. Then suddenly, you are alerted about an image of you performing sex acts is being shared all over the internet; the only thing is, you’ve never done such an act. You’ve never done anything close to porn; but yet, there’s an image spreading rapidly online with your exact likeness. You want to stop it. You try to shut it down, but it’s too late. The damage has been done. This is the reality of Deepfake AI Pornography.
As I was thinking about how to cover this topic, I thought about how when new technology emerges, we always take something that can be used for incredible good and then use it for evil, like making artificial versions of people having sex online. It’s just incredibly weird.
But, welcome to the future. We have access to the sum total of all information known to mankind and we use it to create AI-generated porn. We are truly living in pathetic times. This technology isn’t new, deepfake technology has been around for a while now. I don’t know how many of you are familiar with this clip of a deep fake of Obama, but it’s so convincing that If Jordan Peele didn’t do his reveal, I would have believed it was actually Obama.
After seeing this, I knew we had a massive problem and now you tie that problem to the porn industry and it is the perfect storm of depravity. What makes this technology so nefarious is you can be featured in pornographic content without your consent and the results can be devastating.
This brings us to the incident involving well-known Twitch streamer, Atrioc. Apparently, while Atrioc was doing a Livestream, some users noticed something peculiar in his browser tabs. They noticed he had a tab opened to an AI porn site that featured some of his female colleagues in the Twitch community.
After this incident blew up, Atrioc issued an apology where he is seen crying with his wife by his side. In his apology, he describes the moments leading up to clicking the deepfake site and describes visiting a “regular” site before getting clicking the link that would lead him to deepfakes of his colleagues. That “regular” site he describes is PornHub.
This “regular” website also happens to feature strangers having sex on the internet. No big deal. The fact that he sees PornHub as just a “regular” site makes his willingness to click the link, pull out his wallet and pay for deep fake AI porn not remotely surprising.
Now, I do believe he is sincerely sorry but that doesn’t change the damage that’s been done. He was watching videos of his colleagues superimposed on AI-generated pornographic images. Another Twitch streamer, QTCinderalla, was devastated to learn her images were featured on this website. Here’s a video showing her reaction.
In the video, it’s very clear the turmoil these AI-generated images caused this woman. You can visibly see her pain and it’s heartbreaking. She rightly identifies those who look at pornographic content featuring women who did not consent as being the problem. She is right in calling out men who see women as objects. She is right when she expresses anger in having to take legal action to get this content removed.
Now, where I think she’s wrong is that she implied it would have been alright to create AI porn using someone’s image if they were benefitting from it; I don’t believe that’s true. My stance is, pornography in any form is wrong. It’s an exploitative industry that’s culmination of depravity has opened pandora’s box and now it can’t be closed. The fact that it’s AI pornography only compounds this problem because now anyone's image can be used to create this type of content. Your wife, your family, children, friends, and colleagues. No one is safe from this kind of exploration unless you have no internet presence; which is nearly impossible in modern times.
As this tech develops, it’s only going to improve. What is easily detectable now will soon be nearly impossible to detect. No one will be able to tell what is real and what is fantasy. Here’s an example of a photo from an article by the New York Post:
It’s not perfect but the imperfections are subtle and if you aren’t looking for them you wouldn’t notice. This is what we are seeing within the first year of this technology being widely available to the public. In its current form, AI is already creating problems. As it evolves, the problems will be exasperated. Anyone whose images are available for public consumption will be vulnerable to a level of digital manipulation that seemed impossible several years ago.
It’s not just porn that can be created of you without your consent. It’s literally anything and that should be cause for concern. I’m someone who has spent the better part of 3 years making video content. It wouldn’t be difficult to make deepfakes of me saying things I would never say and completely destroying my reputation.
Just think of the implications of technology like this when it comes to politics. The opposing candidates can create a deep fake video of their rivals saying or doing things they would never do and it can go viral instantly. And even if the video is ultimately proven to be false; many people will have already made up their minds and their reputation will be forever tainted.
This kind of technology makes us question the nature of reality itself and is going to create an environment where it’s going to be nearly impossible to discern between truth and fiction. There are people who are developing technology to combat deep fakes and legal battles that will be fought to pass legislation but these are not bulletproof answers and are a long way off. I don’t know one way to stop this but I do think we can help by cutting out pornography completely and holding websites that feature this content accountable.
As for you and me; we are going to have to be vigilant in how we view content. We are going to need to be slow when drawing conclusions. Obviously, there will be people who blame deepfakes for things that actually did happen but it’s all around a good practice to be slow to react to what we see on social media until more information becomes available. People’s entire lives can be destroyed in an instant if we prematurely pull the trigger.
Pandora’s box has been opened and deep fake AI is a reality, We live in a world where your image can be used in nefarious ways we never thought possible. The days of believing what you see just because there’s video footage are gone.