The latest high-profile victim of deepfake porn has spoken out after her likeness was used in a graphic video that has spread across the internet.
The popular video game streamer QTCinderella is accustomed to engaging with her 831,000 fans on the streaming platform Twitch. Last week, the 28-year-old discovered someone had used advanced software to modify an existing pornographic video to make her its “star.”
The video and its creator(s) used artificial intelligence to make her life a nightmare. The clip’s alarming accuracy rattled QTCinderella, whose real name is Blaire, to her core, according to the New York Post.
The video was posted to a website where such videos are posted — and it spread.
It didn’t matter that Blaire never engaged in any sexual activity on video. She still found herself in front of her camera in tears explaining her harrowing experience to complete strangers online.
“F*** the f***ing internet,” she said as she wept. “F*** the people DM-ing me pictures of myself from that website. F*** you all! This is what it looks like, this is what the pain looks like.”
The young woman said she was “exhausted,” and she looked traumatized.
WARNING: The following video contains graphic language.
Do you think those who create these videos need to face jail time?
Yes: 100% (29 Votes)
No: 0% (0 Votes)
“This is what it looks like to feel violated,” she said through tears. “This is what it looks like to be taken advantage of. This is what it looks like to see yourself naked against your will being spread all over the internet. This is what it looks like.”
She also vowed to those behind the production of video, “I promise you, with every part of my soul, I’m going to f***ing sue you.”
Blaire also shared some of her thoughts on the matter on Twitter.
The amount of body dysmorphia I’ve experienced since seeing those photos has ruined me.
It’s not as simple as “just” being violated. It’s so much more than that.
— QTCinderella (@qtcinderella) January 31, 2023
Unfortunately, Blaire joins other women who have not only been victimized by deepfake porn but also have little to no legal recourse. Her video does not fall under the revenge porn laws dozens of states have in place.
Ironically, that is because Blaire never actually appeared in the video.
But if you want an in-depth look into just how accurate these videos are, look no further than this old clip of President Joe Biden saying things he would never say about a segment of the population he would never condemn.
In 2020, someone created a deepfake of Biden obliterating the argument that men can become women. To a casual observer, this video might seem 100 percent authentic.
This video is fake, and it looks like the text they plugged into the AI Biden voice generator originated in a 2020 4chan post.
It won’t be long before these deepfakes are indistinguishable from the real thing. Skynet has already won.pic.twitter.com/qALZxscyAf
— jimtreacher.substack.com (@jtLOL) February 4, 2023
Advances in technology have helped humanity in immeasurable ways. But on the other side of the coin, high-tech software can mislead, harm, and have the potential to traumatize people whose intentions are, by all appearances, well-meaning.
As deepfake technology improves, the potential for a larger calamity grows.
Imagine if that video of Biden was altered by someone in a basement somewhere to spread a false declaration of nuclear war against China, Russia or one of the country’s other geopolitical foes.
Thankfully, we’re not yet at a place where nerds with advanced computer skills can spark a global war — or at the very least an international incident.
But even without the hypothetical threat of a global conflict, technology is there to make a life for ordinary people a living hell.
A streamer whose charming personality and passion for sharing games with an online community has brought her life to a standstill. Blaire has not posted to Twitch for almost a week, and she has offered nothing in the way of when she will return to Twitch’s servers.
It would be easy to write this off as a cautionary tale for people who enjoy sharing their lives with the world.
But the simple truth is Blaire and others like her have done nothing wrong.
This technology is here to stay, and if the past is in any way indicative of the future, deepfake technology will only improve. Blaire is unlikely to be the last victim of such an invasion of privacy and dignity.