I think this needs to have some discussion as lawmakers are starting to grapple with the tech involved. I was brought up short by this. I had no idea this could be done without any recourse at all by the girls who were victims.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
To do as detailed and convincing of a job without AI would have taken a lot more than merely "a bit of knowledge of an image editor" and it would have taken someone with a lot of experience several hours. With AI audio-visual editing tools anyone can crank out professional quality and hard to detect fakes in a matter of minutes. That problem has not been around a long time and that problem means we are going to see a lot more fake images and videos.That could've been done without A.I. It's been around for a long time, you just have to have a bit of knowledge of an image editor.
Yes, but it takes some time and skill to learn to use Photoshop effectively. AI puts the skill and knowledge into the hands of a HS sophomore and presumably makes him able to produce results that are believable without making any real effort. I know it’s possible to do that - I was surprised to find out that the girls who were victimized have no recourse. They can’t do anything about it.That could've been done without A.I. It's been around for a long time, you just have to have a bit of knowledge of an image editor.
To do as detailed and convincing of a job without AI would have taken a lot more than merely "a bit of knowledge of an image editor" and it would have taken someone with a lot of experience several hours. With AI audio-visual editing tools anyone can crank out professional quality and hard to detect fakes in a matter of minutes. That problem has not been around a long time and that problem means we are going to see a lot more fake images and videos.
It's not that fake images and videos are a new thing. It's that newly developed and developing AI tools are going to lead to an exponential increase in the amount of fake images and videos that are more convincing as being real. That will lead to an exponential increase in the number of people victimized and traumatized by fake images and videos. It's also going to be used to truly gaslight people in the original meaning of the word.
Imagine someone creating a video of you doing whatever would be the most embarrassing thing you can think of you doing. Imagine that part of the video is from video footage of you doing something you actually did. Imagine if it's a video from a time when you were intoxicated. Imagine the faked part of that video seeming so real that you have to reassure yourself that you weren't so intoxicated that you forgot you actually did what the video shows you doing.
The deviously dysfunctional people in the world are going to use AI audio-visual tools to take cyber trolling and bullying to a level of harm that has never been seen before, especially if there's no laws prohibiting them from doing it.