Scammers Are Now Using AI to Generate Fake Kidnapping Images, Then Using Them to Extort Family Members

It’s time to regulate this.

By Braden Bjella

Published 19 minutes ago in Wtf

Have you recently received an image of a loved one being kidnapped? Yes? Then what the heck are you doing reading this?! You have other things to take care of!


But if you haven’t (or if that situation has resolved itself), you should be aware of a new scam about which the FBI is issuing warnings. Specifically, there’s been an increase in cases of people using AI to fake kidnapping photos, then using those photos to get money from friends and relatives.


How it works is frustratingly simple. First, a scammer will find your image on Instagram or Facebook. Then, they’ll use an AI program to make a picture of you trapped in some sort of compromising position. After that, they’ll send that image to your family — either claiming to be you or the kidnapper demanding funds.


Of course, if this happens to you, you should immediately run the image through an AI detector and report it to the police, no matter what the person on the other end says. Then, you should write to your Congressman and tell them that we need to reign in this AI business. It’s getting ridiculous!

Scroll Down For More


Menu search Account Home Video Gallery Article Contest Newest NEW Popular Forums Spicy Games Picture wiFunny Feels Creepy WTF! FTW! WOW! wiEww Facepalm Ouch Blog pinterest Contest Winner Contest Finalist facebook pinterest twitter whatsapp email user views user comments user favorites Next Article List View