EP: 29 Virtual Kidnapping: How AI Has Turned an Old Scam into a High-Tech Nightmare

July 28, 2025 00:45:02
EP: 29 Virtual Kidnapping: How AI Has Turned an Old Scam into a High-Tech Nightmare
Behind the Scams
EP: 29 Virtual Kidnapping: How AI Has Turned an Old Scam into a High-Tech Nightmare

Jul 28 2025 | 00:45:02

/

Show Notes

Virtual kidnappings have evolved from crude phone scams to chilling high-tech nightmares, thanks to the rise of artificial intelligence. Imagine receiving a call that sounds exactly like your child, pleading for help—your heart races, and your rational mind fades away. In Episode 29 of "Behind the Scams," Nick and Sue delve into the terrifying world of AI-driven scams, sharing Amanda's harrowing story and revealing how scammers exploit our deepest fears. Learn how these sophisticated tactics work and discover essential strategies to protect yourself and your loved ones from becoming victims of this modern menace.

Chapters

View Full Transcript

Episode Transcript

[00:00:00] Speaker A: From the dark corners of deception to the cutting edge of crime fighting, this is behind the Scams. I'm Miles, your announcer, and today's episode is one you won't want to miss. Nick and sue are pulling back the curtain on one of the most terrifying scams out there. A chilling con called virtual kidnapping. But this isn't the old school version. This is AI powered, voice cloned, heart stopping fraud. How real can fake really sound? Stay with us to find out. [00:00:39] Speaker B: Thanks, Miles. So we got something really important for our listeners today. But at the same time, this is a scam that is absolutely terrifying, in fact, but super vital to know about. Welcome back to behind the Scams, everyone. I'm sue, and as always, I'm here with my partner in crime fighting, Nick. [00:01:03] Speaker C: Hey there. Good to be back. And yeah, what sue was referring to is something we all need to be talking about right now. [00:01:10] Speaker B: Exactly. And it quite honestly gave me chills when you and I were researching it. We're talking about virtual kidnappings, but not just any virtual kidnappings. These are powered by AI now, which makes them, you know, about a thousand times more convincing and scary. [00:01:29] Speaker C: Absolutely. And I've got to say, when you first mentioned this topic to me, I was. I was kind of skeptical, like, how bad could it really be? But then you showed me some of those examples and. Wow, right? [00:01:44] Speaker B: It's. It's pretty mind blowing. And the thing is, well, these scams aren't actually new. Virtual kidnappings have been around for a while, but what's changed is the technology behind them. Artificial intelligence has completely transformed how convincing these scams can be. So for our listeners who might not be familiar with just what a virtual kidnapping is, we will briefly explain it before we dive in. But before we do, I just wanted to mention that due to the importance of this topic, this episode required extensive research and is something you listeners should know more about. Now, Nick, you're our resident retired crime fighter and tech guru, so please go ahead and explain what a virtual kidnapping is and why it is important for us to know about this scam. [00:02:41] Speaker C: So, Sue, a virtual kidnapping is basically when. When someone calls you claiming they've kidnapped your loved one, Right? But in reality, they haven't actually taken anyone. The whole thing is a complete fabrication designed to terrify you into paying a ransom immediately without checking if your family member is actually missing. And traditionally, these scammers would use what, just generic crying sounds or maybe some vague statements that could apply to anyone? [00:03:13] Speaker B: Exactly. They'd use these really generic approaches and hope that your panic would fill in the blanks or. But now, now with AI voice cloning, they can actually make it sound exactly like your daughter, your son, your spouse is the one crying and begging for help on the other end of the line. It's. It's absolutely terrifying. [00:03:36] Speaker C: That's what really gets me. The technology has reached this point where with just a small sample of someone's voice from social media or wherever, these criminals can create incredibly realistic voice clones. [00:03:52] Speaker B: Yeah. And they're combining this with all kinds of personal information they scrape from your social profiles to make the whole scenario super believable. Like they know your kid's name, where they work, what car you drive. All these little details that make you think this has to be real. [00:04:13] Speaker C: I imagine the psychological impact of hearing what sounds exactly like your child in distress must be. I mean, it must completely short circuit your rational thinking, Right? [00:04:24] Speaker B: Totally. Your brain just goes into this immediate fight or flight response. You're not thinking, is this really my daughter? You're thinking, my daughter is in danger and I need to save her right now. That's exactly what these scammers are counting on. [00:04:39] Speaker C: And I understand we've got some pretty disturbing examples to share with our listeners today. [00:04:44] Speaker B: We do, and trust me, they're eye opening. We're going to start with Amanda's story, which, well, it illustrates exactly how these scams play out and why they're so effective. Her story is going to sound familiar to a lot of parents out there, which is what makes it so scary. [00:05:06] Speaker C: I think what's most disturbing is how this technology is becoming more accessible. It's not just sophisticated criminal organizations anymore, is it? [00:05:16] Speaker B: No, not at all. The barrier to entry for these scams is getting lower and lower. Almost anyone with Internet access can get their hands on artificial intelligence based voice cloning tools now. And that's why we wanted to do this episode, because awareness is really the best defense against these scams. [00:05:38] Speaker C: So let's dive into Amanda's story, which really shows how these scams unfold in real life. Amanda was just having a completely normal day, right? [00:05:49] Speaker B: Yeah, totally normal. She was like, just going about her morning routine, getting ready for work, making breakfast, and seeing her teenage daughter off to her summer job. You know, just, just a regular Tuesday morning like any other parent might have. [00:06:06] Speaker C: And then everything changed in an instant. [00:06:08] Speaker B: Exactly. So what happened was about five minutes after her daughter left for work, Amanda's phone rings. And, and it's showing a local number, which is important because that makes her more likely to pick up. Right. So she answers and immediately she hears what sounds exactly like her daughter crying. [00:06:28] Speaker C: Oh, man, I can't even imagine how terrifying that would be as a parent. Hearing your child in distress just. It hits you at this primal level. [00:06:41] Speaker B: Absolutely terrifying. And before Amanda can even process what's happening, this man's voice comes on the line and says something like, we have your daughter. If you ever want to see her again, you need to follow my instructions exactly. And Amanda's just frozen, you know, because she can HEAR what sounds 100% like her daughter sobbing in the background. [00:07:07] Speaker C: What did Amanda do in that moment? I mean, most people would completely panic, right? [00:07:12] Speaker B: Well, that's exactly what happened. She went into complete panic mode. Her heart racing, hands shaking. The scammer was counting on that reaction, because when you're in that state, you can't think clearly. He told her not to hang up the phone, not to call anyone else, and to immediately start driving to her bank. [00:07:33] Speaker C: And the whole time she's hearing what she believes is her daughter in the background. [00:07:37] Speaker B: Yes. And this is where the AI component becomes so, so incredibly scary. In the past, these scammers would just have someone crying generically. But with AI voice cloning, what Amanda heard sounded exactly like her daughter's voice, saying things like, mom, please help me. They're going to hurt me. It was specific enough that there was no doubt in her mind it was her child. [00:08:04] Speaker C: That's. That's absolutely chilling. The technology has reached a point where it can recreate someone's voice with just a small sample. [00:08:12] Speaker B: Right, Exactly. And think about it. How many videos do our kids Post on Instagram? TikTok? You know, the scammer probably just needed a 30 second clip from her daughter's social media to create a voice that was convincing enough to fool her own mother. It's terrifying how good voice cloning has gotten. [00:08:36] Speaker C: So what. What happened next? Did Amanda end up paying? [00:08:39] Speaker B: Well, she was absolutely ready to. The scammer kept her on the phone while she drove to the bank, giving her specific instructions about how much money to withdraw. I think it was like $8,000 in this case. But here's where Amanda got lucky. When she got to the bank, the teller recognized the signs of a scam. [00:09:03] Speaker C: Oh, thank goodness. So the bank teller had seen this before? [00:09:06] Speaker B: Yeah. The teller noticed how distressed Amanda was, how she kept looking at her phone and just. Just her general panic. And they've been trained to spot these situations. The teller asked to speak to Amanda privately and gently suggested they try calling her daughter on another phone. [00:09:26] Speaker C: And I'm guessing her daughter was Completely fine. [00:09:29] Speaker B: Yes. Her daughter answered right away, totally confused about why her mom was calling her in such a panic. She was fine, just, you know, at work like normal. But here's the thing that's so important to understand. Amanda was ready to empty her savings account because the panic and fear just. It completely overwhelmed her rational thinking. [00:09:53] Speaker C: That's such a critical point. These scammers are exploiting our most basic protective instincts, especially as parents. They're counting on that immediate emotional response. [00:10:03] Speaker B: Absolutely. And they use these really specific psychological tactics to maintain control. Like they'll tell you that they're watching you or that they have accomplices following you. They'll threaten to hurt your loved one if you try to contact anyone else or if you hang up the phone. [00:10:23] Speaker C: So they're basically isolating you at the moment when you most need clear thinking or outside help. [00:10:29] Speaker B: Exactly. They create this. This bubble of terror where you feel like your only option is to comply, and they use urgency as a weapon. You have 30 minutes to get the money, or we'll hurt her. That time pressure prevents victims from stopping to think, think or verify. [00:10:50] Speaker C: And I imagine they also say things to make it sound more credible, like mentioning specific details about the supposed victim. [00:10:57] Speaker B: Oh, yeah, Absolutely. This is where social media really becomes their accomplice. They'll say things like, your daughter in the blue shirt, or reference something specific they've learned from stalking social profiles. Those little details make the whole scenario feel. Feel real and immediate. [00:11:16] Speaker C: What I find particularly disturbing is how they can use AI to make these scams work, even on people who might be somewhat aware of the concept of virtual kidnappings. [00:11:27] Speaker B: That's. That's another important point. Even if you've heard about these scams before, hearing what sounds exactly like your child's voice begging for help, it just bypasses all your rational defenses. [00:11:40] Speaker C: Your. [00:11:40] Speaker B: Your brain doesn't go to, this might be a scam. It goes straight to my child is in danger. [00:11:48] Speaker C: The emotional manipulation is just so powerful. And from what I understand, these scammers are getting better at keeping people on the hook. Right. [00:11:58] Speaker B: They're getting incredibly sophisticated. They study how real kidnappers operate. They rehearse their scripts, and they've learned exactly. Exactly what buttons to push to keep victims compliant. It's. It's truly frightening how effective they've become at exploiting our basic human instincts. [00:12:19] Speaker C: So let's talk about what these virtual kidnappings actually are. I mean, we've heard Amanda's story, but what exactly is happening in these situations? [00:12:30] Speaker B: So virtual kidnappings are. They're basically Elaborate scams where, where criminals convince you that they've kidnapped someone you love when they actually haven't. The virtual part means that no actual kidnapping has taken place. Right. Your loved one is totally fine going about their day, but the scammer creates this. This complete illusion that they're in danger. [00:12:56] Speaker C: And this is different from a traditional kidnapping threat how exactly? [00:13:00] Speaker B: The key difference is that in a virtual kidnapping, the criminals don't have physical custody of anyone. They're just. They're creating this illusion through really sophisticated psychological manipulation. And with AI now, they can make it sound so real that even the most skeptical person might believe it. You know, especially. Especially when it's the voice of someone you know intimately. [00:13:29] Speaker C: I've heard these scams have been around for a while, but now they seem to be getting more and more sophisticated. [00:13:34] Speaker B: Oh, absolutely. These scams have actually been around for, like, years. They actually started becoming really common in Mexico and other parts of Latin America maybe 15 years ago, and then they spread to the U.S. but they used to be much more crude, you know, like just random crying in the background. [00:13:59] Speaker C: So what makes them so effective at causing panic? Is it just the subject matter? [00:14:03] Speaker B: Well, that's part of it. I mean, there's nothing that triggers our fight or flight response faster than believing our child or loved one is in danger. But these scammers are also, like, really skilled at psychological manipulation. They use what psychologists call amygdala hijacking, essentially overwhelming your brain's threat response system. [00:14:29] Speaker C: Amygdala hijacking. That sounds serious. [00:14:31] Speaker B: It is. See, when your amygdala, that's the part of your brain that processes fear, when it gets triggered, it literally bypasses your rational thinking, thinking centers. So all those critical thinking skills just. They just go out the window. And these scammers know exactly how to trigger that response. [00:14:55] Speaker C: What are some of the specific tactics they use? [00:14:58] Speaker B: So first, they create immediate shock with the sound of your loved one crying or screaming. Then they jump in with threats and time pressure. We have your daughter. You have one hour to get the money or we'll hurt. Hurt her. They keep you on the phone constantly so you can't think or verify anything. And they. They isolate you by threatening worse consequences if you tell anyone. [00:15:21] Speaker C: That sounds like the same tactics that cult leaders and abusers use. [00:15:25] Speaker B: Exactly. It's the same psychological playbook. They create this. This bubble where you're cut off from outside perspectives. They'll even say things like, wait, we're watching you right now. If you hang up or call, the police will know and will kill her immediately. And when you're terrified, that seems completely plausible. [00:15:45] Speaker C: So they're basically exploiting our protective instincts. [00:15:48] Speaker B: Yes, 100%. These scammers are counting on the fact that when it comes to our children or loved ones, most of us will act first and question later, like, if there's even a 1% chance the threat is real, what parent wouldn't pay? And they use that protective instinct against us. [00:16:09] Speaker C: How do they prevent victims from verifying if their loved ones are actually in danger? [00:16:14] Speaker B: That's. That's such a crucial part of their strategy. First, they keep you on the phone, right? They tell you that if you hang up, they'll hurt your loved one. They might claim they're watching you, so you feel like you can't safely try to contact them. Person they claim to have kidnapped. [00:16:32] Speaker C: And I'm guessing time pressure plays a big role too. [00:16:36] Speaker B: Absolutely. They'll say things like, you have 30 minutes to get the money or start driving to Western Union now. This creates such urgency that you don't have time to think clearly or check facts. And you know what's particularly insidious? They'll often specifically target people to. During times when their loved ones might legitimately be unreachable, like when. Like, they'll call parents when they know the kids are in school with phones turned off. Or they'll target families of people who are traveling, especially internationally. They look for situations where it's plausible that you can't immediately verify your loved one's safety. And they'll. They'll specifically ask for questions like, where is your daughter right now? To get you to reveal times when verification would be difficult. [00:17:28] Speaker C: That's incredibly calculated. [00:17:30] Speaker B: It really is. And another tactic they use is. Is what experts call forced confirmation. They'll ask vague questions like, is your daughter the one with long hair? And when you say yes, they use that information to make the scam more convincing. They build this illusion of knowledge that makes you think they really do have your loved one. [00:17:54] Speaker C: And now with AI in the mix, I'm guessing these scams have gotten even more convincing. [00:17:59] Speaker B: Oh, my God. Yes. The AI component has made these scams just. Just terrifyingly effective. Before, they might have someone crying generically in the background, but now, with just a small sample of use your loved one's voice from social media, they can generate realistic cries for help in their actual voice. It's. It's like your worst nightmare come to life right on the phone. [00:18:25] Speaker C: That's truly terrifying. So they're essentially creating a perfect psychological storm. [00:18:31] Speaker B: Exactly. They combine shock Fear, time pressure, isolation. And now voice technology to create this, this perfect storm that bypasses our rational thinking. Even people who consider themselves too smart to fall for scams can be vulnerable when they think their child's life is at stake. That's what makes these virtual kidnappings so dangerous. They target our most basic protective instincts. [00:19:02] Speaker C: So now we're getting to what makes these scams truly next level terrifying. Artificial intelligence has really transformed this whole space, hasn't it? [00:19:12] Speaker B: Oh, absolutely, Nick. The AI revolution in scamming is. It's just, it's completely changed the game. What we're seeing now with these virtual kidnappings is like light years beyond what scammers could do even just a few years ago. The technology has advanced so rapidly that, honestly, it's kind of terrifying. [00:19:36] Speaker C: I've heard some things about voice cloning technology, but how exactly does that work in these scams? [00:19:42] Speaker B: So voice cloning is exactly what it sounds like. It's AI technology that can literally create a digital copy of someone's voice. Right. And what's really scary is how little material these things systems need. Now, in the early days of this technology, you might have needed like hours of someone talking to. And now, now with the latest AI models, scammers can create a convincing voice clone with just like 30 seconds of audio. That's it. Just 30 seconds of someone's voice. And these systems can generate that person saying literally anything the scammer wants. And the quality is. It's just getting better and better every month. [00:20:32] Speaker C: Where do they even get these voice samples from? [00:20:34] Speaker B: That's the really scary part, Nick. They're absolutely everywhere. Think about it. How many videos have you posted on social media where you're talking? Maybe you've left a voicemail for someone that could be compromised in a data breach. Or perhaps you've done a video call with someone who recorded it without your knowledge. [00:20:54] Speaker C: Oh, I hadn't thought about that. What about all those voice messages people send on WhatsApp and other platforms? [00:21:02] Speaker B: Exactly. Voice messages, TikTok videos, Instagram stories, YouTube videos. We're constantly putting our voice out there without thinking about it. And, and for public figures or people with any kind of online presence, it's. It's basically impossible to keep your voice private anymore. [00:21:24] Speaker C: So these scammers can just take those samples and create fake audio of someone crying or begging for help? [00:21:31] Speaker B: Precisely. They can generate audio of your daughter crying and saying specific things like, mom, please help me, they're going to hurt me. And because it sounds exactly like your daughter's voice with her Unique vocal patterns, her exact way of saying mom. It triggers this primal panic response that's absolutely chilling. [00:21:54] Speaker C: And I'm guessing the technology to do this isn't even that hard to get. [00:21:58] Speaker B: No, not at all. And that's what's really changed the landscape. See, a few years ago, this kind of technology was only available to, like, high level researchers or big tech companies with serious resources. But now there are open source AI voice cloning tools that anyone can download. There are even commercial services that advertise voice cloning for supposedly legitimate purposes, like audiobook narration or personalized content. [00:22:30] Speaker C: So basically, anyone with basic technical skills can do this now? [00:22:33] Speaker B: Yes. And it gets worse. There are actually, there are now criminal services on the dark web that specifically cater to scammers. They offer what they call voice cloning as a service where you just upload your audio sample and they give you back a fully functional voice clone. No technical skills needed at all. You just pay them in credit cryptocurrency and you get a tool for scamming. [00:23:01] Speaker C: And I'm guessing the quality of these fake voices is pretty good? [00:23:04] Speaker B: The quality is. Honestly, it's incredible. And it's improving at a shocking rate. The latest models can even replicate emotional states, so they can make the voice sound scared or crying or under duress. They can add background noise to, to make it sound like the person is in a car trunk or a warehouse. And they can mimic specific speech patterns. Like if your daughter has a certain way of saying things or uses specific phrases, they can capture that too. [00:23:39] Speaker C: So what about those little quirks that make someone's voice recognizable? Can AI really capture those too? [00:23:46] Speaker B: That's what's so disturbing, Nick. These sounds systems are now sophisticated enough to pick up on those subtle voice characteristics. The tiny hesitations, the specific way someone pronounces certain words, even their breathing patterns between sentences. And for a parent who's heard their child's voice every day for years, those subtle cues are exactly what makes the voice instantly recognizable to them. [00:24:17] Speaker C: It sounds like we're reaching a point where we can't trust our own ears anymore. [00:24:23] Speaker B: That's exactly right. We've always said hearing is believing, right? Like if you hear something with your own ears, you trust it. But we're entering this era of what experts are calling the end of audio truth, where we simply cannot trust that what we're hearing is. Is real. And when that's combined with the emotional manipulation tactics we talked about earlier, it's just. It's a perfect storm for scammers. [00:24:51] Speaker C: What's really Scary to me is how fast this has all happened. Like this wasn't even possible a few years ago, right? [00:24:58] Speaker B: No, it wasn't. The acceleration has been just, like, mind blowing. In 2017, creating a halfway decent voice clone might have required 30 minutes to an hour of high quality audio and significant computing resources. By 2020, that was down to maybe five minutes of audio. And now we're talking about 30 seconds or less, potentially scraped from a single social media video. And the processing can happen on a decent laptop. [00:25:28] Speaker C: So how are these scammers combining this voice technology with their psychological tactics? [00:25:33] Speaker B: Well, they're creating these multi layered attacks that are just devastatingly effective. So imagine first you get a call and hear what is undeniably your daughter's voice crying and begging for help. That immediately triggers that amygdala response we talked about pure panic. Then a threatening male voice comes on and starts giving you instructions. [00:25:57] Speaker C: And because you just heard your daughter's voice, you're already convinced the threat is real. [00:26:03] Speaker B: Exactly. The AI generated voice creates instant credibility for the threat. And then they layer on all those psychological pressure tactics. Isolation, time pressure, threats of violence. They'll even use information gleaned from social media to make the scenario more convincing, like mentioning places, places your daughter frequently visits, or names of her friends. [00:26:30] Speaker C: What about caller ID spoofing? Is that part of this too? [00:26:33] Speaker B: Oh, absolutely. These scammers are using every technical tool at their disposal. They'll spoof the caller ID to make it look like the call is coming from your daughter's actual phone number, or from a local police department, or from an unknown number in an area where your loved one might be. That's another layer of credibility that makes the scam even more convincing. [00:26:59] Speaker C: So they're basically creating this perfect storm of technical deception and psychological manipulation. [00:27:06] Speaker B: That's. That's exactly it, Nick. And what makes it so effective is that even if there's a part of your brain that's questioning it, like maybe something seems slightly off about the voice, or the scenario doesn't quite make sense. The fear response is so overwhelming that most people err on the side of caution, because, I mean, what parent wouldn't? The stakes just feel too high to take any chances. [00:27:34] Speaker C: Are there any ways to detect these fake voices? Like, do they have any telltale signs? [00:27:39] Speaker B: There are some subtle indicators, but they're getting harder and harder to spread spot, especially in a high stress situation. Sometimes the AI generated voices have slight robotic qualities or unnatural pauses. They might struggle with certain sounds or emotion transitions, but honestly, technology is advancing so quickly, that these imperfections are disappearing. [00:28:04] Speaker C: That's not very reassuring to me. [00:28:06] Speaker B: No, it's not. And we've actually seen cases where scammers combine AI voices with real human flexibility. So they might use the AI generated voice of your loved one crying for help, but then a real human scammer takes over for the ransom demands. This gives them the ability to respond naturally to questions or unexpected situations, making the whole scam even more convincing. [00:28:33] Speaker C: So this really is like a technological arms race between the scale scammers and the rest of us? [00:28:38] Speaker B: It absolutely is. And right now, the scammers have some pretty powerful weapons, but that doesn't mean we're powerless. There are definitely ways to protect ourselves, which I'm sure we'll get into later in the show. [00:28:52] Speaker C: There is a cyber security expert named Vijay Balasubrahmanian who specializes in voice technology and security. His work is becoming more and more important, important every day, since more and more virtual kidnapping scams use AI to clone voices. His work in this area is important to combating scammers and their increased use of AI and cloned voices. And, Vijay, if you're listening to this podcast, I hope I pronounced your last name somewhere close to how it is supposed to be pronounced. If not, I apologize. [00:29:30] Speaker B: That's right. And according to Vijay, at the most basic level, voice cloning uses what is called deep learning. Neural networks are trained to understand and then reproduce the unique characteristics of a person's voice. These. These AI systems analyze things like pitch, tone, cadence, accent, and even the way someone pauses between words. [00:29:56] Speaker C: So, sue, it sounds like the AI is similar to learning someone's vocal fingerprint, right? Their unique way of speaking. [00:30:03] Speaker B: That's exactly right, Nick. Every person has what we might call a voice print. It's, you know, completely unique to them, just like your fingerprint. And modern AI systems have gotten remarkably good at capturing those unique fingers features and then reproducing them in new sentences that the original person never actually said. [00:30:28] Speaker C: And how has this technology evolved, Sue? Because I'm guessing this wasn't always so accessible. [00:30:34] Speaker B: Oh, not at all, Nick. The evolution has been. Well, frankly, it's been staggering. Five years ago, this kind of technology was primarily in research labs. You needed specialized knowledge, expensive computing resources, and lots and lots of training data. We're talking hours of someone's speech, fortunately or unfortunately. [00:30:57] Speaker C: Now it's just. It's everywhere. Right. Like, I've seen apps where you can just upload a short clip and get a voice clone back in minutes. [00:31:06] Speaker B: Well, Nick, we've seen this massive democratization of the technology. What used to require a PhD in machine learning can now be done with a simple app or web service. And the quality has improved exponentially, while the amount of sample audio needed has dropped dramatically. So scammers have adapted. [00:31:29] Speaker C: Now, Sue, I have some experience with technology and artificial intelligence, but what kind of sample length are we talking about now? Like, how much of someone's voice do these scammers need? [00:31:43] Speaker B: Great question, Nick. With the most advanced systems we're seeing today, they can create a reasonably convincing voice clone with as little as three to five seconds of clear audio. Now, it won't be perfect with such a small sample, but. But in a high stress situation, like a supposed kidnapping, when you're panicking about your loved one, those minor imperfections are likely to go completely unnoticed. [00:32:10] Speaker C: That's absolutely terrifying. And, and the systems that need just a few seconds, are those like, widely available or are we talking about cutting edge research tools? [00:32:22] Speaker B: Unfortunately, they're becoming increasingly available. There are commercial services that actually advertise instant voice cloning capabilities. And while many have legitimate use cases like helping people who've lost their voice due to medical conditions, the same technology can be misused. And we're also seeing specialized criminal offerings on underground forums that are specifically marketed for fraud. [00:32:51] Speaker C: So what makes these AI voices so convincing, Sue? Is it just that they sound like the person, or is there more to it? [00:32:59] Speaker B: There's definitely more to it. The latest models don't just capture the basic sound of someone's voice. They capture all those little idiosyncrasies that make a voice recognizable. The way someone might slightly emphasize certain syllables, their unique rhythm of speech, even those little vocal quirks, like, you know, a slight rasp, or the way they might trail off at the end of sentences. [00:33:26] Speaker C: I've noticed that a lot of these scams involve voices that sound distressed or crying. How do they manage that part? [00:33:32] Speaker B: That's where things get even more sophisticated. Modern voice synthesis systems can now apply what we call emotional modeling to generated speech. So they can make a voice sound like it's under direct crying, whispering, or frightened. And, and this is critical for these scams because that emotional content bypasses our rational thinking and triggers an immediate emotional response. [00:34:01] Speaker C: I'm guessing the technical quality has improved too, right? Like, do these voices still have that robotic quality that older text to speech systems had? [00:34:11] Speaker B: The robotic quality quality is largely gone in high end systems. The latest models produce voices with natural sounding breathing patterns, appropriate pauses, and even those little ums and uh that characterize human speech. Some can even replicate the sound of someone speaking in different environments, like over a phone line in a car, or in what might seem sound like a kidnapper's hideout with specific background noises, that's just. [00:34:43] Speaker C: That's incredible and terrifying at the same time. [00:34:45] Speaker B: According to Vijay, the cybersecurity expert, it comes down to three key factors that create a perfect storm for the scammer's use of AI voice cloning. [00:34:56] Speaker C: Really, sue, tell our listeners what Vijay identified as the three key factors. [00:35:01] Speaker B: Well, Nick, according to Vijay, the first is what he calls an emotional trigger. Hearing what sounds exactly like your loved one in distress creates an immediate fight or flight response where critical thinking goes out the window. Second, the time pressure these scammers create doesn't give victims a chance to verify anything. And third, these voice scams exploit one of our most fundamental human trusts, that we can recognize the voices of our loved ones. [00:35:34] Speaker C: I've heard about some cases where scammers combine AI voice with other technologies. Is that something you're seeing too? [00:35:41] Speaker B: Absolutely. We're seeing increasingly sophisticated multichannel attacks. Scammers might use AI to clone a voice, then combine that with spoofed caller id, fake text messages that appear. Appear to come from the victim's phone, and even deep fake video. In some cases, they're creating what security experts call blended threats that use multiple technologies to create a convincing deception. [00:36:08] Speaker C: That's like, so next level. Do you think there's, you know, organized criminal groups behind this, or is it more like individual scammers? [00:36:17] Speaker B: Great question, Nick, but in all the scam data I analyze lies. I have definitely been seeing both. There are sophisticated criminal organizations that operate almost like businesses with different people specializing in different aspects of the scam. Some focus on gathering data and voice samples, others on the technical voice generation, and others on making the actual scam calls. But we're also seeing individual scammers buying these capabilities as services on criminal forums, essentially scam as a service offerings. [00:36:57] Speaker C: What about the international aspect of this? Are these scams coming from specific regions or countries? [00:37:03] Speaker B: While these scams can originate from anywhere, data does show clusters of activity in certain regions. Traditionally, many phone scams originated from call center operations in countries where there's a combination of technical skills, English language proficiency, and limited law enforcement capacity to tackle cybercrime. But with the globalization of these technologies and techniques, we're really seeing this become a worldwide phenomenon. [00:37:35] Speaker C: I'm curious about how the underlying AI technology is itself is evolving. Like, did you read anything about Vijay's opinion on what he is seeing on the horizon that might make this Problem even worse. [00:37:48] Speaker B: That's a. That's a great question, Nick. According to Vijay, what concerns him the most is the convergence of different AI technologies. He is already seeing systems that can generate convincing voice. Voice in real time, responding dynamically to the conversation. The next evolution, according to Vijay, it's likely to be fully interactive AI systems that can not only sound like your loved one, but can actually hold a conversation while sounding distressed, answering questions, providing details, and maintaining the deception throughout. [00:38:29] Speaker C: A call that sounds like something out of a sci fi movie, but I guess it's really happening. [00:38:35] Speaker B: Frighteningly, Vijay says it absolutely is. Vijay says the technology to do this exists today, though it's not yet widely accessible. But based on how rapidly these systems are evolving and how quickly they move from research to commercial applications to criminal use, I think we're looking at maybe a 2012-18 month time frame before we see these more sophisticated interactive voice scams becoming common. [00:39:04] Speaker C: So, sue, what should people actually do if they get one of these calls? Like in the moment when they're panicking and hearing what sounds like their child in distress? [00:39:15] Speaker B: This is where preparation is key, Nick. The most important thing is to have a verification system in place with your family members before something like this happens. This could be a code word or a specific question that only your real family member would know how to answer correctly. [00:39:35] Speaker C: Like a security question, right? Something personal that wouldn't be on social media or easily guessable. [00:39:41] Speaker B: Exactly. And if you receive a call like this, try to stay calm enough to ask that verification question. The other critical step is to try to reach your loved one through other means. Call their cell phone, contact their friends, or reach out to places where they're supposed to be, all while keeping the scammer on the line, if possible. [00:40:06] Speaker C: That's really good advice. It must be so hard to think. [00:40:08] Speaker B: Clearly in that moment, though it is incredibly difficult for. Which is why having these plans in place beforehand is so important. Another thing to remember is that these scammers rely on keeping you in a state of panic. If you can create even a small moment to take a breath and think critically, you might spot inconsistencies or red flags in what they're saying. [00:40:33] Speaker C: You know, speaking of spotting those inconsistencies, are there any technical telltale signs that a voice might be AI generated? Like things that the technology still struggles with? [00:40:47] Speaker B: There are some subtle indicators, though they're getting harder to spot. Current AI systems sometimes struggle with very long sentences. The voice quality might degrade or sound slightly different as the sentence progresses. They can also have trouble with certain sounds, particularly laughter, coughing, or very emotional crying. And sometimes the emotional tone doesn't perfectly match the content of what's being said. [00:41:18] Speaker C: I imagine background noise might be another challenge for these systems. [00:41:22] Speaker B: That's a good point, Nick. While the latest systems can simulate background noise, they don't always get the acoustics correct quite right. If the background sounds too clean when someone is supposedly being held in a noisy environment, or if the background noise doesn't change naturally when the person is supposedly moving, that could be a red flag. [00:41:45] Speaker C: This is all just it's really eye opening. Sue, for our listeners who might be worried about their own voice being used in scams like this, do you have any advice on how they can protect their voice data? [00:41:59] Speaker B: This is challenging in today's connected world, Nick, but there are some precautions our listeners can take. Be mindful of where your voice is publicly available. Consider making videos on social media private if possible. Be wary of voice based surveys or games that might be collecting voice samples and considerations. Consider using voice authentication carefully. While it can be secure when implemented properly, not all systems protect your voice data adequately. [00:42:34] Speaker C: Sue, this is a good point to end this very important episode on Virtual Kidnapping. We hope our listeners all learned something and recognize the dangers of artificial intelligence and its use in virtual kidnappings. On another note, I just wanted to say what a fantastic job you did on hosting this episode. [00:42:55] Speaker B: Sue, thank you. I appreciate the compliment. But you know, it's such an important scam trend to highlight that you just can't help being passionate when telling others. Speaking of telling others, Miles, would you please close out this episode of behind the Scams? Bye for now. [00:43:16] Speaker A: You've been listening to behind the Scams, where Nick and sue expose the tactics behind today's most dangerous cons. In this episode, we unpacked the terrifying rise of virtual kidnappings. Calls that make you believe a loved one's been taken when it's all an elaborate scam. We explained how AI voice cloning makes these hoaxes sound real, how scammers use panic and isolation to keep you from thinking clearly and the simple steps you can take now like family code words and slowing down to protect yourself. If this episode opened your eyes, share it with someone you care about. Subscribe to behind the Scams so you don't miss future episodes and visit stampoutscams.org for more tools and resources. Stamp out Scams Leverages, a nationwide law enforcement network and cutting edge technology to combat scammers. On their site and on YouTube, you'll find Scam TV, their official video channel dedicated to exposing scams, educating the public, and sharing real life stories from victims. They created Scam TV because they believe prevention starts with connection, real stories, real people, and real knowledge. Whether you're watching to protect yourself, a loved one, or your entire community, Scam TV is here to help. Because scam victims unfortunately come from all walks of life and all generations. Stay tuned for our next episode of behind the Scams. Until then, stay safe and stay scam free. Bye for now.

Other Episodes

Episode

June 24, 2025 00:59:54
Episode Cover

EP 23: Crypto Scam Kingpins: The $245M Scam That Shocked America!

Crypto Scam: Episode Overview In one of our most jaw-dropping episodes to date, Behind the Scams pulls back the curtain on a newly unsealed...

Listen

Episode 0

February 24, 2025 00:43:03
Episode Cover

EP 10: Pig Butchering Scams EXPOSED: How Scammers Manipulate & Steal Millions! 

How Organized Criminals Are Turning Trust Into a $100 Million Cryptocurrency Scam Industry What Is a Pig Butchering Scam? In this shocking episode of ...

Listen

Episode 0

February 13, 2025 00:41:40
Episode Cover

EP 9: The 80-Year-Old Money Mule: The Curious Romance Scam Case of 80-Year Old Glenda

The Glenda Seim Case: When Romance Turns Criminal A Love Story Fueled by Lies In this episode of Behind the Scams, Nick and Sue...

Listen