GobblesGobbles

A stranger can clone your grandchild's voice from a 10-second Instagram video — and call you with it asking for bail money within the hour.


The Phone Call That Can Empty Your Account in 10 Minutes: Your Grandchild's Voice, a Stranger's Demand

Your phone rings. It's your grandson's voice, shaking, saying he's been in a car accident and needs you to wire money right now — please don't call Mom, she'll panic. The voice is perfect. The fear in it is real. And not one syllable of it came from your grandson.

AI voice-cloning scams — sometimes called "grandparent scams" — work by feeding a few seconds of someone's audio into software that can cost as little as $5 a month. A short video on Facebook or TikTok is enough raw material. The result is a voice clone that can say anything the scammer types. The FTC and FCC have both issued alerts on this pattern, warning that scammers pair the fake voice with a fabricated crisis — an arrest, a crash, a medical emergency — and demand immediate payment through wire transfer, gift cards, or Zelle before you have time to think. In the first documented case of executive voice fraud, a British energy company wired $243,000 to a scammer who had cloned its CEO's voice. The tools to do that now fit inside a cheap monthly subscription.

The money lost is real, but so is something harder to name: the feeling of having heard someone you love begging for help, and realizing it never happened.

Gobbles Gobble's Take: If a voice on the phone is urgent and wants money, hang up immediately and call that person back on a number already saved in your phone.

Source: The Firing Line


The $5 Fix That Beats a $1,000 AI Scam: Set a Family Safe Word Tonight

Hearing a familiar voice used to be enough. It isn't anymore. But there is a defense that costs nothing and takes about two minutes to set up: a family safe word.

The idea is simple. You pick a word or short phrase — something that would never come up naturally in a crisis conversation — and share it privately with the people closest to you. If someone calls claiming to be a family member in trouble, you ask for the safe word before doing anything else. No word, no money, no exceptions. This works because AI voice cloning can perfectly replicate how someone sounds, but it cannot produce a secret it was never given. Researchers and consumer advocates at institutions including Vanderbilt University Medical Center have recommended this exact step as a first-line defense against voice impersonation. Choose something specific enough to be unmistakable — a childhood nickname, the name of a family pet from decades ago — and keep it off social media.

A safe word turns the scammer's best weapon into a single, answerable question: do you know the word or not?

Gobbles Gobble's Take: Pick a safe word with your family tonight — it's the cheapest fraud protection you'll ever have, and it works even against perfect AI voices.

Sources: Aidarsi.com · VUMC News


He Saw the Face, Heard the Voice, and Wired the Money Anyway — Because It Was All AI

A retired government officer in Kerala, India, got a WhatsApp video call from what looked and sounded exactly like a trusted former colleague. The face was right. The voice was right. The emergency felt real. He transferred the money. The "colleague" was a deepfake.

This case illustrates how far the fraud has moved beyond voice calls alone. Scammers are now combining three techniques in sequence: smishing (fake text messages that establish initial contact), vishing (voice phishing calls that build urgency), and AI-generated deepfake video that provides the visual "proof" a victim needs to believe the story. The FBI has separately warned about campaigns in which bad actors have impersonated senior U.S. officials using this same combination of tactics. What makes deepfake video calls particularly dangerous is the removal of every traditional warning sign — there's no bad grammar, no strange accent, no awkward pause. The person on screen looks directly at you and speaks naturally, and the whole exchange is over in minutes before you've had a chance to question it.

Speed is part of the design. The scam is engineered to close before doubt has time to open.

Gobbles Gobble's Take: If a video call from a friend or colleague opens with an urgent money request, treat it as a deepfake until you've called them back on a number you already had saved.

Source: Deepfake Scams Exposed Podcast


That 'Bank Representative' On the Phone Might Be AI — And It's Not Asking for Your Password Yet

The AI-powered scam calls arriving in 2025 and 2026 don't open by asking for your account number. That's what makes them different — and more dangerous than the obvious fraud attempts most people have learned to spot.

Today's AI-driven impersonators can sound exactly like a bank customer service line, a Medicare representative, or a tech support agent. They speak clearly, answer follow-up questions naturally, and often spend the first call doing nothing more suspicious than "confirming your account" with information they already gathered from a data breach or your social media. The ask — wire a payment to a "safe account," install remote-access software, read back a one-time code — comes later, once trust is established. According to Wright-Patt Credit Union, a federally chartered credit union that issued a consumer advisory on AI fraud patterns, the combination of personalization and perfect speech removes the obvious red flags that once helped people catch scam calls early.

The most dangerous call you'll receive this year may be the one that sounds completely normal.

Gobbles Gobble's Take: Any unexpected call from a "trusted" organization asking you to act fast deserves one response: hang up and call them back on the number printed on their official website or your card.

Sources: Wright-Patt Credit Union · Guy Kawasaki / Substack


Quick Hits

  • Free fraud-prevention guides from AOL: AOL has published a plain-English resource on how to recognize and avoid common bank fraud and scam calls — worth bookmarking and forwarding to older relatives. AOL
  • Your voice can be cloned in 10 seconds: A widely shared explainer walks through exactly how little audio scammers need — and why posting video on public social accounts raises your family's risk. Substack

In Case You Missed It

Yesterday's top stories:

Get Family Scam Watch in your inbox

Free daily briefing. No spam. Unsubscribe anytime.