News

    Brooklyn Retiree Lost $1.6M in AI 'Pig Butchering' Romance Scam

    A single text message from a woman named 'Jenny' cost 76-year-old Ron Williams his entire $1.6 million retirement. The scammer's videos, photos, and investment 'dashboard' were all AI-fabricated, and his own son had to use AI to prove it.

    12 min readBy AuthentiLens Editorial
    A smartphone with a generic text-message interface beside a blurred AI-generated portrait on a kitchen table

    What happened

    Ron Williams is 76 years old. He is a retired insurance agent who spent decades building a life in Brooklyn. Last year, a single text message from a stranger cost him every dollar he had saved.

    The woman who sent that message called herself “Jenny.” She said she was 33 years old, a Christian living in Boston. Over weeks of conversation, she sent him photos and videos. She talked about her faith, her dreams, her day-to-day life. She seemed real. She seemed trustworthy. She seemed like someone who cared about him.

    None of it was real. Jenny was a ghost built from artificial intelligence. Her face was generated by AI image models. Her videos were deepfakes. The investment platform she recommended, which showed Williams's money growing to $4 million, was a fake dashboard designed to look legitimate. Every piece of evidence Williams thought proved Jenny was real was, in fact, a product of the same technology that destroyed him.

    According to an NBC News report by Vicky Nguyen, the moment Williams finally accepted the truth came from an unexpected source. His son used publicly available AI tools to generate a video of “Jenny” from scratch, proving to his father how easily a complete human identity can be fabricated. The demonstration, not the warnings, was what broke through.

    The anatomy of the scam: how Ron Williams lost $1.6 million

    The scam began the way most pig butchering operations begin: with a wrong-number text. Williams received a message from someone he did not know. “Jenny” introduced herself, and the two began chatting.

    Over time, NBC News reported that the relationship developed over months, Jenny sent Williams increasingly personal messages. She sent photos. She sent videos. She talked about her life as a 33-year-old Christian woman in Boston. Williams, who was lonely, began to trust her.

    Once the emotional bond was established, the financial hook came. Jenny told Williams she had become wealthy by investing in an online cryptocurrency platform. She encouraged him to do the same. Over the course of about six months, Williams invested his entire life savings, roughly $1.6 million, into the platform Jenny recommended.

    The platform showed Williams his balance climbing. At its peak, the fake dashboard displayed approximately $4 million, more than double what he had deposited. He believed he was making the investment of a lifetime.

    When Williams attempted to withdraw his money, the scam moved to its final phase. Jenny told him he needed to pay additional funds to cover “taxes” on his gains before he could access his money. This is the signature tell of pig butchering scams: a platform that shows enormous gains but requires endless fees before allowing any withdrawal.

    Williams grew suspicious. He had also tried to meet Jenny in person several times, according to NBC News, and she always had an excuse. She would not appear on video calls that felt spontaneous. She would not share verifiable details about her life.

    His son began investigating. He searched for the videos Jenny had sent and found the same footage being used across multiple different social media profiles, a clear sign of a synthetic identity. To drive the point home, the son used AI tools to generate his own video of “Jenny,” demonstrating how trivially easy it had become to create a convincing fake person.

    Williams accepted that he had been scammed. He reported the incident to the police and the FBI. His money is almost certainly gone forever.

    The $110,000 detail: a glimpse into the operation's mechanics

    One detail from the NBC reporting reveals how sophisticated these criminal operations have become. According to the coverage, at one point during the scam, “Jenny” claimed she had loaned Williams $110,000 to help him invest more. When Williams attempted to repay this fake loan via bank transfer to China, the transaction was blocked. The scammers then sent a courier to Williams's home to collect the money in cash, in person, with a suitcase.

    This detail is chilling for two reasons. First, it shows that pig butchering operations are not purely digital. They have physical infrastructure, couriers, money mules, local agents, that can appear at a victim's doorstep. Second, the $110,000 fake loan is a psychological weapon. By claiming to have lent Williams money, Jenny made him feel indebted to her. She also made him feel like they were partners in the investment, not a scammer and a victim, but two people working together toward a shared financial goal. This is the “fattening” phase of pig butchering: building trust so deep that the victim stops asking critical questions.

    Why it matters

    What happened to Ron Williams is not a one-in-a-million tragedy. It is the dominant shape of online fraud in 2026, and the numbers are staggering.

    Chainalysis: $5.5 billion in pig butchering losses

    According to blockchain analytics firm Chainalysis, pig butchering scams, also known as romance-baiting investment fraud, generated approximately $5.5 billion in criminal revenue across roughly 200,000 individual cases in 2024. The average loss per victim was about $27,500.

    Chainalysis reported that illicit revenue from pig butchering scams surged nearly 40% year-over-year in 2024, with the number of crypto deposits into these scams increasing by approximately 210%, suggesting a dramatic expansion in the victim pool.

    The firm also documented a troubling evolution: scammers have begun targeting unemployed individuals with fake work-from-home job offers, generating smaller but faster revenue than traditional romance scams. AI service vendors supporting these operations saw their revenue skyrocket by 1900% in 2024.

    FTC: older Americans lost $2.4 billion in 2024

    The Federal Trade Commission's 2024–2025 report on protecting older consumers found that total fraud losses among adults aged 60 and older have quadrupled since 2020, rising from approximately $600 million to $2.4 billion in 2024. The FTC identified investment scams, romance scams, and impersonation schemes as the most financially devastating categories for older Americans. Tech support scams accounted for more than $159 million in losses last year alone.

    FBI: $632 million in AI-enabled investment scams

    As AuthentiLens reported in our coverage of the FBI's 2025 Internet Crime Report, the bureau documented $632 million in losses from investment scams with a confirmed AI component, a category that did not exist in the IC3's reporting until this year. That figure is almost certainly a dramatic undercount. It only includes cases where victims filed a complaint and where investigators could definitively prove AI was involved.

    The underreporting problem

    Perhaps the most important number is the one that does not appear in any report. Security researchers estimate that only 7% to 10% of romance scam victims ever report the crime to authorities. Shame, embarrassment, and the belief that nothing can be recovered keep the vast majority of cases in the dark. For every Ron Williams who appears on NBC News to warn others, there are nine others who suffer in silence.

    Why AI changed everything

    Romance scams are not new. “Catfishing”, using a fake identity to deceive someone online, has existed for as long as the internet. But AI has fundamentally altered the economics and effectiveness of these operations.

    Before AI: the limits of manual catfishing

    A decade ago, running a romance scam required significant human labor. A scammer needed to:

    • Source stolen photos of a real person (risking those photos being reverse-image searched)
    • Maintain consistent backstories across multiple conversations
    • Avoid video calls (the fastest way to expose a fake)
    • Manage perhaps five to ten active relationships at a time

    The scams worked, but they were labor-intensive and had hard limits.

    After AI: industrial scale, perfect consistency

    Today, a single operator can run dozens of “Jennys” simultaneously using AI tools:

    • AI-generated faces eliminate the need for stolen photos. Every persona can have a unique, never-before-seen face that will not appear in reverse image searches.
    • AI chatbots can carry on hundreds of simultaneous conversations, each one personalized, grammatically correct, and emotionally calibrated to the individual victim.
    • Deepfake video tools can generate convincing “live” footage, including real-time video calls. Scammers are now using real-time deepfake overlays during live video calls to trick victims into believing they are speaking to a real person.
    • AI-generated dashboards create fake investment platforms that show any balance the scammer wants, complete with fabricated trading histories and growth charts.

    Investigations have documented fraudsters using generative AI to manage hundreds of high-quality profiles simultaneously. These are no longer obvious bots with broken English and stock photos. They are industrialized operations that can mimic human empathy, respond in real time, and bypass standard Trust and Safety filters.

    Why warnings didn't work, and what finally did

    One of the most important lessons from the Ron Williams case is also one of the most frustrating: when a victim is inside the frame of a scam, almost nothing the outside world says will get through.

    Williams's family tried to warn him. His son pleaded with him to stop sending money. But Williams believed Jenny was real. He had seen her face. He had heard her voice. He had exchanged countless messages with her. To him, the evidence of her existence was overwhelming.

    What finally broke through was not an article, a warning, or a desperate conversation. It was a demonstration. Williams's son used AI tools to generate his own video of “Jenny”, building a fake person from scratch in front of his father's eyes. Only when Williams saw how easy it was did he accept that his Jenny had been fabricated the same way.

    This is a crucial insight for families dealing with a loved one who may be caught in a romance scam. Logic and pleading often fail. A demonstration, showing how the technology works, building a fake persona in real time, can succeed where words cannot.

    How to protect yourself

    The AuthentiLens editorial team has distilled the Ron Williams case and the broader pig butchering epidemic into six concrete protections.

    1. Treat any “wrong number” text as a potential scam. The cold opener, “Hey, is this Michael?” or “Long time no talk!”, is the standard entry point for pig butchering operations. Do not engage. Do not correct them. Do not politely explain that they have the wrong number. Block the number and report it as spam. Every reply tells the scammer that your number is active and that you are willing to engage with strangers.
    2. Never invest in a platform recommended by someone you have only met online. Full stop. This is the single most important rule on this list. If the “investment opportunity” only exists inside the conversation, if you cannot find independent, verifiable information about it from sources unrelated to the person who recommended it, it is not an opportunity. It is a trap.
    3. Do not trust dashboards that show “gains.” Pig butchering scams run fake front-end interfaces that can display any number the operator wants. Your money is not growing. The only real test is an actual withdrawal to a bank account in your name. If the platform says you need to pay “taxes” or “fees” before you can withdraw, you are being scammed.
    4. Video calls are no longer proof of identity. Real-time deepfake tools now allow scammers to run convincing live video calls while typing responses from a script. If you are on a video call and suspect a deepfake, security experts recommend asking the person to do something unscripted and physically specific: “Hold up your hand and wave it slowly across your face” or “Turn your head 90 degrees to the side.” Current AI models often glitch or tear the digital mask during these movements. Even this test is not foolproof, verify through an independent channel before trusting anyone you have only met online.
    5. Have “the conversation” with elderly parents this week. Sit down with them and walk through the warning signs:
      • Has a stranger texted them recently?
      • Are they in an online relationship with someone they have never met in person?
      • Has that person mentioned cryptocurrency, investing, or a “secret” way to make money?
      • Have they been asked to send money, gift cards, or cryptocurrency?
      Agree on a family rule: no financial decision over a certain amount, even $500, is made without a second phone call to a trusted family member. This single rule would have saved Ron Williams.
    6. Scan suspicious messages, videos, and profiles before you reply. You are not expected to become a deepfake detection expert. Paste suspicious content into AuthentiLens. Our detection engine flags romance-scam language patterns, AI-generated photos and videos, and the urgency signals that pig butchering operators use to close the deal, in seconds, before you reply, click, or pay.

    A note on shame

    Ron Williams went on national television to tell his story. He let NBC News use his name and his face. He did this knowing that some people would judge him, would call him foolish, would wonder how anyone could fall for such a thing. He did it because he wanted to warn others.

    There is no shame in being scammed. Scams succeed because they exploit normal human responses: loneliness, trust, hope, the desire for a better life. The people who run pig butchering operations are professionals. They study psychology. They use the most advanced technology available. They run their enterprises like businesses, complete with market assessments, customer relationship management, and performance metrics.

    The shame belongs to them. Not to Ron Williams. Not to the thousands of other victims who will never appear on television.

    If you or someone you love has been scammed, the most important thing you can do is report it. File a complaint with the FBI's IC3 at ic3.gov. Contact your bank. Tell your family. The silence is what the scammers count on.

    Sources

    Stay ahead of the next scam

    One short briefing per week on the newest scam tactics, deepfakes, and fraud trends, straight from the AuthentiLens editorial desk.

    By subscribing, you agree to our Terms and Privacy Policy. Unsubscribe anytime.

    Scan suspicious content in seconds

    5 free scans across messages, photos, audio, video, profiles, and links. No signup needed.

    Try AuthentiLens Free