A sophisticated online scam recently duped a French woman into giving away €830,000. The con artist? An AI-generated version of Hollywood star Brad Pitt. This shocking case highlights the risks of AI in the wrong hands and the emotional manipulation behind cyber fraud.
Falling for an AI “Romance”
The 53-year-old woman, an interior designer, first received a message from an account claiming to be Jane Etta Pitt, Brad Pitt’s mother. The message was flattering: she was the “perfect match” for the actor. The very next day, she was contacted by an AI-generated Brad Pitt.
He didn’t just send casual texts. Over the next year and a half, the imposter poured on charm with poems, love declarations, and even a marriage proposal.
“There are few men who write to you like that,” the victim told French media. “He knew how to talk to women. It was very well put together.”
At the time, she was going through a divorce with her millionaire husband, a man 19 years older. Vulnerable and looking for emotional support, she found herself swept away by what she thought was a romantic connection.
A Web of Lies
The scam began with seemingly small requests. The “actor” claimed to send her luxurious handbags but asked her to cover customs fees, starting with €9,000.
Soon, the stakes escalated. The scammer said he was in a hospital, battling kidney cancer. He added a layer of urgency by claiming his ex-wife, Angelina Jolie, had frozen his bank accounts, leaving him penniless despite his supposed fortune.
To make the story more believable, the con artist even fabricated a “doctor” who emailed the victim, confirming the dire situation. Convinced of her lover’s plight, she transferred nearly €800,000 to an account in Turkey.
Ignoring Red Flags
When her daughter warned her that the relationship was a scam, the woman stood by her “lover.” “You’ll see when he’s here in person, then you’ll say sorry,” she replied confidently.
Her belief was shattered when the real Brad Pitt appeared publicly with his girlfriend, Inès de Ramon, in the summer of 2024.
The Emotional and Financial Toll
The betrayal left the victim devastated, both financially and emotionally. She was hospitalized for depression after the ordeal. Speaking out, she admitted she initially doubted the messages but was overwhelmed by the personal and emotional nature of the communication.
“We’re talking about Brad Pitt here, and I was stunned,” she said. “At first, I thought it was fake, but I didn’t really understand what was happening to me.”
AI in the Wrong Hands
This incident underscores the growing dangers of AI technology in scams. Using advanced tools, criminals can create realistic fake personas, complete with images and personalized messages, to exploit their victims.
The AI-generated Brad Pitt sent:
- Poems and heartfelt messages to build trust.
- Images of himself in a hospital to evoke sympathy.
- Emails from “doctors” to reinforce credibility.
These tactics illustrate how scammers leverage technology to mimic real-life situations and manipulate emotions.
Preventing Future Scams
Cases like these serve as a warning about the increasing sophistication of online fraud. Here are some tips to protect yourself:
- Verify the identity of online connections. A reverse image search can help spot fake photos.
- Be skeptical of urgent financial requests. Legitimate celebrities or professionals don’t ask for money online.
- Consult family or friends. A second opinion can often reveal red flags.
- Report suspicious activity. If you suspect fraud, inform authorities immediately.
Cybersecurity in the AI Era
As AI technology develops, so does its potential for misuse. This story is a chilling reminder of how even emotionally intelligent scams can devastate lives.
Signs of an AI-Driven Scam
Tactic | What to Look For |
Personalized messages | Overly flattering or emotional language. |
Fake photos or videos | Images that seem overly polished or generic. |
Urgent financial requests | Requests for money with high emotional stakes. |
Complicated backstories | Elaborate stories involving illness or legal issues. |