ClickCease

Horror Stories from a Crypto-Fraud-Fighter

I've seen dead people… at work. 

The old man's eyes were so cloudy that his pupils were hardly visible. His gaunt face and withering pale skin reminded me of the "Screaming Man," his bones and muscles seemingly frozen still from the time he struggled to draw his last breath. I'm no forensic pathologist, but the man appeared to have been deceased for two weeks by the time the withdrawal request to drain his entire BTC wallet was submitted. Of course, he had “assistance” from the convenience of his casket.

Throughout my career in the crypto-fraud-fighting space, spooky cases such as these have been common escalations. While the underlying motives and details always vary, they all have one thing in common… a selfie.

Let's Take a Step Back

When banking and finance transitioned from brick-and-mortar to online, we applied the same mechanisms of trust and safety that felt comfortable. For example, I walk into my local bank and what do I need? Typically just my person and a government-issued ID. Whether you're an extravert or an introvert, there's no escaping that we're all innately social beings who require face-to-face interaction to give and gain trust.

However, the problem arises when we apply that same method of trust to our digital environments. Albeit core to our nature, selfie- and document-scanning solutions create intrusive and friction-full user experiences while also generating costly manual review queues and allowing determined bad actors to slip through the cracks. Though these solutions are necessary to every crypto or fintech, they should be implemented judiciously within a step-up authentication approach and not utilized as your first and only line of defense. In the spirit of Halloween, allow me to explain using real-life examples from my days fighting fraud on the crypto frontlines and advising others in this ever-evolving space.

Real-Life Fraud Horror Stories

  1. Deep Fake Case: In the world of increasingly deceptive deep fake technology, discerning who is a real human being and who is not is becoming nearly impossible. This conundrum once reached an apex when a bad actor took control of a high-net-worth client's account and submitted a BTC withdrawal worth five million USD. The hacker used an ancestry service to produce a 3D rendering of the client's face, recreated the necessary "look left, look straight, look right" selfies, and successfully passed the identity verification check to send the withdrawal into processing mode. Meanwhile, a high-achieving fraud investigator decided to review all selfie and ID images associated with approved withdrawal requests on that day. He noticed the ancestry service logo in the bottom right corner, and just in the nick of time, canceled the withdrawal. Luck, not the identity solution, was what saved the day here. 
  1. Elder Scam Case: Not-so-lucky are victims of elder fraud. The anonymous nature of blockchain technology makes crypto exchanges an enticing place for bad actors to trick and steal from older individuals. Investment scams are one such approach: bad actors establish rapport with older folks and promise immense returns from investing and trading crypto on their behalf. One particular elder female - we’ll call her Ellen - fell into this evil trap and gave the bad actor full control of both her bank account and crypto account, the latter established with the help of her newly entrusted “financial advisor”. Once the crypto account was created and linked to her bank, the fraudster swifty transferred Ellen’s pension, converted it into crypto, and withdrew it to an external wallet address he controlled. Throughout the process, Ellen would eagerly complete the identity verification links the fraudster forwarded her, assuming her gains were on the way. Little did she know she was assisting a fraudster in the unrecoverable exfiltration of her life savings. 
  1. Hostage Case: Elder fraud can occur online (as with Ellen’s case) as well as in person. Though the convenience of the former renders it more pervasive, the later introduces an added dimension of danger and physical threat. The amount of escalations involving a hand physically manipulating an elders head in the selfies were unfortunately too numerous to count. In one extreme case, a fraud investigator noticed something else. There was a mirror in the background, which showed his hands zip-tied behind his back as three men stood in front of him, holding up a mobile phone between themselves and their hostage. The fraudsters' motives mirrored that of Ellen’s case, except instead of hiding behind a computer screen, the bad actors chose physical force. The case took many twists and turns, but in the end, and with help from law enforcement, the gentleman thankfully walked away with his life and his funds. 

Conclusion

Had Prove’s identity authentication and verification solutions been in place, it’s likely that the bad actors in the above examples would have been stopped in their tracks. This is because Prove's approach to onboarding and servicing satisfies a “PRO” model of identity verification and authentication that traditional doc-scanning solutions alone do not:

  1. Possession–Confirm the user is in physical possession of the device with “something-you-have” authentication.
  2. Reputation–Screen for risk events to ensure the phone number being asserted for authentication has not been compromised or used by a bad actor
  3. Ownership–Verify the phone number is associated with the rightful owner or true consumer

So, this Halloween, consider saving your customers from being “tricked” and instead “treat” them to better and safer experiences by leveraging Prove’s identity solutions

No items found.

Keep reading

See all blogs
Blog
Prove’s Mary Ann Miller Featured in TechRepublic Panel About Addressing Cyberattacks With AI

AI tools can autonomously generate threat detection queries, sift through vast amounts of data, and pinpoint potential threats without manual intervention.

Mary Ann Miller
July 26, 2024
Blog
Creating Deepfakes is Easy - And That’s a Huge Onboarding Problem

Deepfakes, while not entirely new, have reached a level of sophistication that challenges businesses that are trying to deliver frictionless digital onboarding to their users.

Kelley Vallone
July 25, 2024
How to Defend Against the Rise of SIM Swap Attacks

The Federal Trade Commission (FTC) received reports of a significant increase in SIM swap attacks in 2023, and Experian's 2024 scam forecast identified SIM swapping as one of the top threats, emphasizing the need for heightened awareness and preventive measures.

Mary Ann Miller
July 24, 2024