Dr. Muriel Frank, a researcher from the Centre for Security, Reliability and Trust (SnT) at the University of Luxembourg, investigates how online scams function and evolve.

As many of the traditional ways to spot online fraud are disappearing in the age of an AI-dominated internet, she offers insights into how one might stay cautious.

From princes to ‘pig-butchering’

“Online fraud has evolved significantly in recent years, becoming more sophisticated and widespread,” she explains.

While many are familiar with mass emails containing basic, impersonal, and urgent requests to transfer upfront funds in exchange for riches that never materialise, today’s scams have become more intricate and often involve an emotional element.

The romance scam first “introduced emotional grooming and the development of an intimate, often fabricated, relationship before making any financial requests,” she says.

Similarly, investment scams have “evolved to mimic professional advice, using language about risk, opportunity, and financial literacy to appear legitimate, and increasingly leveraging social media,” Frank adds.

Recently, Frank and her colleagues Ayse Nur Asyali and Pol Hölzmer investigated the “pig-butchering scam,” which she explains draws inspiration from previous iterations but combines the false promise of professional financial guidance and emotional support.

“Scammers maintain daily contact and gradually shift conversations from intimacy to investment,” she explains.

Thanks to crypto-based payment systems, fraudulent trading platforms, and industrialised training materials, these sorts of scams are increasingly difficult to detect, while being easier to scale for fraudsters.

“Thus, pig butchering is not merely an evolution of past scams, but rather a fusion of emotional manipulation and financial fraud,” she says.

Many such operations are made possible with the advent of online banking and digital money transfers.

“One central issue is that more and more online scams use professional-looking, yet fake, investment platforms or cloned banking websites,” notes Frank.

“These sites closely mimic legitimate services and often feature fabricated positive reviews or online recommendations, making them nearly impossible to detect. As a result, they create a false sense of credibility, causing people to transfer money to accounts or invest in financial products they would normally question,” she adds.

Another aspect is the use of cryptocurrency, warns Frank, as “once victims transfer funds into crypto wallets controlled by scammers, it is extremely difficult, if not impossible, to recover the money.”

AI’s threat

Most of the issues described above are only worsening with the rise of AI tools, “which are making deception easier, faster, and more convincing,” says Frank.

These tools allow cybercriminals to run vast amounts of fake social media profiles that appear authentic and which can then be used to interact with victims.

“They also use advanced face-changing and voice cloning technologies to impersonate real people during video calls or in pre-recorded messages, which makes it much more difficult for victims to detect anything amiss,” she says.

Additionally, AI-powered translation tools also enable scammers to target anyone, anywhere. “Conversely, this means that traditional warning signs, such as poor grammar, mismatched voices, and low-quality images, can no longer be relied upon.”

The researchers also uncovered that many cybercriminals actually share their tactics on social media, which includes bypassing established security protocols, such as ‘Know Your Customer’, used by banks to verify authenticity.

“This is a troubling development because AI can now undermine security measures that were once considered reliable and which have now become increasingly vulnerable to manipulation,” she says.

To do so, scammers can use AI tools to generate “realistic ID photos, alter facial features, and produce manipulated selfies that can successfully deceive automated verification systems.”

Improving online safety

Frank observes that “many people still underestimate how skillfully and strategically cybercriminals operate. Rather than exploiting technical vulnerabilities, they prey on human emotions. This makes people of all ages vulnerable, albeit in different ways.”

Academic research into this topic thus plays an important role in better understanding how such scams function and evolve with time and technology.

Within the pig butchering scam study, the researchers examined 26 scam manuals used to train new scammer recruits.

“By analysing these materials, we can better understand the precise strategies that scammers are taught to use, including how they build trust, apply emotional pressure, and scale their operations with industrial efficiency. This kind of insight is invaluable,” she says.

The important takeaway is that scams are never isolated incidents but extremely well-orchestrated schemes which necessitate a comprehensive approach from authorities and regulators in order to tackle them.

While Frank admits that current technological advancements and the disappearance of traditional warning signs have made it complicated to successfully spot scams, she gives one guiding principle to follow: “be sceptical when online.”

“A big red flag is when an unfamiliar contact suddenly brings up cryptocurrency investments, sometimes within just a few days of making contact. This tactic is typical of pig butchering schemes,” she says.

“I would also advise people to be wary of anything that seems too good to be true. This includes offers of quick profits, assurances of risk-free gains, unexpectedly intense personal attention early in an online interaction, and lots of questions about age, profession, family status, property, and cars,” Frank adds.

Finally, she believes that raising awareness, which includes openly talking about experiences with online scams without shame and alerting friends and colleagues, “can significantly reduce the likelihood of others falling victim.”

Comments are closed.