In a disturbing case of cybercrime, scammers used a fake profile picture of a policeman on WhatsApp to deceive a businessman. The criminals accused the businessman of being involved in human trafficking, leveraging his fear and trust in authority to manipulate him. They sent him a fabricated arrest warrant and a seizure order via an online link, further escalating the pressure on the victim. In a brazen move, one of the scammers even impersonated a Supreme Court judge during a phone call with the businessman.
Through these deceptive tactics, the fraudsters convinced the businessman that he needed to undergo a "fund legalization process" and deposit his money into an account purportedly held by the Reserve Bank of India (RBI). The scam, which unfolded over a gruelling period of seven to eight hours, resulted in a significant financial loss of Rs 1.3 crore for the victim.
Despite the severity of such incidents, victims often find themselves without adequate support. While the government has publicized a cybercrime helpline number, 1930, it merely directs complainants to file their cases on the website www.cybercrime.gov.in. Even after a complaint is lodged, the responsibility to follow up and ensure action is taken largely falls on the victim.
This case highlights the broader issue of law enforcement agencies not playing a proactive role in assisting citizens who fall prey to online fraudsters. The lack of timely intervention and investigation into cybercrimes exacerbates the distress faced by victims. As cybercrime rates continue to rise, there is a pressing need for law enforcement to enhance their responsiveness and take on a more active role in protecting citizens from such sophisticated digital threats.
On Thursday, the Three Brotherhood Alliance, which had conducted a surprise attack in Shan state, on the country's northern border, in late October, took over the city from the military administration of Myanmar. The rebel organization claims that the military has given up control over the Kokang region, which is about the size of Lebanon.
Since the beginning of the campaign, the coalition has indicated its plans to deal with the organized scams that have emerged under the watch of militias loyal to the ruling junta.
“To eradicate telecommunications fraud, fraud dens and their protective umbrellas across the country, including the China-Myanmar border areas, our three coalition forces decided to jointly carry out this military operation,” the coalition stated upon the launch of the offensive.
The rebel groups' emphasis on the flourishing scam sector is probably an attempt to win over China, which has grown weary of seeing its citizens targeted into the compounds to conduct scams, or worse, targeted by so-called 'pig butchering scams.'
Over last weekend, junta leader Senior Gen. Min Aung Hlaing met with Chinese Vice Foreign Minister Sun Weidong in Naypyidaw to discuss border security and organized crime.
“The two sides will jointly maintain peace and stability on the China-Myanmar border, cooperate to combat cross-border criminal activities such as telecommunications fraud, and jointly promote regional peace, tranquillity, development and prosperity,” stated the Chinese Foreign Ministry in the meeting.
As per a state media outlet China Daily, Wang Xiaohong, Minister of Public Security also attended a virtual meeting with Myanmar’s Home Affairs Minister, Lt. Gen. Yar Pyae, where they both agreed to strengthen law enforcement to protect security and stability in border areas, especially by stepping up efforts to deal with online and telecom fraud.
According to a UN report from August 2023, around 120,000 individuals were coerced into scamming operations in Myanmar. In most cases, pig butchering scams entail a con artist establishing a rapport with a victim via social media, dating services, or messaging apps.
On January 5, Chinese state media reported that 41,000 individuals implicated in telecom fraud in Myanmar were turned over to Chinese police in the previous year. The number of people that were taken into custody who were trafficked is unknown.
Observers have cautioned that despite the crackdown in northern Myanmar, activities might easily move to criminal areas elsewhere in the nation, particularly near the borders with Thailand and Laos.
The latest tactic used by threat actors has been deepfakes, where a cybercriminal may exploit the audio and visual media for their use in conducting extortions and other frauds. In some cases, fraudsters have used AI-generated voices to impersonate someone close to the targeted victim, making it impossible to realize they are being defrauded.
According to ABC13, the most recent instance of this included an 82-year-old Texan called Jerry who fell victim to a scam by a criminal posing as a sergeant with the San Antonio Police Department. The con artist informed the victim that his son-in-law had been placed under arrest and that Jerry would need to provide $9,500 in bond to be released. Furthermore, Jerry was duped into paying an extra $7,500 to finish the entire process. The victim, who lives in an elderly living home, is thinking about getting a job to make up for the money they lost, but the criminals are still at large.
The aforementioned case is however not the first time where AI has been used for fraud. According to Reuters, a Chinese man was defrauded of more than half a million dollars earlier this year after a cybercriminal fooled him into transferring the money by posing as his friend using an AI face-swapping tool.
Cybercriminals often go with similar tactics, like sending morphed media of a person close to the victim in an attempt to coerce money under the guise of an emergency. Although impostor frauds are not new, here is a contemporary take on them. The FTC reported in February 2023 that around $2.6 billion was lost by American residents in 2022 as a result of this type of scam. However, the introduction of generative AI has significantly increased the stakes.
A solution besides ignoring calls or texts from suspicious numbers could be – establishing a unique codeword with loved ones. This way, one can distinguish if the person on the other end is actually them. To verify if they really are in a difficult circumstance, one can also attempt to get in touch with them directly. Experts also advise hanging up and giving the individual a call directly, or at least double-checking the information before answering.
Unfortunately, scammers employ a variety of AI-based attacks in addition to voice cloning. Deepfaked content extortion is a related domain. Recently, there have been multiple attempts by nefarious actors to use graphic pictures generated by artificial intelligence to blackmail people. Numerous examples where deepfakes destroyed the lives of numerous youngsters have been revealed in a report by The Washington Post. In such a case, it is advisable to get in touch with law enforcement right away rather than handling things on one's own.