VIDEO ⟩ Voice cloning sounds like science fiction, but it's already happening!

 The rapid development of artificial intelligence (AI) has brought both benefits and risks.


One worrying trend is the abuse of voice cloning. In seconds, scammers can clone a voice and make people think a friend or family member needs money urgently.


News outlets, including CNN, are warning that millions of people could be affected by this type of scam. As technology makes it easier for criminals to invade our personal spaces, being cautious about using it is more important than ever

Image from: unsplash.com

How do criminals use it?


Cybercriminals use voice cloning technology to impersonate celebrities, authorities or ordinary people for fraud.


They create urgency, gain the victim's trust, and demand money via gift cards, wire transfers, or cryptocurrency.


The process starts by collecting audio samples from sources like YouTube and TikTok. The technology then analyzes the audio to create new recordings.


Once the voice is cloned, it can be used for deceptive communication, often in conjunction with spoofing the caller ID to appear credible.


Many cases of voice cloning scams have made headlines. For example, criminals cloned the voice of a company director in the United Arab Emirates to organize a $51 million robbery.


A businessman in Mumbai fell victim to a voice cloning scam involving a fake call from the Indian Embassy in Dubai.


In Australia, scammers recently used a voice clone of Queensland Premier Stephen Miles to try to trick people into investing in Bitcoin.


The target audience is also teenagers and children. In a kidnapping scam in the US, a teenager's voice was cloned and her parents manipulated into compliance.


How widespread is it?


Recent research shows that 28% of adults in the UK have come across a voice cloning scam in the past year, with 46% unaware of this type of scam.


This shows a significant lack of knowledge, putting millions of people at risk of fraud. In 2022, almost 240,000 Australians reported being victims of voice cloning scams, resulting in financial losses of A$568 million.

Image from: unsplash.com

How can people and organizations guard against it?


The risks posed by voice cloning require a multidisciplinary response. People and organizations can take a number of steps to protect themselves against misuse of voice cloning technology.


First, public awareness campaigns and education can help protect people and organizations and reduce this type of fraud. Public-private collaboration can provide clear information and consent options for voice cloning.


Second, people and organizations should strive to use biometric security with liveness detection, a new technology that can recognize and verify a live voice instead of a fake one. Organizations using voice recognition should consider implementing multi-factor authentication.


Third, improving investigative capabilities against voice cloning is another critical law enforcement measure.


Finally, precise and up-to-date regulations are needed for countries to manage the associated risks.


Law enforcement agencies in Australia recognize the potential benefits of AI. However, concerns about the "dark side" of this technology have prompted research into the "criminal use of artificial intelligence to target victims".


There are also calls for potential intervention strategies that law enforcement agencies could use to combat the problem. Such efforts should be linked to an overall national plan to combat cybercrime, which focuses on proactive, reactive and restorative strategies.


This national plan establishes a duty of care for service providers reflected in the Australian Government's new legislation to protect the public and small businesses. The legislation introduces new obligations to prevent, detect, report and disrupt fraud.


This will apply to regulated organizations such as telecommunications companies, banks and digital platform providers. The aim is to protect customers by preventing, detecting, reporting and disrupting cyber fraud involving deception.


Video from: https://youtu.be/E_jP1R6aiUU


Previous Post Next Post

نموذج الاتصال