• Home
  • Blog
  • Deepfakes: A Looming Threat to Identity and Financial Security
Deepfakes

In the digital tussle between reality and deception, Deepfakes have emerged as a game-changer dramatically reshaping the delicate boundaries of truth and fiction. Deepfake, a synthetic AI-powered tool possessing the uncanny ability to distort reality, fundamentally raises questions about the nature of authenticity, trust, and evidence of our perception. Originally designed for harmless entertainment purposes, have swiftly trickled down to malicious hands as means of exploitation. These sophisticated tools are now gaining dominance in adhering to serious crimes, particularly in the areas of identity theft and financial fraud.

The term ‘Deepfake’ was coined by a Reddit user combining the terms ‘deep learning’ and ‘fake’, highlighting the use of deep learning techniques to camouflage fraudsters into your trusted figures. One might ponder, ‘How would such a technique even be convincing?’ Well, Deepfakes leverages artificial intelligence and machine learning algorithms to create hyper-realistic videos, images, or audio that mimic real people. These manipulated media files can be made with such precision that distinguishing them from actual content can be nearly impossible to the untrained eye.

Popularised in 2017, Deepfakes or ‘face-swapping’ technologies were used by the perpetrators to target and superimpose celebrity faces to that in non-consensual explicit content and thereafter it was increasingly used to spread misleading information and perpetuate financial frauds such as – loan or credit scams, investment frauds to even spreading of fake trading information.

The Rise of Deepfakes in Identity Theft and Financial Fraud

Imagine getting a video conference call from your boss requesting you to urgently transfer funds for the project, not suspecting any of it you obligingly do the same only to later discover that your boss had never made the call. This exact chilly instance took place in Hong Kong, where a bank manager was scammed into transferring $35 million after receiving a call from what he believed was the CEO of his company, whose voice had been convincingly replicated using Deepfake Technology.

Similarly, in India, an individual native to Kozhikode, Kerala received a call from his former colleague requesting financial assistance for his relative admitted to the hospital, and distressingly he was scammed out of Rs. 40,000/-.

Such scenarios are cases of Identity Theft, a crime where a fraudster impersonates the personal details of a real or deceased individual without their permission. Traditionally, identity theft involved the stealing of physical documents or hacking into personal accounts, however, the emergence of Deepfakes has enhanced the complexities and risks of the crime. Typically, cybercriminals would use Deepfakes in mechanisms such as:

1. Impersonation of an individual: Cybercriminals use Deepfakes to create realistic videos or voice recordings of people, enabling them to impersonate someone with astonishing accuracy. Any person sitting in any part of the world can digitally clone into your known relative or your trusted financial executive, allowing criminals to carry out more sophisticated scams. Imagine a fraudster calling a bank and pretending to be a client, using a cloned voice to convince the bank to change account details or approve financial transactions.

2. Synthetic Identities: Fraudsters can create entirely new digital identities by combining Deepfake Technology with stolen personal data. By crafting realistic photos, videos, and even voice recordings of individuals, they can effectively “become” someone else and commit fraudulent activities without raising suspicion.

3. Impersonation of the Deceased: Deepfakes are used to impersonate deceased individuals, convincing family members or beneficiaries to transfer inheritance funds to fraudulent accounts. Scammers create Deepfakes to fake the identity of deceased policyholders, claiming life insurance benefits or other payouts.

Why is Deepfake such a significant threat to both individuals and financial institutions? This synthetic technology can be used by perpetrators at alarming ease to rob an individual of their personality rights and right to privacy as provisioned under Article 21 of the Constitution of India. In case, a financial institution falls into the sham of a Deepfake scam, it could suffer reputational damage. Clients may lose trust in the security of their accounts or become hesitant to engage in financial transactions. This loss of trust can have long-lasting consequences on a bank’s credibility.

Legal Provisions in India to combat the ongoing Deepfake menace:

The growing integration of Deepfakes and AI-powered Technology in committing financial frauds has created worldwide turbulence. Most countries have started to acknowledge the menaces associated with the advancement of technology and the need for specialized legislation to combat the issue.

Countries like the European Union tackle Deepfakes under the Digital Services Act and General Data Protection Regulation (GDPR), holding platforms accountable for harmful content and protecting personal data.  China has enacted the Regulation on Deep Synthesis of Audio and Video, which requires Deepfake Creators to disclose synthetic content. Though, it has not much effectiveness due to lack of the penal provisions.

In India, the legal framework addressing Deepfakes, particularly in the context of identity theft and financial fraud, is still developing. Upcoming legislation, like the Personal Data Protection Bill, is expected to provide further protections. Still, certain provisions under existing laws can be provoked in cases involving Deepfakes Frauds like Section 66D penalizes cheating by impersonation using computer resources and Section 66E addresses violations of privacy, making it illegal to capture or transmit images without consent under the Information Technology Act, 2000. The IT Rules also mandate social media platforms to remove the morphed content of any individual upon notification. Further, even though such issues are not explicitly tackled under the Indian Penal Code (IPC), Sections 499 and 500 of the IPC would address issues relating to defamation and criminalizing actions that harm an individual’s reputation. Sections 419 and 420 address issues related to defrauding by cheating by impersonation and transferring of property or money.

Eminent cases that paved the way for legal actions against Deepfake-related offenses in India

In India, its misuse has led to various legal challenges, particularly in cases involving personality rights, privacy violations, and financial fraud. For instance, as observed in the case of Anil Kapoor vs. Simply Life India & Ors. and Jaikishan Kakubhai Saraf Alias Jackie Shroff vs. The Peppy Store & Ors., the courts addressed violations of personality rights, which are closely linked to an individual’s right to privacy.

Additionally, in a case where the image and voice of a Bollywood legend like Amitabh Bachchan were used without his consent for commercial purposes, legal action was taken to prevent the unauthorized use of his identity. In the case of Amitabh Bachchan vs. Rajat Negi (2022 SCC OnLine Del 4110), the Delhi High Court directed the Ministry of Electronics and Information Technology as well as the internet service providers to take down the websites and URLs infringing the personality right of Mr. Bachchan, underscoring the need to protect the personality right against misuse.

Conclusion

Deepfakes represent a rapidly evolving threat landscape in identity theft and financial fraud. As technology continues to advance, both individuals and organizations must remain vigilant and proactive in safeguarding against these sophisticated forms of deception. By implementing robust security measures and fostering awareness, it is possible to mitigate the risks associated with this emerging threat.

Author: Rashmi Roy

Department of Law, University of Calcutta, 5th Year

Lets Connect

Disclaimer

As per the rules of the Bar Council of India, law firms are not permitted to solicit work and advertise. By clicking the “Agree” button and accessing this website (www.daslegal.co.in) the user fully accepts that you are seeking information of your own accord and volition and that no form of solicitation has taken place by the Firm or its members.

The information provided under this website is solely available at your request for information purposes only. It should not be interpreted as soliciting or advertisement. The firm is not liable for any consequence of any action taken by the user relying on material / information provided under this website. In cases where the user has any legal issues, he/she in all cases must seek independent legal advice.