When Worldcoin 2.0 (proof of personhood) was released last month with advanced features, AIM predicted it to be the best thing that could possibly counter any form of future online fraud. In less than a month, the possibility of identity theft caused by AI is getting more real than ever, with worldcoin’s relevance increasingly going up.
‘Online KYC Compromised’
A recent image of a person’s online verification has been doing the rounds on the internet. The scary part being that the image is wholly AI-generated and it has reportedly been created using Stable Diffusion. The image depicts exactly what everyone had been fearing: AI identity theft.
Source: X
The near-perfect image of a person holding a placard, can easily pass off as an online verification for identity purposes- a method commonly used for KYC (Know Your Customer). A number of banks and brokerage firms allow photo/video KYC methods, something that aggressively picked up during the pandemic. However, this has also opened up the possibility of manipulation via AI tools.
A number of AI image-generation tools have sprung in the last year, and most of them are still not governed by copyright laws, which only makes the process unhinged. The recent version of Midjourney V6 tool, allows one to generate any celebrity or brand image without any restriction. Furthermore, with the rise of deepfake, video as a form of identification is also compromised.
Recently, Nithin Kamath, co-founder at Zerodha, spoke about the threats of online identity fraud by sharing his own deepfake video. Kamath explains the typical process for verification where the ID and address proof is retrieved from either Digilocker or Aadhaar. It is then followed by an ID check via a webcam which he believes is jeopardised in today’s age.
In Comes a Legit Solution
A number of measures such as adopting two-factor authentication, regulating passwords, verifying and reporting duplicate accounts, password updation, and others are suggested, however, none of them are preventive measures. At the moment, the only legit method of online identification is something that might seem far-fetched, but might just work is Worldcoin.
World ID, a digital passport, seeks to validate unique human identities online while prioritising privacy. With the latest update, a number of companies including Reddit, Shopify, Discord, and others have integrated World ID 2.0 onto their platforms, which allows the users to verify their accounts using the same.
Worldcoin founder’s response to the AI-generated image for KYC. Source: X
The integration of Worldcoin across multi-sector platforms including retail, gaming and e-commerce, looks to make users have a safe experience. Problems caused by frauds via duplicate accounts, availing multiple discounts, automated bot integration, and bad actors, are some of the issues that Worldcoin can address. Proof of personhood allows the protection against unlawful account sharing which is a potential way for hackers to gain access to multiple accounts.
Furthermore, even options such as captcha that are used for online verification can be eliminated owing to Worldcoin.
Mass Integration
The idea may sound perfect, but implementation is where the problem arises. For these measures to effectively work, Worldcoin or similar kinds of products need to be integrated at a universal level. However, owing to security concerns, Worldcoin is still looked at with suspicion.
The controversy surrounding biometric data captured by Worldcoin Orbs had been under the radar when it was initially launched. Though the company later came out with blogs explaining the secure usage of one’s data claiming that they do not store anyone’s data, Worldcoin is still under slow adoption. Currently, Worldcoin has over 2 million users with users across 120 countries.
What was presumably the next obvious step for AI—getting infamous for identity theft—is now emerging as a burdensome problem. Unless applications like Worldcoin and digital proof-of-personhood methods are adopted, AI frauds will continue to rise.