The rise of artificial intelligence (AI) has raised concerns about identity verification tools on cryptocurrency exchanges. The process of creating deepfake identity proofs has become easier than ever with the rapidly advancing AI technology. Concerns about AI-supported risks in the crypto world have prompted discussions among some industry leaders.
CZ’s Statement on the Matter
Changpeng Zhao, the CEO and founder of the global cryptocurrency exchange Binance, raised an alarm about the use of AI by malicious actors in the crypto space through a tweet on August 9th.
“This is quite scary in terms of video verification. Even if someone sends you a video, do not send them coins.”
Like many other crypto exchanges, Binance requires crypto investors to provide video evidence as part of their Know Your Customer (KYC) process to perform certain transactions.
The Binance CEO referred to a video created by an AI, featuring Joshua Xu, the founder and CEO of HeyGen. The video included an avatar generated by AI that resembled the real HeyGen CEO, replicating facial expressions, voice, and speech patterns.
Xu stated, “Both of these video clips were created by 100% artificial intelligence, which includes my avatar and voice clone.” He also mentioned that HeyGen’s lifestyle avatar has made significant improvements to mimic his unique accent and speech patterns in terms of video quality and voice technology. According to Xu, this will soon be available for production and accessible to everyone.
Issues with Deepfake
HeyGen’s CEO stated that when it is made available to the public, the AI tool will allow anyone to create a lifelike digital avatar within just two minutes. The public release of AI creation tools like HeyGen could lead to serious identity verification issues for crypto exchanges like Binance. Like many other exchanges, Binance implements KYC measures that require users to submit a video containing their personal information and specific documents to access services and even withdraw funds from the platform.
Binance’s statement video specifically requires users to submit the video along with official identification documents such as ID cards, driver’s licenses, or passports. The policy also requires users to mention the date and specific requests in their video recordings. The policy explicitly states, “Please do not watermark or edit your videos.”
Binance’s Chief Security Officer, Jimmy Su, had previously warned about the risks associated with AI-generated deepfakes. In late May, Su argued that AI technology has advanced to a point where AI deepfakes could soon become undetectable by a human verifier.