Stephen Henry thought, “It must be perfect. If not, how could you get the Prime Minister?
Article content
A Toronto man says he lost $12,000 after being scammed by a fake cryptocurrency scam that used Justin Trudeau’s image to back a fraudulent investment platform.
The scam was spread via a YouTube video and manipulated with AI and voice-cloning technology to make it appear as if Trudeau was promoting a cryptocurrency exchange and investment platform aimed “to help Canadians protect their financial futures.”
Advertisement 2
Content of the article
Article content
“I thought, ‘It has to be legal, it has to be perfect. If not, how could you get the Prime Minister? So I thought, ‘It has to be official,'” Stephen Henry told CTV.
Henry initially invested $250, but then continued to invest his savings, believing his investments had grown to over $40,000 in value.
Recommended by the editors
-
Deepfake Trudeau selling crypto with an accent is an unsettling look at the future of AI disinformation
-
Canada’s cybersecurity is under siege and even the government is powerless
When Henry tried unsuccessfully to withdraw some of his money, he realized he had been scammed.
“I am now torn from all my chances of ever living. That was all my money,” he said.
Henry is far from alone. Scams that use the likenesses of politicians and celebrities to trick individuals have increased along with improvements in the quality and availability of deepfake technology.
Taylor Swift, Pope Francis and Ukrainian President Volodymyr Zelensky are just a few examples of individuals whose likenesses have been co-opted into deep fake scams and disinformation campaigns.
Scams manipulate AI and voice cloning technologies to create very convincing but fraudulent endorsements. AI and machine learning algorithms can overlay faces and mimic voices, including reproducing mannerisms and vocal patterns.
Even low-credibility ads can be effective, especially for those unfamiliar with the advancements in AI technology.
Facebook users may have recently spotted an ad on the platform featuring a deeply fake Justin Trudeau promoting a cryptocurrency scam.
The fake ad uses footage from a CBC interview, but Trudeau speaks with an Australian accent.
Article content
Advertising 3
Article content
“A trademark of hoaxes is that they have to be realistic enough to catch someone, but also fake enough that the people they catch are likely to go through (fall for it),” Assistant Professor Angus said McGill University’s Bridgman told the National Post earlier this month.
While Bridgman said Trudeau’s ad was poorly done, it also served a purpose by filtering out more experienced users in an attempt to attract people who were more likely to invest money in the scam.
“That’s the type of person you want to catch with these ads: someone who isn’t digitally literate — similarly, seniors in Canada are victims of phone scams and identity theft,” Bridgman said.
The Prime Minister’s Office, through press secretary Jenna Gasbech, acknowledged the challenges posed by deepfake technology and the spread of false information targeting elected officials in a statement to CTV.
“The amount of fraudulent, false and misleading information and accounts targeting elected officials is increasingly concerning and unacceptable, especially in the age of deep-fake technology,” Ghassbeh said.
Advertising 4
Article content
As the federal government struggles to keep up with advances in technology, they say educating communities and encouraging critical engagement with information are key protection strategies.
“Public norms and discourse around deep fakes need to be fostered to create a social environment where people are not only more skeptical about what they see, but also encouraged to challenge the informational claims of others,” notes the Canadian Service for Security Intelligence Service (CSIS).
Some tech companies and social media platforms are using a combination of human insights and automated methods to detect deepfakes, while there is also a push for legal frameworks that could hold the creators and spreaders of deepfakes accountable and offer protection to victims of defamation.
“To change societal norms, thought leaders and those most central to social networks are key,” CSIS adds. “Educational resources, including digital literacy training, are useful tools, especially if they target influential people. Videos explaining politically deep fakes have been found to reduce uncertainty and thus can increase trust in the media. But norms only really change through collective action.
Our website is the place for the latest breaking news, exclusive scoops, long reads and provocative commentary. Please bookmark nationalpost.com and sign up for our daily newsletter, Posted, here.
Article content