MrDeepFakes, a website known for non-consensual deepfake pornography, was exposed by CBC’s visual investigations team, revealing a Canadian pharmacist as a key figure behind the site. Following the revelation, a Danish MP is seeking the extradition of the pharmacist, David Do, to face prosecution in Denmark, where deepfake porn is illegal. The website, which hosted nearly 70,000 non-consensual deepfake videos and images, including those of Danish public figures, was shut down after the investigation. Currently, while Canada does not have laws criminalizing deepfakes, the Prime Minister’s Office has stated it will work to make both the creation and distribution of non-consensual sexual deepfakes a criminal offence.
Read the original article here
Danish MP calls for extradition of the Canadian behind a notorious AI porn site, and that’s a serious situation. The sheer scale of the alleged offenses, with seventy thousand deepfake videos, is shocking.
The core issue here is that a Canadian citizen, David Do, is implicated in running MrDeepFakes, a site that hosted an obscene number of non-consensual and sometimes violent deepfake videos. Canada has extradition treaties with both Denmark and the Netherlands, the countries that would be making the request, so the potential for legal action is there. The challenge, however, is that deepfake porn isn’t explicitly illegal in Canada yet. But, it’s possible that Do could be extradited based on existing harassment or defamation laws.
It’s interesting to note that this isn’t just Denmark involved; the Netherlands are also pursuing action. The hope is that this prosecution will move forward quickly. The case of Kim Dotcom, who’s been fighting extradition for over a decade, shows how long these legal battles can drag on.
The response to this situation is mixed. On one hand, there’s a sense of justice and the need to shut down this type of AI-generated content. There’s a feeling that those who create and distribute these videos should face serious consequences, including potential prison sentences. On the other hand, there’s a skepticism about the likelihood of extradition. Some believe Canada is known to protect its citizens, and that the extradition process is incredibly complex.
The power and potential dangers of AI are the root of the problem. Once you’ve trained a model and written a prompt, a decent computer can churn out a video in a matter of minutes. Some suggest that anyone with a relatively modern graphics card has the technological capacity to create such content.
The use of tools like Photoshop, which have existed for decades, to create fake images of celebrities is brought up to provide a point of reference. It wasn’t seen as criminal behavior back then, and it raises questions about the legal distinctions being made with AI-generated content. It seems the argument is that AI makes it easier, but the fundamental nature of the act – creating images or videos of people without their consent – remains the same. The difference lies in the scale.
Some opinions seem to place the emphasis on the scale. Creating deepfake porn on the scale of MrDeepFakes is a massive problem that needs to be addressed. It’s not about judging every thought or artistic expression, but about the clear violation of consent and the potential for harm.
The legal framework also matters. It is hoped that Danish law would criminalize both Photoshop-created content and AI-generated content equally, but it depends on the language of the law.
Some argue that the technology is readily available, so shutting down one site won’t eliminate the problem. The tools are out there, and the ability to create deepfakes will only grow.
There is a debate about generated people by AI. Should there be exceptions for content involving randomly generated individuals, or is any use of AI to create pornographic content inherently problematic? Some argue that the ethics of using AI for artistic purposes, such as creating art references, need to be carefully considered.
There is skepticism on how likely an extradition is. Some think that Canada may protect its own citizens, making the extradition process unlikely to succeed. Some suggest that it is akin to running an organized crime gang. The scale and impact matter.
Making porn of people without their consent is fundamentally wrong. Someone is using someone else’s body to create pornography, without their consent first.
The question of whether there is a distinction in the law is raised. Does the law make a difference between Photoshop and AI, or does it criminalize this type of activity regardless of how the image is created?
Others believe that people who need threats to discern what they shouldn’t do are the worst. And Canada has found legal loopholes to avoid honoring extraditions. It is stated that not being able to extradite people, like Nur Chowdhury, is not a sign of a functional system. There is a discussion around the potential for art and defamation.
It is important to remember that the core issue is the distribution of non-consensual images. The fact that they are distributed to the public is a crucial point that separates personal privacy from the harms associated with creating and sharing deepfakes.
It is not about the thoughts of the individual, it is about the distribution of AI generated porn to others. The focus should be on the harm caused by sharing deepfakes, and the criminal implications of creating these videos without consent.
It is stressed that creating porn of people without their consent is incredibly wrong. If this man is misrepresenting his art by presenting it as authentic, then he should be pursued in court for that. The key factor being presenting as real. The article emphasizes that it is not the same thing as creating a fake for your own pleasure.
Some believe that not extraditing someone to face the death penalty is a moral principle. But it should be remembered that the focus here is about a person accused of spreading harmful images. It is about a blatant disregard for a person’s bodily autonomy.
