{"id":4939,"date":"2025-09-29T14:09:15","date_gmt":"2025-09-29T12:09:15","guid":{"rendered":"https:\/\/calysta.eu\/is-extra-protection-needed-against-deepfakes-via-ai-denmarks-bold-move-and-the-eus-path\/"},"modified":"2025-09-29T14:15:36","modified_gmt":"2025-09-29T12:15:36","slug":"is-extra-protection-needed-against-deepfakes-via-ai-denmarks-bold-move-and-the-eus-path","status":"publish","type":"post","link":"http:\/\/calysta.eu\/nl\/is-extra-protection-needed-against-deepfakes-via-ai-denmarks-bold-move-and-the-eus-path\/","title":{"rendered":"Is extra protection needed against deepfakes via AI: Denmark\u2019s Bold Move and the EU\u2019s Path"},"content":{"rendered":"<p>In an era defined by Artificial Intelligence technology, deepfakes, and disinformation, Denmark\u2019s recent initiative represents a major shift in European legal thinking. Last month, the country unveiled plans to fundamentally redefine digital identity, becoming the first government in Europe to propose granting individuals actual copyright over their image, voice, and physical traits.\u00a0(complete proposal in DK dd 06\/2025: <a href=\"https:\/\/kum.dk\/fileadmin\/_kum\/1_Nyheder_og_presse\/2025\/Aftale.pdf\">https:\/\/kum.dk\/fileadmin\/_kum\/1_Nyheder_og_presse\/2025\/Aftale.pdf<\/a>)<\/p>\n<p>This bold move signals a strong political commitment to reclaim agency in the face of generative AI\u2019s unchecked growth and the mounting threat of identity misuse driven by emerging technologies.<\/p>\n<p><strong>Here is our thought about it :<br \/>\n<\/strong>The concept itself isn\u2019t bad, giving people a way to combat the unauthorized use of their likeness in deepfake videos is important. However, we don\u2019t believe this idea will gain widespread traction across the EU for several reasons.<\/p>\n<ol>\n<li><strong> Personality Rights Already Offer Protection<\/strong><\/li>\n<\/ol>\n<p>Personality rights are fundamental and inalienable rights inherent to every human being. They protect personal attributes such as private life, image, and voice, and guarantee moral integrity. These rights also encompass respect for human dignity and physical integrity, including the principle of free disposal of one\u2019s body and the prohibition of inhuman or degrading treatment.<\/p>\n<p>In the context of deepfakes, personality rights already provide a legal basis for individuals to challenge the misuse of their likeness. For example:<\/p>\n<ul>\n<li>Image rights allow individuals to prevent others from using their visual representation without consent.<\/li>\n<\/ul>\n<p>&nbsp;<\/p>\n<ul>\n<li>Voice rights protect against unauthorized replication or manipulation of someone\u2019s voice.<\/li>\n<\/ul>\n<p>&nbsp;<\/p>\n<ul>\n<li>Right to privacy ensures that personal life details cannot be exploited or exposed without permission.<\/li>\n<\/ul>\n<p>These protections are especially relevant in cases of reputational harm, emotional distress, or exploitation through deepfake technologies. Therefore, introducing new legislation might be redundant when existing personality rights already offer robust safeguards.<\/p>\n<p>In other words, personality rights already provide a legal framework to challenge the unauthorized use of deepfakes.<\/p>\n<p>However, another question arises: Could the more lenient application of personality rights for public figures have broader implications?<\/p>\n<p>For instance, regarding image rights, the image of a public figure, such as a minister, may be published without their consent if it was taken during the exercise of their official duties. However, any use that is commercial, degrading, or otherwise inappropriate remains prohibited.<\/p>\n<p>In this context, we struggle to see how deepfakes could be used without exceeding these legal boundaries. While their use might be tolerated in a clearly humorous or satirical setting, it\u2019s difficult to identify a legitimate context where deepfakes could benefit from the more lenient application of personality rights. After all, deepfakes are not necessary for conveying factual information, except, perhaps, for spreading misinformation.<\/p>\n<p>&nbsp;<\/p>\n<ol start=\"2\">\n<li><strong> Copyright Law Isn\u2019t Designed for This\u00a0<\/strong><\/li>\n<\/ol>\n<p>Copyright protects original works that have a form, it does not protect ideas or appearances per se. To apply copyright logic here, one would have to consider a person\u2019s appearance as a \u201cwork\u201d and assume it is inherently original. This reasoning doesn\u2019t align with the principles of copyright law.<\/p>\n<p>Moreover, copyright requires a creator and a creation from its part. If we follow this logic, our parents would be considered the creators of our appearance, and thus the copyright holders, unless a license is granted ? Also, copyright assumes intentional creation, which doesn\u2019t apply to natural human features.<\/p>\n<p>Additionally, copyright law is designed to protect creators, not individuals from being impersonated. Trying to stretch copyright to cover deepfakes would require a fundamental rethinking of its principles, which could lead to unintended consequences and legal confusion.<\/p>\n<p>As mentioned earlier, personality rights already offer effective protection against such misuse.<\/p>\n<p>&nbsp;<\/p>\n<ol start=\"3\">\n<li><strong> The AI Act Provides Regulatory Safeguards\u00a0<\/strong><\/li>\n<\/ol>\n<p>The European Union has already introduced the AI Act, which imposes transparency requirements on AI-generated content, including deepfakes. Companies that fail to comply risk significant fines.<\/p>\n<p>In addition to the AI Act, the EU\u2019s directive on violence against women criminalizes the non-consensual creation or alteration of content that depicts individuals in sexual scenarios. This includes:<\/p>\n<ul>\n<li>Deepfake pornography<\/li>\n<\/ul>\n<ul>\n<li>AI-generated sexual content<\/li>\n<\/ul>\n<ul>\n<li>Manipulated videos that simulate real people in compromising situations<\/li>\n<\/ul>\n<p>This directive ensures that victims of deepfake abuse have legal recourse, even if the content is digitally fabricated. Member states are required to implement these rules by June 2027, reinforcing the EU\u2019s commitment to protecting individuals from AI-related harm.<\/p>\n<p>&nbsp;<\/p>\n<ol start=\"4\">\n<li><strong> National Trends and Influence from France<\/strong><\/li>\n<\/ol>\n<p>In Belgium, we often follow France\u2019s legislative lead. Since 2024, France has updated its criminal code to prohibit the sharing of any AI-generated visual or audio content\u2014such as deepfakes, without the consent of the person depicted.<\/p>\n<p>Reshared content must be clearly labelled as AI-generated. The law also includes a specific ban on pornographic deepfakes, even if they are clearly marked as fake.<\/p>\n<p>Given this trend, it\u2019s reasonable to expect that Belgium will adopt similar measures, reinforcing the idea that new EU-wide legislation may be unnecessary when national laws are already evolving in this direction.<\/p>\n<p>&nbsp;<\/p>\n<p>This is why, in our view, Denmark\u2019s approach to combating deepfakes through copyright law is unlikely to gain widespread support across other EU member states. The fight against deepfakes is a European challenge, and we believe it should not be addressed through fragmented national legislation.\u00a0(see the intermediate action to have EU-member states following the example of Denmark \u201c<strong>The Minister of Culture wants to extend deepfake law to the rest of Europe<\/strong>\u201d dd 16\/07\/2025: <a href=\"https:\/\/kum.dk\/aktuelt\/nyheder\/kulturministeren-vil-udbrede-deepfake-lov-til-resten-af-europa\">https:\/\/kum.dk\/aktuelt\/nyheder\/kulturministeren-vil-udbrede-deepfake-lov-til-resten-af-europa<\/a>)<\/p>\n<p>&nbsp;<\/p>\n<p>Instead, efforts should be grounded in the shared legal framework provided by the European Union, namely, the new Directive, the AI Act, and the personality rights. These instruments are better suited to address the complex realities of deepfake technology. While Denmark\u2019s use of copyright law is bold and symbolic, it may not be the most effective or appropriate tool for the task.<\/p>\n<p>Source image: https:\/\/www.weforum.org\/stories\/2025\/07\/deepfake-legislation-denmark-digital-id\/<\/p>\n<p>Johan Dedeckel &amp; L\u00e9opold Buscemi<\/p>\n","protected":false},"excerpt":{"rendered":"<p>In an era defined by Artificial Intelligence technology, deepfakes, and disinformation, Denmark\u2019s recent initiative represents a major shift in European legal thinking. Last month, the country unveiled plans to fundamentally redefine digital identity, becoming the first government in Europe to propose granting individuals actual copyright over their image, voice, and physical traits.\u00a0(complete proposal in DK [&hellip;]<\/p>\n","protected":false},"author":25,"featured_media":4910,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[130],"tags":[335,144,146],"acf":[],"_links":{"self":[{"href":"http:\/\/calysta.eu\/nl\/wp-json\/wp\/v2\/posts\/4939\/"}],"collection":[{"href":"http:\/\/calysta.eu\/nl\/wp-json\/wp\/v2\/posts\/"}],"about":[{"href":"http:\/\/calysta.eu\/nl\/wp-json\/wp\/v2\/types\/post\/"}],"author":[{"embeddable":true,"href":"http:\/\/calysta.eu\/nl\/wp-json\/wp\/v2\/users\/25\/"}],"replies":[{"embeddable":true,"href":"http:\/\/calysta.eu\/nl\/wp-json\/wp\/v2\/comments\/?post=4939"}],"version-history":[{"count":1,"href":"http:\/\/calysta.eu\/nl\/wp-json\/wp\/v2\/posts\/4939\/revisions\/"}],"predecessor-version":[{"id":4941,"href":"http:\/\/calysta.eu\/nl\/wp-json\/wp\/v2\/posts\/4939\/revisions\/4941\/"}],"wp:featuredmedia":[{"embeddable":true,"href":"http:\/\/calysta.eu\/nl\/wp-json\/wp\/v2\/media\/4910\/"}],"wp:attachment":[{"href":"http:\/\/calysta.eu\/nl\/wp-json\/wp\/v2\/media\/?parent=4939"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/calysta.eu\/nl\/wp-json\/wp\/v2\/categories\/?post=4939"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/calysta.eu\/nl\/wp-json\/wp\/v2\/tags\/?post=4939"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}