{"id":990,"date":"2026-01-14T05:21:51","date_gmt":"2026-01-13T21:21:51","guid":{"rendered":"https:\/\/obagg.com\/index.php\/2026\/01\/14\/senate-passes-defiance-act-for-a-second-time-to-address-grok-deepfakes\/"},"modified":"2026-01-14T05:21:51","modified_gmt":"2026-01-13T21:21:51","slug":"senate-passes-defiance-act-for-a-second-time-to-address-grok-deepfakes","status":"publish","type":"post","link":"https:\/\/obagg.com\/index.php\/2026\/01\/14\/senate-passes-defiance-act-for-a-second-time-to-address-grok-deepfakes\/","title":{"rendered":"Senate passes Defiance Act for a second time to address Grok deepfakes"},"content":{"rendered":"<p>The Senate has passed the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE ) Act with unanimous consent, according to the bill\u2019s co-sponsor <a target=\"_blank\" class=\"link\" href=\"https:\/\/www.durbin.senate.gov\/newsroom\/press-releases\/durbin-successfully-passes-bill-to-combat-nonconsensual-sexually-explicit-deepfake-images\" data-i13n=\"cpos:1;pos:1\">Senator Dick Durbin (D-IL)<\/a>. The bill lets the subjects of nonconsensual, sexually explicit deepfakes take civil action against the people who create and host them.<\/p>\n<p>Deepfakes are a known issue online,  but without the proper protections, easy access to AI-powered image and video generation tools has made it possible for anyone to create compromising content using another person&#8217;s likeness. This has become <a target=\"_blank\" class=\"link\" href=\"https:\/\/www.engadget.com\/ai\/elon-musks-grok-ai-posted-csam-image-following-safeguard-lapses-140521454.html\" data-i13n=\"cpos:2;pos:1\">a particular problem on X<\/a>, where the integration of Grok, the AI assistant created by <a target=\"_blank\" class=\"link\" href=\"https:\/\/www.engadget.com\/social-media\/xai-elon-musks-ai-company-just-purchased-x-elon-musks-social-media-company-221503759.html\" data-i13n=\"cpos:3;pos:1\">X&#8217;s parent company xAI<\/a>, makes it possible for anyone to turn the content of another person&#8217;s post into an image-generating prompt. Over the last month, that&#8217;s allowed users to create sexually explicit images of children, just by replying to a post with @grok and a request.\u00a0<\/p>\n<p>In response, Ofcom, the UK&#8217;s media regulator, has already <a target=\"_blank\" class=\"link\" href=\"https:\/\/www.engadget.com\/big-tech\/uk-regulator-ofcom-opens-a-formal-investigation-into-x-over-csam-scandal-120000312.html\" data-i13n=\"cpos:4;pos:1\">opened an investigation into X<\/a> for potentially violating the <a target=\"_blank\" class=\"link\" href=\"https:\/\/www.gov.uk\/government\/collections\/online-safety-act\" data-i13n=\"cpos:5;pos:1\">Online Safety Act<\/a>. The chatbot has also been <a target=\"_blank\" class=\"link\" href=\"https:\/\/apnews.com\/article\/grok-malaysia-indonesia-block-c7cb320327f259c4da35908e1269c225\" data-i13n=\"cpos:6;pos:1\">outright blocked in Malaysia and Indonesia<\/a>. The DEFIANCE Act won&#8217;t prevent Grok or other AI tools from generating nonconsensual deepfakes, but it would make creating or hosting that content potentially very expensive for anyone on the receiving end of a lawsuit.<\/p>\n<p>The Senate passed <a target=\"_blank\" class=\"link\" href=\"https:\/\/www.congress.gov\/bill\/118th-congress\/senate-bill\/3696\/text\" data-i13n=\"cpos:7;pos:1\">an earlier version<\/a> of the DEFIANCE Act in 2024, but it stalled in the House. Given the urgency of Grok&#8217;s deepfake problem, the hope is this <a target=\"_blank\" class=\"link\" href=\"https:\/\/www.congress.gov\/bill\/119th-congress\/senate-bill\/1837\/text\" data-i13n=\"cpos:8;pos:1\">new version of the bill<\/a> won&#8217;t see the same resistance. Congress passed an earlier piece of deepfake regulation last year, the <a target=\"_blank\" class=\"link\" href=\"https:\/\/www.engadget.com\/big-tech\/trump-will-sign-the-take-it-down-act-criminalizing-ai-deepfakes-today-184358916.html\" data-i13n=\"cpos:9;pos:1\">Take It Down Act<\/a>, with bipartisan support. That bill was focused on the companies who host nonconsensual, sexually explicit content, rather than the people exploited by it.<\/p>\n<p>This article originally appeared on Engadget at https:\/\/www.engadget.com\/ai\/senate-passes-defiance-act-for-a-second-time-to-address-grok-deepfakes-212151712.html?src=rss<\/p><p>Please credit: <a href=\"https:\/\/obagg.com\">OBA Blog<\/a> &raquo; <a href=\"https:\/\/obagg.com\/index.php\/2026\/01\/14\/senate-passes-defiance-act-for-a-second-time-to-address-grok-deepfakes\/\">Senate passes Defiance Act for a second time to address Grok deepfakes<\/a><\/p>","protected":false},"excerpt":{"rendered":"<p>The Senate has passed the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE ) Act with unanimous consent, according to the bill\u2019s co-sponsor Senator Dick Durbin (D-IL). The bill lets the subjects of nonconsensual, sexually explicit deepfakes take civil action against the people who create and host them. Deepfakes are a known issue online, but without the proper protections, easy access to AI-powered image and video generation tools has made it possible for anyone to create compromising content using another person&#8217;s likeness. This has become a particular problem on X, where the integration of Grok, the AI assistant created by X&#8217;s parent company xAI, makes it possible for anyone to turn the content of another person&#8217;s post into an image-generating prompt. Over the last month, [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[],"class_list":["post-990","post","type-post","status-publish","format-standard","hentry","category-share"],"_links":{"self":[{"href":"https:\/\/obagg.com\/index.php\/wp-json\/wp\/v2\/posts\/990","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/obagg.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/obagg.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/obagg.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/obagg.com\/index.php\/wp-json\/wp\/v2\/comments?post=990"}],"version-history":[{"count":0,"href":"https:\/\/obagg.com\/index.php\/wp-json\/wp\/v2\/posts\/990\/revisions"}],"wp:attachment":[{"href":"https:\/\/obagg.com\/index.php\/wp-json\/wp\/v2\/media?parent=990"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/obagg.com\/index.php\/wp-json\/wp\/v2\/categories?post=990"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/obagg.com\/index.php\/wp-json\/wp\/v2\/tags?post=990"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}