A bipartisan group of US senators introduced a bill Tuesday that would criminalize the spread of nonconsensual, sexualized images generated by artificial intelligence. The measure comes in direct response to the proliferation of pornographic AI-made images of Taylor Swift on X, formerly Twitter, in recent days.
The measure would allow victims depicted in nude or sexually explicit “digital forgeries” to seek a civil penalty against “individuals who produced or possessed the forgery with intent to distribute it” or anyone who received the material knowing it was not made with consent. Dick Durbin, the US Senate majority whip, and senators Lindsey Graham, Amy Klobuchar and Josh Hawley are behind the bill, known as the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024, or the “Defiance Act.”
I don’t think general enforcement against deepfake porn consumption is a practical application of this proposed law in civil court. Practical applications are shutting down US-based deepfake porn sites and advertising. As far as possessors go, consider cases of non-celebrities being deepfaked by their IRL acquaintances. In a scenario where the victim is aware of the deepfake such that they’re able to bring the matter of possession to court, don’t you agree it’s tantamount to sexual harrassment? All I’m seeing there is the law catching up to cover disruptive tech with established legal principle