Deepfake pornography bill seeks to pass U.S. House by end of year
(ABC 6 News) – U.S. Senators from Minnesota and Texas have joined forces to fight against one of the latest concerns stemming from artificial intelligence — deepfake pornography.
Deepfakes – pieces of media like pictures, videos or audio – are created by A.I. in a way that makes the content look convincingly real, to the point where the average person would not know the material is fake.
Many examples of deepfakes on the internet today are either designed to scam people or, in increasingly prevalent cases, are highly realistic, sexually explicit materials released without the subject’s consent.
“It is estimated that 1 in 12 American adults have had some type of image distributed without their consent,” said Sen. Amy Klobuchar, DFL-Mn., during a virtual press conference about the bill.
Sen. Klobuchar and Sen. Ted Cruz, R-Tx., have set their sights on the deepfake issue with their latest piece of legislation, the “Take It Down Act.”
The act will seek to protect victims of those crimes, who range from celebrities like Taylor Swift to people like Molly Kelley of Otsego, Minnesota.
Kelley was one of over 80 women who became the subject of her close friend’s deepfake porn collection, and she spoke during the press conference about her experience.
“The knowledge that these images and video exist, and will likely exist forever, is an intolerable weight that has pushed me to speak here today,” she said.
The “Take It Down” Act, which has already passed the Senate unanimously, would make it a federal offense for people who make or share explicit deepfake materials without consent.
It would also force media companies like Meta and Snapchat to remove any images on their platforms at the victim’s request.
The bill now sits in the House, awaiting approval.
Both Sen. Klobuchar and Sen. Cruz are confident it will pass and be on President Biden’s desk by the end of the year.