APPLE

Taylor Swift deepfakes and CSAM AI highlight need for new laws

A series of apparent deepfakes of Taylor Swift has led lawmakers to propose new legislation to address the problem in a similar way to existing legislation. for so-called revenge porn …

Governments are increasingly concerned about the use of AI to create child sexual abuse material (CSAM) .

Deep Fakes of Taylor Swift

Social network X was recently flooded with sexually explicit images purporting to be of Taylor Swift, but are actually fakes created using artificial intelligence technology. 404 Media reports that a Microsoft AI tool was used to create the images.

The report notes that one example alone received 45 million views and 24,000 retweets before it was removed. X acknowledged the problem, saying he had removed the images and taken action against the accounts behind them.

New Law Proposed: DEFIANCE Act

The Verge reports that a bipartisan group of senators has proposed new legislation to address the problem. Essentially, it would prohibit fake explicit photos in the same way that existing law prohibits the posting of real images without consent.

US lawmakers have proposed allowing people to sue for fake pornographic images of themselves following the spread of explicit photos of Taylor Swift created using artificial intelligence. The Stopping Obvious Fraudulent Images and Editing Without Consent (DEFIANCE) Act would add a civil right of action for intimate “digital forgeries” depicting an identifiable person without their consent, allowing victims to recover financial damages from anyone who “knowingly created or possessed” the image with intent its distribution […]

It is based on a provision of the Violence Against Women Act Reauthorization Act of 2022 that adds a similar right of action for genuinely explicit images.

Senate Majority Whip Dick Durbin (D-Ill.), joined by Senators Lindsey Graham (R-SC), Amy Klobuchar (D-Minn.), and Josh Hawley (R-Minn. Missouri) specifically cite images of Taylor Swift as an example of how fake AI can be used to exploit and harass women.

Although due to concerns about the abuse of AI image processing tools, the law will also cover hand-crafted fakes, such as those created using conventional image-editing applications such as Photoshop. .

Law enforcement fears flood of AI-generated CSAM

The New York Times reports similar concerns about AI-generated CSAM.

Law enforcement agencies are preparing for an explosion of AI-generated materials , which realistically depict children being sexually exploited, making it difficult to identify victims and combat such abuse […]

By simply entering a hint, you get realistic images, videos and text in minutes, creating new ones images of real children, as well as explicit images of children who do not actually exist. These may include […] regular classroom photos, adapted so that all the children are naked.

However, as expected, politicians are using this issue as an excuse to call for a ban . about end-to-end encrypted messaging.

Tom Tugendhat, the UK security secretary, said the move would expand the power of child predators around the world.

“Meta's decision to implement end-to-end encryption without strong security features makes these images available to millions of people without fear of being caught,” Mr. Tugendhat said in a statement.

Photo: Ronald Vaughan/CC2.0 (cropped)

Leave a Reply