Non-consensual intimate images are being created using iPhone apps
1 Facebook x.com Reddit
Congress has sent sharp letters to tech executives like Apple's Tim Cook expressing concern about the prevalence of deepfake non-consensual intimate images.
The letters follow earlier reports that nude deepfakes were being created using dual-use apps. Ads promoting face swaps have appeared across social media, with users using them to place faces into nude or pornographic images.
According to a new report from 404 Media, the US Congress is taking action based on these reports, asking tech companies how it plans to stop non-consensual intimate images from being generated on their platforms. The letters were sent to Apple, Alphabet, Microsoft, Meta, ByteDance, Snap and X.
Apple's letter specifically addresses the company's failure to approve such dual-use apps despite its App Review Guidelines. It asks what Apple plans to do to curb the spread of deepfake pornography, and cites the TAKE IT DOWN Act.
Tim Cook was asked the following questions:
- What plans are in place to proactively address the proliferation of fake pornography on your platform, and what is the timeline for implementing these measures?
- What individuals or stakeholders, if any, are involved in developing these plans?
- What is the process for once a report is submitted, and what oversight is in place to ensure that these reports are responded to in a timely manner?
- What is the process for determining whether an app should be removed from your store?
- What remedies, if any, are available to users who report that their image has been included in fake content without their consent?
Apple acts as the steward of the App Store, and in doing so, it takes the blame every time something gets into the store that shouldn’t be there. These examples of deepfake tools or kids’ apps becoming casinos are used as fuel for the argument that Apple shouldn’t be the gatekeeper of its app platform.
The apps identified in previous reports that were removed by Apple showed clear signs of potential abuse. For example, users were able to download videos from Pornhub to swap faces.
Apple Intelligence cannot create nude or pornographic images, and Apple has blocked Sign-in with Apple from working on deepfake sites. However, this is only the bare minimum that Apple can and should do to combat deepfakes.
As the Congressional letter suggests, Apple needs to take steps to ensure that dual-use apps fail to pass muster. At the very least, additional precautions should be taken against apps that promise AI-powered image and video manipulation, especially those that offer face-swapping capabilities.
Follow AppleInsider on Google News