APPLE

Adobe's terms have been clarified: You will never own your work or use it to teach artificial intelligence.

Adobe's change in terms broke the Internet yesterday after several professional users of the company's apps reacted with anger and confusion to the scary-looking wording .

The company initially issued a rather dismissive statement that its terms had been in place for many years and was simply clarifying them, but subsequently wrote a blog post that addressed the issue in more detail …

disagreements regarding the updated Adobe terms

Yesterday, Adobe Creative Cloud users opened their apps and found that they were forced to agree to new terms that included some scary language. This seemed to suggest that Adobe was claiming rights to its work.

Even worse, there was no way to continue using the apps, request support to clarify the terms, or even delete the apps without accepting the terms. .

A number of well-known professionals didn't hold back in weighing in on the matter.

At the time, we noted that possible explanations included things like sketching clients' work and CSAM scanning – and the company has confirmed that both options apply.

Adobe Initial Statement

When we asked Adobe for comment, the company's initial statement was unhelpful with a dismissive “nothing to do” response. look here, move on’ tone.

This policy has been in place for many years. As part of our commitment to being transparent with our customers, earlier this year we added clarifying examples to our Terms of Service regarding when Adobe can access user content. Adobe accesses user-generated content for a number of reasons, including the ability to provide some of our most innovative cloud-based features, such as Photoshop's neural filters and background removal in Adobe Express, and to take action against prohibited content. Adobe does not access, view, or listen to content stored locally on any user's device.

Following explanations

However, the company subsequently realized that the problem would not go away until it came up with the right explanation. This was done in a blog post.

The company gave a general explanation that it wants to be transparent about the content reviews it performs, and also published a changelog noting the changes.

The purpose of this update was to more clearly outline the improvements we have implemented to our moderation processes. Given the explosive growth of generative AI and our commitment to responsible innovation, we have added more human moderation to our review processes for submitted content.

The highlighted changes reflect the fact that Adobe now uses both manual and automated scanning. Specifically, the automatic flag will then be passed on to a human for review.

They further clarify that this review is for CSAM, as well as using an app that violates the company's terms of use, such as spamming or hosting adult content outside the designated area.

The company also confirmed that the creation of thumbnails is one of the reasons for these conditions.

Finally, it provided two key assurances.

  • Adobe does not train Firefly Gen AI models on client content. Firefly's generative AI models are trained on a dataset of licensed content, such as Adobe Stock, and public domain content whose copyright has expired. Read more here: https://helpx.adobe.com/firefly/faq.html#training-data
  • Adobe will never take ownership of a customer's work. Adobe hosts content to enable customers to use our applications and services. Customers own their content, and Adobe does not take any ownership rights in a customer's work.

It said it will “clarify the acceptance of the Terms of Service that customers see when opening apps “.

Photo by Emily Bernal on Unsplash

Leave a Reply