Adobe Responds to ‘Terms of Use’ Controversy, Says It Isn’t Spying on Users

Adobe released a new blog post explaining its changes in its Terms of Use, when Adobe applications may access a user’s content, and whether a user’s content will be used to train Adobe’s artificial intelligence (AI) models and services.

The need for clarification came after numerous users, including some established creative professionals, received pop-up notifications in Adobe applications that said, among other things, that Adobe could access user content through automated and manual methods. The resulting anger among the creative community is easy to understand.

The pop-up, which required consent for a person to continue using Adobe software, failed to explain precisely what had been updated in the Terms of Use and how Adobe may access someone’s content. Adobe’s opaqueness left the door open for speculation, confusion, and fear.

Among the most prominent concerns was that Adobe would essentially be spying on a person’s work. This is concerning in general but especially so for those who use Adobe software to create work for NDA projects. Given Adobe’s prominence in the creative software segment, that fear applies to many people.

There were also concerns that Adobe was claiming ownership of a person’s work, which is very murky given the way some Adobe Creative Cloud services operate. Ultimately, no, Adobe doesn’t own work someone creates inside Creative Cloud apps or uploads to Adobe platforms, but it must have some form of a license to provide specific services. Some rather boilerplate legalese tucked in terms of use and end-user license agreements, even though common to many services with asset uploading and sharing tools, can seem terrifying to those who read them.

It wouldn’t be an Adobe controversy without people wondering if Adobe is training its Firefly AI using customer content, so that understandable worry popped up again, too.

“We recently made an update to our Terms of Use with the goal of providing more clarity on a few specific areas and pushed a routine re-acceptance of those terms to Adobe Creative Cloud and Document Cloud customers. We have received a number of questions resulting from this update and want to provide some clarity,” Adobe writes in its new blog post. “We remain committed to transparency, protecting the rights of creators and enabling our customers to do their best work.”

As referenced in the pop-up that launched the anger this week, Adobe updated language in sections two and four of its Terms of Use. The precise changes Adobe made are detailed in its blog post, but the primary revisions of note are sections concerning that Adobe may access, view, or listen to user content in “limited ways, and only as permitted by law.” Reasons for doing so include responding to customer feedback and support, detecting and preventing legal and technical issues, and enforcing content terms, like those that prohibit using Adobe software to create child sexual abuse material (CSAM).

Adobe further details its content moderation policies on a separate section of its website dedicated to transparency.

A modern office with brick walls and large windows. Several people are working at desks with computers, while others are standing and conversing. There are plants and office equipment scattered around, creating a collaborative and busy work environment.

“To be clear, Adobe requires a limited license to access content solely for the purpose of operating or improving the services and software and to enforce our terms and comply with law, such as to protect against abusive content,” Adobe continues.

The company outlines three instances when Adobe applications and services may access user content. These include when access is required to provide essential services and functions, such as when opening and editing files for the user or creating thumbnails or previews for sharing.

Access is also required to provide some cloud-based features, including Photoshop’s Neural Filters, Liquid Mode, or Remove Background. People can learn more about how content may be viewed and analyzed in these instances in Adobe’s Content Analysis FAQ. For those working on sensitive, confidential material, it is perhaps worth considering the limited situations in which Adobe may view that content, including with real humans.

Finally, Adobe may access content that is processed or stored on Adobe servers. In these instances, Adobe may, automatically or using humans, screen for certain types of illegal content (like CSAM).

Adobe reaffirms that it “does not train Firefly Gen AI models on customer content.” Firefly is trained using licensed content, such as media on Adobe Stock and public domain content.

Further, Adobe says it “will never assume ownership of a customer’s work.”

“Adobe hosts content to enable customers to use our applications and services,” the tech giant explains. “Customers own their content and Adobe does not assume any ownership of customer work.”

“We appreciate our customers who reached out to ask these questions which has given us an opportunity to clarify our terms and our commitments. We will be clarifying the Terms of Use acceptance customers see when opening applications,” Adobe concludes.

Hopefully, these changes will arrive to customers sooner rather than later, because it is easy to see how this situation unfolded so rapidly. Without the context Adobe failed to include in its pop-up message, some standard terms of use seemed anything but.


Image credits: Adobe

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Todays Chronic is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – todayschronic.com. The content will be deleted within 24 hours.

Leave a Comment