From PC World: Adobe, the maker of Photoshop, Premiere, and other industry-standard tools in the Creative Suite package, has its foot in its mouth. Last week an update to the Creative Cloud terms of service set off alarms across the web as users interpreted the new wording to mean that the company was using their cloud storage files to train its generative AI systems. Not true, says Adobe in a non-apology post.
According to the message from Creative Cloud design leader Scott Belsky and legal, security, and policy lead Dana Rao, it’s all been a big misunderstanding. The language that customers had noticed, which said that the company’s automated systems can “access, view, or listen to your Content,” sure seems like the kind of thing that enables generative AI systems to be trained. The same kind of AI systems that Adobe has been pushing in Creative Suite for the better part of a year.
The blog makes a few things clear. One, customers own the files and content uploaded to Creative Cloud and edited with Adobe’s tools. Two, generative AI isn’t trained on it. Three, Adobe never scans local files saved on your computer, only the files that are uploaded to Creative Cloud.
Why, and for what purpose, is Adobe scanning the files saved in cloud storage? That’s the point of contention here. According to Belsky and Rao, the reason it’s using automated scanning systems is to make sure that the files do not contain child sexual abuse material. In less legalistic terms, Adobe is using auto-scanning tools to make sure it isn’t hosting child porn. If the system flags an image, video, or other file, it triggers a manual review by a human. Adobe also reserves the right to view user content “to otherwise comply with the law,” i.e. it gets served a warrant to access private content.
View: Full Article