, , , , , ,

Adobe’s thoughts on the ethics of AI-generated images (and paying its contributors for them)

We’re at a tipping point where AI is going to break trust in what you see and hear — and democracies can’t survive when people don’t agree on facts. You have to have a baseline of understanding of facts,” Dana Rao, Adobe’s general counsel and chief trust officer, told me. And while that’s not necessarily a new observation, the company’s launch of its Firefly generative image creator and overall GenAI platform this week puts this in a different context.

Maybe more so than any other company, Adobe is deeply embedded in both the creative economy and the world of marketing. And while the Adobe Summit, the company’s annual digital marketing event, unsurprisingly puts its focus on how generative AI can help marketers market more effectively with AI, there was no escaping the discussions around AI ethics, especially in the context of Firefly. Indeed, Adobe itself put AI ethics in the spotlight because the company clearly believes that this is, in part, what allows it to differentiate its generative AI offerings from those of its competition.

Adobe Summit on Tuesday, March 21, 2023, in Las Vegas. Image Credits: David Becker/AP Images for Adobe

“I manage the AI ethics program. We have a really good relationship with the engineering team as we’re developing new technologies,” Rao explained. “We’ve been reviewing AI features for the last five years. Every single AI feature that goes to market goes through the review board.” This team, it’s worth noting, also ensures that the AI-generated results are not just commercially safe but also free of bias (to ensure that when you ask for images related to an occupation, for example, the results cover a broad demographic set).  

As Adobe used its Adobe Stock service to train the model (in addition to openly licensed and public domain images), the company doesn’t have to worry about having the rights to these images. The photographers that contribute to Stock already have a commercial relationship with Adobe, after all, and are likely creating the kind of commercially safe images that Adobe’s customers are looking for — and that the company can then train its AI on. And since the licensing is clear, Adobe’s users won’t have to worry about breaking any copyright laws themselves.

“That stock database of images is the perfect place to go if you want to create something designed to be commercially safe. And we have the license for it — a direct license with the contributor. And that helps on both the ethics side and the copyright side,” Rao explained.

Image Credits: TechCrunch

But that also creates questions about how to pay these contributors for the content they’ve licensed to Adobe, especially if services like Firefly take off. Today, stock photographers tend to receive royalties for every time their photo gets licensed on a platform like Adobe Stock. And while Adobe has the rights to use this content to train its model, Adobe Stock contributors will surely want to get paid for helping the company train these models, too. In the company’s defense, it’s been quite open about this, though how it expects to do this remains a bit vague. Rao didn’t provide too many additional details, but he did explain the company’s thinking in a bit more detail.

“What we’ve said is that we’re really reviewing all the different ways you could possibly do this and we’re going to do that through the beta,” he said. “I think the number one thing is that we’re committed. We feel it’s the right thing to do. We’re committed to compensating the people who are contributing their work to these databases. That’s what we want to make sure. That’s the message we want to get out there.” 

He stressed that Adobe wants there to be a value exchange between the contributors and the company. He argued that there are lots of different ways Adobe could pay contributors for how their images influence the AI-generated content, but because it’s hard to know what exactly influenced how the model created a new image, it’ll also be hard for Adobe to decide how to compensate the content creators that contributed to every AI-generated image. But Rao believes there may be proxies that the company could use — and the solution for that may actually be another AI system.

Image Credits: TechCrunch

“Speculatively,could use an AI to analyze an image and say: where dothink it came from? There’s just a number of ways to imagine how you could come up with the right model. But there’s no need right now for us to solve that problem while we’re in beta,” said Rao.

He also noted that, in addition, Adobe could maybe pay photographers for when a user asks for an image that’s specifically influenced by their individual style.

“Whenthink about a forward-looking [compensation] model, it’s style. Right now, that’s a negative. Artists don’t want their style to be ripped off. But what if you can monetize it? What if we can say: you give us your assets. We’ll plug it into Firefly and then if someone says: I want it to look like Dana Rao, we pop up a message saying for $2, you can get something in the style of Dana Rao, all of a sudden, I get a new revenue stream,” he explained. He also noted that it is now up to everybody who works in the creative economy to figure out new ways to make money.

That’s for Adobe and its content partners to figure out, though. For users, who want to use Firefly to create their own assets, that’s not a problem they have to worry about. There are some interesting questions around how — or even if — you can copyright AI-generated images.

“Where the copyright office is now — and I think there’s a decent chance that’ll stick, because technically speaking, it’s almost nonsensical, otherwise. Right now, they’re saying that if you type in a text prompt, the resulting image: no one owns it. You need a human to add expression in order to get copyright,” explained Rao. How much value a human would have to add to copyright an image, though, is still a bit unclear.

Adobe, together with a large number of partners, has long championed the Content Authenticity Initiative, which is developing standards and tools for tracking how an image was created and manipulated over time. And while this initiative mostly focused on fighting deepfakes and misinformation, it may also come to play in this context because it will allow companies to prove that they did add their own expression to an AI-generated image.

Adobe’s thoughts on the ethics of AI-generated images (and paying its contributors for them) by Frederic Lardinois originally published on TechCrunch

https://techcrunch.com/2023/03/22/adobes-thoughts-on-the-ethics-of-ai-generated-images-and-paying-its-contributors-for-them/


January 2025
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
2728293031  

About Us

Welcome to encircle News! We are a cutting-edge technology news company that is dedicated to bringing you the latest and greatest in everything tech. From automobiles to drones, software to hardware, we’ve got you covered.

At encircle News, we believe that technology is more than just a tool, it’s a way of life. And we’re here to help you stay on top of all the latest trends and developments in this ever-evolving field. We know that technology is constantly changing, and that can be overwhelming, but we’re here to make it easy for you to keep up.

We’re a team of tech enthusiasts who are passionate about everything tech and love to share our knowledge with others. We believe that technology should be accessible to everyone, and we’re here to make sure it is. Our mission is to provide you with fun, engaging, and informative content that helps you to understand and embrace the latest technologies.

From the newest cars on the road to the latest drones taking to the skies, we’ve got you covered. We also dive deep into the world of software and hardware, bringing you the latest updates on everything from operating systems to processors.

So whether you’re a tech enthusiast, a business professional, or just someone who wants to stay up-to-date on the latest advancements in technology, encircle News is the place for you. Join us on this exciting journey and be a part of shaping the future.

Podcasts

TWiT 1015: Smarter Than a House Cat – TikTok, Trumpcoin, Samsung Unpacked 2025 This Week in Tech (Audio)

Supreme Court Upholds Law That Threatens US TikTok Ban Trumpcoin Texas Sues Allstate Over Its Collection of Driver Data Skyrocketing car-insurance premiums are pushing inflation higher Behind the Curtain — Coming soon: Ph.D.-level super-agents 4 surprise products we could see at Samsung Unpacked 2025 Apple suspends error-strewn AI generated news alerts US Finalizes Rule Banning Smart Cars With Russian, Chinese Tech Natrium 'advanced nuclear' power plant wins Wyoming permit – WyoFile Cash App parent fined $175 million for 'woefully incomplete' response to fraud FDA Proposes Significant Step Toward Reducing Nicotine to Minimally or Nonaddictive Level in Cigarettes and Certain Other Combusted Tobacco Products Host: Leo Laporte Guests: Jason Hiner, Paris Martineau, and Molly White Download or subscribe to This Week in Tech at https://twit.tv/shows/this-week-in-tech Get episodes ad-free with Club TWiT at https://twit.tv/clubtwit Sponsors: joindeleteme.com/twit promo code TWIT ziprecruiter.com/twit NetSuite.com/TWIT canary.tools/twit – use code: TWIT shopify.com/twit
  1. TWiT 1015: Smarter Than a House Cat – TikTok, Trumpcoin, Samsung Unpacked 2025
  2. TWiT 1014: Just Say It's Capitalism – CES 2025, Meta News, Newag DRM
  3. TWiT 1013: Calamari in Crisis – Touching the Sun, Fake Spotify Artists, Banished Words
  4. TWiT 1012: Our Best Of 2024 – The Best Moments From TWiT's 2024
  5. TWiT 1011: The Year in Review – A Look at the Top Stories of 2024