Why is Adobe indemnifying some users against AI legal challenges?
*Cover image by Lumapoche on Pixabay.
Since the use of generative AI has gone mainstream, there have been numerous allegations about these generators stealing content to create their outputs. Just this year, multiple lawsuits have been initiated by writers, artists, and copyright owners like Getty against tech companies including OpenAI (the creator of ChatGPT) and Stability AI. In these suits, the creators claim these companies have infringed their copyrights by downloading and using copyrighted content in AI training datasets. As proof of this activity, creators point out that these generators can create images in a particular artist’s style or provide summaries of entire books.
Adobe’s new product, Firefly, is a tool that employs generative AI in Photoshop. Given the flood of legal action and its own sense of customer concerns, Adobe anticipates that businesses using their tools may have reservations about using AI-powered tools to make content.[1] To mitigate these concerns, Adobe has taken a unique step. The company has announced that it will indemnify its enterprise customers for Firefly content.
What does “indemnify” mean?
Indemnification is a concept in contracts. You are most likely to see indemnity clauses toward the end of a long commercial contract. These clauses are essential to understanding your liability because they determine who pays if a third party (i.e., not you or the other contracting party) takes issue with something that results from your agreement.
If you indemnify someone, you are responsible for their legal defense if they get sued by a third party on certain claims. Under an indemnity, you might also agree to pay any damages if that defense is not successful in protecting against those claims.
Indemnity clauses can vary widely in the type of claims they will cover, and it is important to read these clauses carefully before assuming that they will protect you. In fact, it is likely that many agreements you sign on a regular basis require that you indemnify the other party. For example, the Adobe General Terms of Use state that the user agrees to indemnify Adobe against claims that are connected to the user’s content. This makes you, the user, responsible for the legal costs of a lawsuit against Adobe because of something you make using its software.
What is Adobe’s indemnification policy for Firefly?
Adobe has chosen to indemnify certain business users against claims “that allege the Firefly output directly infringes or violates any third party’s patent, copyright, trademark, publicity rights or privacy rights.”[2] This indemnity is limited, and will not cover customer-added content or actions taken by customers that Adobe does not like (as defined in the contracts with businesses).[3] For example, Adobe would likely not indemnify you for an infringement claim if you told the AI, in as much detail as possible, to generate a visual that resembled another person’s piece of art. That type of copying would be the result of your own conduct. Additionally, the exact terms of the indemnity clause are not yet clear, so each individual use case for the coverage may be different.
This move from Adobe shows that the company is not concerned about substantial intellectual property liability arising from Firefly. This confidence is due in large part to the fact that the AI is only trained on content that is either Adobe Stock, public domain, or other openly licensed content.[4] A company like Adobe stands out from other creators of AI tools that train their AI using content with widely varying permissions, which is why you’re unlikely to see a similar promise from OpenAI anytime soon, if ever.
What does this mean for creators and Adobe users?
On one hand, this announcement is a great ad for Adobe’s attention to the legal risks emerging in the generative AI landscape. In addition to this indemnity and its promise to train the AI on a narrow subset of content, Adobe is working on developing other methods for creators to profit from the use of their content.[5] This focus also highlights the pitfalls of other generative tools, which are facing more controversy every day over the ways that their AI systems may infringe on creators’ rights or the privacy rights of individuals online. Where other companies are using content with impunity, Adobe is attempting to respect the rights and wishes of creatives that want control over how their work is used.
Still, Adobe users and creators alike should remain alert about the risks that these tools pose. Creators should review the terms under which their works are licensed or sold, and understand what rights they are handing over under those terms, along with other implications of AI training on those works. As Adobe and others begin to offer ways to profit from your work in this space, it will be important to monitor the uses of that work to make sure you are being properly compensated. Furthermore, those who use these tools should understand that the safeguards these companies put in place are not impenetrable, and they do not guarantee that you will be safe from liability in all instances.
If you want more information on indemnification, intellectual property rights, or other legal issues concerning generative AI, you should speak with an experienced attorney who can talk through these new and complex issues.
Notes
[1] See Scott Belsky Discusses Adobe, Art, Design & People in an Age of Generative AI, 2023 Upfront Summit (discussion of Adobe’s initiatives using generative AI tools).
[2] Firefly Legal FAQs – Enterprise Customers, Adobe, pp. 3 (June 12, 2023).
[3] Firefly Legal FAQs, pp. 3.
[4] Ron Miller, Adobe indemnity clause designed to ease enterprise fears about AI-generated art, TechCrunch (June 26, 2023).
[5] Some methods include allowing an artist to profit from software that lets users generate content that is explicitly in the style of that artist, in addition to systems that track the use of the content in generated content. Scott Belsky, 2023 Upfront Summit