|
AI Product Contracts
Artificial Intelligence Product Contracts -- Having a clearly delineated AI product agreement can limit AI software product liability
New York Law Journal September 24, 2024
By Jonathan Bick Bick is counsel at Brach Eichler in Roseland, and chairman of the firm's patent, intellectual property, and information technology group. He is also an adjunct professor at Pace Law School and Rutgers Law School.
Artificial Intelligence (AI) is software. Consequently, proper AI product contracts address the same relationships and requirements as traditional software contracts. As with traditional software contracts, ambiguous use terms, unique termination clauses, intellectual property ownership and on-going programing support are present in AI product contracts. These issues for both traditional and AI software agreements can lead to unexpected costs, legal disputes, and operational disruptions. However, significant differences between traditional software contracts and AI contracts may be found in contract clauses related to algorithms.
Algorithms are sets of instructions that, when properly implemented, accomplish a task. A non-computer example of an algorithm is a baking recipe. A cupcake recipe for example, is a replicable set of steps to create cupcakes from scratch. Both baking recipes and computer algorithms have inputs, which contains all the items needed to perform a task, and a sequence of steps necessary to prepare a specific output.
AI software and traditional software primarily differ as to the algorithm writer. Programmers (people) write traditional software algorithms. Computers write algorithms for AI products.
Legally liable arises when a legal entity is sued for damages, and the court finds the legal entity financially responsible for those damages. When a software algorithm contains an error, harm may arise. An issue uniquely related to AI products is who will be held liable if the AI system makes a mistake? The potentially liable parties may include the user, AI programmer, the AI software owner, or AI itself.
In some cases, the AI system may be solely responsible. In other cases, the AI programmer who created the software, or the people using it may be partially or fully responsible. Proper AI product contracts ameliorate or eliminate questions determining who's responsible for AI mistakes.
Programmers are legal entities. They have legal rights and responsibilities. A legal entity can own property, enter into contracts, and appear in court. However, AI software is not a legal entity. Consequently, it is not separate "person" in the eyes of the law. It cannot be held accountable for its actions.
Both AI product contracts and traditional software contracts disclose the responsibilities, rights, and obligations of the product seller and the product buyer. These include the product scope, timelines, intellectual property rights, payment terms, and dispute resolution mechanisms.
AI product contracts and traditional software contracts must address the legal difficulties associated with algorithm related damage and algorithm intellectual property. However, they must address them differently because of the legal entity difference between a programmer and a computer.
Generally, two types of software product contracts exist. The first is a content development agreement. The second is a product purchase agreement.
The content development agreement is used when content is created. Such agreements include: software applications, websites and media projects. Key elements of this type of agreement are: defining project scope and cost, intellectual property rights, confidentiality, success metrics, timeframes, and warranties. Collaboration between the content owner and the developer is usually required.
The second type of software contract is a product purchase agreement. It is a contract between the product owner and the end-user, or customer, who intends to purchase the developed product. This agreement outlines the terms of sale, such as pricing, delivery terms, warranties, and any licensing or usage restrictions associated with the product.
Content development contracts typically do not involve algorithm issues for contract clauses related to project scope, success metrics, or time frames, but usually do involve algorithm matters due to intellectual property rights, confidentiality, and warranties clauses. Consequently, AI agreements and traditional software agreements must address intellectual property rights, confidentiality, and warranties clauses differently.
Regarding intellectual property for example, copyright rights in a software algorithm written by a programmer is initially owned by the writer (17 USC §201(a)). The writer may transfer the rights to the algorithm software to others (17 USC §201(d)).
A Copyright Office ruling ( 03/16/2023) disclosed that if a work is solely generated by an AI and lacks human authorship, there is no copyright protection and therefore no one can own the copyright to the generated work because it is in the public domain. The Copyright Office authority has been confirmed for this type of matter (see generally Norris Indus. v. Int'l Tel. & Tel. Corp., 696 F.2d 918, 922 (11th Cir. 1983)).
Applying the reasoning noted above, the Copyright Office has refused to register any work that is solely generated by AI. A federal district court, in upholding the Office's decision to refuse registration for such a work, affirmed these principles (Thaler v. Perlmutter No. CV 22-1564 (BAH) (D.D.C. Aug. 18, 2023)).
Product purchase agreements usually do not involve algorithm issues for contract clause related to terms of sale, such as pricing, delivery terms and warranties. However, the licensing contract clause is likely to be a function of who wrote the product algorithms. Consequently, AI agreements and traditional software agreements must address the licensing contract clauses differently.
Regarding the licensing clauses for example, a standard license agreement is a contract between the licensor (the intellectual property owner) and licensee that grants the licensee permission to use identified intellectual property.
Traditional product purchase software licenses the intellectual property owner is usually the person who created a work of authorship and hence is usually the licensor. When those rights are assigned or licensed to others, then others may be the licensor.
AI product purchase agreements, by definition, have AI created algorithms. Since AI are not legal entities, they cannot license the AI content.
AI software (including algorithms generated as part of an AI purchase product, cannot be licensed. Consequently, the intellectual property clause must address this matter.
AI product agreements should also consider that product liability action can be brought under any or all of three theories, namely negligence, strict products liability and breach of warranty. These may arise when it is shown that a product was "defective" or "not reasonably safe" due to the fact that the product incorporates a flaw when made, a design defect or inadequate instructions/warnings.
Consequently, in addition to having an AI software product agreement clearly delineate legal liability, other action may be taken to limit damages. For example, the AI software product user, owner, programmer or distributor can minimize legal liability by documenting that they exercised "reasonable care" in design or manufacture even if it did not adopt the "safest possible" practice (Cover v. Cohen, 461 N.E.2d 864 (1984)).
AI software product liability may also be reduced by preemptively addressing certain strict liability issue. For example, the AI programmer, owner or distributor warns of the danger of unintended uses of a product provided those uses are reasonably foreseeable (see Lugo v. LJN Toys, Ltd., 146 A.D.2d 168 (N.Y. App. Div. 1989)). Generally speaking, an acceptable warning passes three tests: adequately communicates the specific hazard; suitably communicates the scale of the hazard; and amply notifies the user how to avoid the hazard.
|