All posts
AI vendor contractsvendor agreementsmall businesscontract red flagsdata ownershipSaaS agreementsAI tools

6 AI Vendor Contract Red Flags Every Small Business Should Catch Before Signing

AI vendor agreements are loaded with one-sided terms that put your data, your budget, and your legal rights at risk. Here are six red flags to spot before you sign — and what to negotiate instead.

ACPrivilege.ai· Legal TechnologyApril 6, 20268 min read

You found an AI tool that could save your team hours every week. The demo looked great. The sales rep sent over a contract. You skim the terms, click "I agree," and move on.

Six months later, you discover the vendor owns everything the tool created using your data. Or they quietly raised prices 40% with a clause buried on page nine. Or your confidential business information was used to train a model that now serves your competitors.

This isn't hypothetical. A 2026 analysis found that 92% of AI vendors claim broad data usage rights in their agreements, while only 17% commit to full regulatory compliance. The gap between what AI vendors promise in demos and what they guarantee in contracts is enormous — and small businesses are the ones absorbing the risk.

Here are six red flags to watch for before you sign any AI vendor agreement.

1. The Vendor Claims Rights to Your Data (and Everything the AI Produces)

This is the most common — and most dangerous — clause in AI vendor contracts. Look for language like "a non-exclusive, worldwide, royalty-free license to use, reproduce, modify, and create derivative works from Customer Data."

What that means in plain English: the vendor can use your data however they want, including to train their AI models, improve their product for other customers, and create new offerings based on your proprietary information.

Even worse, some agreements give the vendor joint ownership of AI-generated outputs. So that market analysis the tool produced using your sales data? The vendor might own it too.

What to negotiate instead: Insist on language that says you retain full ownership of your data and all outputs generated from it. The vendor should have a limited license to process your data solely to provide the service — nothing more. And make sure the contract explicitly prohibits using your data for model training unless you opt in.

2. The Privacy Policy Doubles as a Confidentiality Waiver

Here's where things get legally dangerous, especially after the United States v. Heppner ruling from the Southern District of New York in February 2026.

In that case, a defendant used a consumer AI platform to prepare legal defense strategies. The court ruled those communications were not protected by attorney-client privilege — in part because the platform's privacy policy allowed the company to collect user inputs, use them for training, and share data with third parties including government authorities.

The lesson extends far beyond legal work. If your AI vendor's privacy policy says they can retain, analyze, or share your inputs, then any confidential business information you feed into that tool may not be confidential anymore. That matters for trade secrets, privileged communications, competitive intelligence, and regulatory compliance.

What to look for: Read the privacy policy as carefully as the contract. If it mentions data retention for "product improvement," sharing with "affiliates or partners," or cooperation with "regulatory authorities," you need to understand exactly what you're giving up.

What to negotiate instead: Look for vendors that offer enterprise-tier data handling with explicit no-training clauses, data processing agreements (DPAs), and contractual commitments that your data won't be used for anything beyond delivering the service.

3. The Vendor Can Change Pricing Whenever They Want

Many AI vendor contracts include a clause that lets the vendor modify pricing with as little as 30 days' notice — or sometimes no notice at all until your next billing cycle. For a small business budgeting $500 a month for an AI tool, a sudden jump to $800 can wreck quarterly projections.

This is especially common with usage-based pricing models. The per-unit cost stays the same, but the vendor redefines what counts as a "unit" — suddenly, each API call that used to be one unit is now three.

What to negotiate instead: Lock in pricing for the full contract term. If the vendor insists on the right to adjust, negotiate a cap on annual increases (10-15% is reasonable) and require 90 days' written notice. For usage-based models, get the unit definitions in writing as part of the agreement, not just the current documentation.

4. You Can't Leave Without Losing Your Data

Vendor lock-in is a known problem in SaaS, but it's worse with AI tools because the "data" at stake often includes months of training, customization, and institutional knowledge baked into the system.

Watch for contracts that don't include data portability provisions. If you cancel, can you export your data in a usable format? How long does the vendor retain it? What happens to custom models or workflows you built inside the platform?

Some vendors make termination technically possible but practically impossible: they'll export your raw data in a proprietary format that no other tool can read, or they'll charge an "extraction fee" that rivals the cost of the entire contract.

What to negotiate instead: Include a data portability clause that guarantees export in standard formats (CSV, JSON, or whatever is standard for your data type) within 30 days of termination. Get written confirmation that the vendor will delete your data within 90 days of contract end, and that any custom configurations or models built on your data belong to you.

5. The Indemnification Only Runs One Way

Here's a stat that should concern every small business owner: only 33% of AI vendors provide indemnification for third-party intellectual property claims. That means if the AI tool generates content that infringes someone's copyright or trademark, two-thirds of vendors leave you holding the bag.

At the same time, nearly every AI vendor contract requires the customer to indemnify the vendor — meaning if something goes wrong on your end (even if the vendor's tool contributed to the problem), you're paying for the vendor's legal defense.

What to negotiate instead: Push for mutual indemnification. The vendor should indemnify you against IP infringement claims arising from the tool's outputs, and against data breaches caused by the vendor's security failures. Yes, you should indemnify the vendor for misuse on your end — but the obligations should be balanced.

6. There Are No Performance Guarantees

Would you hire an employee with no job description and no performance metrics? That's essentially what you're doing when you sign an AI vendor contract with no service level agreement (SLA).

Only 17% of AI contracts include warranties related to compliance with their own documentation — compared to 42% in traditional SaaS contracts. That means the tool might not do what the sales deck promised, and you have no contractual remedy.

What to negotiate instead: Get an SLA that specifies uptime guarantees (99.5% or better), response time for support tickets, and accuracy benchmarks if applicable. Include a right to terminate without penalty if the vendor consistently fails to meet these standards. And make sure the contract references the vendor's current documentation as a baseline for expected functionality.

Why This Matters More for Small Businesses

Large enterprises have legal departments and procurement teams that negotiate these clauses as a matter of course. Small businesses usually don't. You're signing the same vendor's contract template that a Fortune 500 company would negotiate heavily — except you're accepting it as-is.

The financial exposure is real. A data breach caused by a vendor with no liability cap could cost your business tens of thousands in notification costs, regulatory fines, and lost customer trust. An IP infringement claim from AI-generated content could mean litigation costs that exceed your annual revenue.

This is exactly the kind of contract review where having an attorney in the loop matters — not just for negotiation leverage, but for privilege protection. When you review a vendor agreement with your attorney through a platform like acprivilege.ai, that analysis is protected by attorney-client privilege. If a dispute arises later, your internal assessment of the contract's risks can't be used against you in court.

Compare that to pasting the contract into ChatGPT and asking for a risk analysis. As the Heppner ruling made clear, that analysis has no privilege protection at all. It's fully discoverable — meaning the vendor's lawyers could potentially access your own AI-generated assessment of where the contract was weakest.

The Bottom Line

AI vendor contracts in 2026 are written to protect the vendor, not you. That's not unusual in business — every contract favors the drafter. But the gap between what AI vendors promise and what they guarantee is wider than in almost any other category of business software.

Before you sign your next AI vendor agreement, slow down and look for these six red flags. Better yet, have an attorney review it. The cost of a contract review is a fraction of the cost of discovering — after the fact — that your data, your budget, and your legal rights were never protected in the first place.


This post is for informational purposes only and does not constitute legal advice.

Need this analysis to be privileged?

Reading about the law is helpful. Start step 1 of the intake and get attorney-reviewed work product for your specific matter.

Get My Free Privileged Memo